InkCop’s AI features require at least one LLM provider configured.
Supported Providers (16+)
Chinese Providers
| Provider | Model | Features |
|---|---|---|
| Alibaba Bailian | qwen3-235b-a22b | Strong Chinese, multimodal |
| DeepSeek | deepseek-reasoner | Excellent reasoning, cost-effective |
| Moonshot (Kimi) | kimi-latest | Long-context, Chinese-friendly |
| Zhipu GLM | glm-4.7 | Stable tool calling |
International Providers
| Provider | Model | Features |
|---|---|---|
| OpenAI | gpt-4o | Strongest overall |
| Anthropic | claude-4-sonnet | Great academic writing |
| Google Gemini | gemini-2.5-pro | Strong multimodal |
| Ollama | Local models | Fully offline |
Configuration
- Settings → AI → LLM Providers
- Select provider, enter API Key and Model
- Click Test Connection
- Toggle Enable
Recommended Setups
| Plan | Main Model | Embedding | Best For |
|---|---|---|---|
| Cost-effective | DeepSeek | Bailian text-embedding-v4 | Beginners |
| Best experience | Claude 4 / GPT-4o | OpenAI text-embedding-3-large | English papers |
| Fully offline | Ollama + Qwen3-32B | Ollama + nomic-embed-text | Privacy-focused |
⚠️
Local models require 16GB+ GPU VRAM. Use cloud services if your hardware is limited.