-
-
Notifications
You must be signed in to change notification settings - Fork 29
Description
Summary
Adds local LLM support via Ollama for privacy-first, offline-capable package management.
Features
- ✅ Auto-detect Ollama installation
- ✅ Smart model selection (prefers code-focused models)
- ✅ Streaming responses
- ✅ Fallback to Claude/OpenAI
- ✅ Works completely offline
- ✅ Zero data sent to cloud
Supported Models
- CodeLlama (7B, 13B, 34B)
- Llama 3.2
- Mistral / Mixtral
- DeepSeek Coder
- Phi-2
- And more...
Files Added
cortex/providers/ollama_integration.py- Core integrationtests/test_ollama_integration.py- Test suitedocs/README_OLLAMA.md- Documentation
Testing
pytest tests/test_ollama_integration.py -vBounty: $150 (+ $150 bonus after funding)
🤖 Generated with Claude Code
Summary by CodeRabbit
-
New Features
- Added local LLM support through Ollama integration with automatic model discovery and selection
- Enabled offline and privacy-preserving AI capabilities with model management and configuration options
- Introduced provider routing with automatic fallback support
-
Documentation
- Added comprehensive integration guide covering setup, configuration, usage examples, troubleshooting, and architecture details
-
Tests
- Added test coverage for LLM provider functionality and integration scenarios
✏️ Tip: You can customize this high-level summary in your review settings.
coderabbitai
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or request