One-click, tray-accessible agent host for AbstractFramework.
AbstractAssistant runs agentic loops (ReAct/CodeAct/MemAct) on top of:
- AbstractAgent (agent patterns)
- AbstractRuntime (durable runs, waits, ledgers)
- AbstractCore (providers + tools + media handling)
- AbstractVoice (STT/TTS)
Docs:
docs/getting-started.mddocs/architecture.md
pip install "abstractassistant"Tray (macOS):
assistant trayAlias:
abstractassistant trayTerminal (one turn):
assistant run --prompt "What is in this repo and where do I start?"Provider/model override:
assistant run --provider ollama --model qwen3:4b-instruct --prompt "Summarize my changes"AbstractAssistant enforces a durable tool boundary:
- read-only / known-safe tools can auto-run
- anything else pauses and requires approval (tray dialog or terminal prompt)
This aligns with the framework’s durability + safety model: tools are executed by the host, not persisted as callables inside run state.
By default, assistant state is stored in ~/.abstractassistant/ (configurable via --data-dir):
session.json: fast UI snapshot (transcript + last run id)runtime/: run store + ledger + artifacts (source of truth)
pip install -e ".[dev]"
python -m pytest -q
assistant tray --debug- 📱 Unobtrusive: Lives quietly in your menu bar until needed
- 🔊 Conversational: Optional voice mode for natural AI interactions
| Guide | Description |
|---|---|
| 📖 Installation Guide | Complete setup instructions, prerequisites, and troubleshooting |
| 🎯 Getting Started Guide | Step-by-step usage guide with all features explained |
| 🏗️ Architecture Guide | Technical documentation and development information |
- macOS: 10.14+ (Mojave or later)
- Python: 3.10+
- Qt Framework: PyQt5, PySide2, or PyQt6 (automatically detected)
- Core deps: AbstractAgent + AbstractRuntime + AbstractCore + AbstractVoice (installed with
abstractassistant) - Audio note: audio attachments are auto-transcribed via AbstractVoice (first run may download model weights)
- Video note: frame-sampling fallback may require
ffmpegon your PATH
Contributions welcome! Please read the architecture documentation and follow the established patterns:
- Clean Code: Follow PEP 8 and use type hints
- Modular Design: Keep components focused and reusable
- Modern UI/UX: Maintain the sleek, native feel
- Error Handling: Always include graceful fallbacks
- Documentation: Update docs for any new features
MIT License - see LICENSE file for details.
AbstractAssistant is built on excellent open-source projects:
- AbstractCore: Universal LLM interface - enables seamless multi-provider support
- AbstractVoice: High-quality text-to-speech engine with natural voice synthesis
- PyQt5/PySide2/PyQt6: Cross-platform GUI framework for the modern interface
- pystray: Cross-platform system tray integration
- Pillow: Image processing for dynamic icon generation
AbstractAssistant integrates seamlessly with other AbstractX projects:
- 🧠 AbstractCore: Universal LLM provider interface
- 🗣️ AbstractVoice: Advanced text-to-speech capabilities
See ACKNOWLEDGMENTS.md for complete attribution.
Built with ❤️ for macOS users who want AI at their fingertips