-
Notifications
You must be signed in to change notification settings - Fork 0
Description
Description
Package LunaAI (formerly cosmic_llm), a native COSMIC desktop AI assistant application built with Rust and the libcosmic framework. LunaAI provides a modern, native desktop interface for AI conversations with integrated Model Context Protocol (MCP) support, conversation history management, and multi-provider backend support (OpenAI, Anthropic, Gemini, Ollama).
Repository: https://github.com/digit1024/LunaAI
Application Review
✅ Strengths
- Native COSMIC Integration: Built with libcosmic framework for seamless COSMIC desktop experience
- Multi-Provider Support: OpenAI, Anthropic, Gemini, Ollama, and custom endpoints
- MCP Protocol: Full Model Context Protocol support for tool calling and external service integration
- Real-time Streaming: Async architecture with tokio for smooth, non-blocking UI updates
- Conversation Management: SQLite-based persistent storage for chat history
- Secure Credentials: keyring integration for secure API key storage
- Well-Structured: Clean Rust codebase with proper dependency management
- Active Development: Recent commits and active maintenance
⚠️ Considerations
- COSMIC Dependency: Requires COSMIC desktop environment (available in NixOS 25.05+)
- Build Complexity: Uses git dependency for libcosmic (requires handling in Nix)
- Node.js for MCP: MCP servers often use npx, requiring Node.js runtime
- Storage Requirements: SQLite database, config files, conversation history
- API Keys: Needs secure secret management integration (our agenix setup)
Technical Analysis
Dependencies
Core Dependencies:
- libcosmic (git: pop-os/libcosmic, branch: master)
- tokio (async runtime)
- reqwest (HTTP client for LLM APIs)
- rusqlite (conversation storage)
- keyring (secure credential storage)
- tokio-tungstenite (WebSocket for MCP)
System Requirements:
- COSMIC desktop environment
- SQLite3
- OpenSSL
- D-Bus
- Node.js (for MCP servers)
NixOS Packaging Strategy
Approach: buildRustPackage with cargoLock.lockFile
{ lib
, rustPlatform
, fetchFromGitHub
, pkg-config
, sqlite
, openssl
, dbus
, libcosmic
, makeWrapper
, nodejs
}:
rustPlatform.buildRustPackage rec {
pname = "luna-ai";
version = "0.1.0";
src = fetchFromGitHub {
owner = "digit1024";
repo = "LunaAI";
rev = "v${version}"; # Or specific commit
hash = "sha256-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=";
};
cargoLock = {
lockFile = ./Cargo.lock;
outputHashes = {
# libcosmic git dependency will need hash
"libcosmic-..." = "sha256-XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX=";
};
};
nativeBuildInputs = [
pkg-config
makeWrapper
];
buildInputs = [
sqlite
openssl
dbus
];
postInstall = ''
# Wrap with Node.js in PATH for MCP support
wrapProgram $out/bin/cosmic_llm \
--prefix PATH : ${lib.makeBinPath [ nodejs ]}
'';
meta = with lib; {
description = "Native COSMIC desktop AI assistant with MCP support";
homepage = "https://github.com/digit1024/LunaAI";
license = licenses.mit;
maintainers = [ ];
platforms = platforms.linux;
};
}Research Summary
Recommended Approach
- Use Modern Rust Packaging: Leverage
cargoLock.lockFilefor easier maintenance - Handle Git Dependencies: Use
outputHashesfor libcosmic git dependency - Create Modular Configuration: Separate module in
modules/desktop/luna-ai.nix - Integrate with Existing AI Infrastructure: Connect to our agenix secrets for API keys
- Wrap with Dependencies: Include Node.js in PATH for MCP server support
- Desktop Integration: Provide .desktop file and proper XDG integration
Key Considerations
Technical:
- libcosmic git dependency requires handling with
outputHashes - MCP servers need npx/Node.js runtime in PATH
- SQLite database needs proper permissions and location
- Config files use TOML format, need generation from Nix options
Integration:
- Connect to existing AI provider secrets (api-openai, api-anthropic, api-gemini)
- Ollama integration with our existing Ollama service
- Configuration management through Home Manager or NixOS options
- Desktop file for application launcher integration
Host Compatibility:
- P620: ✅ GNOME currently, would need COSMIC desktop for full experience
- Razer: ✅ GNOME currently, same consideration
- Alternative: Could run as standalone app even without COSMIC
Packaging Challenges
- libcosmic Git Dependency: Need to handle master branch tracking or pin to specific commit
- COSMIC Desktop Availability: Ensure NixOS 25.05+ or nixos-cosmic flake integration
- MCP Configuration: Generate proper mcp_config.json from Nix options
- API Key Management: Integrate with our agenix secret management
- Storage Paths: Ensure proper XDG directory structure for data persistence
References
- LunaAI Repository
- COSMIC NixOS Wiki
- nixos-cosmic Flake
- Rust Packaging in NixOS
- buildRustPackage Documentation
- Model Context Protocol
Use Cases for Our Infrastructure
P620 (Primary AI Host - AMD Workstation)
- Benefit: Native desktop AI assistant for development workflows
- Integration: Use existing Ollama service, Anthropic/OpenAI API keys
- MCP Use: File operations, system tasks, development automation
- Consideration: Currently runs GNOME; would need COSMIC or run as standalone
Razer (Mobile Development - Intel/NVIDIA Laptop)
- Benefit: AI assistant for mobile development work
- Integration: Connect to remote Ollama on P620, use cloud providers
- MCP Use: Email management, todo lists, web search
- Consideration: Currently runs GNOME; consider COSMIC for full experience
Alternatives if COSMIC Not Desired
- Run as Flatpak: Use official Flatpak release (less Nix integration)
- Wait for GTK/Qt Port: Monitor if project adds non-COSMIC support
- COSMIC in VM: Run COSMIC desktop in MicroVM for testing
Implementation Plan
Phase 1: Basic Packaging (M)
- Create Nix package derivation with buildRustPackage
- Handle libcosmic git dependency with outputHashes
- Configure build inputs and dependencies
- Test basic build and runtime
Phase 2: NixOS Module (M)
- Create module in
modules/desktop/luna-ai.nix - Define configuration options (profiles, backends, MCP servers)
- Generate config.toml from Nix options
- Integrate with agenix for API key management
Phase 3: AI Provider Integration (S)
- Connect to existing ai-openai, ai-anthropic, api-gemini secrets
- Configure Ollama backend to use existing service
- Set up default profiles for each provider
- Test multi-provider switching
Phase 4: MCP Configuration (M)
- Generate mcp_config.json from Nix options
- Wrap binary with Node.js in PATH
- Configure common MCP servers (filesystem, weather, etc.)
- Test MCP tool calling functionality
Phase 5: Desktop Integration (S)
- Create .desktop file for application launcher
- Set up proper XDG directory structure
- Configure SQLite database location
- Test conversation history persistence
Phase 6: Documentation & Testing (S)
- Document configuration options
- Create usage examples
- Test on P620 and Razer
- Update roadmap and CLAUDE.md
Acceptance Criteria
- LunaAI packages successfully with buildRustPackage
- All dependencies (including libcosmic) resolve correctly
- NixOS module provides clean configuration interface
- API keys integrate with agenix secrets
- Ollama backend connects to existing service
- MCP servers work with Node.js wrapper
- Application launches and streams responses correctly
- Conversation history persists across restarts
- Desktop file appears in application launcher
- Configuration follows PATTERNS.md best practices
- No anti-patterns from NIXOS-ANTI-PATTERNS.md
- Documentation updated with usage examples
- Tested on both P620 and Razer hosts
Implementation Checklist
- Research COSMIC desktop integration requirements
- Create feature branch:
feature/[issue-number]-luna-ai-package - Implement Nix package derivation
- Create NixOS module with configuration options
- Integrate with existing AI infrastructure
- Configure MCP server support
- Set up desktop integration
- Write comprehensive documentation
- Run validation:
just validate - Test on P620:
just test-host p620 - Test on Razer:
just test-host razer - Create pull request with detailed description
- Code review focusing on Rust packaging patterns
- Merge after approval and testing
Estimated Effort
Total: L (2 weeks)
- Phase 1 (Basic Packaging): M (1 week) - Handling git dependencies
- Phase 2 (NixOS Module): M (1 week) - Configuration generation
- Phase 3 (AI Integration): S (2-3 days) - Using existing infrastructure
- Phase 4 (MCP Config): M (1 week) - Node.js integration and testing
- Phase 5 (Desktop): S (2-3 days) - Standard desktop integration
- Phase 6 (Docs/Testing): S (2-3 days) - Documentation and validation
Alternative: Quick Evaluation
For immediate testing without full packaging:
# Clone and build locally
cd /tmp
git clone https://github.com/digit1024/LunaAI.git
cd LunaAI
nix-shell -p cargo rustc pkg-config sqlite openssl dbus
cargo build --release
# Test with existing API keys
mkdir -p ~/.local/share/cosmic_llm
# Configure with our API keys
./target/release/cosmic_llmRelated Issues
- Consider creating sub-issues for each phase if implementation is complex
- Link to any COSMIC desktop integration issues in nixpkgs
Notes
IMPORTANT: This requires evaluating whether we want to:
- Switch to COSMIC desktop environment (available in NixOS 25.05)
- Use nixos-cosmic flake for testing
- Run as standalone application (may have limited COSMIC integration)
The application is well-structured and would be an excellent addition to our AI infrastructure, providing a native desktop experience that complements our existing Claude Code and shell-based AI tools.