Skip to content

feature: Package LunaAI - Native COSMIC Desktop AI Assistant with MCP Support #6

@olafkfreund

Description

@olafkfreund

Description

Package LunaAI (formerly cosmic_llm), a native COSMIC desktop AI assistant application built with Rust and the libcosmic framework. LunaAI provides a modern, native desktop interface for AI conversations with integrated Model Context Protocol (MCP) support, conversation history management, and multi-provider backend support (OpenAI, Anthropic, Gemini, Ollama).

Repository: https://github.com/digit1024/LunaAI

Application Review

Strengths

  1. Native COSMIC Integration: Built with libcosmic framework for seamless COSMIC desktop experience
  2. Multi-Provider Support: OpenAI, Anthropic, Gemini, Ollama, and custom endpoints
  3. MCP Protocol: Full Model Context Protocol support for tool calling and external service integration
  4. Real-time Streaming: Async architecture with tokio for smooth, non-blocking UI updates
  5. Conversation Management: SQLite-based persistent storage for chat history
  6. Secure Credentials: keyring integration for secure API key storage
  7. Well-Structured: Clean Rust codebase with proper dependency management
  8. Active Development: Recent commits and active maintenance

⚠️ Considerations

  1. COSMIC Dependency: Requires COSMIC desktop environment (available in NixOS 25.05+)
  2. Build Complexity: Uses git dependency for libcosmic (requires handling in Nix)
  3. Node.js for MCP: MCP servers often use npx, requiring Node.js runtime
  4. Storage Requirements: SQLite database, config files, conversation history
  5. API Keys: Needs secure secret management integration (our agenix setup)

Technical Analysis

Dependencies

Core Dependencies:

  • libcosmic (git: pop-os/libcosmic, branch: master)
  • tokio (async runtime)
  • reqwest (HTTP client for LLM APIs)
  • rusqlite (conversation storage)
  • keyring (secure credential storage)
  • tokio-tungstenite (WebSocket for MCP)

System Requirements:

  • COSMIC desktop environment
  • SQLite3
  • OpenSSL
  • D-Bus
  • Node.js (for MCP servers)

NixOS Packaging Strategy

Approach: buildRustPackage with cargoLock.lockFile

{ lib
, rustPlatform
, fetchFromGitHub
, pkg-config
, sqlite
, openssl
, dbus
, libcosmic
, makeWrapper
, nodejs
}:

rustPlatform.buildRustPackage rec {
  pname = "luna-ai";
  version = "0.1.0";

  src = fetchFromGitHub {
    owner = "digit1024";
    repo = "LunaAI";
    rev = "v${version}";  # Or specific commit
    hash = "sha256-AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA=";
  };

  cargoLock = {
    lockFile = ./Cargo.lock;
    outputHashes = {
      # libcosmic git dependency will need hash
      "libcosmic-..." = "sha256-XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX=";
    };
  };

  nativeBuildInputs = [
    pkg-config
    makeWrapper
  ];

  buildInputs = [
    sqlite
    openssl
    dbus
  ];

  postInstall = ''
    # Wrap with Node.js in PATH for MCP support
    wrapProgram $out/bin/cosmic_llm \
      --prefix PATH : ${lib.makeBinPath [ nodejs ]}
  '';

  meta = with lib; {
    description = "Native COSMIC desktop AI assistant with MCP support";
    homepage = "https://github.com/digit1024/LunaAI";
    license = licenses.mit;
    maintainers = [ ];
    platforms = platforms.linux;
  };
}

Research Summary

Recommended Approach

  1. Use Modern Rust Packaging: Leverage cargoLock.lockFile for easier maintenance
  2. Handle Git Dependencies: Use outputHashes for libcosmic git dependency
  3. Create Modular Configuration: Separate module in modules/desktop/luna-ai.nix
  4. Integrate with Existing AI Infrastructure: Connect to our agenix secrets for API keys
  5. Wrap with Dependencies: Include Node.js in PATH for MCP server support
  6. Desktop Integration: Provide .desktop file and proper XDG integration

Key Considerations

Technical:

  • libcosmic git dependency requires handling with outputHashes
  • MCP servers need npx/Node.js runtime in PATH
  • SQLite database needs proper permissions and location
  • Config files use TOML format, need generation from Nix options

Integration:

  • Connect to existing AI provider secrets (api-openai, api-anthropic, api-gemini)
  • Ollama integration with our existing Ollama service
  • Configuration management through Home Manager or NixOS options
  • Desktop file for application launcher integration

Host Compatibility:

  • P620: ✅ GNOME currently, would need COSMIC desktop for full experience
  • Razer: ✅ GNOME currently, same consideration
  • Alternative: Could run as standalone app even without COSMIC

Packaging Challenges

  1. libcosmic Git Dependency: Need to handle master branch tracking or pin to specific commit
  2. COSMIC Desktop Availability: Ensure NixOS 25.05+ or nixos-cosmic flake integration
  3. MCP Configuration: Generate proper mcp_config.json from Nix options
  4. API Key Management: Integrate with our agenix secret management
  5. Storage Paths: Ensure proper XDG directory structure for data persistence

References

Use Cases for Our Infrastructure

P620 (Primary AI Host - AMD Workstation)

  • Benefit: Native desktop AI assistant for development workflows
  • Integration: Use existing Ollama service, Anthropic/OpenAI API keys
  • MCP Use: File operations, system tasks, development automation
  • Consideration: Currently runs GNOME; would need COSMIC or run as standalone

Razer (Mobile Development - Intel/NVIDIA Laptop)

  • Benefit: AI assistant for mobile development work
  • Integration: Connect to remote Ollama on P620, use cloud providers
  • MCP Use: Email management, todo lists, web search
  • Consideration: Currently runs GNOME; consider COSMIC for full experience

Alternatives if COSMIC Not Desired

  1. Run as Flatpak: Use official Flatpak release (less Nix integration)
  2. Wait for GTK/Qt Port: Monitor if project adds non-COSMIC support
  3. COSMIC in VM: Run COSMIC desktop in MicroVM for testing

Implementation Plan

Phase 1: Basic Packaging (M)

  • Create Nix package derivation with buildRustPackage
  • Handle libcosmic git dependency with outputHashes
  • Configure build inputs and dependencies
  • Test basic build and runtime

Phase 2: NixOS Module (M)

  • Create module in modules/desktop/luna-ai.nix
  • Define configuration options (profiles, backends, MCP servers)
  • Generate config.toml from Nix options
  • Integrate with agenix for API key management

Phase 3: AI Provider Integration (S)

  • Connect to existing ai-openai, ai-anthropic, api-gemini secrets
  • Configure Ollama backend to use existing service
  • Set up default profiles for each provider
  • Test multi-provider switching

Phase 4: MCP Configuration (M)

  • Generate mcp_config.json from Nix options
  • Wrap binary with Node.js in PATH
  • Configure common MCP servers (filesystem, weather, etc.)
  • Test MCP tool calling functionality

Phase 5: Desktop Integration (S)

  • Create .desktop file for application launcher
  • Set up proper XDG directory structure
  • Configure SQLite database location
  • Test conversation history persistence

Phase 6: Documentation & Testing (S)

  • Document configuration options
  • Create usage examples
  • Test on P620 and Razer
  • Update roadmap and CLAUDE.md

Acceptance Criteria

  • LunaAI packages successfully with buildRustPackage
  • All dependencies (including libcosmic) resolve correctly
  • NixOS module provides clean configuration interface
  • API keys integrate with agenix secrets
  • Ollama backend connects to existing service
  • MCP servers work with Node.js wrapper
  • Application launches and streams responses correctly
  • Conversation history persists across restarts
  • Desktop file appears in application launcher
  • Configuration follows PATTERNS.md best practices
  • No anti-patterns from NIXOS-ANTI-PATTERNS.md
  • Documentation updated with usage examples
  • Tested on both P620 and Razer hosts

Implementation Checklist

  • Research COSMIC desktop integration requirements
  • Create feature branch: feature/[issue-number]-luna-ai-package
  • Implement Nix package derivation
  • Create NixOS module with configuration options
  • Integrate with existing AI infrastructure
  • Configure MCP server support
  • Set up desktop integration
  • Write comprehensive documentation
  • Run validation: just validate
  • Test on P620: just test-host p620
  • Test on Razer: just test-host razer
  • Create pull request with detailed description
  • Code review focusing on Rust packaging patterns
  • Merge after approval and testing

Estimated Effort

Total: L (2 weeks)

  • Phase 1 (Basic Packaging): M (1 week) - Handling git dependencies
  • Phase 2 (NixOS Module): M (1 week) - Configuration generation
  • Phase 3 (AI Integration): S (2-3 days) - Using existing infrastructure
  • Phase 4 (MCP Config): M (1 week) - Node.js integration and testing
  • Phase 5 (Desktop): S (2-3 days) - Standard desktop integration
  • Phase 6 (Docs/Testing): S (2-3 days) - Documentation and validation

Alternative: Quick Evaluation

For immediate testing without full packaging:

# Clone and build locally
cd /tmp
git clone https://github.com/digit1024/LunaAI.git
cd LunaAI
nix-shell -p cargo rustc pkg-config sqlite openssl dbus
cargo build --release

# Test with existing API keys
mkdir -p ~/.local/share/cosmic_llm
# Configure with our API keys
./target/release/cosmic_llm

Related Issues

  • Consider creating sub-issues for each phase if implementation is complex
  • Link to any COSMIC desktop integration issues in nixpkgs

Notes

IMPORTANT: This requires evaluating whether we want to:

  1. Switch to COSMIC desktop environment (available in NixOS 25.05)
  2. Use nixos-cosmic flake for testing
  3. Run as standalone application (may have limited COSMIC integration)

The application is well-structured and would be an excellent addition to our AI infrastructure, providing a native desktop experience that complements our existing Claude Code and shell-based AI tools.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions