foxus/PROJECT_SUMMARY.md

8.7 KiB

Foxus - Local-First AI Coding Assistant

Project Overview

Foxus is a privacy-focused, fully offline coding assistant that provides AI-powered code completion, refactoring, bug fixing, and code explanation using locally running language models. Built with modern technologies, it offers a seamless development experience without sending your code to external servers.

Key Features

🔒 Privacy & Security

  • Fully Local: All processing happens on your machine
  • No Internet Required: Works completely offline
  • Zero Data Collection: Your code never leaves your computer

🧠 AI-Powered Assistance

  • Code Explanation: Understand complex code snippets
  • Smart Refactoring: Improve code quality and maintainability
  • Bug Detection & Fixing: Identify and resolve issues
  • Code Completion: Intelligent autocomplete suggestions
  • Documentation Generation: Auto-generate comments and docs

💻 Developer Experience

  • Multi-Language Support: Python, JavaScript, TypeScript, Go, Java, Rust, and more
  • Keyboard Shortcuts: Quick access to AI commands (/explain, /refactor, /fix)
  • Modern UI: Clean, responsive interface with dark theme
  • Project Context: Multi-file analysis and understanding

Performance & Compatibility

  • Cross-Platform: Windows, Linux, and macOS support
  • Lightweight: Built with Tauri for minimal resource usage
  • Fast Response: Local models provide quick feedback
  • Extensible: Easy to add new AI models and commands

Technology Stack

Frontend

  • Framework: Tauri + React + TypeScript
  • Editor: Monaco Editor (VS Code engine)
  • Styling: Tailwind CSS
  • State Management: Zustand
  • Build Tool: Vite

Backend

  • API: FastAPI (Python)
  • LLM Integration: Ollama
  • Models: CodeLlama, Deepseek-Coder, StarCoder
  • File Handling: Python file system APIs

Desktop Application

  • Framework: Tauri (Rust + Web Technologies)
  • Bundle Size: ~15MB (significantly smaller than Electron)
  • Performance: Native-level performance

Supported AI Models

Code-Specialized Models

  1. CodeLlama 7B/13B - Meta's code generation model
  2. Deepseek-Coder 6.7B - Advanced code understanding
  3. StarCoder 7B - Multi-language code completion
  4. CodeGemma 7B - Google's code model

Model Capabilities

  • Code generation and completion
  • Bug detection and fixing
  • Code explanation and documentation
  • Refactoring suggestions
  • Multi-language understanding

Architecture Benefits

Local-First Approach

  • Privacy: Code never leaves your machine
  • Speed: No network latency
  • Reliability: Works without internet
  • Cost: No API fees or usage limits

Modular Design

  • Extensible: Easy to add new features
  • Maintainable: Clear separation of concerns
  • Testable: Well-structured codebase
  • Scalable: Can handle large projects

Use Cases

Individual Developers

  • Learning: Understand unfamiliar code
  • Productivity: Speed up coding with AI assistance
  • Quality: Improve code through AI suggestions
  • Debugging: Get help fixing complex issues

Teams & Organizations

  • Code Reviews: AI-assisted code analysis
  • Standards: Consistent code quality
  • Documentation: Auto-generated code docs
  • Training: Help junior developers learn

Enterprise

  • Security: Keep sensitive code private
  • Compliance: Meet data residency requirements
  • Customization: Add domain-specific models
  • Integration: Embed in existing workflows

Getting Started

Quick Setup (5 minutes)

  1. Install Prerequisites: Node.js, Python, Rust, Ollama
  2. Install Dependencies: npm install & pip install -r requirements.txt
  3. Download AI Model: ollama pull codellama:7b-code
  4. Start Services: Backend API & Frontend app
  5. Start Coding: Open files and use AI commands

Development Workflow

  1. Open Foxus: Launch the desktop application
  2. Load Project: Open your code files
  3. Select Code: Highlight code you want help with
  4. Use AI Commands:
    • Ctrl+Shift+E - Explain code
    • Ctrl+Shift+R - Refactor code
    • Ctrl+Shift+F - Fix bugs
    • Ctrl+K - Command palette

Project Structure

foxus/
├── backend/                 # FastAPI Python backend
│   ├── app/
│   │   ├── api/            # API routes (ai, files, models)
│   │   ├── core/           # Configuration and settings
│   │   ├── models/         # Pydantic data models
│   │   └── services/       # Ollama integration
│   ├── main.py             # FastAPI entry point
│   └── requirements.txt    # Python dependencies
├── frontend/               # Tauri + React frontend
│   ├── src/
│   │   ├── components/     # React UI components
│   │   ├── stores/         # Zustand state management
│   │   ├── hooks/          # Custom React hooks
│   │   └── App.tsx         # Main application
│   ├── src-tauri/          # Tauri Rust backend
│   │   ├── src/main.rs     # Rust entry point
│   │   └── tauri.conf.json # Tauri configuration
│   └── package.json        # Node.js dependencies
├── docs/                   # Documentation
├── README.md               # Project overview
├── SETUP.md               # Installation guide
└── PROJECT_SUMMARY.md     # This file

Development Roadmap

Phase 1: MVP

  • Basic code editor interface
  • Local AI model integration
  • Core AI commands (explain, refactor, fix)
  • Desktop application framework

Phase 2: Enhanced Features

  • Advanced code completion
  • Project-wide context awareness
  • Custom AI prompts
  • File tree and project management
  • Settings and preferences

Phase 3: Advanced Capabilities

  • Plugin system
  • Custom model training
  • Team collaboration features
  • Integration with version control
  • Advanced debugging assistance

Comparison with Alternatives

Feature Foxus GitHub Copilot Cursor Windsurf
Privacy Fully Local Cloud-based Cloud-based Cloud-based
Offline Yes No No No
Cost Free 💰 $10/month 💰 $20/month 💰 $15/month
Customization Full control Limited Limited Limited
Multi-language Yes Yes Yes Yes
Speed Local 🌐 Network 🌐 Network 🌐 Network

Contributing

How to Contribute

  1. Fork the repository
  2. Create a feature branch
  3. Implement your changes
  4. Add tests if applicable
  5. Submit a pull request

Areas for Contribution

  • UI/UX Improvements: Better design and user experience
  • AI Model Integration: Support for new models
  • Language Support: Additional programming languages
  • Performance: Optimization and speed improvements
  • Documentation: Guides and examples

Technical Highlights

Performance Optimizations

  • Tauri vs Electron: 10x smaller bundle size
  • Local Processing: Zero network latency
  • Efficient State Management: Zustand for minimal re-renders
  • Code Splitting: Lazy loading for faster startup

Security Features

  • No External Calls: All processing happens locally
  • File System Sandboxing: Tauri security model
  • Input Validation: Comprehensive API validation
  • Error Handling: Graceful failure recovery

Future Vision

Foxus aims to become the de facto standard for privacy-conscious developers who want AI assistance without compromising their code security. The goal is to create an ecosystem where:

  • Developers have full control over their AI assistant
  • Organizations can maintain code privacy and compliance
  • AI Models can be easily customized for specific domains
  • Innovation happens locally without external dependencies

Getting Help

Documentation

  • README.md: Quick project overview
  • SETUP.md: Detailed installation guide
  • API Documentation: Available at http://localhost:8000/docs

Community

  • Issues: Report bugs and request features
  • Discussions: Ask questions and share ideas
  • Contributing: Help improve Foxus

Support

  • Check troubleshooting in SETUP.md
  • Review API logs for debugging
  • Ensure Ollama is running properly
  • Verify model availability

Foxus represents the future of AI-assisted development: powerful, private, and fully under your control. Join us in building the next generation of coding tools that respect developer privacy while maximizing productivity.

🦊 Start coding smarter, not harder, with Foxus!