5.6 KiB
Foxus Setup Guide
This guide will walk you through setting up Foxus, a local-first AI coding assistant.
Prerequisites
Before starting, ensure you have the following installed:
Required Software
-
Node.js (v18 or higher)
# Check version node --version npm --version
-
Python (3.9 or higher)
# Check version python --version pip --version
-
Rust (for Tauri)
# Install Rust curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh source ~/.cargo/env # Check version rustc --version cargo --version
-
Ollama (for local AI models)
# Linux/macOS curl -fsSL https://ollama.ai/install.sh | sh # Windows: Download from https://ollama.ai/download
Installation Steps
1. Install Dependencies
Backend Dependencies
cd backend
pip install -r requirements.txt
Frontend Dependencies
cd frontend
npm install
2. Setup Ollama and AI Models
-
Start Ollama service:
ollama serve
-
Pull a coding model (choose one):
# Recommended for most users (lighter model) ollama pull codellama:7b-code # For better performance (larger model) ollama pull codellama:13b-code # Alternative models ollama pull deepseek-coder:6.7b ollama pull starcoder:7b
-
Verify model installation:
ollama list
3. Configure Environment
-
Create backend environment file:
cd backend cp .env.example .env # If example exists
-
Edit
.env
file (optional):OLLAMA_BASE_URL=http://localhost:11434 DEFAULT_MODEL=codellama:7b-code DEBUG=true HOST=127.0.0.1 PORT=8000
4. Install Tauri CLI
# Install Tauri CLI globally
npm install -g @tauri-apps/cli
# Or use with npx (no global install needed)
npx @tauri-apps/cli --version
Running the Application
Development Mode
-
Start the backend API server:
cd backend python main.py
The backend will start on
http://localhost:8000
-
Start the frontend application (in a new terminal):
cd frontend npm run tauri:dev
The Foxus application window should open automatically.
Production Build
cd frontend
npm run tauri:build
This will create a distributable application in frontend/src-tauri/target/release/bundle/
.
Verification
Test Backend API
# Check API health
curl http://localhost:8000/health
# Check AI service
curl http://localhost:8000/api/ai/health
# List available models
curl http://localhost:8000/api/models/list
Test Frontend
- Open the Foxus application
- Check for "AI Service Connected" status
- Try opening a file
- Test AI commands using Ctrl+K (or Cmd+K)
Troubleshooting
Common Issues
-
Ollama not running:
# Start Ollama service ollama serve # Check if running curl http://localhost:11434/api/tags
-
Port conflicts:
- Backend: Change
PORT
in backend.env
file - Frontend: Change port in
frontend/vite.config.ts
- Backend: Change
-
Model not found:
# Pull the default model ollama pull codellama:7b-code # Verify installation ollama list
-
Rust compilation errors:
# Update Rust rustup update # Clear Tauri cache cd frontend npm run tauri clean
-
Node.js/NPM issues:
# Clear npm cache npm cache clean --force # Delete node_modules and reinstall rm -rf node_modules package-lock.json npm install
Development
Project Structure
foxus/
├── backend/ # FastAPI Python backend
│ ├── app/
│ │ ├── api/ # API routes
│ │ ├── core/ # Configuration
│ │ ├── models/ # Data models
│ │ └── services/ # Business logic
│ ├── main.py # Entry point
│ └── requirements.txt # Dependencies
├── frontend/ # Tauri + React frontend
│ ├── src/
│ │ ├── components/ # React components
│ │ ├── stores/ # State management
│ │ ├── hooks/ # Custom hooks
│ │ └── App.tsx # Main app
│ ├── src-tauri/ # Tauri Rust backend
│ └── package.json # Dependencies
└── README.md
Adding New Features
- Backend API endpoints: Add to
backend/app/api/
- Frontend components: Add to
frontend/src/components/
- State management: Use Zustand stores in
frontend/src/stores/
- AI commands: Extend
backend/app/services/ollama_service.py
Keyboard Shortcuts
Ctrl+K
/Cmd+K
: Open command paletteCtrl+S
/Cmd+S
: Save current fileCtrl+Shift+E
: Explain selected codeCtrl+Shift+R
: Refactor selected codeCtrl+Shift+F
: Fix selected code
Next Steps
- Customize AI models: Download and test different models
- Configure file associations: Add support for new languages
- Extend AI commands: Add custom prompts and commands
- UI customization: Modify themes and layouts
Support
For issues and questions:
- Check the troubleshooting section above
- Review logs in the terminal
- Ensure all prerequisites are installed
- Verify Ollama is running and models are available
Contributing
- Fork the repository
- Create a feature branch
- Make your changes
- Add tests if applicable
- Submit a pull request
Happy coding with Foxus! 🦊