269 lines
5.6 KiB
Markdown
269 lines
5.6 KiB
Markdown
# Foxus Setup Guide
|
|
|
|
This guide will walk you through setting up **Foxus**, a local-first AI coding assistant.
|
|
|
|
## Prerequisites
|
|
|
|
Before starting, ensure you have the following installed:
|
|
|
|
### Required Software
|
|
|
|
1. **Node.js** (v18 or higher)
|
|
```bash
|
|
# Check version
|
|
node --version
|
|
npm --version
|
|
```
|
|
|
|
2. **Python** (3.9 or higher)
|
|
```bash
|
|
# Check version
|
|
python --version
|
|
pip --version
|
|
```
|
|
|
|
3. **Rust** (for Tauri)
|
|
```bash
|
|
# Install Rust
|
|
curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
|
|
source ~/.cargo/env
|
|
|
|
# Check version
|
|
rustc --version
|
|
cargo --version
|
|
```
|
|
|
|
4. **Ollama** (for local AI models)
|
|
```bash
|
|
# Linux/macOS
|
|
curl -fsSL https://ollama.ai/install.sh | sh
|
|
|
|
# Windows: Download from https://ollama.ai/download
|
|
```
|
|
|
|
## Installation Steps
|
|
|
|
### 1. Install Dependencies
|
|
|
|
#### Backend Dependencies
|
|
```bash
|
|
cd backend
|
|
pip install -r requirements.txt
|
|
```
|
|
|
|
#### Frontend Dependencies
|
|
```bash
|
|
cd frontend
|
|
npm install
|
|
```
|
|
|
|
### 2. Setup Ollama and AI Models
|
|
|
|
1. **Start Ollama service**:
|
|
```bash
|
|
ollama serve
|
|
```
|
|
|
|
2. **Pull a coding model** (choose one):
|
|
```bash
|
|
# Recommended for most users (lighter model)
|
|
ollama pull codellama:7b-code
|
|
|
|
# For better performance (larger model)
|
|
ollama pull codellama:13b-code
|
|
|
|
# Alternative models
|
|
ollama pull deepseek-coder:6.7b
|
|
ollama pull starcoder:7b
|
|
```
|
|
|
|
3. **Verify model installation**:
|
|
```bash
|
|
ollama list
|
|
```
|
|
|
|
### 3. Configure Environment
|
|
|
|
1. **Create backend environment file**:
|
|
```bash
|
|
cd backend
|
|
cp .env.example .env # If example exists
|
|
```
|
|
|
|
2. **Edit `.env` file** (optional):
|
|
```env
|
|
OLLAMA_BASE_URL=http://localhost:11434
|
|
DEFAULT_MODEL=codellama:7b-code
|
|
DEBUG=true
|
|
HOST=127.0.0.1
|
|
PORT=8000
|
|
```
|
|
|
|
### 4. Install Tauri CLI
|
|
|
|
```bash
|
|
# Install Tauri CLI globally
|
|
npm install -g @tauri-apps/cli
|
|
|
|
# Or use with npx (no global install needed)
|
|
npx @tauri-apps/cli --version
|
|
```
|
|
|
|
## Running the Application
|
|
|
|
### Development Mode
|
|
|
|
1. **Start the backend API server**:
|
|
```bash
|
|
cd backend
|
|
python main.py
|
|
```
|
|
The backend will start on `http://localhost:8000`
|
|
|
|
2. **Start the frontend application** (in a new terminal):
|
|
```bash
|
|
cd frontend
|
|
npm run tauri:dev
|
|
```
|
|
|
|
The Foxus application window should open automatically.
|
|
|
|
### Production Build
|
|
|
|
```bash
|
|
cd frontend
|
|
npm run tauri:build
|
|
```
|
|
|
|
This will create a distributable application in `frontend/src-tauri/target/release/bundle/`.
|
|
|
|
## Verification
|
|
|
|
### Test Backend API
|
|
```bash
|
|
# Check API health
|
|
curl http://localhost:8000/health
|
|
|
|
# Check AI service
|
|
curl http://localhost:8000/api/ai/health
|
|
|
|
# List available models
|
|
curl http://localhost:8000/api/models/list
|
|
```
|
|
|
|
### Test Frontend
|
|
- Open the Foxus application
|
|
- Check for "AI Service Connected" status
|
|
- Try opening a file
|
|
- Test AI commands using Ctrl+K (or Cmd+K)
|
|
|
|
## Troubleshooting
|
|
|
|
### Common Issues
|
|
|
|
1. **Ollama not running**:
|
|
```bash
|
|
# Start Ollama service
|
|
ollama serve
|
|
|
|
# Check if running
|
|
curl http://localhost:11434/api/tags
|
|
```
|
|
|
|
2. **Port conflicts**:
|
|
- Backend: Change `PORT` in backend `.env` file
|
|
- Frontend: Change port in `frontend/vite.config.ts`
|
|
|
|
3. **Model not found**:
|
|
```bash
|
|
# Pull the default model
|
|
ollama pull codellama:7b-code
|
|
|
|
# Verify installation
|
|
ollama list
|
|
```
|
|
|
|
4. **Rust compilation errors**:
|
|
```bash
|
|
# Update Rust
|
|
rustup update
|
|
|
|
# Clear Tauri cache
|
|
cd frontend
|
|
npm run tauri clean
|
|
```
|
|
|
|
5. **Node.js/NPM issues**:
|
|
```bash
|
|
# Clear npm cache
|
|
npm cache clean --force
|
|
|
|
# Delete node_modules and reinstall
|
|
rm -rf node_modules package-lock.json
|
|
npm install
|
|
```
|
|
|
|
## Development
|
|
|
|
### Project Structure
|
|
|
|
```
|
|
foxus/
|
|
├── backend/ # FastAPI Python backend
|
|
│ ├── app/
|
|
│ │ ├── api/ # API routes
|
|
│ │ ├── core/ # Configuration
|
|
│ │ ├── models/ # Data models
|
|
│ │ └── services/ # Business logic
|
|
│ ├── main.py # Entry point
|
|
│ └── requirements.txt # Dependencies
|
|
├── frontend/ # Tauri + React frontend
|
|
│ ├── src/
|
|
│ │ ├── components/ # React components
|
|
│ │ ├── stores/ # State management
|
|
│ │ ├── hooks/ # Custom hooks
|
|
│ │ └── App.tsx # Main app
|
|
│ ├── src-tauri/ # Tauri Rust backend
|
|
│ └── package.json # Dependencies
|
|
└── README.md
|
|
```
|
|
|
|
### Adding New Features
|
|
|
|
1. **Backend API endpoints**: Add to `backend/app/api/`
|
|
2. **Frontend components**: Add to `frontend/src/components/`
|
|
3. **State management**: Use Zustand stores in `frontend/src/stores/`
|
|
4. **AI commands**: Extend `backend/app/services/ollama_service.py`
|
|
|
|
### Keyboard Shortcuts
|
|
|
|
- `Ctrl+K` / `Cmd+K`: Open command palette
|
|
- `Ctrl+S` / `Cmd+S`: Save current file
|
|
- `Ctrl+Shift+E`: Explain selected code
|
|
- `Ctrl+Shift+R`: Refactor selected code
|
|
- `Ctrl+Shift+F`: Fix selected code
|
|
|
|
## Next Steps
|
|
|
|
1. **Customize AI models**: Download and test different models
|
|
2. **Configure file associations**: Add support for new languages
|
|
3. **Extend AI commands**: Add custom prompts and commands
|
|
4. **UI customization**: Modify themes and layouts
|
|
|
|
## Support
|
|
|
|
For issues and questions:
|
|
1. Check the troubleshooting section above
|
|
2. Review logs in the terminal
|
|
3. Ensure all prerequisites are installed
|
|
4. Verify Ollama is running and models are available
|
|
|
|
## Contributing
|
|
|
|
1. Fork the repository
|
|
2. Create a feature branch
|
|
3. Make your changes
|
|
4. Add tests if applicable
|
|
5. Submit a pull request
|
|
|
|
Happy coding with Foxus! 🦊 |