Skip to content

Uplink Deployment Guide

This document provides a central reference for the deployment process and initial setup when working with the Uplink codebase.


Table of Contents

  1. Initial Setup - Opening the Codebase
  2. Development Environment Options
  3. Creating a New Branch
  4. Working with Database Changes
  5. Code Changes Best Practices
  6. Submitting Changes for Review
  7. Deployment Process

Initial Setup - Opening the Codebase

Every time you start working on the codebase, follow these steps:

1. Update Your Local Main Branch

git checkout main
git pull origin main

This ensures you have the latest version of the codebase.

2. Choose Your Development Environment

You have two options for running Uplink locally:

Option A: Docker (Recommended for consistency)

./deploy.sh docker-up

Option B: Local Virtual Environment

source .venv/bin/activate

3. Sync Your Local Database

With Docker:

./deploy.sh migrate
# Or directly: docker compose exec web python manage.py migrate

With Local venv:

python manage.py migrate


Development Environment Options

Docker provides a consistent environment that matches production more closely.

Advantages: - No need to install uv, Python, MySQL, or Redis locally - Consistent environment across all developers - Matches production setup - Automatic dependency management via uv inside containers

Setup:

# Start all services (web, database, redis, huey, daphne)
./deploy.sh docker-up

# View logs
./deploy.sh docker-logs

# Run Django commands
docker compose exec web python manage.py [command]

# Or use helper script (auto-detects Docker)
./deploy.sh migrate
./deploy.sh check

How uv Works with Docker: - uv is installed inside the Docker containers - Dependencies are installed automatically when building images - You don't need uv installed on your local machine - When you update pyproject.toml, rebuild with: ./deploy.sh docker-build

Key Ports: - Web application: http://localhost:8001 - Admin panel: http://localhost:8001/admin - MySQL: localhost:3309 - Redis: localhost:6381 - Daphne (WebSockets): localhost:9000

Daily Workflow:

# Morning - start containers
./deploy.sh docker-up

# Work on code (files auto-reload)
# Edit files as normal

# Run migrations when needed
./deploy.sh migrate

# Evening - stop containers
./deploy.sh docker-down

Local Virtual Environment (Legacy)

For those who prefer local development without Docker.

Setup:

# Activate virtual environment
source .venv/bin/activate

# Install/update dependencies with uv
uv pip sync

# Run migrations
python manage.py migrate

# Start development server
honcho start

Requirements: - uv installed locally: pip install uv - Python 3.9+ - Local MySQL server (port 3307) - Local Redis server (port 6379)

Key Points: - Requires manual setup of MySQL and Redis - Need to install uv on your local machine - .env needs to be configured for local services


Creating a New Branch

Option 1: Creating a New Feature Branch

Create a new branch from the updated main:

git checkout -b <branch-name>

Branch naming convention: - Use descriptive names: feature/add-contact-favourites, fix/order-deletion-bug, docs/update-deployment-guide - Include issue numbers if applicable: 123-add-favourite-contacts

Option 2: Updating an Existing Branch

If you're continuing work on an existing branch, rebase it with main:

git checkout <your-branch>
git rebase main

Important: Resolve any conflicts that arise during the rebase before continuing.


Working with Database Changes

⚠️ Critical Rule: Database Changes Must Be Isolated

Database changes should ALWAYS be on a separate branch from general code changes.

Creating a Database Migration Branch

  1. Create a dedicated branch for database changes:

    git checkout -b db-migration/<description>
    

  2. Make your model changes in the appropriate models.py file

  3. Create the migration:

With Docker:

docker compose exec web python manage.py makemigrations

With Local venv:

python manage.py makemigrations

  1. Review the generated migration file in <app>/migrations/

  2. Test the migration:

With Docker:

docker compose exec web python manage.py migrate

With Local venv:

python manage.py migrate

  1. Commit ONLY the migration files:
    git add <app>/migrations/
    git commit -m "Add migration for <description>"
    

Why Separate Branches?

  • Prevents conflicts between database schema changes and code changes
  • Allows database migrations to be deployed independently
  • Makes rollback easier if issues arise
  • Keeps pull requests focused and easier to review

Code Changes Best Practices

Keep Branches Focused

  • Each branch should address ONE specific feature, bug, or improvement
  • Avoid mixing unrelated changes in the same branch
  • If you notice other issues while working, create separate branches for them

Examples of Well-Scoped Branches

Good: - Branch fixes the order deletion cascade issue - Branch adds favourite contact functionality - Branch updates API documentation

Bad: - Branch fixes order deletion, adds contact favourites, updates 5 unrelated files, and reformats CSS

Commit Messages

Write clear, descriptive commit messages:

# Good examples
git commit -m "Fix cascade deletion for service items"
git commit -m "Add auto-linking for product instances in service groups"
git commit -m "Update deployment documentation with rebase instructions"

# Bad examples
git commit -m "fixes"
git commit -m "wip"
git commit -m "stuff"

./deploy.sh test

Or with Docker: docker compose exec web python manage.py test

Or with local venv: ## Submitting Changes for Review

1. Check Your Changes

Before creating a pull request:

# Check what files have changed
git status

# Review your changes
git diff

# Ensure all tests pass (when we have a proper test suite)
python manage.py test

2. Push Your Branch

git push origin <branch-name>
Or pull via the GitHub site.

3. Create a Pull Request

  1. Go to the GitHub repository
  2. Click "Pull Requests" → "New Pull Request"
  3. Select your branch
  4. Write a clear description:
  5. What does this PR do?
  6. Why is this change needed?
  7. How has it been tested?
  8. Any special deployment considerations?

4. Address Review Feedback

  • Respond to all review comments
  • Make requested changes in new commits
  • Push updates to the same branch
  • Re-request review when ready

Production Deployment Method

Production deployment uses traditional server setup (not Docker currently).

1. Pre-Deployment Checks

On the server (production environment):

# Check current status
git status

# Review recent changes
git log

# Ensure you're on the correct branch (usually main)
git branch

2. Pull Latest Changes

# Pull the latest approved changes
git pull origin main

Important: Only pull changes that have been: - Reviewed and approved via pull request - Merged into the main branch (or relevant deployment branch) - Tested in development/staging environment

3. Run Database Migrations

# Apply any new database migrations (current pipenv setup)
pipenv run ./manage.py migrate

4. Collect Static Files

If there were frontend changes:

pipenv run ./manage.py collectstatic --noinput

5. Install/Update Dependencies

Current production setup (pipenv):

# Install/update dependencies
pipenv install

After migration to uv (see PRODUCTION_MIGRATION.md):

# Activate virtual environment
source .venv/bin/activate

# Sync dependencies from pyproject.toml
uv pip sync

6. Restart Application Services

Restart the application to load the new code:

sudo systemctl restart uplink

7. Post-Deployment Verification

  • Check application logs: sudo journalctl -u uplink -f
  • Test critical functionality
  • Monitor for any issues

Using the Deployment Script

The deploy.sh helper script simplifies deployment:

# Full deployment (pull, migrate, static, sync deps, restart)
./deploy.sh deploy

# Individual steps
./deploy.sh update      # Pull latest code
./deploy.sh migrate     # Run migrations
./d

**Note:** For migrating production from pipenv to uv, see [PRODUCTION_MIGRATION.md](PRODUCTION_MIGRATION.md)# Check status
./deploy.sh logs        # View logs
``` for errors
- Test critical functionality
- Monitor for any issues

---

## Branch Merge Strategy

### Simple Changes
- Create feature branch  PR to `main`  Deploy from `main`

### Complex Multi-Part Deployments

For features requiring multiple coordinated changes:

1. Create a parent integration branch (e.g., `release/v2.0`)
2. Create feature branches off the integration branch
3. PR feature branches into the integration branch
4. Once all features are complete and tested, PR integration branch into `main`
5. Deploy from `main`

---

## Common Issues and Solutions

### Database Conflicts

**Issue:** Migration conflicts after pulling main

**Solution:**
```bash
# Reset migrations if in development
python manage.py migrate <app> zero
python manage.py migrate

# Or resolve manually by editing migration files

Merge Conflicts

Issue: Conflicts when rebasing or merging

Docker Development Commands

# Container management
./deploy.sh docker-up        # Start all containers
./deploy.sh docker-down      # Stop all containers
./deploy.sh docker-restart   # Restart containers
./deploy.sh docker-build     # Rebuild images (after pyproject.toml changes)
./deploy.sh docker-logs      # View container logs
./deploy.sh docker-shell     # Open bash in web container

# Django commands (auto-detects Docker)
./deploy.sh migrate          # Run migrations
./deploy.sh check            # Run system checks
./deploy.sh test             # Run tests
./deploy.sh static           # Collect static files

# Direct Docker commands
docker compose exec web python manage.py migrate
docker compose exec web python manage.py createsuperuser
docker compose exec web python manage.py shell
docker compose ps            # Check container status
docker compose logs -f web   # Follow web container logs

Local Development Commands (Non-Docker)

# Start working
git checkout main
git pull origin main
source .venv/bin/activate
python manage.py migrate
git checkout -b my-feature-branch

# During development
git status                    # Check what's changed
git add <files>              # Stage changes
git commit -m "message"      # Commit changes
git push origin <branch>     # Push to remote

# Database work
python manage.py makemigrations
python manage.py migrate
python manage.py showmigrations

# Update branch with latest main
git checkout main
gitUnderstanding uv in Different Environments

### uv with Docker (Development)

When using Docker for development:
- **You don't need uv installed locally**
- uv is installed inside Docker containers via the Dockerfile
- Dependencies are installed automatically when building images
- Updates happen when you rebuild: `./deploy.sh docker-build`

**How it works:**
1. Dockerfile installs uv from GitHub
2. During image build, uv reads `pyproject.toml`
3. uv installs all dependencies at lightning speed
4. Container is ready with all packages installed

**Adding dependencies with Docker:**
```bash
# 1. Edit pyproject.toml and add your package
# 2. Rebuild the Docker images
./deploy.sh docker-build

# 3. Restart containers
./deploy.sh docker-restart

uv with Local Development

When running locally without Docker: - You need uv installed: pip install uv - uv manages your .venv virtual environment - Much faster than pip or pipenv for installing packages

Initial setup:

# Create virtual environment
uv venv

# Activate it
source .venv/bin/activate

# Install all dependencies
uv pip install -e ".[dev]"

Adding dependencies locally:

# Edit pyproject.toml and add your package
# Then sync environment
uv pip sync

uv in Production

Production servers use uv for deployment: - Installed once on the server: pip install uv - Used to sync dependencies during deployment - Much faster deployments than pip or pipenv

Production workflow:

git pull origin main
source .venv/bin/activate
uv pip sync              # Fast dependency sync
sudo systemctl restart uplink

Why uv?

  • 10-100x faster than pip/pipenv
  • Reliable: Better dependency resolution
  • Modern: Written in Rust, actively maintained
  • Compatible: Works with standard pyproject.toml
  • Future-proof: Industry is moving toward uv

Docker vs Local Development

When to Use Docker

Use Docker when: - You want a quick, consistent setup - You don't want to install MySQL/Redis locally - You're working on features that need production-like environment - You're new to the project (faster onboarding) - You want to test production configurations

When to Use Local Development

Use Local when: - You prefer traditional Python development workflow - You need better IDE integration with installed packages - You're doing deep debugging with Python debuggers - You have MySQL/Redis already installed locally - You prefer the speed of native execution

Switching Between Environments

You can switch freely between Docker and local:

Switch to Docker:

# Update .env to use Docker services
DATABASE_HOST=db
REDIS_HOST=redis

# Start Docker
./deploy.sh docker-up

Switch to Local:

# Update .env to use local services
DATABASE_HOST=127.0.0.1
DATABASE_PORT=3307
REDIS_HOST=127.0.0.1

# Start local services
honcho start
pipenv install sudo systemctl restart uplink
**New process:**
```bash
git pull origin main
python manage.py migrate
python manage.py collectstatic --noinput
source .venv/bin/activate
uv pip sync
sudo systemctl restart uplink

Adding New Dependencies

When adding packages after migration to uv:

  1. Locally (development):

    # Add package to pyproject.toml [project.dependencies]
    # Then install
    uv pip install -e ".[dev]"
    

  2. Commit and push:

    git add pyproject.toml
    git commit -m "Add new dependency: package-name"
    git push
    

  3. Deploy to production:

    git pull origin main
    source .venv/bin/activate
    uv pip sync  # Installs new dependencies
    sudo systemctl restart uplink
    

Transition Period: Running Both

During the transition, you can keep both systems available:

  • Development: Use uv (faster, modern)
  • Production: Use pipenv (stable, proven)

Keep both pyproject.toml and Pipfile in sync: - When you update one, update the other - Test with uv locally, deploy with pipenv to production - Switch production to uv when confident


Contact

For questions about the deployment process, contact the development team or refer to: - API Documentation - Testing Documentation - Services Documentation