Skip to main content

Overview

The local development setup runs PostgreSQL and Redis inside Docker containers while you run the backend (FastAPI) and frontend (Next.js) natively on your machine. This gives you fast hot-reload for both services and direct debugger access.

Prerequisites

ToolVersionInstall
Docker20.10+docker.com
Docker Composev2+Included with Docker Desktop
Python3.11+python.org
Node.js22+nodejs.org
Git2.xgit-scm.com

Step-by-Step Setup

1

Clone the Repository

git clone https://github.com/nadoo-ai/nadoo-ai.git
cd nadoo-ai
git submodule init
git submodule update
2

Start Docker Services

Launch PostgreSQL and Redis:
docker-compose -f infrastructure/docker-compose.local.yml up -d
This starts:
  • PostgreSQL 15 with pgvector on port 5432
  • Redis 7 on port 6379
3

Install Backend Dependencies

cd packages/backend
pip install -r requirements.txt
We recommend using a Python virtual environment (venv or conda) to avoid dependency conflicts.
4

Configure Environment Variables

Copy the example environment file and adjust as needed:
cp .env.example .env
The defaults work out of the box with the local Docker Compose setup. See Environment Variables for the full reference.
5

Run Database Migrations

alembic upgrade head
This creates all required tables in PostgreSQL.
6

Start the Backend

uvicorn src.main:app --reload --port 8000
The API server starts at http://localhost:8000 with auto-reload enabled.
7

Start the Frontend

Open a new terminal:
cd packages/frontend
npm install
npm run dev
The frontend starts at http://localhost:3000.

Quick Start Script

Alternatively, use the all-in-one start script that handles all of the above steps:
# From the project root
npm run start
Or:
./scripts/start-local.sh

Service URLs

Once everything is running, access the following:
ServiceURLDescription
Frontendhttp://localhost:3000Nadoo AI web application
Backend APIhttp://localhost:8000FastAPI REST API
Swagger UIhttp://localhost:8000/api/docsInteractive API documentation
ReDochttp://localhost:8000/api/redocAlternative API documentation

Database Credentials

The local Docker Compose file uses the following default credentials:
ParameterValue
Hostlocalhost
Port5432
Userpostgres
Password4432646294A404D6351
Databasenadoo_db
Connection string:
postgresql://postgres:4432646294A404D6351@localhost:5432/nadoo_db
These are default development credentials. Never use them in production. See Environment Variables for guidance on setting secure credentials.

Troubleshooting

Another service is using the port. Either stop the conflicting service or change the port mapping in docker-compose.local.yml:
ports:
  - "15432:5432"  # Map to a different host port
Then update DATABASE_URL in your .env file to use the new port.
Check that Docker is running:
docker info
If using Docker Desktop, ensure it is started. On Linux, verify the Docker daemon:
sudo systemctl status docker
Ensure PostgreSQL is fully started before running migrations. You can check:
docker-compose -f infrastructure/docker-compose.local.yml logs postgres
Wait for the message database system is ready to accept connections before running alembic upgrade head.
Verify the backend is running on port 8000 and that your frontend .env file has the correct API URL:
NEXT_PUBLIC_API_URL=http://localhost:8000
Ensure you are using Python 3.11 or later:
python --version
If you have multiple Python versions, use python3.11 explicitly or create a virtual environment with the correct version.

Stopping Services

# Stop Docker containers
docker-compose -f infrastructure/docker-compose.local.yml down

# Stop with volume cleanup (removes database data)
docker-compose -f infrastructure/docker-compose.local.yml down -v
Using down -v will delete all data in PostgreSQL and Redis. Only use this when you want a clean slate.