FastAPI service with Postgres + Alembic migrations and optional local LLM (LM Studio).
- Python 3.11
- Docker (for Postgres)
- (Optional) LM Studio for local LLM server
This repo includes a Makefile that wraps the most common dev commands.
make install
make startWhat make start does: db-up (Postgres) β db-upgrade (Alembic) β run (Uvicorn with reload).
make help # list all targets
make venv # create .venv
make install # install backend deps (incl dev)
make db-up # start Postgres (docker compose)
make db-down # stop Postgres
make db-logs # follow Postgres logs
make db-wait # wait for postgres health
make db-upgrade # run Alembic migrations against compose DB
make db-dump # dump data to backups/ directory
make db-restore # restore data from SQL file (requires FILE=...)
make run # run API (reload)
make run-prod # run API (no reload)
make test # run tests
make fmt # ruff format
make lint # ruff lint
make type # mypy
make ci # CI-like: fmt-check + lint + migrate + testmake frontend-install
make frontend-dev
make frontend-lint
make frontend-typecheck
make frontend-test
make frontend-buildIf you prefer, you can use make start (see above). Manual steps are below.
docker compose up -dMakefile equivalent:
make db-upCreate .env (or export vars in your shell). Minimum:
export DATABASE_URL="postgresql+psycopg://postgres:postgres@localhost:5432/claims_assistant"
export JWT_SECRET="dev-secret"
export ENV="dev"
# Optional: readiness should not depend on LLM during local debugging
export READY_CHECK_LLM="false"
# Optional: policy parser integration
export PARSER_MODE="local"
export PARSER_BASE_URL="http://localhost:8001"python -m alembic upgrade headMakefile equivalent:
make db-upgradeuvicorn app.main:app --host 0.0.0.0 --port 8000 --reloadMakefile equivalent:
make runcurl -i http://localhost:8000/health
curl -i http://localhost:8000/readyThis repo is ready for Vercel Python Serverless Functions via api/index.py. All paths route to the
FastAPI app so /docs and /openapi.json work by default.
DATABASE_URL(Postgres connection string)JWT_SECRETENV=prod
ADMIN_API_KEY(needed to bootstrap admins)CORS_ORIGINS(comma-separated or JSON array)LMSTUDIO_BASE_URL,LLM_MODEL,LLM_TIMEOUT_SECONDS,LLM_MAX_STEPS,LLM_TEMPERATUREREADY_CHECK_LLM(setfalseto avoid readiness blocking on LLM)PARSER_MODE,PARSER_BASE_URLCHAT_FILE_LOGS=false(recommended for serverless; defaults to disabled inENV=prod)
- The SQLAlchemy engine is created with
create_engine(...)and opens a connection on first use; use a managed Postgres or pooler if you see connection churn on cold starts. - The app avoids writing to disk in
ENV=prod; if you enable chat file logs, setCHAT_LOG_DIRto a writable mount.
- The PDF parser lives under
app/parsers/pdf/(pdf_parse.py+aetna_eob.py). - The backend policy adapter is
app/parsers/policy/aetna_policy.pyand importsdlc_modulparser modules whenPARSER_MODE=local. - Set
PARSER_MODE=httpto call the external parser service atPARSER_BASE_URL(/api/policy/parse).
To create a new user with admin privileges, you must use the ADMIN_API_KEY backdoor.
Ensure your .env file (or environment) has the ADMIN_API_KEY variable set.
# .env
# Example (development only):
ADMIN_API_KEY=5b241278440774e6c74d3019bb74f2585d8762b4d66134d17db66b723c8c6709013afc738ef5fa60b685f2bbabd143595dc7751ffb829259041b4526b2d42098With the server running (e.g., via make run), you can use the create-admin or create-user make targets. They will automatically pick up ADMIN_API_KEY from your .env.
Create an Admin:
make create-admin EMAIL=admin@example.com PASSWORD=strong_password FULL_NAME="System Admin"Create a Standard User:
make create-user EMAIL=doctor@example.com PASSWORD=secretNote: If you don't use .env, you can pass the key manually:
make create-admin ADMIN_API_KEY=secret-key ...- Navigate to
/app/adminand open the "Companies & Policies" tab. - In the Policy Links table:
- Use "Refresh Rules" to parse and review extracted rules. Confirm to store.
- Use "View Rules" to open the policy rules page.
- On the policy rules page:
- Pick an MCP code to browse related policy links.
- Select a policy link to see the latest title, next review date, medical necessity text, criteria hierarchy, and notes.
- If no rules have been parsed yet, the page shows a "No rules parsed yet" message.
The make command will output the JSON response containing the access token. You can verify login:
# Verify login
curl -s -X POST http://localhost:8000/auth/login \
-H "Content-Type: application/json" \
-d '{"email":"admin@example.com","password":"strong_password"}'Create user (admin-only):
export ADMIN_API_KEY="dev-admin-key"
curl -s -X POST http://localhost:8000/auth/admin/users \
-H "Content-Type: application/json" \
-H "X-Admin-Token: dev-admin-key" \
-d '{"email":"doc1@example.com","password":"secret"}'; echoLogin:
TOKEN=$(curl -s -X POST http://localhost:8000/auth/login \
-H "Content-Type: application/json" \
-d '{"email":"doc1@example.com","password":"secret"}' \
| python -c "import sys, json; print(json.load(sys.stdin)['access_token'])")
echo "$TOKEN"Me:
curl -i http://localhost:8000/auth/me \
-H "Authorization: Bearer $TOKEN" \
-H "X-Request-ID: me-1"Minimal chat:
curl -i -X POST http://localhost:8000/chat \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-H "X-Request-ID: chat-1" \
-d '{"message":"Say exactly OK"}'; echoContinue a session:
SESSION_ID="<paste from previous response>"
curl -i -X POST http://localhost:8000/chat \
-H "Authorization: Bearer $TOKEN" \
-H "Content-Type: application/json" \
-H "X-Request-ID: chat-2" \
-d "{\"session_id\":\"$SESSION_ID\",\"message\":\"What did I just ask you to say? Answer in one word.\"}"; echoRun LM Studio server on your LLM machine, e.g.:
- Same machine:
http://127.0.0.1:1234/v1 - Another machine on the same Wi-Fi:
http://<LM_IP>:1234/v1
Then set:
export LLM_BASE_URL="http://<LM_IP>:1234/v1".venv/bin/python -m compileall app
.venv/bin/ruff check app || true
.venv/bin/pytest -q tests/test_claim_ingest_idempotent.py --maxfail=1 || trueFull suite (optional): .venv/bin/pytest -q
GitHub Actions workflow:
- starts Postgres as a service
- runs Alembic migrations and tests
- builds Docker image
- smoke-tests /health and /ready on port 8000
Local CI-like checks:
make ciLocal Docker smoke (build + run + health/ready + stop):
make docker-ciYou can dump the local Docker-based database to the backups/ directory:
make db-dumpTo restore from a dump, specify the file path:
make db-restore FILE=backups/dump_20230101_120000.sql(Note: This drops/recreates data depending on the dump content, use with caution.)
If you want next steps:
- add CI badge to README
- split CI into jobs (lint / test / docker-smoke)
- prepare a separate
docker-compose.ci.yml