Skip to content
/ COFFEE Public

COFFEE provides students with feedback on their answers to free-text questions. This feedback is based on criteria specified by teachers. This means that COFFEE knows what is important in the exam.

License

Notifications You must be signed in to change notification settings

hansesm/COFFEE

Repository files navigation

COFFEE - Corrective Formative Feedback

AI-powered feedback system for educational institutions using Django and Large Language Models.

COFFEES Startpage

Quick Demo with Docker Compose

Try COFFEE instantly with a single command! Download the docker-compose.demo.yml file and run:

docker compose -f docker-compose.demo.yml up

Or use this one-liner (macOS/Linux/Windows):

curl -O https://raw.githubusercontent.com/hansesm/coffee/main/docker-compose.demo.yml && docker compose -f docker-compose.demo.yml up

Windows (PowerShell):

Invoke-WebRequest -Uri https://raw.githubusercontent.com/hansesm/coffee/main/docker-compose.demo.yml -OutFile docker-compose.demo.yml; docker compose -f docker-compose.demo.yml up

This spins up PostgreSQL, Ollama (with phi4 model), and the app itself using the pre-built image ghcr.io/hansesm/coffee:latest. On startup, migrations run automatically, default users are created, and demo data is imported.

Note: The phi4 model download can take a while. Ollama may run slowly or time out when running in Docker. You can adjust the request_timeout setting in the Admin Panel to prevent timeouts.

Access the app at http://localhost:8000.

To tear everything down:

docker compose -f docker-compose.demo.yml down -v

Important: Restarting the demo reruns the migrations and will likely fail, so this compose file is meant strictly for a one-off demo environment.

Getting Started

  1. Prerequisites
  • Install uv
  1. Clone and setup

    git clone <repository-url>
    cd COFFEE
    uv venv --python 3.13
    uv sync
  2. Configure environment

    cp .env.example .env
    # Edit .env with your settings
  3. Setup database Without the env variable DATABASE_URL django creates a sqlite database:

    uv run task migrate
    uv run task create-groups

    If you want to use a PostgreSQL database, you can spin it up with Docker Compose:

    docker compose up -d 
    uv run task migrate
    uv run task create-groups
  4. Run

    uv run task server

Optional: Local Ollama Setup for Development

  1. Install Ollama

  2. Start the Ollama service

    • After installation the daemon normally starts automatically. You can verify with:
      ollama serve
      (Press Ctrl+C to stop if it is already running in the background.)
  3. Download a model

    ollama pull phi4
  4. Test the model locally

    ollama run phi4

    The default API endpoint is available at http://localhost:11434.

  5. Register Ollama in Django Admin

    • Sign in at <BASE_URL>/admin.
    • Go to LLM ProvidersAdd, pick Ollama, set the host (e.g. http://localhost:11434), and save.
    • Go to LLM ModelsAdd, select the newly created Ollama provider, enter the model name (e.g. phi4), choose a display name, and save.
    • The provider and model can now be assigned to tasks and criteria inside the app.

Optional: Populate Database with Demo Data

uv run task import-demo-data

Configuration

All configuration is environment-based. Copy .env.example to .env and customize:

Required Settings

# Django (REQUIRED)
SECRET_KEY=your-secret-key-here  
DEBUG=True
DB_PASSWORD=<YOUR_DB_PASSWORD>
DB_USERNAME=<user>
DB_HOST=<host>
DB_PORT=<port>
DB_NAME=<db>
DB_PROTOCOL=<postgres|sqlite>

Custom LLM Providers

You can add your own LLM Providers and LLM Models in the Django Admin Panel (<BASE_URL>/admin).

Currently supported LLM Providers:

Contributions for additional providers such as LLM Lite, AWS Bedrock, Hugging Face, and others are very welcome! 🚀

LLM Backends

Add providers and models in the Django admin under LLM Providers / LLM Models. Each backend needs different connection details:

  • Ollama – Set Endpoint to your Ollama host (e.g. http://ollama.local:11434 or http://localhost:11434). Leave the API key empty unless you enabled token auth; optional TLS settings live in the JSON config.
  • Azure AI – Use the Inference endpoint that already includes the deployment segment, for example https://<azure-resource>/openai/deployments/<deployment>. Add the matching API key.
  • Azure OpenAI – Point Endpoint to the service base URL like https://<azure-resource>.cognitiveservices.azure.com/. Add the matching API key.

Default Login Credentials

After running python manage.py create_users_and_groups, use these credentials:

  • Admin: username admin, password reverence-referee-lunchbox
  • Manager: username manager, password expediter-saline-untapped

Usage

  1. Admin: Create courses, tasks, and criteria at /admin/
  2. Students: Submit work and receive AI feedback
  3. Analysis: View feedback analytics and export data

Docker Deployment

docker build -t coffee .
docker run -p 8000:8000 --env-file .env coffee #On Windows add '--network host'  

Podman Deployment (RedHat/RHEL)

For RedHat Enterprise Linux systems using Podman:

# Install podman-compose if not already installed
sudo dnf install podman-compose

# Copy and configure environment
cp .env.example .env
# Edit .env with your actual configuration values

# Deploy with podman-compose
podman-compose -f podman-compose.yaml up -d

# Create initial users and database schema
podman exec -it coffee_app python manage.py migrate
podman exec -it coffee_app python manage.py create_users_and_groups

# Access the application
curl http://localhost:8000

Useful Podman commands:

# View logs
podman-compose logs -f coffee_app

# Stop services
podman-compose down

# Rebuild and restart
podman-compose up -d --build

Credits

This project was developed with assistance from Claude Code, Anthropic's AI coding assistant.

License

See LICENSE.md for details.

About

COFFEE provides students with feedback on their answers to free-text questions. This feedback is based on criteria specified by teachers. This means that COFFEE knows what is important in the exam.

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors 2

  •  
  •