Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
28 changes: 28 additions & 0 deletions .github/PULL_REQUEST_TEMPLATE.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
<!--
Thank you for contributing! Please fill out the sections below to help maintainers review your change faster.
Delete any sections that don't apply.
-->

# Pull Request

## Summary
[SBP-XXX](https://biocloud.atlassian.net/browse/SBP-XXX) <Provide a short description of the changes in this pull request and the motivation behind them.>
Copy link

Copilot AI Dec 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The placeholder link format uses 'biocloud.atlassian.net', which may be specific to the organization. Ensure this is the correct Jira instance URL for the project and that contributors have access to it.

Suggested change
[SBP-XXX](https://biocloud.atlassian.net/browse/SBP-XXX) <Provide a short description of the changes in this pull request and the motivation behind them.>
<Tracking ticket link (e.g., Jira, GitHub issue) and a short description of the changes in this pull request and the motivation behind them.>

Copilot uses AI. Check for mistakes.

## Changes
- <What changed? (files, components, behavior)>
- <Any migration or backwards-incompatible changes?>

## How to Test
<Provide step-by-step instructions for testing the changes locally. Include any setup, commands, or screenshots where helpful.>

## Type of change
- [ ] Bug fix (non-breaking change which fixes an issue)
- [ ] New feature (non-breaking change which adds functionality)
- [ ] Breaking change (fix or feature that would cause existing functionality to change)
- [ ] Documentation update

## Checklist
- [ ] I have added tests that prove my fix is effective or that my feature works
- [ ] I have added or updated documentation where necessary
- [ ] I have run linting and unit tests locally
- [ ] The code follows the project's style guidelines
36 changes: 36 additions & 0 deletions .github/workflows/lint.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
name: Lint

on:
push:
branches: [main, workflows]
pull_request:
branches: [main, workflows]
Comment on lines +5 to +7
Copy link

Copilot AI Dec 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The workflow is configured to run on both 'main' and 'workflows' branches. The 'workflows' branch appears to be a temporary development branch. This should typically only run on 'main' and on pull requests to keep the CI configuration cleaner and more maintainable.

Suggested change
branches: [main, workflows]
pull_request:
branches: [main, workflows]
branches: [main]
pull_request:
branches: [main]

Copilot uses AI. Check for mistakes.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is right - you generally only need CI to run on pushes or pull requests to main


jobs:
ruff:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout@v4

- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.11"
cache: "pip"
cache-dependency-path: requirements.txt

- name: Install dependencies
run: |
python -m pip install --upgrade pip
python -m pip install -r requirements.txt
python -m pip install ruff black mypy
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These types of requirements should be in requirements-dev.txt so you can run the same checks locally (or in pyproject.toml as discussed below)

- name: Run Ruff
run: ruff check app tests

- name: Run Black
run: black --check app tests

- name: Run MyPy
run: mypy app --ignore-missing-imports
69 changes: 69 additions & 0 deletions .github/workflows/test-coverage.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
name: Coverage

on:
push:
branches: ["main"]
pull_request:
branches: ["main"]

jobs:
tests:
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.10", "3.11", "3.12"]

steps:
- name: Checkout
uses: actions/checkout@v4

- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v5
with:
python-version: ${{ matrix.python-version }}
cache: "pip"
cache-dependency-path: requirements.txt

- name: Install dependencies
run: |
python -m pip install --upgrade pip
python -m pip install -r requirements.txt
python -m pip install -r requirements-dev.txt

- name: Run tests with coverage
env:
ALLOWED_ORIGINS: http://localhost
SEQERA_API_URL: https://example.com/api
SEQERA_ACCESS_TOKEN: test-token
WORK_SPACE: demo-workspace
COMPUTE_ID: compute-123
WORK_DIR: /tmp/work
run: |
pytest --cov=app --cov-report=xml --cov-report=term-missing --cov-report=html -v

- name: Check coverage threshold (90%)
run: |
coverage report --fail-under=90

- name: Upload coverage to Codecov
uses: codecov/codecov-action@v4
if: matrix.python-version == '3.11'
with:
file: ./coverage.xml
flags: unittests
name: codecov-umbrella
fail_ci_if_error: false
Comment on lines +48 to +55
Copy link

Copilot AI Dec 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The Codecov action is using v4, but there's no token provided in the 'with' section. Codecov v4 typically requires a token for authentication. Either add the token using 'token: secrets.CODECOV_TOKEN' or verify that the repository is public and tokenless upload is intended.

Copilot uses AI. Check for mistakes.

- name: Coverage summary
uses: irongut/[email protected]
if: matrix.python-version == '3.11'
with:
filename: coverage.xml
badge: true
fail_below_min: true
format: markdown
hide_branch_rate: false
hide_complexity: true
indicators: true
thresholds: "90 90"
output: both
32 changes: 32 additions & 0 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
# Pre-commit hooks configuration
repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-yaml
- id: check-added-large-files
- id: check-json
- id: check-toml
- id: check-merge-conflict
- id: debug-statements

- repo: https://github.com/astral-sh/ruff-pre-commit
rev: v0.1.15
Copy link

Copilot AI Dec 18, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The ruff-pre-commit version v0.1.15 is outdated. As of December 2024, more recent versions (v0.7.x or later) are available with additional features and bug fixes. Update to a newer version for better linting capabilities.

Suggested change
rev: v0.1.15
rev: v0.7.0

Copilot uses AI. Check for mistakes.
hooks:
- id: ruff
args: [--fix, --exit-non-zero-on-fix]

- repo: https://github.com/psf/black
rev: 24.1.1
hooks:
- id: black
language_version: python3.11

- repo: https://github.com/pre-commit/mirrors-mypy
rev: v1.8.0
hooks:
- id: mypy
additional_dependencies: [types-httpx]
args: [--ignore-missing-imports]
60 changes: 58 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,14 @@
# SBP Portal Backend Server

![Lint](https://github.com/AustralianBioCommons/sbp-backend/actions/workflows/lint.yml/badge.svg)
![Coverage](https://github.com/AustralianBioCommons/sbp-backend/actions/workflows/test-coverage.yml/badge.svg)
[![codecov](https://codecov.io/gh/AustralianBioCommons/sbp-backend/branch/main/graph/badge.svg)](https://codecov.io/gh/AustralianBioCommons/sbp-backend)

FastAPI backend for handling Seqera Platform workflow launches.

## Prerequisites

- Python 3.9+ (matching the version used by your deployment target)
- Python 3.10+ (matching the version used by your deployment target)
- [uvicorn](https://www.uvicorn.org/) and other dependencies listed in `requirements.txt`

## Setup
Expand All @@ -22,7 +26,13 @@ FastAPI backend for handling Seqera Platform workflow launches.
pip install -r requirements.txt
```

3. Configure environment variables:
3. Install development dependencies (for testing and linting):

```bash
pip install -r requirements-dev.txt
```

4. Configure environment variables:

```bash
cp .env.example .env
Expand All @@ -46,6 +56,52 @@ FastAPI backend for handling Seqera Platform workflow launches.
- `GET /api/workflows/{runId}/details` — Placeholder details endpoint
- `POST /api/workflows/datasets/upload` — Create a Seqera dataset and upload submitted form data as a CSV

## Testing

Run the test suite with coverage:

```bash
# Run all tests with coverage report
pytest --cov=app --cov-report=term-missing --cov-report=html

# Run tests with verbose output
pytest -v

# Run specific test file
pytest tests/test_main.py

# Check coverage threshold (90%)
coverage report --fail-under=90
```

View HTML coverage report:

```bash
open htmlcov/index.html # macOS
xdg-open htmlcov/index.html # Linux
start htmlcov/index.html # Windows (Command Prompt / PowerShell)
```

## Linting and Code Quality

```bash
# Run ruff linter
ruff check app tests

# Run black formatter
black app tests

# Run type checking with mypy
mypy app --ignore-missing-imports

# Install pre-commit hooks
pip install pre-commit
pre-commit install

# Run pre-commit on all files
pre-commit run --all-files
```

## Environment Variables

Required entries in `.env`:
Expand Down
5 changes: 4 additions & 1 deletion app/main.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
"""FastAPI application entry point for the SBP Portal backend."""

from __future__ import annotations

import logging
Expand Down Expand Up @@ -29,7 +30,9 @@ def create_app() -> FastAPI:
if not allowed_origins_env:
raise RuntimeError("ALLOWED_ORIGINS environment variable is required but not set")

allowed_origins = [origin.strip() for origin in allowed_origins_env.split(",") if origin.strip()]
allowed_origins = [
origin.strip() for origin in allowed_origins_env.split(",") if origin.strip()
]

app.add_middleware(
CORSMiddleware,
Expand Down
37 changes: 17 additions & 20 deletions app/routes/workflows.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,9 @@
"""Workflow-related HTTP routes."""

from __future__ import annotations

import asyncio
from datetime import datetime, timezone
from typing import Optional

from fastapi import APIRouter, HTTPException, Query, status

Expand Down Expand Up @@ -36,26 +36,23 @@ async def launch_workflow(payload: WorkflowLaunchPayload) -> WorkflowLaunchRespo
"""Launch a workflow on the Seqera Platform."""
try:
dataset_id = payload.datasetId

# If formData is provided, create and upload dataset
if payload.formData:
dataset_result = await create_seqera_dataset(
name=payload.launch.runName or "workflow-dataset"
)
dataset_id = dataset_result.dataset_id

await upload_dataset_to_seqera(
dataset_id=dataset_id,
form_data=payload.formData
)

result: SeqeraLaunchResult = await launch_seqera_workflow(
payload.launch, dataset_id
)

await upload_dataset_to_seqera(dataset_id=dataset_id, form_data=payload.formData)

result: SeqeraLaunchResult = await launch_seqera_workflow(payload.launch, dataset_id)
except SeqeraConfigurationError as exc:
raise HTTPException(status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail=str(exc))
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail=str(exc)
) from exc
except SeqeraServiceError as exc:
raise HTTPException(status_code=status.HTTP_502_BAD_GATEWAY, detail=str(exc))
raise HTTPException(status_code=status.HTTP_502_BAD_GATEWAY, detail=str(exc)) from exc

return WorkflowLaunchResponse(
message="Workflow launched successfully",
Expand All @@ -77,8 +74,8 @@ async def cancel_workflow(run_id: str) -> CancelWorkflowResponse:

@router.get("/runs", response_model=ListRunsResponse)
async def list_runs(
status_filter: Optional[str] = Query(None, alias="status"),
workspace: Optional[str] = Query(None),
status_filter: str | None = Query(None, alias="status"),
workspace: str | None = Query(None),
limit: int = Query(50, ge=1, le=200),
offset: int = Query(0, ge=0),
) -> ListRunsResponse:
Expand Down Expand Up @@ -143,23 +140,23 @@ async def upload_dataset(payload: DatasetUploadRequest) -> DatasetUploadResponse
except SeqeraConfigurationError as exc:
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail=str(exc)
)
) from exc
except SeqeraServiceError as exc:
raise HTTPException(status_code=status.HTTP_502_BAD_GATEWAY, detail=str(exc))
raise HTTPException(status_code=status.HTTP_502_BAD_GATEWAY, detail=str(exc)) from exc

# Allow Seqera time to finish dataset initialization before uploading
await asyncio.sleep(2)

try:
upload_result = await upload_dataset_to_seqera(dataset.dataset_id, payload.formData)
except ValueError as exc:
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(exc))
raise HTTPException(status_code=status.HTTP_400_BAD_REQUEST, detail=str(exc)) from exc
except SeqeraConfigurationError as exc:
raise HTTPException(
status_code=status.HTTP_500_INTERNAL_SERVER_ERROR, detail=str(exc)
)
) from exc
except SeqeraServiceError as exc:
raise HTTPException(status_code=status.HTTP_502_BAD_GATEWAY, detail=str(exc))
raise HTTPException(status_code=status.HTTP_502_BAD_GATEWAY, detail=str(exc)) from exc

return DatasetUploadResponse(
message="Dataset created and uploaded successfully",
Expand Down
Loading