Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
81 changes: 81 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,81 @@
# Changelog

All notable changes to the InfobΓΊs project will be documented in this file.

The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [Unreleased]

### Added - Storage and Data Access Layer (feat/storage-reading-dal)

#### Storage Layer
- **Data Access Layer (DAL)** with repository pattern for GTFS schedule data
- `ScheduleRepository` interface defining contract for schedule data access
- `PostgresScheduleRepository` implementation using Django ORM
- `CachedScheduleRepository` decorator for Redis caching with configurable TTL
- `RedisCacheProvider` for cache operations
- Factory pattern (`get_schedule_repository()`) for obtaining configured repository instances

#### API Endpoints
- **GET /api/schedule/departures/** - Retrieve scheduled departures for a stop
- Query parameters:
- `stop_id` (required): Stop identifier
- `feed_id` (optional): Feed identifier, defaults to current feed
- `date` (optional): Service date in YYYY-MM-DD format, defaults to today
- `time` (optional): Departure time in HH:MM or HH:MM:SS format, defaults to now
- `limit` (optional): Maximum number of results (1-100), defaults to 10
- Returns enriched departure data with route information:
- Route short name and long name
- Trip headsign and direction
- Formatted arrival and departure times (HH:MM:SS)
- Validates stop existence (returns 404 if not found)
- Uses PostgreSQL as data source with Redis read-through caching

#### Configuration
- `SCHEDULE_CACHE_TTL_SECONDS` environment variable for cache duration (default: 60 seconds)
- Cache key format: `schedule:next_departures:feed={FEED_ID}:stop={STOP_ID}:date={YYYY-MM-DD}:time={HHMMSS}:limit={N}:v1`

#### Testing
- Comprehensive test suite for schedule departures endpoint
- Response structure validation
- Stop validation (404 handling)
- Time format validation (HH:MM:SS)
- Programmatic test dataset creation

#### Documentation
- OpenAPI/Swagger schema generation with drf-spectacular
- API endpoint annotations for automatic documentation
- Architecture documentation for DAL strategy
- README updates with endpoint usage examples and cache configuration

### Removed - Storage and Data Access Layer (feat/storage-reading-dal)

#### Fuseki Implementation
- Removed Apache Jena Fuseki as optional SPARQL backend
- Deleted `storage/fuseki_schedule.py` implementation
- Removed `api/tests/test_fuseki_schedule.py` integration tests
- Removed Fuseki Docker service from docker-compose.yml
- Deleted `fuseki_data` Docker volume
- Removed `docker/fuseki/` configuration directory
- Deleted `docs/dev/fuseki.md` documentation
- Removed Fuseki-related configuration
- `FUSEKI_ENABLED` environment variable
- `FUSEKI_ENDPOINT` environment variable
- Fuseki references in `.env.local.example`
- Updated `storage/factory.py` to use only PostgreSQL repository
- PostgreSQL with Redis caching is now the sole storage backend

### Changed - Storage and Data Access Layer (feat/storage-reading-dal)

#### Documentation
- Updated README.md to document new DAL architecture and API endpoints
- Updated docs/architecture.md with storage strategy and repository pattern
- Added project structure documentation including `storage/` directory
- Removed all Fuseki references from documentation

---

## [Previous Releases]

<!-- Future releases will be documented above this line -->
46 changes: 46 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -190,6 +190,51 @@ docker compose down

## πŸ“š API Documentation

### New: Schedule Departures (Data Access Layer)
An HTTP endpoint backed by the new DAL returns scheduled departures at a stop. It uses PostgreSQL as the source of truth and Redis for caching (read-through) by default.

- Endpoint: GET /api/schedule/departures/
- Query params:
- stop_id (required)
- feed_id (optional; defaults to current feed)
- date (optional; YYYY-MM-DD; defaults to today)
- time (optional; HH:MM or HH:MM:SS; defaults to now)
- limit (optional; default 10; max 100)

Example:
```bash
curl "http://localhost:8000/api/schedule/departures/?stop_id=STOP_123&limit=5"
```

Response shape:
```json
{
"feed_id": "FEED_1",
"stop_id": "STOP_123",
"service_date": "2025-09-28",
"from_time": "08:00:00",
"limit": 5,
"departures": [
{
"route_id": "R1",
"route_short_name": "R1",
"route_long_name": "Ruta 1 - Centro",
"trip_id": "T1",
"stop_id": "STOP_123",
"headsign": "Terminal Central",
"direction_id": 0,
"arrival_time": "08:05:00",
"departure_time": "08:06:00"
}
]
}
```

Caching (keys and TTLs):
- Key pattern: schedule:next_departures:feed={FEED_ID}:stop={STOP_ID}:date={YYYY-MM-DD}:time={HHMMSS}:limit={N}:v1
- Default TTL: 60 seconds
- Configure TTL via env: SCHEDULE_CACHE_TTL_SECONDS=60

### REST API Endpoints
- **`/api/`** - Main API endpoints with DRF browsable interface
- **`/api/gtfs/`** - GTFS Schedule and Realtime data
Expand All @@ -213,6 +258,7 @@ infobus/
β”œβ”€β”€ πŸ“ gtfs/ # GTFS data processing (submodule)
β”œβ”€β”€ πŸ“ feed/ # Data feed management
β”œβ”€β”€ πŸ“ api/ # REST API endpoints
β”œβ”€β”€ πŸ“ storage/ # Data Access Layer (Postgres) and cache providers
β”œβ”€β”€ πŸ“¦ docker-compose.yml # Development environment
β”œβ”€β”€ πŸ“¦ docker-compose.production.yml # Production environment
β”œβ”€β”€ πŸ“„ Dockerfile # Multi-stage container build
Expand Down
21 changes: 21 additions & 0 deletions api/serializers.py
Original file line number Diff line number Diff line change
Expand Up @@ -198,6 +198,27 @@ class Meta:
fields = "__all__"


class DalDepartureSerializer(serializers.Serializer):
route_id = serializers.CharField()
route_short_name = serializers.CharField(allow_null=True, required=False)
route_long_name = serializers.CharField(allow_null=True, required=False)
trip_id = serializers.CharField()
stop_id = serializers.CharField()
headsign = serializers.CharField(allow_null=True, required=False)
direction_id = serializers.IntegerField(allow_null=True, required=False)
arrival_time = serializers.CharField(allow_null=True, required=False)
departure_time = serializers.CharField(allow_null=True, required=False)


class DalDeparturesResponseSerializer(serializers.Serializer):
feed_id = serializers.CharField()
stop_id = serializers.CharField()
service_date = serializers.DateField()
from_time = serializers.CharField()
limit = serializers.IntegerField()
departures = DalDepartureSerializer(many=True)


class FareAttributeSerializer(serializers.HyperlinkedModelSerializer):

feed = serializers.PrimaryKeyRelatedField(read_only=True)
Expand Down
77 changes: 77 additions & 0 deletions api/tests/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
# API Tests

This directory contains test suites for the InfobΓΊs API endpoints.

## Test Structure

### `test_schedule_departures.py`
Tests for the `/api/schedule/departures/` endpoint which provides scheduled departure information using the Data Access Layer (DAL).

**Test Cases:**
- `ScheduleDeparturesTests`: Complete test suite for the schedule departures endpoint
- `test_returns_404_when_stop_missing`: Validates 404 error handling for non-existent stops
- `test_returns_departures_with_expected_shape`: Validates response structure and data format

**What's Tested:**
- Endpoint returns proper HTTP status codes
- Response JSON structure matches API specification
- Required fields are present in response
- Time fields are formatted correctly (HH:MM:SS)
- Stop validation and error handling
- Integration with PostgreSQL via DAL
- Data enrichment (route names, trip information)

## Running Tests

### Run all API tests
```bash
docker compose exec web uv run python manage.py test api
```

### Run specific test file
```bash
docker compose exec web uv run python manage.py test api.tests.test_schedule_departures
```

### Run specific test class
```bash
docker compose exec web uv run python manage.py test api.tests.test_schedule_departures.ScheduleDeparturesTests
```

### Run specific test method
```bash
docker compose exec web uv run python manage.py test api.tests.test_schedule_departures.ScheduleDeparturesTests.test_returns_404_when_stop_missing
```

## Test Data

Tests use Django's test database which is created and destroyed automatically. Each test case sets up its own minimal test data using:
- `Feed.objects.create()` for GTFS feeds
- `Stop.objects.create()` for stop locations
- `StopTime.objects.bulk_create()` for scheduled stop times

## Test Dependencies

- `rest_framework.test.APITestCase`: Base class for API testing
- `django.test.TestCase`: Django test framework
- `gtfs.models`: GTFS data models (Feed, Stop, StopTime)
- PostgreSQL test database with PostGIS extension

## Coverage

Current test coverage focuses on:
- βœ… Schedule departures endpoint functionality
- βœ… Error handling and validation
- βœ… Response format verification
- βœ… DAL integration (PostgreSQL)

## Adding New Tests

When adding new API endpoint tests:
1. Create a new test file named `test_<feature>.py`
2. Import necessary test base classes and models
3. Add class-level and method-level docstrings
4. Set up minimal test data in `setUp()` method
5. Test both success and error cases
6. Validate response structure and data types
7. Update this README with the new test file information
1 change: 1 addition & 0 deletions api/tests/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
# makes tests a package for unittest discovery
17 changes: 17 additions & 0 deletions api/tests/data/fuseki_sample.ttl
Original file line number Diff line number Diff line change
@@ -0,0 +1,17 @@
@prefix ex: <http://example.org/gtfs#> .

# Minimal sample data for Fuseki integration tests
# One departure at stop S1 for feed TEST

[] a ex:Departure ;
ex:feed_id "TEST" ;
ex:stop_id "S1" ;
ex:trip_id "T1" ;
ex:route_id "R1" ;
ex:route_short_name "R1" ;
ex:route_long_name "Ruta 1" ;
ex:headsign "Terminal" ;
ex:direction_id "0" ;
ex:service_date "2099-01-01" ;
ex:arrival_time "08:05:00" ;
ex:departure_time "08:06:00" .
109 changes: 109 additions & 0 deletions api/tests/test_schedule_departures.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,109 @@
from __future__ import annotations

import re
from typing import List

from django.urls import reverse
from django.test import TestCase
from rest_framework.test import APITestCase
from rest_framework import status

from gtfs.models import Feed, Stop, StopTime


from django.contrib.gis.geos import Point
from datetime import time


class ScheduleDeparturesTests(APITestCase):
"""Test suite for the /api/schedule/departures/ endpoint.

This endpoint uses the Data Access Layer (DAL) to retrieve scheduled
departures from PostgreSQL with Redis caching.
"""

def setUp(self):
"""Set up minimal test data: feed, stop, and stop_time records."""
# Minimal dataset for the endpoint
self.feed = Feed.objects.create(
feed_id="TEST",
is_current=True,
)
self.stop = Stop.objects.create(
feed=self.feed,
stop_id="S1",
stop_name="Test Stop",
stop_point=Point(0.0, 0.0),
)
# Create StopTime without triggering model save() logic that requires Trip
StopTime.objects.bulk_create(
[
StopTime(
feed=self.feed,
trip_id="T1",
stop_id=self.stop.stop_id,
stop_sequence=1,
pickup_type=0,
drop_off_type=0,
arrival_time=time(8, 5, 0),
departure_time=time(8, 6, 0),
)
]
)

def test_returns_404_when_stop_missing(self):
"""Verify endpoint returns 404 when querying a non-existent stop_id."""
url = "/api/schedule/departures/?stop_id=THIS_DOES_NOT_EXIST&limit=1"
resp = self.client.get(url)
self.assertEqual(resp.status_code, status.HTTP_404_NOT_FOUND)
self.assertIn("error", resp.json())

def test_returns_departures_with_expected_shape(self):
"""Verify endpoint returns departures with expected JSON structure.

Validates that all required fields are present in the response and
time fields are formatted correctly (HH:MM:SS).
"""
feed = Feed.objects.filter(is_current=True).first() or Feed.objects.first()
self.assertIsNotNone(feed, "Expected fixture to provide at least one feed")

# Find a stop_id that actually has stoptimes
st = StopTime.objects.filter(feed=feed).order_by("departure_time").first()
self.assertIsNotNone(st, "Expected fixture to provide at least one StopTime")
stop_id = st.stop_id

url = f"/api/schedule/departures/?stop_id={stop_id}&time=08:00:00&limit=1"
resp = self.client.get(url)
self.assertEqual(resp.status_code, status.HTTP_200_OK)
data = resp.json()

# Top-level keys
for key in ["feed_id", "stop_id", "service_date", "from_time", "limit", "departures"]:
self.assertIn(key, data)

self.assertIsInstance(data["departures"], list)
self.assertGreaterEqual(len(data["departures"]), 1)

item = data["departures"][0]
for key in [
"route_id",
"route_short_name",
"route_long_name",
"trip_id",
"stop_id",
"headsign",
"direction_id",
"arrival_time",
"departure_time",
]:
self.assertIn(key, item)

# Time fields formatted HH:MM:SS
time_pattern = re.compile(r"^\d{2}:\d{2}:\d{2}$")
if item["arrival_time"] is not None:
self.assertRegex(item["arrival_time"], time_pattern)
if item["departure_time"] is not None:
self.assertRegex(item["departure_time"], time_pattern)

# from_time string formatted HH:MM:SS
self.assertRegex(data["from_time"], time_pattern)
3 changes: 2 additions & 1 deletion api/urls.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,7 +29,8 @@
path("next-trips/", views.NextTripView.as_view(), name="next-trips"),
path("next-stops/", views.NextStopView.as_view(), name="next-stops"),
path("route-stops/", views.RouteStopView.as_view(), name="route-stops"),
path("schedule/departures/", views.ScheduleDeparturesView.as_view(), name="schedule-departures"),
path("api-auth/", include("rest_framework.urls", namespace="rest_framework")),
path("docs/schema/", views.get_schema, name="schema"),
path("docs/schema/", SpectacularAPIView.as_view(), name="schema"),
path("docs/", SpectacularRedocView.as_view(url_name="schema"), name="api_docs"),
]
Loading