ERAS is a microservices-based emergency dispatch system designed to enhance emergency response operations. The system processes incoming emergency calls in real-time, extracts critical information, generates actionable suggestions for dispatchers, and facilitates optimal resource allocation.
The system consists of 5 Python microservices, 1 React frontend, and supporting infrastructure:
-
Audio Ingestion Service (
services/audio-ingestion/)- Receives audio file uploads from dispatchers
- Chunks and publishes audio data to Kafka
- Port:
8001
-
Audio Processing Service (
services/audio-processing/)- Consumes audio chunks from Kafka
- Performs Speech-to-Text transcription
- Publishes transcripts to Kafka
-
Suggestion Engine (
services/suggestion-engine/)- Consumes transcripts from Kafka
- Generates suggestions
- Publishes suggestions to Kafka
-
Geospatial Dispatch Service (
services/geospatial-dispatch/)- Manages vehicle tracking and locations
- Provides vehicle assignment recommendations
- Handles route suggestions
- Port:
8002
-
Dashboard API Service (
services/dashboard-api/)- Aggregated REST API for frontend
- WebSocket server for real-time updates
- Consumes from Kafka and broadcasts to clients
- Port:
8000
-
Frontend Dashboard (
frontend/)- React + TypeScript application
- Split-screen layout: transcripts (left) and map (right)
- Real-time updates via WebSocket
- Port:
3000(development)
- Kafka + Zookeeper: Message queue for event-driven communication
- PostgreSQL: Persistent storage for sessions, transcripts, suggestions, and vehicles
- Docker Compose: Orchestration for all services
Audio File Upload
↓
Audio Ingestion Service
↓
Kafka: audio-chunks topic
↓
Audio Processing Service
↓
Kafka: transcripts topic
↓
Suggestion Engine ──→ Kafka: suggestions topic
↓ ↓
Dashboard API ←──────────────────────────────┘
↓
WebSocket → Frontend Dashboard
↓
Geospatial Dispatch Service (for vehicle assignments)
The system uses the following Kafka topics for event-driven communication:
audio-chunks: Raw audio data chunks published by Audio Ingestion Servicetranscripts: Transcribed text from Audio Processing Servicesuggestions: AI-generated suggestions from Suggestion Engine
Topics are automatically created when messages are first published (via KAFKA_AUTO_CREATE_TOPICS_ENABLE: "true").
ERAS/
├── services/ # Microservices
│ ├── audio-ingestion/ # Audio file upload service
│ ├── audio-processing/ # STT processing worker
│ ├── suggestion-engine/ # AI suggestion generator
│ ├── geospatial-dispatch/ # Vehicle management service
│ └── dashboard-api/ # Frontend API gateway
├── frontend/ # React application
│ ├── src/
│ │ ├── components/ # React components
│ │ ├── App.tsx # Main app component
│ │ └── main.tsx # Entry point
│ ├── package.json
│ └── vite.config.ts
├── shared/ # Shared Python utilities
│ ├── types.py # Data models (Pydantic)
│ └── kafka_client.py # Kafka producer/consumer helpers
├── infrastructure/ # Infrastructure configuration
│ └── postgres/
│ └── init.sql # Database schema
├── docker-compose.yml # Service orchestration
├── .env.example # Environment variables template
└── README.md # This file
-
Clone the repository
git clone https://github.com/het1613/ERAS.git cd ERAS -
Verify Docker is running
docker --version docker-compose --version
-
Set up environment variables
Create a
.envfile in the project root if you want to customize defaults:POSTGRES_DB=eras_db POSTGRES_USER=eras_user POSTGRES_PASSWORD=eras_pass KAFKA_BOOTSTRAP_SERVERS=localhost:9092
-
Install frontend dependencies
cd frontend npm install cd ..
Key environment variables (with defaults):
| Variable | Default | Description |
|---|---|---|
POSTGRES_DB |
eras_db |
PostgreSQL database name |
POSTGRES_USER |
eras_user |
PostgreSQL username |
POSTGRES_PASSWORD |
eras_pass |
PostgreSQL password |
KAFKA_BOOTSTRAP_SERVERS |
kafka:29092 |
Kafka broker address (internal) |
GEOSPATIAL_DISPATCH_URL |
http://geospatial-dispatch:8002 |
Geospatial service URL (internal) |
# Start all infrastructure and services
docker-compose up -dThis will start:
- Zookeeper and Kafka (message queue infrastructure)
- PostgreSQL (database)
- All 5 microservices
- Services will be accessible on their respective ports
After backend services are running, start the frontend development server:
cd frontend
npm run devThe frontend will be available at: http://localhost:3000
Check service status:
docker-compose psAll services should show Up status. You can also check logs:
# All services
docker-compose logs
# Specific service
docker-compose logs dashboard-api
docker-compose logs audio-ingestion# Stop all services (keeps data)
docker-compose down
# Stop and remove volumes (clears database)
docker-compose down -vGET /health- Health checkGET /sessions- List all active sessionsGET /sessions/{id}/transcript- Get transcripts for a sessionGET /sessions/{id}/suggestions- Get suggestions for a sessionGET /sessions/{id}/assignment- Get vehicle assignment for a sessionGET /vehicles- Get all vehicles (proxied from geospatial service)WS /ws- WebSocket endpoint for real-time updates
-
GET /health- Health check -
POST /ingest- Upload audio file (multipart/form-data)Example:
# Test with a text file curl -X POST http://localhost:8001/ingest \ -F "file=@/path/to/your/file.txt" # Test with an audio file curl -X POST http://localhost:8001/ingest \ -F "file=@/path/to/your/audio.wav" **Response:** ```json { "session_id": "uuid-here", "status": "ingested", "filename": "file.txt", "size": 123 }
GET /health- Health checkGET /vehicles- List vehicles (with optional?status=availablefilter)GET /vehicles/{id}- Get specific vehicleGET /assignments/{session_id}- Get/generate vehicle assignment- Optional query params:
?lat=43.4643&lon=-80.5204
- Optional query params:
POST /assignments/{session_id}/accept- Accept vehicle assignment
Services are automatically built when starting with docker-compose up. To rebuild:
# Rebuild all services
docker-compose build
# Rebuild specific service
docker-compose build dashboard-api# All services (follow mode)
docker-compose logs -f
# Specific service
docker-compose logs -f dashboard-api
docker-compose logs -f audio-processing
# Last 100 lines of a service
docker-compose logs --tail=100 suggestion-engine- Python services: Edit files in
services/<service-name>/main.py - Rebuild the service:
docker-compose build <service-name> - Restart the service:
docker-compose restart <service-name>
TODO: Mount volumes to avoid rebuilds
Connect to PostgreSQL:
# Using docker exec
docker exec -it eras-postgres psql -U eras_user -d eras_db
# Or using connection string
psql postgresql://eras_user:eras_pass@localhost:5432/eras_db