Project documentation
AI for Kuala Lumpur — Documents
This page is the full documentation layer of the project. It explains the origin of the platform, the business problem it addresses, the architecture choices, the data logic, the AI copilot design, the implementation roadmap, and the value of the project from both an engineering and recruiter perspective. It is designed to act as a hybrid between a product handbook, a README, an interview support, and a portfolio-ready technical narrative.
Why this project exists
Project vision
How the platform was born
Project story
What the project is trying to solve
Problem statement
Operational and business value
Why this project is useful
Product walkthrough
Current pages and features
How city signals are represented
Data sources and APIs
• traffic APIs (Google Maps, TomTom, Waze-like services)
• air quality APIs (OpenAQ, governmental sensor networks)
• weather APIs (OpenWeather, Meteostat)
• transport APIs (GTFS feeds, city transport data)
• event / disruption feeds (public notices, city incidents)
How data moves across the platform
End-to-end data pipeline
• raw signal generation
• live cache storage in Redis
• API serving via FastAPI
• analytical storage in DuckDB
• transformation logic in dbt marts
• consumption by dashboard pages and AI endpoints
Why both layers are necessary
Live data vs warehouse analytics
How the assistant reasons
AI copilot and RAG logic
Current and target architecture
Technical stack
Why it was designed this way
Architecture logic
What made the project hard
Main difficulties encountered
What will be added next
Implementation roadmap
Enterprise readiness
Why this roadmap matters
Real-world constraints
Deployment challenges & technical decisions
- Local mode: full producer/consumer streaming with Redis
- Public demo mode: frontend-driven live simulation
This demonstrates the ability to adapt architecture to real constraints while preserving product experience.
What the finished platform becomes