Documentation site for the Gnosis Analytics platform, built with MkDocs Material.
Live site: docs.analytics.gnosis.io
pip install -r requirements.txt
mkdocs serve
# http://127.0.0.1:8000mkdocs build --strict # static site → site/
docker build -t cerebro-docs . # Docker image (Nginx)Builds now also generate LLM-facing artifacts in site/:
llms.txt-- curated public indexllms-ctx.txt-- core docs and research corpusllms-ctx-full.txt-- full public corpus*.html.mdmirrors for each page in MkDocs navigation
These files are generated from mkdocs.yml navigation and page metadata. Do not maintain them manually under docs/.
Model catalogs, API endpoints, and dashboard metrics are auto-generated from live data sources. Sections between <!-- BEGIN/END AUTO-GENERATED --> markers are overwritten by these scripts.
python scripts/update_docs.py # regenerate from dbt manifest
python scripts/update_docs.py --dry-run # preview changes
DUNE_API_KEY=xxx python scripts/sync_dune_queries.py # sync 1,288 Dune queries
python scripts/sync_dune_queries.py --cache dune_cache.json # use cached datadocs/
├── getting-started/ Platform overview, quickstart, architecture
├── data-pipeline/ Ingestion, crawlers, transformation
├── api/ Authentication, endpoints, filtering, rate limits
├── models/ dbt model catalog (8 modules, 387 models)
├── esg-reporting/ ESG methodology, carbon footprint, data pipeline
├── mcp/ MCP server tools, reports, setup
├── dashboard/ Sectors and configuration
├── developer/ Adding endpoints, models, scrapers
├── operations/ Infrastructure, deployment, monitoring
└── reference/ Env vars, glossary, data dictionary, Dune queries
- Edit markdown in
docs/ - Preview with
mkdocs serve - Verify with
mkdocs build --strict - Open a PR
