Automated EDA, interactive charts, a lightweight forecast, and an AI narrative report from your CSV/Excel. No setup. Instant results.
- Live: (https://query-lens-test.vercel.app/)
- API Docs (in‑app): https://(https://query-lens-test.vercel.app/)/apidocs
- © 2025 Himesh. All rights reserved under MIT license.
Upload a file → get answers. QueryLens runs automated exploratory data analysis (EDA), builds interactive charts, explains key insights in plain English, and projects a short forecast when a time series is detected. Open an AI narrative report in a new tab and print to PDF, or grab a server‑rendered PDF in one click.
- Upload CSV/Excel (robust encoding and date parsing)
- EDA summary: missingness, types, duplicates, quick stats
- Interactive charts: histograms, bar/pie, scatter, correlation heatmap, time series
- AI insights: strongest relationships, category dominance, skew, trends
- Quick forecast: trend + simple seasonality with MAE/RMSE/MAPE
- AI Report: clean HTML narrative, printable to PDF
- Backend PDF: ReportLab‑generated, printer‑friendly
- Mobile‑friendly UI, light/dark theme
- Minimal REST API you can integrate anywhere
- Frontend: Vite + React + Tailwind (deployed on Vercel)
- Backend: FastAPI + NumPy/Pandas + ReportLab (host anywhere: HF Spaces/Render/Railway)
- Optional AI narrative: OpenAI/compatible model via
/ai-report
- Privacy‑first: analyses cached in memory (short‑lived), not persisted by default
- Production: frontend proxies
/api/*
to backend to keep your origin private
frontend/
src/
components/ # NavBar, Logo, SummaryCards, ChartGrid, etc.
pages/ # Landing, Upload, Dashboard, Report, Features, ApiDocs
services/ # api.js (API_BASE -> "/api" in prod)
index.html # SEO, OG tags, JSON-LD
vercel.json # /api proxy + SPA fallback
backend/
main.py # FastAPI with /analyze, /analysis/{id}, /ai-report, /download-report
Prereqs: Node 18+, Python 3.10+, pip
cd backend
python -m venv .venv && source .venv/bin/activate # Windows: .venv\Scripts\activate
pip install -U fastapi uvicorn[standard] numpy pandas reportlab openai
# Optional for AI narrative:
# export OPENAI_API_KEY=your_key
uvicorn main:app --host 0.0.0.0 --port 8000 --reload
cd frontend
cp .env.example .env
# For local dev, point to your local backend:
# VITE_API_URL=http://localhost:8000
npm install
npm run dev
# open http://localhost:5173
Production deploy (Vercel with hidden backend)
- Ensure the frontend defaults to a relative API in production:
// src/services/api.js
export const API_BASE = import.meta.env.VITE_API_URL || "/api";
- Add
vercel.json
(at repo root):
{
"version": 2,
"rewrites": [
{ "source": "/api/(.*)", "destination": "https://YOUR_BACKEND_URL/$1" },
{ "source": "/(.*)", "destination": "/" }
]
}
/api/*
is proxied to your backend (origin stays private).- SPA fallback ensures deep links like
/report/:id
render.
- Vercel project settings:
- Framework: Vite
- Build command:
npm run build
- Output directory:
dist
- Env var:
VITE_API_URL=/api
- Deploy and attach your custom domain (optional).
Base (prod): https://YOUR_VERCEL_URL/api
(proxied)
-
POST
/analyze
(multipart/form‑data; fieldfile
)-
Returns:
{ "analysis_id": "uuid", "summary": { "rows": 123, "columns": 7, "missing_total": 12, "duplicate_rows": 0, "dtypes": {} }, "columns": [{ "name": "col", "dtype": "float64", "missing": 0, "unique": 12, "sample_values": [] }], "charts": { "histograms": [], "barCounts": [], "pie": null, "heatmap": null, "scatter": null, "timeseries": null }, "forecast": { /* optional */ }, "insights": ["..."], "detected": { "dateCol": "OrderDate", "target": "Sales", "agg": "sum", "freq": "D" }, "forecast_reason": "", "preview_rows": [{ /* first 50 rows */ }] }
-
-
GET
/analysis/{analysis_id}
- Returns the cached analysis (~24h in‑memory)
-
POST
/ai-report
- Body:
{ "analysis_id": "<uuid>" }
or{ "analysis": { ...full analysis... } }
- Returns:
{ "html": "<semantic HTML narrative>" }
- Body:
-
GET
/download-report?analysis_id=<uuid>
- Returns a printer‑friendly PDF (ReportLab)
cURL snippets:
curl -X POST -F "[email protected]" https://YOUR_VERCEL_URL/api/analyze
curl https://YOUR_VERCEL_URL/api/analysis/1234-uuid
curl -X POST https://YOUR_VERCEL_URL/api/ai-report \
-H "Content-Type: application/json" -d '{"analysis_id":"1234-uuid"}'
curl -L -o report.pdf "https://YOUR_VERCEL_URL/api/download-report?analysis_id=1234-uuid"
Frontend:
-
VITE_API_URL
- Local dev:
http://localhost:8000
- Vercel prod:
/api
- Local dev:
Backend:
OPENAI_API_KEY
(optional) — enables AI narrativeOPENAI_BASE_URL
(optional) — custom endpoint/Azure/proxyLLM_MODEL
(optional) — defaultgpt-4o-mini
frontend/index.html
includes title/description, canonical, OG/Twitter tags, JSON‑LD (Organization, WebSite, SoftwareApplication, FAQ).- Add
public/robots.txt
andpublic/sitemap.xml
for indexing. - Publish helpful pages targeting intent keywords: “AI EDA tool”, “Upload CSV for data analysis”, “time series forecast online”.
- Analyses are cached in memory for convenience; not persisted by default.
- For persistence or multi‑replica, plug in Redis/DB.
- Frontend calls
/api
on the same origin; the backend origin stays hidden behind Vercel rewrites.
- 404 for
/analysis/{id}
: ID was created on a different server or cache expired. Re‑analyze or pass the full analysis to/ai-report
. - CORS errors: ensure requests go to
/api
(proxied), not the raw backend hostname. - PDF download issues: use fetch → blob download in the UI to avoid exposing the origin.
- Advanced forecasting adapters (Prophet/ARIMA/LSTM)
- Saved projects & auth
- Column profiling (drift, anomalies)
- Exports (Excel/Parquet)
PRs welcome! Please open an issue for proposals, keep PRs focused, and add screenshots for UI changes.
This project is released under the MIT License.
Copyright © 2025 Himesh
See LICENSE for the full text.