Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My story for Datasette and Big Data at the moment is that you can use Big Data tooling - BigQuery, Parquet, etc, but then run aggregate queries against that which produce an interesting ~10MB/~100MB/~1GB summary that you then pipe into Datasette for people to explore.

I've used that trick myself a few times. Most people don't need to be able to interactively query TBs of data, they need to be able to quickly filter against a useful summary of it.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: