Hacker News new | past | comments | ask | show | jobs | submit login
How I store server logs in Google Spreadsheets (hakanu.net)
39 points by hakanu on Sept 15, 2014 | hide | past | favorite | 15 comments



I'm pretty disappointed with google sheet at the moment. I built some mildly complex spreadsheets for time tracking, nutrition, planning and such. They worked fine a year ago but I have experienced a massive drop in performance over the past few months. They take a minute to open and awfully slow to update calculations and no longer work reliably offline either. This is on an quad core desktop ffs.

This will kill cloud office apps if what works today doesn't work tomorrow. Cloud should not mean total loss of control of the version of the application. Something like gdocs should be offering frozen past versions of the app to avoid this type of thing being possible. Its not like frozen versions would actually add any maintenance cost to google.

These sheets are small and just have a few array forumulae. Nothing excel 20 years would have broken a sweat over.


Have you upgraded to the newest version of google spreadsheets? We were having some performance issues that sound just like what you're experiencing and as soon as we made the switch, our sheets were super fast.


I upgraded a couple when it was first released but a few basic features not supported so had to abandon the attempt.

I've tried copying a problem sheet to a new file and first impressions are positive. Thanks.


As far as I observed, latest big update improved the performance of large sheets a lot. Plus, using chrome is also making some positive difference. However, copying and pasting large data is still a problem. Instead I recommend to use copy to option within google drive.


The best was definitely the remove of cell limits and row limits. If I remember correctly, it used to be at 50k linrs , which you reach in no time. The new spreadsheets are way faster indeed!


New limits are here: https://support.google.com/drive/answer/37603?hl=en Number of cells: 400,000 total cells across all sheets - this is the one i don't like. If you log substantial amount of columns, it fills up pretty quickly Number of columns: 256 columns per sheet Number of formulas: 40,000 cells containing formulas across all sheets Number of tabs: 200 sheets per workbook GoogleFinance formulas: 1,000 GoogleFinance formulas


You needed to read one more paragraph,

"All spreadsheet limits mentioned above have been removed in the new version of Google Sheets. The new version of Google Sheets should support 2 million cells of data, though please note that extremely large spreadsheets may have slower performance. Learn more about switching to the new version of Google Sheets."


If those server logs could contain customer-identifiable and/or confidential data, storing them in a cloud service might not be a good idea.


The spreadsheets are not open to world. They are restricted to the certain gmail accounts. Don't you think this is safe? Plus, what is the difference between storing in the spreadsheets or parse.com or loggly.com etc? in-house storing is cool but so costly and hard to analyze.


No, this is not safe — Google still has access, and can give additional access to whoever compels them to do so.

This is, indeed, no different from other cloud-based solutions. In-house storage “costly”? Really? Well, what is security worth?


We kinda whipped up something for our own purposes (logging any data using google spreadsheets). It's called Sheetstorm and basically you just post data using the API and it'll automatically create a new sheet or append it to an existing one.

It's handy for logging and analyzing arbitrary stats.

https://www.sheetstorm.io/


This story reminded me of this:

http://www.pcpro.co.uk/news/80885/new-p2p-service-uses-gmail...

Note, it's from 2005, blast from the past.


haha so smart! this sentence hit me "Ever wondered what you might do with the 2GB storage"

such a nice hackery!


To address the speed problem you could still leverage sqlite locally as cache layer in between your in-memory cache and longer-term google spreadsheet cache, and use a repeating job to move the sqlite data over to google spreadsheets (and subsequently delete those rows in the sqlite, keeping the sqlite db at a small size).


that's a good idea actually. It will be safer for retries thanks! For future reference, increasing the number of rows persisted makes the time complexity also increase.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: