Press "Enter" to skip to content

Snippet: 4Gb in, no Gigs out

I spent last week running my computer 24/7 processing several gigabytes of data and eventually filled up a MySQL database table with 4Gb of data (I’ve still got 1Gb to load in!). So I’ve had to redesign the database scheme, branch out to flat file databases and now I’m trying to get the data out of the MySQL database into the flat files…

Except, it won’t export the data now. Every time I try and extract it out using a very very basic Perl script, it bombs out with “Out of memory (Needed XXXXX bytes)”. Despite the fact I’ve got 0.5Gb RAM and 4Gb of “swap space” currently configured.

I need to get this thing sorted ASAP – I want to be able to upload it this weekend (ready for a launch on the 1st of October) and I want to be able to use my PC again. Whilst it’s “number crunching” and processing and whatnot, my machine slows down to a crawl 🙁

3 Comments

  1. Hmmm…

    I think the problem is that you’re trying to do large selects on the database tables and perl is waiting for the complete result set rather than processing row by row.

    Have you tried doing a dump of the data with mysqldump and the -q option? The -q option disables buffering so that mysqldump doesn’t buffer the complete result set into memory.

  2. >> Have you tried doing a dump of the data with mysqldump and the -q option? The -q option disables buffering so that mysqldump doesn’t buffer the complete result set into memory. <<

    Good thinking Peter (I’ve hardly used mysqldump), but to make the data to be useable, I needed to create additional keyfields and everything. Anyway, I’ve had to dump that database now and start afresh, but thanks for the hint! 😉

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.