Press "Enter" to skip to content

Category: Snippet

Snippet: Still At Work

It’s gone 7pm and I’m still at work and look to be here for another hour afterwards finishing a recruitment site (I was meant to be able to pass the design onto a coworker, but unfortently they didn’t “meet requirements” and hence I’m having to do all of the work myself). And once I get back home, I’ve got to give the DNS system a really good kicking as, by looking at things, my main website is still showing the “Under maintenance” page (how it can do that with Apache down I’m not sure: attempt to stop Apache comes up with “Apache not running” attempting to start it comes up with “segmentation fault”): and checking on DNS Report shows there’s some really strange things happening with my DNS at the moment.

Groan. Can’t I have an “easy week” for once?

Snippet: Server Failure

My London web server (yep, I’ve got two dedicated web servers at the moment) has just had a minor crash and Apache has failed. Removing and reinstalling and configuring it still pulls up faults, but luckily I’m on the brink of decommissioning that 3year old server anyway – however, it wasn’t scheduled to happen for another fortnight at least to give me time to write all the scripts that run on it (it’s complete CGI driven).

So I’ve just had to copy all the files across to this server and try and juggle everything I can – I think it’s all worked as well as can be expected (still got a few problems with the scripting but I’ve patched around it for now), but I won’t be able to check if the DNS has properly updated for another 12 hours or so. The thing is, I’m not sure if I’ve got the configuration “spot on” – the old server was a totally customised DNS system and now it’s automated: but it’s still got to interact with my external slave nameservers (all 12 of them!). Fingers crossed – but if my blog disappears for a day, you’ll know what I screwed up (hopefully all my other sites will shortly update). It doesn’t really help that I’m “between registry’s” at the moment 🙁

Snippet: Nooo!!!!!

Nooo!!!! 200,000 MySQL inserts comprising of over 2.16Gb of data won’t work because there’s a single comma missing in the SQL insert statement in every single row… Nooo! I just hope I can rescue the data with a Perl regexp or else I’ll have to wipe out the last 3.5days of work and processing (that’s 24/7 processing by the way: in the last ten days, I’ve downloaded over 5.3Gb of data and already made over 20,190,979 MySQL inserts – and I’ve still got all the above to do: working on project DBC is a lot harder than first expected!)

Snippet: Microsoft Word Maximum File Size

*snippet* I’ve just tried loading a 2Gb file into Microsoft Office 2000 Word (thinking that as Word supports “Virtual Memory” at al, it’ll be able to cope with the fact I’ve only got half a gig of RAM). It chugs along for a number of minutes loading the file and then pops up:

Word cannot open this file because it is larger than 32 Megabytes in size.

I never even knew Word had a maximum file size!. But what really peeved me was the fact the Microsoft programmers didn’t implement a very small routine to check the file size before even attempting to open a file – it’s not like it’s a difficult thing to do…

Snippet: Groan… Lots of Data :(

*snippet* I’m currently in the process of rebuilding large sections of my website(s) and need to import a substantial amount of data into the new content management system. By “substantial amount” we’re talking around 2Gb of data(!). However, the data seems very very slightly corrupted (around a quarter of a record every 1million entries) so I have to run another script to correct the corruption and then rerun the parser utility. And I’ll tell you this, even on a 2.4Ghz machine, parsing 2Gb of data takes a looong time. Especially when it fails and you’ve got to restart from scratch.

Of course, once it’s parsed, I’ve then got to import it all into a MySQL database (I’m having it write the SQL statements instead of directly importing it for speed reasons), and then index it (which will take ages: believe me, once you start hitting the half-a-million row point onwards on MySQL it begins to crawl) and then link all the data together and then export it into a suitable format: no way am I going to bog down my server by having it make around 50 database requests per page!

Fingers crossed that I’ll have it all parsed by Sunday…

(That’s why there haven’t been that many blog entries: my machine begins to crawl whilst parsing – and it’s just a command line parsing system anyway: no GUI to slow it down).