Archive for the ‘Life: Work and Techy’ category

Magento: Class Zend_log not found in GoMage Lightcheckout Help file

March 27th, 2013

Do you receive an error message such as:

Fatal error: Class 'Zend_Log' not found in /.../app/code/local/GoMage/Checkout/Block/Adminhtml/System/Config/Fieldset/Help.php on line 48

when running the GoMage Lightcheckout v3.1 plugin for Magento? If so, it’s a quick fix: just open up that file and remove the section:

protected function _getFieldsetCss()
{
$configCss = (string)$this->getGroup()->fieldset_css;
return 'config collapseable'.($configCss ? ' ' . $configCss : '');
}

and your Magento ecommerce shop should start working! (Basically, in a recent version of Magento CE – version 1.7.0.2 or before – that method was set as “protected” in Mage_Adminhtml_Block_System_Config_Form_Fieldset and so the child class GoMage_Checkout_Block_Adminhtml_System_Config_Fieldset_Help was not able to redeclare it).

Magento: Associating customer accounts

March 22nd, 2013

If you run a Magento e-commerce store, you may occasionally find existing customers placing orders without being logged into their account. This isn’t a problem usually, unless the customer is one of those vary rare ones which actually logs into the customer frontend to track/check their order(s). If they weren’t logged in, they won’t see the ecommerce order.

So how do you fix this? Well, there is a Magneto module available to do this, but (as is increasingly common in the Magento world) it costs $99. Or there is my way which is free and requests two SQL statements:

UPDATE sales_flat_order, customer_entity
SET sales_flat_order.customer_id=customer_entity.entity_id
WHERE
sales_flat_order.customer_email=customer_entity.email

UPDATE sales_flat_order_grid,sales_flat_order
SET sales_flat_order_grid.customer_id=sales_flat_order.customer_id
WHERE
sales_flat_order_grid.increment_id=sales_flat_order.increment_id

Done.

Techy: Dedicated server prices compared

February 16th, 2013

We’ve just started our annual “provider audit” and our first “task” was to look at comparable dedicated hosting providers to make sure we are getting a package for us at the lowest reasonable rate. Our requirements are:
* Minimum 4Gb RAM
* 4 CPU Cores at 2.0Ghz or above
* 2x 500Gb hard drives
* RAID 1 or RAID 10 (software or hardware)
* 1Tb monthly data transfer limit
* Minimum of 4 IPv4 IP addresses
* Europe hosting (UK preferred)
* CentOS/RedHat operating system
* Ideally cPanel, IPv6 and remote “console” support
* Not “Cloud/VPS” (we’re going to be using the server for shared hosting and shared hosting on shared hardware just seems silly)

Our budget for this was £200+vat. Here’s what we found:
View in new window.

We haven’t yet made a decision (I must admit, we’ve been extremely happy with Memset for quite a few years now) – it is interesting what all the companies offer. Please remember that this spreadsheet is incomplete in parts (the ? should show where), doesn’t actually assess the providers networks or what support package is offered (i.e. there may be an “unlisted” reason one costs £100pm more than another!).

Is there any one you think we’ve missed?

Gmail: Search for mail between two dates

February 12th, 2013

If you want to search your Gmail/Google Email mail for emails received between two dates, the following search term should help;
before:2013/02/01 after:2012/12/31

Command Line awk Regular Expression for Apache logs

February 10th, 2013

For code testing against a live site, I’ve had to extract all urls from an Apache access file – but how to do this from the Linux command line?

The secret is to use two regular expressions (regexp) in a “awk” command – for example:

cat examine.txt | awk 'sub(/.*(GET|POST) \//,"")&&sub(/ HTTP.*/,"")'

This will pipe the contents of the file examine.txt to AWK which will run two regular expressions. The first one will remove the “phrase” “GET /” or “POST /” and anything before it – and the second will remove the “phrase” ” HTTP” and anything after it. It’ll then give you a nice list of URLs to test.

Oh – and if you’d like it to produce a nice “curl friendly” file of just URLs starting “xyz.php” from host example.com then:

cat examine.txt | grep "GET /xyz.php" | awk 'sub(/.*(GET|POST) \//,"http://example.com/")&&sub(/ HTTP.*/,"")' > curl.txt

should do the trick (combine that with cat curl.txt | xargs -n1 -i curl {} > /dev/null to test)

gamy-dance
%d bloggers like this: