Cache invalidation with memcache

“There are only two hard things in Computer Science: cache invalidation and naming things.” — Phil Karlton

Mr. Karlton was not wrong. In my day-to-day job, cache invalidation is something that can easily disrupt releases – the fact that the result from an API is cached can be easily forgotten, for example.  It has often caused us to pause and re-evaluate exactly what our applications are doing.  Some of our APIs are quite static in nature and are cached appropriately (general rule of thumb is that more static data is cached for longer periods of time).  When it comes to update these APIs, there are often many layers of cache to bust through in order to prune stale data.  If we forget to clear one cache, the stale data can again propagate to all the other caches in the stash.  Not cool.

Not unlike an onion, this can definitely cause tears.

Memcache & Redis

Invalidating keys in redis is relatively simple via redis-cli:

redis-cli KEYS "session:*" | xargs redis-cli DEL

memcached on the other hand does not support namespaced deletes, nor does it have a tool to interact with the server.  The only real way to interact with the server is via telnet or similar tool via TCP/IP (such as nc).  This caused a desire to write a tool to invalidate cache quickly so that I could test these problem APIs more effectively. Below is the source code (a bash script) – it requires netcat to be installed and within your path.


function usage {
    echo ""
    echo "$0 [regex]"
    echo "Used to invalidate keys on memcached servers"
    echo ""

function memcache_netcat {
    netcat -q $TIMEOUT $SERVER $PORT

function memcache_delete {
    echo "DELETING: $1"
    RESULT=$(echo "delete $1" | memcache_netcat)

# Parameter is required
if [ -z $1 ]; then
    exit 1

# For each server... (in SERVER:PORT format)
for definition in "${SERVERS[@]}"
    read -ra server <<< "$definition"

    echo ""
    echo "Invalidating keys on: $SERVER:$PORT"
    echo "Searching for       : $1"
    echo ""
    echo "stats items" | memcache_netcat | while read line;
        let "ITERS=$LOOPS % 10"

        # Each STATS ITEM row brings back 10 statistics. Skip all but first row.
        if &#91; $ITERS -eq 1 &#93;; then
            read -ra chunks <<< "$line"

            # If this not a STATS ITEMS row, skip it
            if &#91; ${#chunks&#91;@&#93;} -lt 3 &#93;; then

            IFS=" "

            # Search this slab for the keys it contains
            echo "stats cachedump $SLAB $KEYLIMIT" | memcache_netcat | while read row;
                # If the key matches the search, delete
                KEY=`echo "$row" | tr -d '\b\r' | sed 's/^.\{4\} \(&#91;^ &#93;*\).*$/\1/'`
                if &#91;&#91; $KEY = "END" &#93;&#93;; then

                if &#91;&#91; $KEY =~ $1 &#93;&#93;; then
                    read -ra parts <<< "$row"
                    memcache_delete $KEY




echo "DONE."
echo ""
&#91;/sourcecode&#93;<div class="wp-git-embed" style="margin-bottom:10px; border:1px solid #CCC; text-align:right; width:99%; margin-top:-13px; font-size:11px; font-style:italic;"><span style="display:inline-block; padding:4px;">memcache_invalidator</span><a style="display:inline-block; padding:4px 6px;" href="" target="_blank">view raw</a><a style="display:inline-block; padding:4px 6px; float:left;" href="" target="_blank">view file on <strong>GitHub</strong></a></div>
<h3>Usage example</h3>
To use this script, make sure it is executable and pass a regular expression in as the first argument. As an example, let's invalidate all keys that start with <em>session:</em>

chmod a+x memcache_invalidator
./memcache_invalidator ^session

Here is some sample output, showing that we have deleted two keys (that I added for testing purposes) from local memcache:

jonnu@onion:$ ./memcache_invalidator ^session

Invalidating keys on:
Searching for : ^session

DELETING: session_9fc9575c7eb47fbcdb39c2a872ea74d8
DELETING: session_2bdace452a1904970c457f7ddfd6a132


Suggestions on how to improve this tool are welcome – either comment here or just fork & send me a merge request on github.

Written by .

Why I Love Earthbound (Mother 2)

For those that know what Earthbound is, you may have noticed that it has quite the following online. ‘Following’ doesn’t fully explain it – mentioning the series awakens a feverish and almost cult-like devotion amongst the legions of fans that can be found on the internet. I have always been fond of the game myself and always list it as one of my all-time favourites, but how has the game become a cult classic when it sold so poorly when it was released?

First, let me rewind for those few that do not know what Earthbound is. ‘Earthbound’ is the English name given to the second game in the Japanese ‘Mother’ series of video games. It was released in North America in 1995 for the Super NES. The Mother series were created by Shigesato Itoi for Nintendo. Mr. Itoi is also notable for having been the voice of Tatsuo Kusakabe, the father of the protagonist in the Japanese version of “My Neighbor Totoro” (which is co-incidentally one of my favourite movies). There are three games in the series: Mother, Mother 2 (known as Earthbound in the West, which I am focusing on here) and Mother 3. Earthbound is unique in that it is the only one of the three that saw an original release outside of Japan.

The game follows a boy named Ness, who is contacted by a time-travelling alien from the future about a powerful being that has taken over the universe named Giygas. The life-form (who appears to Ness as an insect named Buzz-Buzz) instructs Ness to destroy this menace in the present before it has the chance to become too powerful. Ness sets out on a journey, seeking out three friends and visiting eight special ‘sanctuaries’ in the world in order to become powerful enough to defeat Giygas.

“So”, you may ask, “Why does the game have such a devoted following?” Well, the game is quirky – almost to a fault. It forgoes the standard RPG tooling of a fantasy world, swords, demons, princesses, magic and a foreboding atmosphere, and replaces it with a light-hearted look at the West through a lense that is heavily coloured by pop culture. For this reason alone the series is close to unique, and the world in which Earthbound takes place is disarmingly endearing. Set in the 1990’s and in a world that feels distinctly American (called ‘Eagleland’), the cast of characters use childhood toys and household objects in lieu of swords and magic to make progress in their quest. In place of battling grotesque monsters the party face stray dogs, bizare inanimate objects come to life and other denizens driven mad by Giygas’ influence.

The game leaves many questions open to the interpretation of the player. The complex interpersonal relationships between the characters seem almost out of place in a game where the cast of protagonists are all middle-schoolers, but they enrich the world and make it more believable. Why does Ness only interact with his father via the telephone? Why is Pokey so obnoxious and passive-aggressive towards Ness? What is the relationship between Ness and Paula? Having to call home to speak to Ness’ Mom to alleviate ‘home sickness’ is another great touch. The game world is filled with rich ideas.

Of all the RPGs I played on the SNES, Earthbound was one of the few that really stuck with me. It really is like a surreal acid trip in places. The front-facing battle system was initially disappointing, having preferred both the side-on and isometric styles that ‘Final Fantasy’ and ‘Super Mario RPG’ employed respectively. Disappointment soon wanes though as soon as the dialogue is presented, both in and out of battle. It is both witty and quick-paced. The initial ‘breaking out’ into the open world within the sleepy town of Onett is a great experience – the streets are filled with interesting stores and houses, and the citizens are delightfully condescending (you are school children after all). The lack of random battles in the field is refreshing – instead the local ‘bestiary’, including a gang of punks called ‘The Sharks’ who hang our neat Onett’s arcade, chase after you.

Difficulty curves are very difficult to balance in RPGs, and they often fall into the trap of either being too grindy in nature or an absolute cakewalk. Earthbound managed to walk the line very well, despite having a couple of encounters that veered towards the ‘That One Boss‘ trope (I am looking at you Diamond Dog & the lights-off event at Fourside Dept. Store). The storyline is varied and always engaging, the environments that the party visits are vibrant and the dialogue presented by the citizens of Eagleland never fails to raise a smile. It is a genuine delight from start to finish, even if it is close to two decades old.

Getting hold of the game cheaply can be difficult. It is highly prized by collectors due to it’s relatively small run (140,000 copies in North America). Copies on eBay still regularly go for upwards of $200 USD, and if you have a complete copy with the box, manual, guide and fabled “Scratch ‘n Sniff” cards, you could be looking at around $1,000 USD. Prices have also been driven up by the lack of availability of the game on services such as the Wii’s Virtual Console. Due to the repeated riffing on pop culture (and almost blatant plagiarism when it came to the sound track) it was deemed unfit for re-release in its current state.

So there you have it. A game that stands the test of time and one that all RPG fans should play. From the family of Exit Mice (It’s a very smart mouse), the house at Beak Point, the cult of Happy Happyism and Ness’ lazy dog through to the village of Mr. Saturns at Saturn Valley and swapping a lost contact lens for a pair of dirty socks – it oozes charm and is comfortably one of the best console RPGs available. Now, where did I put my copy of Mother 3…

Written by .

PNGs & browser colour management

Subtle colour differences in hex #3FA868 between browsers

Ahh, the joys of colour management. Within the realms of web development, managing colour can be a real pain. It is a well-known fact that browsers are already guilty of subtle variations in how they render web pages, but it can be true of how they render colour too.

You might have noticed when saving PNGs that the colour varies ever so slightly between different browsers (usually Firefox). The image above shows what should be #3FA868 between Chrome and Firefox, both running on Mac OS X. For designers that like their websites to look the same in all browsers, this is evidently a problem, and moreso when trying to blend an image into a background colour.

The problem stems from how each browser handles colour. Images often come with something called a ‘colour profile’ embedded within them which allows displays to be calibrated in order to give the best colour. In the above example, the difference is caused by Firefox rendering the image with the colour profile, whilst Chrome opts to ignore it. You can, if required, turn this on in Chrome.

There are two solutions. If you are using Adobe Photoshop, ensure that PNGs are saved using the sRGB colour profile (under ‘Save For Web’). The second, and my preferred method, is to strip out the colour profile from the PNG file. This has the added bonus of shrinking the file size (sometimes as much as 25%).

This is done with a tool called pngcrush which is a command-line tool.  If you are not confident with using the command line there is a GUI alternative which embeds pngcrush‘s functionality, called trimmage (also available for Windows).

Stripping out an images colour profile can be done with the following command:

pngcrush -q -rem gAMA -rem cHRM -rem iCCP -rem sRGB oldfile.png newfile.png

After performing this many, many times, I found it quite labour intensive.  This is mainly due to pngcrush not allowing you to write over the old file in place since the source and destination are the same.  In order to bypass the monotony, I wrote a quick shell script that will convert all PNGs in the current folder:


echo " "
shopt -s nullglob
for file in ./*.png; do
	echo "Working on $file"
	pngcrush -q -rem gAMA -rem cHRM -rem iCCP -rem sRGB $file "$file.tmp"
	mv $file "$file.old"
	mv "$file.tmp" $file
echo " "
echo "Complete."
echo " "

# Remove old files
rm -rf *.png.old

I tend to save this in a folder listed in $PATH (such as /usr/local/bin) for easy access – et voila!  PNG-based headaches are now a thing of the past.  Unless of course, you want to start a discussion on transparent PNGs and IE6…

Written by .

OSX, dot underscore and .DS_Store

Like most developers these days I consider myself platform agnostic.  This has led me into several jobs where I develop exclusively on OS X, but store and stage work on non-AFP server volumes.  Of course this is no problem but it does come with its own set of idiosyncrasies.

One such annoyance is the automatic creation of ‘dot underscore’ files.  These metafiles quickly litter any non-HFS+ formatted drive (a common issue in a mixed-platform environment), and are irritating when it comes to tasks such as version control (although you can have them ignored) and archiving directories with tar. By far the easiest method for disposing of these files in OS X is via a tool called BlueHarvest.  The only downside is it’ll cost you – $14.95 USD at the time of writing.

So, is there a free alternative?  Well, yes – it does however involve a little bit of work.  You can recursively remove the offending dot underscore files with the following one-liner:

find . -name '._*' -print | xargs rm

If you find that you are having issues due to files containing spaces, you can get around this by using the following command instead:

find . -name '._*' -printf \'%h/%f\'\\n | xargs rm

This will find all matching files recursively that match the glob pattern ‘._*’ and print them, then each result of this command is piped through ‘rm’.  The other common sight is the .DS_Store file.  These can be removed using the same snippet as above (switching ‘._*’ for ‘.DS_Store’), or you can suppress the automatic creation of resource fork files (.DS_Store) with this snippet:

defaults write DSDontWriteNetworkStores true
Written by .