Author Archives: John Gibbard

Continuous Improvement?

A little thing I did this afternoon.

I don’t Like numbers

There’s a not-unreasonable perception that agencies are staffed with people who are preoccupied with the new and trendy. At Dare it’s fair to say there are gentlemen that don’t wear socks. There are girls who were in maxi-dresses last year and preposterous jumpsuits this year. There are even people that wear hats and scarves in June. But that’s just textiles. When it comes to digital I’d challenge you to find a more cynical bunch. Propose an idea in an internal meeting to make use of the latest in social networkery and you’d better have some solid evidence to prove it works because you are going to face a barrage of critical analysis from 24 year old juniors to 45 year old seen-it-alls.

With this in mind it’s been a matter of debate this week that perhaps the sheer volume of requests we get from clients asking us to take them (for example) from 10,000 to a million fans on Facebook, is getting to be a bit of a problem. I mean, how important to a brand is a Facebook fan?

Consider an example from Taco Bell in the states. During a spell of bad news for the Mexican fast-food chain where they were challenged on the volume of beef in their Tacos, the Yum! brand reached out to their fans as part of a $4m ad campaign. Presumably these people – who let’s remind ourselves had actively said they were fans of the brand on Facebook, would be up for some positive activity? Over the period in question they’d swollen their fanbase from 500k to 6 million.  Jonathan Blum explains that they offered these fans free food … hey friend, come get some free tacos, on-the-house.

What happened was that just 200k people did. That’s 3%.

97% decided not to take-up the offer of free food from a brand they liked.

When you can’t even give away free food to people who like you, how can you possibly expect people to pay for it? But we’re being asked to generate bigger fan numbers with the assumption that this equates to more sales somewhere down the line.

So, what’s at work here? Why did so many people look the gift horse in the mouth and walk away? What are the implications?

Friction
I don’t know how the mechanic was resolved logistically. It would have to be easy to redeem the offer. If you’re on Facebook it’s not that likely you’re in Taco Bell right then or perhaps even in the frame of mind for a taco. You might see the offer but unless it’s promoted in-store and can be obtained and redeemed at the point of transaction then there’s sufficient friction that customers (fans) are less likely to follow through with the process. It doesn’t take an expert in ethnography or service design to see that printing a voucher at home and remembering to take it next time you get a taco is a bit clunky.

Mixed messages
On the one hand there’s a lawsuit alleging your food is only 50% beef and at the same time you’re offering it up for free. What does that say about the quality and the value you place on your product? Does it display confidence in your taco or does it smack of desperation?

Like ≠ like
Perhaps we don’t actually care that much about the brands we like. Like has become a substitute for ‘join in’ on Facebook. In order to see or interact with content we have to Like it. We might not actually like the nrand but are just intruiged to see what the fuss is about – Like has become nothing more than a threshold. The freedom with which likes are handed out has devalued them; it was presumed that the peer pressure of being seen to like something you clearly wouldn’t would act as a moderator. For example, I might want to see what all the fuss is about Justin Bieber’s page but to like him would be to broadcast that to much derision amongst my friends. The reality is that I can hide this like instantly to spare my blushes but still it counts as another positive vote for the precocious little twerp. Many of those 6m Taco Bell fans aren’t real fans then, they’re just people who had a passing interest or were perhaps mindlessly clicking away on anything they recognised. What would scarcity do to modify this behaviour? Perhaps if we could only issue three likes per week we might think more careful about where we used them.

They just didn’t see it
Data at AllFacebook.com suggests that the magic of EdgeRank (Facebook‘s ostensibly-intelligent method of prioritising content for you) means that  only 3%-7.5% of fans actually see business page posts in their feed. The reasons for this are not entirely understood (Edgerank isn’t transparent) but brands that are posting infrequently or with low-engagement content for example aren’t going to be helping themselves.

But but but…
What about those people that did take up the offer and do see the posts? Their numbers may be small but are these the mavens and connectors, the influencers? Before we entirely dismiss the idea of the fan we should at least acknowledge the benefits of the engaged superfan.

Part of the problem is that Facebook’s irritatingly quantifiable. Chief Marketing Officers and their subordinates can hang their targets on tangible numbers – more fans, beat our competitors, more likes than last year etc. etc. This is data they can see daily, it’s not something they need to commission research for or wait until the next quarter. He or she can log-in at home or in the office and beat their agency with a stick as the numbers rise and fall in real-time. Trouble is, the acquisition of these fans costs money (c. £10 per fan in marketing spend some say), even more when you’re honest and realise that something like 1/10 fans is likely to be a truly engaged one.

It’s because of these superfans that I’m reluctant to call bullshit on the whole Facebook thing. It’s still important to (ahem) fish where the fish are. There are plenty of us swimming about in the big blue sea, but just trying to get loads of us into your net and assuming we’ll all eat your taco… (this isn’t working is it?)

Let’s just teach some of this to our clients and make sure that proposals aren’t about numbers but are about genuine engagement, conversations that are acted upon and activities like voucher redemption are as free from friction as possible. Let’s not go around issuing desperate calls for people to share, let’s think instead about strong scarce engagement ideas for some brands and tactical offers in volume for others. I’m glad I’m surrounded by so many cynics, I just wish they’d wear better clothes.

Tagged , , , , , , ,

Carrots and Sticks Wielded at the RSA

In doing a little research for some behavioural change theory as part of my day-job I came across this wonderfully brief talk that Ian Ayres did at the RSA back in April. I’ve been toying with carrots and sticks (I think both approaches can be wonderfully split-tested online) in my own work particularly around financial services. However, Ian introduces the idea of the anti-incentive and it’s a bit of a head-scratcher that I’m going to spend some time exploring for my clients. I think it’s got some potential but it’s perilous in terms of setting oneself up for quite the outlay should it be implemented incorrectly. So, without further ado, take a moment:

> More on anti-incentives found by Liz Danzico

Tagged , , , ,

Press Here to Play

 For some reason this new bit of work from my colleagues here at Dare has me clicking the video ‘play’ call-to-action a lot. Still not got to the bottom of whether it was intentional or not…

Wood Preservation Society: Participation Marketing At Its Worst

It’s been a great and sunny weekend in this Sceptered Isle. Britain has come out (well, at least in this rather Royalist corner of Surrey) in red, white and blue.  There’s a frisson of wedding buzz and I even saw several amorous middle-aged couples dry-humping in the evening sun on the grassy banks of the Thames as I enjoyed my first post-marathon run last night. A weekend of BBQs, Zinfandel Rose and wall-to-wall sunshine will do that to our repressed Northern Hemisphere blood.

Imagine how sad it was therefore to find it all come crashing down last night during the ad-break to the Suspicions of Mr. Whicher on ITV.  To the unmistakably British melody of ‘The Self Preservation Society’ tune written for the Italian Job, the British TV view was subjected to the latest ‘wood preservation society’ advert for either Cuprinol/Ronseal (so clear I couldn’t remember which one it was), it encouraged people to sign-up to their (turns out it’s Cuprinol) Facebook page and to presumably tell lots of stories about painting and preserving external wood.

This is Britain at its worst. A mind-bogglingly stupid consequence of a creative idea that spun out of that song from The Italian Job. The incentive (there HAS to be an incentive because clearly nobody in their right mind cares that much about external wood preservation) is to win a shed or something. But of course you do benefit it other ways: ergo … tips … (there’s always a market for tips isn’t there? “Try using a brush!” “Consider sealing your wood when it isn’t raining!”) and inspiration (“Why not use green, instead of brown, or brown instead of green if you prefer!”) and the worryingly vague ‘other updates’ can also be found lurking behind this terrible stock shot of yummy mummy in pink gloves watering her pansies. Presumably they have people hanging on the line for the latest in ICI’s wood-preservation solvent technology.

I hate Facebook participation campaigns so very much now.

EDIT: I have since discovered this ‘idea’ has been around since 2009, and as this post explains even more acerbically, it was pretty lame first time around…

Tagged , , , , ,

Marathon: What went wrong?

I have waited almost a week to write this post. I’ve composed at least some of it many times over and started composing it close to midnight last Sunday, 17th April. So, what went wrong on Marathon day?

Firstly it’s important to state that for many people 3:57:32 is a perfectly reasonable – even impressive – time to complete a marathon in. Friends, family and colleagues have assured me that ‘sub 4 hours’ seems pretty good. Only it wasn’t. I don’t see myself as an average amateur, I didn’t at any point through my training. I knew this was training that saw me run a personal best (PB) of 1hr 42″ for the half marathon just 12 weeks after starting the programme on Boxing Day, December 26th 2010. Plugging that time into the McMillan running calculator had given me a predicted 26.2 time of 3:35:28. From that, an A-goal target of 3:40:00 was set (with a B goal at 3:45:00) and subsequent training runs were tweaked to aim for a 5’11″/km (08’24″/mile) marathon pace. I wasn’t going to shoot for the 3:35, I was playing it safe for an achievable goal.

What happened?
A familiar tale to those of you who have spoken to me this week but by mile 1 I was 2 minutes outside A-goal pace and hugely frustrated by the congestion. I was tense and feeling it more than I usually do, right the way through to at least 1 hour – concentrating on not weaving too much or getting slowed down. By 13 it was hurting and I was at 1:53:22 – about 9 minutes down. Descending to Docklands I was feeling miserable, experiencing unfamiliarly heavy legs and ‘knowing’ I was too slow; at 17 miles I pulled up and gave in to the central governor and walked for two minutes, stretching off and then running a clean mile or so before pulling up again. At that point the pattern was set. The next ten miles I think I walked about 5-6 times (the Runkeeper data is unclear) for various spells, all under 2-minutes. By which time the damage was done.

On the day I did too many things differently
Music: In almost all of my training runs I have run with music or a podcast. On marathon day I’d heard and read that the atmosphere is such a huge part of the event that it would be almost churlish to run with headphones in. So I didn’t. I didn’t take strength from the crowds so much as I missed the monotonous pounding of generic dance music that helped me throughout my usual training runs to dissociate from what my body was saying.

Fuel: It was a hot day (more on that later) and despite knowing the course was peppered with Lucozade and water, I wanted to stick to what I’d trained with (sensible) so I ran with 500ml of Lucozade Lite and aimed to drink most of it by 13 and all of it by 20, relying on water and regular on-course drinks for the final 6. As it happens I just didn’t drink regularly enough in the first half of the race and by 10 I had most of my drink left, by 13 probably over half remained. Coupled to this, I added a little flexibility in my gel strategy too, taking the first gel around 70 minutes and then not really knowing what to do about the remaining gels, roughly taking them between 25 and 40 minutes apart. On my long runs I had been disciplined at taking them every 30 minutes from the 70 minute mark. Perhaps this lack of hydration and possible bonking (depeletion of glycogen stores and reliance on fat burning) was making my legs feel so damned heavy?

Walking: I never walked in my training runs. Even on runs that averaged faster than I ran on marathon day and on runs that exceeded 20 miles. Once I’d given-in it was psychologically impossible not to fall into the pattern of walk-run. It might be fine by Jeff Galloway but it’s not fine by me, I signed-up to run 26.2, not walk it. Perhaps it allowed me to finish at all, I’ll never know.

Panic: The start was not good. My position in pen 6 (of 9) was because I’d underestimated my finish time when I applied two years ago. It meant I was running with people that were not sub 4 runners and it meant that ahead of me were a group of runners all linked together for charity. Within 600m of the start line we were standing still as people peeled-off to pee. When mile 1 ticked by I was already stressed about an even paced run and that tension remained through the congested period of the first few miles. Tragically, my data shows my first 5km I cleared in 28″ and the 10km in 54″, I was slow but not crawling like I felt. But that stress and weaving (as well as the odd stride to clear the pack) had taken its toll.

External factors
These are not excuses but more contributory factors. I’m convinced my own mistakes (above) were more important to the final time but those mistakes were made, in part, by the following factors:

Heat: The temperature rose steadily throughout the race to a maximum at 13:45; Wolfram Alpha shows it peaked at 19 C (66 F) but I believe I heard on the day that it went higher in central London. My two longest long-runs were run in the range 5 C – 13 C (40 F – 55 F). Tom Williams on Marathon Talk mentioned after the high temperatures the week before at the Brighton Marathon that it should only account for c. 6 minutes variation in time based on results in the hot 22 C (72 F) 2007 London Marathon. Clearly I wasn’t used to it and I reacted badly, it was then compounded by poor hydration early on. Something my attempts later in the race to take on water more regularly failed to correct.

Congestion & Crowds: I should have run an organised race in my training plan. I didn’t and I forgot what it’s like to run both behind and around fellow runners, and even from experience in other events, the race was particularly bad. The first miles out of Blackheath are heavily congested and do not always use the full width of the road. There are toilets at 600m from the start which creates a bottleneck; there were runners with slow predicted times and running in groups; there were no pacers in my pen and I had assumed I could rely on running with a pacer to keep me in check for the first half.

Nike+: I use the sensor (non GPS) version and the short strides in the first part of the race meant it wasn’t tracking my pace properly so it counted in the first mile way before I reached it. It was then innacurate for the remainder of the race, right the way through to the point it packed-up entirely on my iPod Nano (6 Gen) after just under 30km thanks to too much sweat and an erroneous ‘end workout’ action which was never given. In all of my training runs I’ve used this – well calibrated – to judge my pace. I had nothing giving me clear data on the day. This made me feel I was too slow. As it happens, my Runkeeper app was chugging away in my back pocket and recording that I was running some great splits: from 5km to 10km I was under 5:16/km and at peaked at 4:56/km (under 8 minute miles) at the 13km (8 mile) mark. In future I will not run a marathon without a GPS watch.

Final thoughts
So it wasn’t atrocious at all but it wasn’t at all what I wanted. It’s a day that I wanted to enjoy but I didn’t do it justice. I experienced some amazing moments that eclipse anything I’ve ever done in sport, turning on to Tower Bridge to face the crowds, crying weakly as I got my medal. The crowds are joyous but I’ll be honest and say that at times you just don’t care, your selfishly wrapped up in your own world – even without the headphones.

I know I’ve moaned a bit, even whinged at my performance but reading the wonderful Sir Jog A Lot post this week I feel a little better. A seasoned runner, a pacer for Runners World, also got caught out by a few of the same issues.

I’ll be back to apply on 26th April for the ballot and I’d consider a charity place if that doesn’t work out. I know I might not get my best runs at London unless the stars align, and might look outside the city events. But in the meantime I’m back into training with at least the Humanrace Kingston 16-miler in my schedule for the Autumn. And that 1:42 half marathon can do with a little trim too…

Tagged , , , , , , , , , ,

Marathon Lessons From The Past 16 Weeks

Opening Titles for the BBC Sport Coverage

I started my 16-week training journey on Boxing Day (26 December) and it culminates on race day on Sunday. Over that time I’ve run, read and eaten all I could to prepare myself for the day. I’ve done that whilst sporting a chronic tendonopathy in my knee, holding down a full-time job, commuting and coping with a bathroom renovation in a one-bathroom house. I’m not saying that this is amazing by any stretch – thousands run marathons each year and many do so with far more obstacles in their life – but frankly it is because my life is so ordinary, so middle-class, so typical that I thought I could share a few tips. It’s not arrogant to think that at some point friends and colleagues might attempt the same and might come to me for some help (as I did with others) so this is pre-emptive:

Start with a running base. I started a 16 week plan that already required the first Sunday run was about 30-40 minutes and even with slow progression in volume it built to an hour+ quite quickly. If you can’t yet run non-stop for 40 minutes or run for 30 minutes 3-4 times a week then that’s your first port of call before you start marathon training. So, if you’re doing London in 2012 then perhaps building your running base through November and December would be wise.

Budget-for and find a good sports rehab specialist. Plenty of places offer ‘physio’ and plenty offer ‘massage’. Look for a local person that understands your work obligations and – importantly – understands running. I’ve had 4 or so sessions of deep-tissue massage both preventative and curative at a cost of about £50 each time. Consider it a monthly cost for an otherwise stupidly cheap activity. Even if you don’t feel sore then have one anyway.

Listen to Marathon Talk. Honestly, as much as I read Sam’s book cover-to-cover, the weekly delivery of fresh news, genuine trials and tribulations of fellow runners and the sense of community that they build is inspiring. It reminds you on a run why you run. It’s a podcast but that doesn’t mean you need to own an iPod, see the Marathon Talk site for full details.

Run your long runs s..l..o..w..l..y, or at least easily. This was something I really struggled with. A siginificant reason for this was my obsession with logging each run on Nike+ & Runkeeper. By sharing the data on Facebook/Twitter it meant that I was reluctant to show slow times. This was stupid. Running 2-3 hours at close to marathon pace each week puts a lot of strain on your body for little gain. I got plenty of niggles in my calves, achilles and knees – so much so that I regularly missed my Monday runs in the middle parts of my training. Psychologically the feeling of extreme fatigue after those runs makes you fear them more too. If I’d treated them as easy sessions and run them at the 5’30″+ pace I should have done I’d have covered less miles on those sessions but more miles overall and progressively I would have seen more than the 400 metre improvement between my 11-week and 13-week 3hr runs. I ran my long runs alone or ahead of my wife but next time I would run with friends and talk along the way to keep the pace easy.

Sam’s training for beginners urges you to run to time in your long runs. That is, run for 3 hours rather than shoot for 21 miles. This was fine for me up until the final few weeks when I started to realise that three hours was not getting me (even at the faster pace) close to the 22 miles I really believe I should have covered on my longest of long runs. Her ‘experienced’ training programs do cover miles, I guess it’s fair to say that after 13 weeks I just didn’t feel like a beginner any more (and to some extent I never was).

It could be that post-event I find even more things that I’d like to talk about but I’m not going to promise an update, I’ll just leave this here nicely archived to link-to when someone asks me about it in future.

So, as the build-up continues, how about those of us running get all excited by the BBC’s wonderful opening title film to this year’s event.

Tagged , ,

Virgin London Marathon: Countdown

This week I’ll try and post a couple of things as I lead in to one of the biggest days of my life on Sunday. There are a large number of people blogging about their marathon this year (estimates put the figure at 20% of this year’s entrants) but these two stood out: Gemma Bardsley’s Beats Per Mile is a well-engineered assimilation of running data from RunKeeper, the music she will listen to and photos from points around the course taken at the time she may be passing. It’s no surprise the Gemma and her developers (Marc & Howard) are in the industry , the design is simple and sophisticated and bang on-trend with a single scroll approach. A great showcase for all concerned.

Secondly, I’m a fan of Foot4ward simply because it’s content-rich. Plenty of regular posts, diverse content (data, stories, cartography) and some charming photos. So many event-motivated bloggers set up a page and post infrequent, repetitive and impersonal material. Once you’ve read one mid-pack runners miserable grumbles about training you’ve read a lot so I’m always looking for something a little richer. Foot4ward is just about the only non elite runner’s blog I’ve added to my Google Reader this year.

There’s a good reason I didn’t do a new blog, I simply couldn’t design/code stuff like this, as much as I might sketch it out. I will, however, be revealing my progress live on Runkeeper during the race. More on that later.

I promise this is the last time I’ll mention it

Just a little note to readers new and old that we* won our first award for myFry last night at the Media Guardian Innovation awards. It was nominated in the Mobile App category alongside the well-known iHobo work (which won two other awards last night).

You’ll probably already have seen my thoughts on my time developing the app at Dare, and they still ring true today. Even though I still read the odd critical comment (and then assume it’s naive criticism) the truth is, as the awards recognised and Stephen himself states here – it fundamentally challenges the notion of linear reading for certain texts and it – more than many other alternative digital readers – understood the culture and physical environmental context in which this book would be read.

And that, said Pooh, is that.

—–

* – Dare, Stefanie Posavec & Penguin

The Waitrose Redesign: Perspective Required

This week eConsultancy’s report on the apparent usability calamity of the new Waitrose site has been widely shared: “New Waitrose website panned by users“. People queued up to take pot shots at this aspirational brand, criticising a range of issues from taxonomy, speed and the apparent non-disclosure of prices.

Several cried-out “why wasn’t this tested?” “didn’t listen to users” and so-on and so forth. Compounding the issue was the revelation that the design ‘cost £10 million’.

An unmitigated disaster eh? Well no, not in my opinion. Firstly I think that the £10m issue is swaying a lot of bad publicity. The general public, and this is not to patronise, simply do not understand the price of design (c.f. Olympics 2012 logo). I don’t understand the price of building a new bridge, or anti-retroviral drugs and I don’t presume to tell the people in those industries that the cost of such things is too much. For some reason, the great British public assume that design work is just a 17 year old with photoshop tinkering about. It completely misses the point that work like this involves high levels of expertise in visual design, logistics, accountancy, information systems, security, project management and so-on. It’s massive, it’s expensive stuff. You might re-design a local dentist’s website for £1000 but really this isn’t even vaguely comparable.

Secondly, it does actually work. To claim it’s “not fit for purpose … beyond fixing” is bonkers. Show me the evidence that no-one is shopping on the site, that the usage is down that average basket sizes are down etc. etc. I suspect you would find the opposite [EDIT 25.March: Orders are in-fact up by 34% on the previous site, according to The Guardian]. Yes, there are problems. Some of the nomenclature and taxonomy is a little unconventional. Sally pointed out that browsing freezer products was done by brand and not by type, that seems peculiarly specific. Most users would at least like a choice to filter by meal, by category (fish, poultry, ready-meal, dessert) and so on. Other glaring errors include the (now fixed) inability to identify sizes or quantities of items like milk and meat.

And then there’s the speed. The speed it’s rendering is not great. I’m no developer so can only speculate that it could be either an interface layer issue or one related to pulling items out of the eCommerce catalogue (the back-end). To the consumer this distinction is irrelevant, it just takes time and time-precious consumers get understandably narky. Fixing the speed is critical to the perception of performance.

It infuriates me to suggest that this wasn’t thought about or tested. In our industry with so much money at stake it is inconceivable to think it wasn’t tested in some way at several points throughout the process. It was designed in part by some very talented user-centred people and the fact that certain elements have been included (drop-down category breadcrumbs) suggest a user-experience designer’s hand. The key is whether the user-testing was sufficiently rigorous, sufficiently real-world and sufficiently analysed to feed back into the design process.

Interactions which are causing the most concern include long-lists – the heavy duty users at home doing > £100 shops with many items. In these scenarios they are likely to be juggling multiple threads of activity: searching for goods, ticking them off a paper list, popping to and from the kitchen, considering recipes and so on. Keeping as much of the action (‘add to basket’) transparent at the same time as the browse activity is a tricky ask. Often user-testing is done in a lab with a user isolated from the context in which they normally perform their activity. It’s not a real shop, it’s a simulated list and the observations you will make will subsequently be quite false.

Work like this is so dependant on context that it needs to be stress-tested in real-world situations. It means sample shoppers using a staging-version of the site or a high-fidelity prototype to do their normal shopping routine. It might have happened here, I speculate that it probably didn’t.

I rememeber Catriona Campbell of Foviance telling me once of some ethnography work done for Tesco where they observed online shoppers ordering in bulk from their value range. Actually observing the users in their homes showed that these were consolidated orders for their community where one person acted as a distributor from a single paid-for delivery. Insight like this rarely comes from a two-way mirror, eyetracking and a moderator.

Returning to the Waitrose site, i’d urge you not to get caught in the hype but to actually use the site. The majority of  problems cited on the forum seem to be resolvable coding/performance issues, not fundamental interface design issues. By which I mean buttons not working as intended, technical errors and so-on.  The remaining issues surround a nostalgia for old site features like the jotter. I’ve seen this sort of thing before when a quirky feature barely anyone used gets removed the one or two people who did use it take to the web to complain.

I’m not saying it’s brilliant, it clearly needs work but I just personally feel the need to call for some calm and reflection in light of the fact that passionately user-centred people would have been involved in this and working with the very best of intentions albeit perhaps without the backup to see it through to final development or the support of adequate contextual user-tests.

Tagged , , , , ,