“A wholly original kind of app”: The Information Architecture Behind myFry

My esteemed colleague Flo described it as what happens when “the stars align“, I separately described it as a ‘perfect storm’. This week Penguin launched myFry an iPhone app that we at Dare had been asked to create for Stephen Fry.

This was one of those projects which sets sparks off when the brief comes in. Already engaged on another app (this time more iPad than iPhone) for an established Penguin talent, word reached us that Penguin would like us to look at this Fry work – in an incredibly short space of time. As the days and weeks progressed we were fortunate (and I know this all sounds gushing) to be working alongside a client in Jeremy Ettinghausen who not only profoundly understood his client [Fry] but equally understood the capability and talents of his staff and his agency [Dare].

The first step was to meet and discuss with information artist Stefanie Posavec who was employed by Penguin as a cover artist and who’s extra-curricular work had caught everyone’s imagination. We discussed the taxonomy of the manuscript, the experience of Stephen Fry’s writing and ultimately the opportunities for an interaction. From some sketches and notes I took this work and began by laying out a visual of the manuscript, demonstrating how this could be tagged and chunked, how these chunks could form navigable elements and how we could represent relationships through the text.

This work went back to Stefanie, Jeremy and others at Penguin from which Stefanie produced conceptual visuals and the hard work of beginning to read, analyse and tag the book began. As this was an editoral task, this was best left to the publishing team (and Flo – see his tagging output) but it left myself (and I suspect Stefanie) nervously awaiting the outcome; utlimately the visualisation and interaction would be profoundly affected by how the book was re-cut. It is important to note here that nothing was removed from the manuscript or changed in ordering.

In the meantime, I began to work with Ron and Luke at Dare (both visual designers) to start to refine the interface layer, understand the interaction journeys and turn so-far static PSDs into beautifully engaging experiences. All the time we were working on this, Penguin and Flo were tagging away and the tech guys (James, Joe & Perry) were astonishingly already producing prototype experiences.

These tech prototypes were incredibly important to the project. They enabled us to see how the click-wheel navigation would work. The detail required in understanding how many ‘spines’ we could fit on the wheel to be usable for the majority of people on an iPhone was crucial to determining how many sections the book would be separated into. In prototyping this work the guys creatively developed the liquid ‘bounce’ feedback on the wheel which gave valuable feedback to the user as to where their finger was in relation to the wheel – something we were unlikely to solve in static IA/IxD.

As the days and weeks passed I continued to refine the screen flows for the app, demonstrating every single interaction from the point of clicking the app icon, reading in each conceivable scenario through to managing the internal app settings. We toyed with ideas such as a horizontal histogram view (rotate your phone to landscape to view the wheel stretched out in linear fashion) and we experimented with section-to-section navigation but each of these experiences were debated and eventually dropped. It is important to say that we thought as hard, about the stuff we dropped as we did about the stuff we kept.

I continued to learn about the limitations of iPhone development: (detail bit coming up) changing values in the central apps settings interface doesn’t affect the app until it is launched which means toggles rather than realtime action buttons, for example. As the real data came in our experts in tech began to work out a way of tying the design/IxD visuals together with the data to make it work and I performed some Excel analyses on the the outputs to establish how the live data would affect the visualisation patterns. Over time my role switched to reviewing and tweaking interactions, consulting to ensure that the final build stayed close to the original concept and, eventually, on the 10th September (my birthday) the app was approved.

In general, it simply could not have happened this way without the enthusiasm, collaboration and skill of everyone involved. In my vocation we spend a stupid amount of time redefining our job titles: IA/IxD/Ux etc. etc. What I can say is that this job gave me full exposure to just about everything we think of in this sphere: true information architecture with the taxonomic analysis of the manuscript, cutting-edge interaction design and strategic experience planning; it had it all. That all this was done for a client in Penguin and a subject in Stephen Fry that were as enthusiastic and involved as we were made it doubly exciting. The icing on the cake? Stephen’s contact with Messrs. Jobs and Ive, experience and design royalty, ensuring that my work, our work, is on their radar.

I left Norwich Union (now Aviva) nearly three years ago to join Dare with the express reason of doing work like this, long may it continue.


Note: As of writing, within 5 hours since it launched, the app is already No. 4 in the iPhone Top Grossing apps.

Credits: Everyone involved in this project is writing about it and being gloriously magnanimous and humble in their praise for the team:

Official press stuff: Campaign wrote a simple summary and the Press Association have a piece too. As a long-time reader of Infosthetics it’s great that they have picked up on it too.

Tagged , , ,
%d bloggers like this: