Just over two weeks ago we launched our latest piece of work, Concept Lens, at the most excellent Web Directions conference in Sydney, Australia.
Web Directions is Australia’s leading web focused conference and Maxine, John and their team make a fantastic effort at doing their bit to help develop the web industry here in Australia.
We approached Maxine back in August about working together to produce a visualisation of the “back channel” conversation that was occurring during the conference. In today’s technology that means Twitter for the conversation and Flickr for the photos. I remember back to 2004 and Joi Ito presenting the backchannel on IRC for eTech on a scrolling LED display on stage. Times have changed and the richness of conversations, topics and visualisations have developed.
Over the month prior to the conference and between client work, we pulled together our “thumb in the pie” Twitter visualisation. We had seen many abstract pieces visualising Twitter and while we wanted to make sure we delivered something aesthetically pleasing, it was equally important that the application and associated visualisation was useful for the audience to gain insight into the back channel conversations around the event.
Using our tool of choice, Processing, we proceeded to sculpt and sketch some initial ideas of how the visualisation could work, and along the same lines as the recent excellent article on working in the data mines by Berg, we always find that until we play, explore and get our hands dirty with the datasets we can’t come up with the appropriate design or visualisation. Concepts that we think would work often don’t, schemas in the data at first glance look appropriate and complete but turn out to be null or incomplete.
Our initial data explorations focused on the Twitter search feed and understanding what was in the data and how we could use it. One of the concepts I have been very keen on since first hearing about it from Jeff Jonas in 2007 is that of “data finds data”. Jeff has written plenty about this topic but here is a brief example, “you have two records that refer to the same person, but you don’t know that they do. Then a third record appears which relates to each of the first two, and which establishes that all three refer to the same person. The first two pieces of data find one another, through the agency of a third piece of data.”
After developing an understanding of the Twitter search result data and the inconsistencies between schema and actual results, along with getting a “feel” for the data inside the results, we proceeded to sketch out some initial visualisations. A key concept was being able to show data over time and in this case, the representation we were drawn to was the use of “zoomable” timeline as away of pegging the data in time and space. The zoom interface gives the user a macroscope-style tool which allows them to see activity over a 10 minute period up to a three day window giving a quick visual cue regarding hot points and areas of great discussion.
Once we had the basic rendering happening, we shifted to the key focus for the project, that of identifying relationships between the snippets of conversation. Back to the data finds data concept here. While the volume of tweets was by no means huge, we decided to develop and approach that didn’t require the application to revisit every piece of data each time a new tweet came in just to find relationships that may or (most likely) may not exist.
To this end, we created some structures that supported key lookups driven off plugins that understand a specific concept around the tweets, such as a RT, a conversation thread, a hashtag topic, a keyword, or an author, and let them handle the relationship building as each new tweet comes into the application.
This is turned out worked exceptionally well, both fast enough to actually be done on the fly and customisable enough that we could, and did, implement new plugins with 30 mins work.
The result of this was a zoomable time line based visualisation of the conversation around the event that enabled the viewer to navigate the large set of tweets and view the relationships between them in an efficient and useful manner.
It’s by no means perfect, but we did achieve the two goals of delivering something that was well accepted at the conference and creating a tool that we continue to find ourselves using to gain insight into specific events such as sporting matches and conferences.
Of course it wasn’t all sunshine and roses, we experienced a lot of pain dealing with the native OpenGL calls required to get any performance out of Processing, especially around text display and, unexpectedly, the rendering of “points” for each tweet. Inconsistent OpenGL problems took up a few days of development time but we did end up learning a heap about monitoring the performance of Processing optimising it using appropriate OpenGL calls. The lack of GUI widgets in Processing also meant we had to roll our own components for everything on the screen and while interesting intellectual exercise it was slightly painful in the last few days as we crunched to deliver.
We learnt heaps from actually seeing the visualisation in use in the wild and getting some wonderfully constructive feedback from many of the conference attendees.
We’ve updated the Concept Lens code to enable it to be used around any topic or search term. It’s available for Mac and PC at www.flinklabs.com/projects/conceptlens
We received a LOT of interest in using Concept Lens to monitor conversations around brands and several conference organisers are very keen to use it in upcoming events. So the next steps are to investigate integrating some sentiment analysis into the visualisation, probably moving it to Flex and to polish up some of the rough edges taking into consideration the learnings we got from Web Directions. We’ll probably also remove Flickr as it, somewhat expectedly, gets used after the event as people return home, sort out their photos and uploads them removing the immediacy of the images.
Finally, we really want to thank the team at Web Directions for supporting us in implementing Concept Lens and giving us the space and facilities at the conference. Also, Jeremy Yuille was exceptionally helpful during the ideation phase and provided many use focused insights that helped clarify a lot of our thinking.