What should we know about the way storytelling and news is presented on tablets?
In 1990, Poynter tested how people read news in print. In 2000 and 2003, we tested how people read news online. Then, we dove into the differences between the two with research in 2007.
Next up? The device that incorporates the magical elements of touch and location.
Of course, readers touch and transport newspapers and laptops, but the interaction on tablets really is quite different.
Users pinch, swipe, and scroll, horizontally and vertically. They multitask and they physically move from place to place. Tablets devices know where people are, and they give people access to real time information on the go, based on where they are.
This adds great dimensions to the user experience. And we’re ready to look for tools and practices that help to define standards for what works well and what is to come.
To that end, we’re embarking on research that hopefully will include you.
We’d like you to tell us what you would like to know. If possible, we’ll incorporate your ideas into our study using eyetracking gear, gesture tracking, media diaries, observation, survey, prototypes and more.
While this research can’t answer every question related to the way people read on tablets, we hope to provide good insight into the opportunities that the devices bring for news, storytelling and advertising.
All of the design, code and content from our prototypes will be open source. The data we gather will be, too. If you are interested in replicating the experiments or parsing through the data, we’ll give you the goods over the next six to nine months as they become available.
We’re also seeking possible funders for the project — that’s another way to get involved. With a $50,000 grant from the John S. and James L. Knight Foundation and other funds from CCI Europe, we’ll begin with this basic framework of questions:
Tools and tasks: How intuitive can tablet navigation be and how long does it take to successfully complete a task?
Satisfaction: How happy are users with an overall experience and how does that impact their perception of the credibility of the source?
Comprehension and retention: Which forms help people to understand and remember what they have seen or read?
Business and revenue: What strategies might work for news organizations? For advertisers? For consumers? How might editors set up a newsroom to create content for a tablet product?
In addition to myself, our key research team includes:
Dr. Mario Garcia, CEO and founder of Garcia Media, who pioneered Poynter eyetracking research with Dr. Pegie Stark Adam.
Jeremy Gilbert , assistant professor of media product design at Medill Northwestern University.
David Stanton, managing developer for Smart Media Creative and research consultant.
Rick Edmonds, media analyst for Poynter.
Regina McCombs, Poynter faculty for multimedia and mobile.
And our advisory group draws an even longer list of talent, including:
Roger Black, CEO of A Narrative Design Studio.
Rusty Coats, President of Coats2Coats, Consultants for a Media Future.
Andrew DeVigal, New York Times multimedia editor and interactivenarratives.org.
Jeff Sonderman, digital media fellow at Poynter.
Jennifer George-Palilonis, George & Frances Ball Distinguished Professor of Multimedia, Ball State University
Michael Holmes, The Center for Media Design.
Damon Kiesow, The Boston Globe.
Miranda Mulligan, The Boston Globe.
Tor Bøe-Lillegraven , CCI Europe.
Nora Paul, Institute for New Media Studies, University of Minnesota.
Robin Sloan, former director of media partnerships at Twitter.
Will Sullivan, Lee Enterprises and the blog, journerdism.com.
Matt Thompson, National Public Radio.
Here are a few introductory thoughts from our research team.
Dr. Mario Garcia:
“EyeTrack has always been one of the most major contributions from the Poynter Institute to the industry. A generation of editors, journalists and designers have benefited from the Institute’s EyeTrack findings through the years, which has, in turn, contributed to easier to read newspapers and online editions.
Now, EyeTrack turns its attention to the tablet, and, again, this will come at a time when the industry needs the information the most. I am happy to be involved again, and I am convinced that this new EyeTrack will advance the cause of effective visual storytelling on new platforms to the next level.”
“In only a few years touch-based, mobile computing has gone from a novelty to industry changing technology. Now is a critical time for media companies to evaluate what works for users and to explore what a touch-based news experience should look like.
Previous EyeTrack research has helped designers and researchers understand how people interact with old media. Now is the time for the Poynter Institute, university researchers and media professionals to join together to explore these issues.”
“Instead of trying to create unified theory, we want to build a research framework testable across devices and content.
Our choices are driven by our research objectives, but you may have different objectives. To let you extend our research, all design, code, content and data will be open source. We want everyone to contribute to a better understanding of news consumption.”
“Analysts are suggesting that the adoption of iPads and competing tablets will be far faster than the introduction of any previous popular device. But there is a great deal still to be learned about how users interact with their tablets. As a business matter we particularly need to begin moving past hypothesis to observation and other primary research on how users view advertising, what kinds are most engaging and effective. EyeTrack can contribute significantly to that body of knowledge.”
In awarding the grant, Knight Foundation noted that the research comes at a critical time for the industry. “With half of all data traffic expected to come from mobile devices by 2015, this research could provide new insights for news organizations about revenue generation strategies that work for mobile and tablet platforms,” said Knight Foundation Program Associate Amy Starlight Lawrence.