Innovation is all around us: cameras that shoot 360-degree video, audio that makes the user feel immersed in the scene. What journalists need to know is whether these new technologies will result in more engaged and informed news consumers. That was the focus at our workshop entitled Sensing the News, co-sponsored by the Institute for New Media Studies at the University of Minnesota. This workshop followed an earlier session, Painting the News.
Participants in Sensing the News viewed and discussed what new technologies already are available for use in telling stories online. Then they got some hands on practice with existing technologies, including 360-degree and 3-D video. They worked in small groups to produce short “white papers” advising the news industry about the possible uses of new technologies, as well as concerns and research questions raised by their use. Take the opening day tour yourself by following the links below. The project reports are available here.
Jeff Gralnick: Remote Media Immersion
USC scientists are working on an entirely immersive experience for users, combining HDTV and immersive audio online. “When the president of Nvidia saw a demonstration,” Jeff said, “his reaction was, ‘Television as we know it is now dead.'” This could include haptic technology, allowing users to “touch the news.” The technology would allow journalists to take people to a story in a way never before possible, and users could learn in entirely different ways.
Susannah Gardner: Immersive Panoramic Video
Working on “user-directed news,” an approach to journalism based on panoramic video technology. With 360-degree video, the user can self-select what to view and can see what is behind and around the reporter. Immersion gives the user a sense of being part of the event. The reporter is less a gatekeeper than a guide. “This will have an impact not seen before,” Susannah said, and it raises lots of questions. “When is it appropriate? When will people want to have news delivered this way? Will they use it? Will reporters adapt to it?” Susannah’s group shot a news story using the IMSC technology–which is enormously expensive and cumbersome at the moment. The user views the results with a laptop and special goggles that offer a head-mount display. Lessons learned: Set-up time required is much greater than for regular video. (For example, the unit uses five cameras which must all be color calibrated the same.) Planning is essential. Reporters don’t know where to look. (The five lenses shoot into a mirror, so there is no one lens to look into.) Limitations: The resolution of the camera is better than using a parabolic or fisheye lens. But there is no tilt option–the final output is 76×360 degrees, not a complete 360×360. But Susannah believes issues of time and cost will be resolved in the near term.
Russell Bellamy: Telegensis
Russell has developed a portable, inexpensive, 360-degree video unit, which uses an off-the-shelf digital video camera, a specially designed lens, and a high-end laptop. Total cost: less than $3500. The lens shoots into a mirror that picks up a 360-degree view. It could be held above the shooter’s head to give the viewer essentially the same experience as the shooter. Online, the viewer can “drive” the picture by deciding what to look at. Imagine coverage of a demonstration in which you could turn the camera around to see just how many people are really taking part. One participant commented: “It forces more honesty on you as a journalist.” Russell also has worked with a mini-remote chopper to get aerial video. The cost: $4500/day including pilot and photographer, for up to six hours of flying.http://rcwhirlybird.com/rcwhirlybird/site.html The drawbacks: must have a qualified pilot to operate, pilot on the ground need to keep chopper in sight, could be dangerous if it goes out of control.
Amy Talkington: The New Arrival
Amy Talkington, a former journalist, is a filmmaker who produced the world’s first immersive movie, “The New Arrival.” Shot with Be Here’s 360-degree ivideo technology, the movie allows the viewer to navigate through the scenes. The story was written specifically for the 360-degree camera. Talkington created 360-degree storyboards to anticipate whether an edit would make sense. Even so, she says, every cut is a jump cut. But because of the way it was shot, there were fewer discussions in the edit room about what shot to put where. “The nature of the medium is to be watched multiple times,” she says. It changes the way a director thinks about the medium, too. “As a filmmaker, you tell stories with juxtaposition and this explodes that. It pretty much obliterated film language, because any shot could be close, medium or wide.” The way Amy sees it, “It’s the difference between a painting and a sculpture. A sculptor can’t decide where a person will stand to look at this work. It’s just different.” The four-minute film was produced on a very short schedule: Five days of pre-production, one day of shooting, six days of post-production.
Paul Morin: Geowall
Technology developed for teaching geology students uses 3-D photography and off-the-shelf hardware to project it so an entire class can see the same thing at once. Instead of expensive goggles, they use cheap “passive stereo” glasses fitted with polarizing lenses. “It doesn’t replace going into the field,” Paul said, “but it helps you see what you’ll see when you get there.” Read how it’s done by clicking on applications/stereo photography. (See also this review of new technologies making it possible to capture 3-D photos online and view them without glasses of any kind.)
Jamason Chen: 3-D Video
Jamason is experimenting with shooting 3-D video by shooting with two DV cams at once. He believes viewers can better share the experience if it’s in 3-D, but is still searching for the kinds of stories that can only be done in this new way.
Rex Sorgatz: Fimoculous
We think of Web cams as devices that can observe public spaces–the penguins at the zoo, Logan Airport, and so on. But they could be personal communication devices. Macromedia has a new server for video chat that allows for 10 streams. Boxes pop up with video and audio, and people can talk to each other, and they can type in instant chat messages. Check out what’s being done now at Remote Lounge. This could be a way of letting people interact with an expert source, beyond the live-chat typing sessions done now. Webcams could get you access to sources that can’t be reached any other way (like someone in a war zone, or under house arrest).
Loren Omoto: Stribcam
“Web cams are the salted peanuts of the Internet,” Loren said. “People can’t get enough.” The Star Tribune’s camera has added functions. You can drive it, shoot with it, save photos from it, and email the results. The paper has several fixed cams, and a portable camera that goes to events. Last year, it was at the Minnesota Vikings training camp when Korey Stringer collapsed and died from the heat. The camera had 90,000 page views the next day. “We provide the tools, they create community.” But journalists could add content to Web cameras using technology like that developed by Perceptual Robotics. You create “hot spots” in the video, and when you click you get something else. Most applications have been for commerce (buy the guy’s tie, for example) but journalists could use it too.
Jim Andrews: Vispo
Jim’s NIO program allows the user to select audio elements, decide what order to put them in, and play them back together. Check “The Art of Interactive Audio” for more examples. What is the role of the composer, then? “Musicians would be offering unfinished pieces, raw material, but highly evolved raw material.” Future options would allow users to save their “compositions,” or add their own sounds by recording them through audio plug-ins. Imagine using this to let users experience the sounds of a specific location, like a forest, to effectively recreate the environment. What if a user could click on a story and have it read to them by the person who wrote it?
The Future of Flash
Rex Sorgast: Fimoculous
Rex posits that we need new tools to attract what he calls the “drag and drop DJ culture” to news. As an example, check out what ESPN is doing with Flash technology using multiple cameras to create 360-degree video that the user can manipulate while it’s streaming. You could create a kind of video mixer so people could arrange clips in the order they want and then play them, save them, send them. At a minimum, this would be a great classroom exercise for beginning reporters and producers. According to Jupiter Research, 30 percent of online users want to create their own material.
Laurence Bricker: Continental Harmony
Laurence created the “Sound Lounge” for the PBS project, Continental Harmony. The user can create his own story by selecting video clips and choosing music to match. In his new project for Valparaiso University, you choose characters who then enter into dialogue (pre-scripted) about the subject you select.
Other Issues and Research Questions
Regina McCombs: Double Play
Regina’s project for the Minneapolis Star-Tribune is a video-driven story with no text that tells the story. Text supplements the story with additional information. It was designed to be more than a TV story dumped on the Web. It is, however, a linear story with a beginning, middle and end. Her approach was to create a timeline and give people places they could move for more information while they followed the line. She included “sidebar” videos and infographics, as well as games you can play as you move through the story.
Julie Jones: Crutch Freestyle
Julie’s research studied how people navigated an online story that was presented three different ways: as linear video, as video with text, and as non-linear video with several different entry points. Her project is summarized here. Among her remaining research questions: could you add hints that would get people to choose to view the story in a specific order? Add different levels of interactivity and see how younger subjects would use them. Add a measure of comprehension to the basic comparison test, to see whether people choosing a specific approach understand the story better or not as well as people choosing a different one.
Christina Fiebich: Elements of Digital Storytelling
Christina and Nora Paul are working on an operational definition of interactivity. They are developing a vocabulary for studying current usage so they can begin to analyze effects on audiences. Some of the stories they have studied are available here. One research project compared users of two different versions of an online project, Voices for the Land, one of which used Flash technology and the other of which did not. The results found that teenagers and seniors who viewed the multimedia presentation had slightly better recall, but the opposite was true for college-age students. In the case of the seniors, the reason seemed to be that they found it hard to read the screen. Three-quarters of them said they would have preferred the project in a newspaper format.