This is ubiquitous media

@ubistudio: Introducing the Ubiquitous Media Studio

Tuesday, July 13th, 2010

As promised during my talk at ARE2010, I’m launching a new project called the Ubiquitous Media Studio, a.k.a. @ubistudio. The idea is to gather an open network of technologists, artists, experience designers, social scientists and other interested folks, to explore the question “If the world is our platform, then what is our creative medium?” I’m provisionally calling this notion “ubiquitous media”, building on initial research I did in this area several years back. The idea is also very much inspired and influenced by my friends at the most excellent Pervasive Media Studio in Bristol England, who you should know as well.
button-ubi So what is ubiquitous media? I don’t know exactly, thus the exploration. But it seems to me that its outlines can be sensed in the choppy confluence of ubicomp, social networks, augmented reality, physical computing, personal sensing, transmedia and urban systems. It’s like that ancient parable of the blind monks trying to describe an elephant; the parts all feel very weird and different, and we’re trying to catch a glimpse of the future in its entirety. When you look through an AR magic lens, ubiquitous media is in there. When your kid went crazy over the Pokemon and Yu-Gi-Oh story-game universes, it was in there too. When you snap your Nike+ sensor into your running shoe, you’re soaking in it. When you go on a soundwalk or play a mediascape, there’s more than a bit of ubiquitous media in the experience.

Blind-monks-450x337

Anyway, we are going to investigate this, with the goals of learning new creative tools and applying them in creative projects. And “we” includes you. If you’re in the Bay Area and you think you might be interested, just jump right in! We’re having a little get-together in Palo Alto:

@ubistudio: Ubiquitous Media Studio #1
Thursday July 22, 2010 5:30-8:30PM
Venue: The Institute for the Future
Details & RSVP: http://meetup.com/ubistudio

I hope you’ll join us. You can also stay connected through @ubistudio on Twitter, and a soon-to-be-more-than-a-placeholder website at ubistudio.org.

Beyond Augmented Reality: Ubiquitous Media

Saturday, June 19th, 2010

Here are the slides I presented during my talk at ARE2010, the first Augmented Reality Event on June 3, 2010 in Santa Clara. Many thanks to all who attended, asked questions and gave feedback. For interested Bay Area folks, I will be organizing some face to face gatherings of the Ubiquitous Media Studio to explore the ideas raised here. The first one will be in July; follow @ubistudio on Twitter for further details.

what is ubiquitous media?

Friday, June 26th, 2009

In the 2003 short paper “Creating and Experiencing Ubimedia“, members of my research group sketched a new conceptual model for interconnected media experiences in a ubiquitous computing environment. At the time, we observed that media was evolving from single content objects in a single format (e.g., a movie or a book), to collections of related content objects across several formats. This was exemplified by media properties like Pokemon and Star Wars, which manifested as coherent fictional universes of character and story across TV, movies, books, games, physical action figures, clothing and toys, and American Idol which harnessed large-scale participatory engagement across TV, phones/text, live concerts and the web. Along the same lines, social scientist Mimi Ito wrote about her study of Japanese media mix culture in “Technologies of the Childhood Imagination: Yugioh, Media Mixes, and Otaku” in 2004, and Henry Jenkins published his notable Convergence Culture in 2006. We know this phenomenon today as cross-media, transmedia, or any of dozens of related terms.

Coming from a ubicomp perspective, our view was that the implicit semantic linkages between media objects would also become explicit connections, through digital and physical hyperlinking. Any single media object would become a connected facet of a larger interlinked media structure that spanned the physical and digital worlds. Further, the creation and experience of these ubimedia structures would take place in the context of a ubiquitous computing technology platform combining fixed, mobile, embedded and cloud computing with a wide range of physical sensing and actuating technologies. So this is the sense in which I use the term ubiquitous media; it is hypermedia that is made for and experienced on a ubicomp platform in the blended physical/digital world.

Of course the definitions of ubicomp and transmedia are already quite fuzzy, and the boundaries are constantly expanding as more research and creative development occur. A few examples of ubiquitous media might help demonstrate the range of possibilities:

nikeplus430px

An interesting commercial application is the Nike+ running system, jointly developed between Nike and Apple. A small wireless pressure sensor installed in a running shoe sends footfall data to the runner’s iPod, which also plays music selected for the workout. The data from the run is later uploaded to an online service for analysis and display. The online service includes social components, game mechanics, and the ability to mashup running data with maps. Nike-sponsored professional athletes endorse Nike-branded music playlists on Apple’s iTunes store. A recent feature extends Nike+ connectivity to specially-designed exercise machines in selected gyms. Nike+ is a simple but elegant example of embodied ubicomp-based media that integrates sensing, networking, mobility, embedded computing, cloud services, and digital representations of people, places and things. Nike+ creates new kinds of experiences for runners, and gives Nike new ways to extend their value proposition, expand their brand footprint, and build customer loyalty. Nike+ has been around since 2006, but with the recent buzz about personal sensing and quantified selves it is receiving renewed attention including a solid article in the latest Wired.

mediascapes430px

A good pre-commercial example is HP Labs’ mscape system for creating and playing a media type called mediascapes. These are interactive experiences that overlay audio, visual and embodied media interactions onto a physical landscape. Elements of the experience are triggered by player actions and sensor readings, especially location-based sensing via GPS. In the current generation, mscape includes authoring tools for creating mediascapes on a standard PC, player software for running the pieces on mobile devices, and a community website for sharing user-created mediascapes. Hundreds of artists and authors are actively using mscape, creating a wide variety of experiences including treasure hunts, biofeedback games, walking tours of cities, historical sites and national parks, educational tools, and artistic pieces. Mscape enables individuals and teams to produce sophisticated, expressive media experiences, and its open innovation model gives HP access to a vibrant and engaged creative community beyond the walls of the laboratory.

These two examples demonstrate an essential point about ubiquitous media: in a ubicomp world, anything – a shoe, a city, your own body – can become a touchpoint for engaging people with media. The potential for new experiences is quite literally everywhere. At the same time, the production of ubiquitous media pushes us out of our comfort zones – asking us to embrace new technologies, new collaborators, new ways of engaging with our customers and our publics, new business ecologies, and new skill sets. It seems there’s a lot to do, so let’s get to it.