Interactive Narratives

From CSISWiki
Jump to: navigation, search

Contents

Introduction to Interactive Storytelling

Interactive storytelling can be broadly defined as a story where the reader or audience has an agency of their own. This agency gives rise to a certain dialogue between the story and the viewer which ultimately shapes the story for that person.

Interactive storytelling is a relatively new field. It was first conceived in the late 1960s and first experimented with in the 1990s but it did not attract enough attention until the 2000s. By 2010, however, it had become a hot topic across the world.

A linear story follows a track from start to finish in the most powerful and expeditious manner possible. In contrast, interactive storytelling focuses on providing diverse dramatic possibilities, perspectives or outcomes. The approaches to this have varied across different works. For example, Chris Crawford defines interactive storytelling as “a form of interactive entertainment in which the player plays the role of the protagonist in a dramatically rich environment.” (Crawford 2004).

The general idea behind an interactive narrative is to focus on the active creation and authoring of the story by the user rather than having a plot set in stone only to be received by a passive audience. One can do this in several different ways with various different possibilities.

One important aspect to keep in mind is that interactive storytelling is not simply an extension of the field of cinema or video games but must be approached as an entirely new concept.

A common problem that might come up in interactive stories is regarding the authorship of the story. In order for a story to be interactive, the audience must be able to change the plot. In doing so, they might go against what the original author had intended and potentially ruin the story. The solution to this is a concept called abstraction. Here, we will be talking about the storyworld as opposed to the story itself. Each storyworld can have several different possibilities and it is upto the author to define the general direction of the plot with the player or audience deciding the details that would lead to it. (Crawford, 2012)


Analogue Interactive

Before talking about the emergence of interactive narratives in the digital media, it might be interesting to note that interaction with the audiences has always been a recurring theme in art forms. For instance, theatre has always tried to facilitate this kind of audience interaction for the drama to unfold. In the 1960s, Augusto Boal proposed one of the most important initiatives in the 60s and her “Teatro del oprimido” where the “spect - actors” (audience) becomes active. Also in literature, relevant writers like Julio Cortazar and his novel “Hopscotch” published in 1963 dealt with the idea of the active reader. The chapters of his book are written keeping in mind the reader's decisions. This practice is placed at the decade where some theorists like Foucault and Derrida were thinking about the idea of readers as authors.

Talking about more popular culture books some parts of the Goosebumps series featured interactive plotlines where the ending depended on the user’s decisions.


Early Development

The theory of digital interactive storytelling was developed in the 1980s when several artists and theorists came up with systems to create interactive storytelling experiences. One of the most well-known papers about the topic was “Toward the design of a computer-based interactive fantasy system” (1986) written by Brenda Laurel.

“There exists in our culture a number of deeply felt, shared fantasies of the kinds of experiences we might have with high technology. The idea of first-person, interactive excursions through imaginary worlds is one such powerful fantasy, which has been expressed in literature, film, and popular culture for decades. It is at eh root of the desire which is evoked but only partially fulfilled by the video game fad. The wished-for experience might be compared to volitional dreaming, or to the idea of becoming a character in a play, affecting the action and outcome by making choices and performing actions in the imaginary world of drama. What would it be like to be Hamlet of Captain James T. Kirk? How would it feel? How might one do things differently than the characters who have already been created? What could one learn by doing it?”

The excerpt from the above paper very coherently articulates the kind of path this new media was trying to explore at that point and the potential uses of it. (Laurel, 1986)

Façade (Interactive Drama- 2006)

Facade is an AI based interactive story created by Michael Mateas and Andrew Stern. It was way ahead of its time and received the Grand Jury Prize at the 2006 Slamdance INdependent Games Festival.

Facade puts the player into the shoes of Andrew, who is a friend of a couple (Trip and Grace) living in New York City. Andrew is invited to their place for cocktails and is left feeling awkward because of the clear tension between Trip and Grace. The player or Andrew can “speak” to the couple by typing and they respond accordingly using the language processing software the is built into the game.

This is a very good example of interactive storytelling at a point of time when this was hardly a common form of narrative media. The many possible outcomes which are primarily based on how the viewer chooses to proceed with the story is the key characteristic of an interactive story.


Applications

Film & Games

The possibility of making films in interactive format can possibly bend all the rules of linear filmmaking. Over the past couple of years, interactive films have been made by several prominent film-makers. As mentioned before, it gives rise to various approaches to telling a story. At this point, there are major players in this field including Google who have come up with a platform called Google Spotlight. Google Spotlight Stories, a mobile app featuring immersive, 360-degree animated films originally developed by Motorola ahead of its 2011 Google acquisition. Viewers can look around inside the animated content by moving their body and the phone to see different parts or angles of the story taking place. Basically, the app can take advantage of the device’s sensors like its gyroscope and accelerometer in order to offer an immersive viewing experience. The app includes films like “Duet” from Glen Keane, “Buggy Night” from ATAP, and “Help” by “The Fast and the Furious” director Justin Lin. Example: Pearl, animated Oscar nominated short film. https://www.youtube.com/watch?v=WqCH4DNQBUA

Journalism

Journalism in Virtual reality can transport the audience to the news site, in a way. Major news websites such as the Guardian, New York Times etc have started exploring this new medium to cover numerous stories and give them a new dimension. A news report or documentary can be presented in a million different ways if one introduces the element of interaction in it. Examples of different immersive journalism : http://www.scoop.it/t/interactive-narratives, Quipu project at https://interactive.quipu-project.com/.

Education Schools & Museums

Interactive Narratives can be brilliantly utilised in the field of education since it facilitates user engagement. Having children or even adults delving into the world about which they are learning helps them learning quicker and in a far more interesting way than conventional methods. The element of interaction can bring in a new dimension to l earning things, whether in schools or museums. Google is working on a project called Expedition which aims to take students wherever they would like to go. Another example is the use of augmented reality to bring fossils back to life. This technique is used at the Natural History Museum in London. https://www.theguardian.com/science/video/2010/nov/30/natural-history-museum-evolution


Classification Models of Interactive Digital Stories

Interactive Digital Storytelling can be further classified into several broad categories. There are numerous different ways to classify Interactive Digital Stories, but for the purpose of this seminar, we have chosen to elaborate on three particular models. The three models that we have decided to talk about were chosen by us because they offer three very different perspectives to Interactive Digital Stories.

On the basis of how the Interactive Digital Story is navigated through

Some academics like to classify the different type of Interactive Digital Storytelling on the basis of how we navigate through the Interactive Digital Story. (Gifreu,2010) is the Spanish scholar who has come up with this method of classification.

Under this method of classification, we can find different modalities and submodalities to navigate the content.

On the basis of what narrative structure the Interactive Digital Story follows

Some other academics, however, like to classify Interactive Digital Storytelling on the basis of the narrative structures used in the Story. One of the leading academics who proposed this method of classification is Marie Laure Ryan (Ryan,2001), who has actually classified all the different types of Interactive Digital Stories under nine basic categories. They are as follows: 1. The Complete Graph Structure 2. The Network Structure 3. The Tree Structure 4. The Vector with Side Branches 5. The Maze Structure 6. The Directed Network or Flow Chart 7. The Hidden Story Structure 8. The Braided Plot 9. Action Space, Epic Wandering and Story-World.


On the basis of Authorial Intent and the Autonomy of the Players or Audience

Another method of classification that certain other academics follow is one that classifies different Interactive Digital Stories on the basis of Authorial Intent and the Autonomy of their Players or Audience. This means that these scholars categorize stories on the basis of how much freedom the creators give to the players/viewers. This form of classification has been suggested by Mark Riedl and Vadim Bulitko. (Riedl and Bulitko, 2013)

Our Simplified Classification Model

After careful study of these classification models, we have decided to take this seminar forward by using a special classification model that we ourselves have come up with. We’ve put effort into coming up with a simpler classification model after bearing in mind the scope of the seminar, the time limit that had to be adhered to, the depth that we were expected reach and the fact that our audience members are not people who are academics or even students from this background.

For our own classification model, we have decided to categorise the different types of Interactive Digital Narratives on the basis of the kind of engagement that different technologies provide.

Under this classification system, we have identified the following five types of Interactive Digital Narratives: 1. Desktop based 2. Collaborative 3. Immersive 4. Location-aware 5. Tangible

Desktop based Interactive Digital Narratives

Desktop based Interactive Digital Narratives is one of the earliest known forms of Interactive Digital Narratives. As the name suggest, Desktop based Interactive Narratives make us of the now simple technology of a keyboard and a mouse or touch screens. The audience navigates through the narrative by using the keys on a keyboard or by clicking on a mouse or by tapping a touch screen.

Example 1: Last Hijack Last Hijack is an Interactive Digital Narrative that is actually a documentary that tells us about a Somali hijacker who has been forced into such a dangerous life since he is struggling to make ends meet. The audience can view the entire documentary by navigating through it using a keyboard and a mouse. The almost constant presence of a timeline at the bottom of the screen with clearly marked points that signify the chapters allows the users to see their progress.

It is particularly interesting because the creators of this piece have created it both as a tradition linear documentary that is meant to be viewed by large audiences in theatres. They have also made an interactive documentary out of it which is meant to be played/viewed by single individuals on computers. The Last Hijack is created by Femke Wolting and Tommy Pallotta. http://lasthijack.com/

Example 2: Fort McMoney Fort McMoney is another Interactive Digital Narrative that is also a documentary. This Interactive Documentary, however, is played/viewed from the first person’s perspective and the player/viewer plays the part of a journalist who wants to dig up the truth about Fort McMurray in Alberta, Canada, the world’s largest industrial site and third largest oil reserve.

The scenario that is depicted in this Interactive Documentary is all true. The maker of this Documentary is David Dufresne, who is primarily a journalist http://www.fortmcmoney.com

Collaborative Interactive Digital Narratives

Under the Collaborative type of Interactive Digital Storyelling, the audience works together with the actual creators of the story as co-creators to generate the end product. The audience, therefore, creates their own unique story by contributing to the supplied story following whatever rules or constraints that the actual creators have outlined.

Example 1: The Johnny Cash Project The Johnny Cash Project is a perfect example of Collaborative storytelling. As part of this project, users from all around the world contribute towards the creation of an animated music video for Johhny Cash’s popular song “Ain’t No Grave”.

The way that this is done is by taking advantage of the fact that animation, or the illusion of movement, is created by viewing a vast number of still images one after the other at a very high speed. For the Johnny Cash project, the company behind the project allows users to draw one still image, or one “frame” as they are called. In fact, users are free to draw as many frames as they want. In this manner, what happens is that millions of people from all over the world are exposed to the project and they all create a frame or a number of frames. At the end of the drawing process, the website where this interface is hosted, plays the entire song and its music video which is entirely composed of individual frames that are all drawn by individuals from all around the world.

The Johnny Cash Project was created by Chris Milk, who is a heavyweight in this industry. http://www.thejohnnycashproject.com/

Location-Aware

In this mode the user creates a narrative through his location. It has been used in museums with AR and for some experimental projects.

E.g., Walking the edit. This is an experimental project. The starting point is the following question: Nowadays, videos, pictures and other data can be geolocalized thus creating an ‘augmented space’. Can we build a narrative with all this geolocalized information?

The workflow of the project is the following (Is an experiment. The experience has been performed in certain zones of only in few cities: Paris, Geneve Brussels or Rennes ):

1. The videos already existed on the web and they are already geolocaizated. For instance in Geneva : almost 3500 video files placed on Geneva. They’ve used openstreetmaps. Is not clear how they get the videos.

2. Equipped with an App. The user can hear the sound of those videos while walking. The editing process is a software that converts the way you walk into the structure of the edit. The logic is fast walk=short shots, slow walk= long shots

3. The smartphone records the walk.

4. Once the walk is complete, the visitors can watch “their trajectories translated into films” on the internet site and can share them.

Example: http://walking-the-edit.net/en/movies/paris/?movie_id=5df14071-d0e3-4232-9bf0-f94fe0512374

http://walking-the-edit.net/en/

-Other interesting project is CHESS (Cultural Heritage Experiences through Socio-personal interactions and Storytelling). A project funded by the EU comission in 2011 http://www.chessexperience.eu/

The concept of the project is engaging visitors, especially the young ‘digital natives’ into the collections of the Cultural heritage institutions. The principal objective of CHESS is to research, implement and evaluate an innovative conceptual and technological framework that will enable both the experiencing of personalised interactive stories for visitors of cultural sites and their authoring by the cultural content experts.

The main idea is to create “adventures” through hybrid structures, which adapt continuously to their visitors, extend over space and time, and involve multiple users with different interfaces.

https://www.youtube.com/watch?v=fZRiE7VR-xw

We found it very interesting because is a different way to visit a museum and create a new experience for the audiences. The functionality is very similar as the previous one. Walking the edit.

Tangible interface

The interaction is based on objects and what can they afford The first one is a project called Magic Story Cube: an Interactive Tangible Interface for Storytelling and was a project developed by several researchers from the very well known Mixed Reality Laboratory Department of Electrical & Computer Engineering National University of Singapore.

The basis of this project is that traditional storytelling by books enables multi-sensory experiences including speech (narration), vision (seeing the book) and touch (turning pages and pointing). Their aim was to enhance the interactions of traditional books, while still keeping the main advantages of these traditional physical books.

On the other hand, there were good experiments using AR on books, but the planar configuration (cards and paddles) of these experiments, is similar to the traditional book and does not allow 3D exploration of the contents .

To solve this issues, the researchers designed the "Magic Story Cube". Using a head mounted display with a camera mounted in front, providing the first person viewpoint of the 3D scenes while direct manipulating the process of the story by two-hands interactions.

This project has a physical constraint, and that means that only enables the user to continuously unfold the cube in a unique order, to have a coherent storytelling.

For this prototype, the designers used a famous Bible story “Noah’s Ark”.

This is an example of functionality. When the children unfold the cube, different pieces of the story will be played back in 3D with multimedia supports (human voice, sound, and music).

This project is very interesting because enhance the book experience using something fun; aimed directly at children like a cube and the action to unfold it.


Another example is PuzzleTale: A Tangible Puzzle Game for Interactive Storytelling made by researchers of the National Cheng Kung University Tainan, Taiwan and the Georgia Institute of Technology, Atlanta

The PuzzleTale system allows users to interact with story characters, through tangible puzzle pieces and compose a causal story ending on the table.

The PuzzleTale system includes three layers: -the digital content which is projected onto the surface; -the tangible interface; -and image recognition system.

They designed one white piece and 27 identical dog-shaped puzzle pieces as the tangible objects.

The shape of puzzle pieces provides a physical constraint which hints to the user how to assemble them in a particular direction.

The puzzle pieces can be arranged on the table surface according to their shape and to the constraint of the table frame.

For creating the interactive storytelling the designers used one leading character and several supporting characters distributed on the tabletop. For the prototype, for instance they designed a story with four characters, include one leading character—the male dog, and three supporting characters - the dog catcher, the female dog, and the old couple.

In the beginning, the three digital supporting characters are randomly projected around the surface. They maintained fixed in this position until the end of the story.

The leading character is projected and attached on the initial tangible puzzle.

The story begins when the user places the white piece on the surface.

The task in this prototype was to lead the male dog home (destination).

As the user places the first brown tangible puzzle piece against the white one, it will attract the male dog to jump on it.

The user keeps assembling other tangible puzzle pieces in sequence to make a track. The male dog follows the track step-by-step to reach where the reader wants him to go. During the interaction process, the user can decide whether the male dog meets other supporting characters.

There are two variables that define the ending of the story: First, how many times the male dog encountered the other characters, and second what sequence the male dog met them.

The project was thought as a system. I mean there is no evidence, in the articles we read, about how and what type of multimedia they used to explain the story of the dog, but the idea of interact with the puzzle was interesting enough to describe the process.


They also detailed some methods to evaluate the prototype, which is very useful for us as designers.

The researchers made quantitative and qualitative evaluations, with ten students of Georgia Tech. Using the NASA-TLX criterion of usability that is a method that measures Mental Demands, Physical Demands, Temporal Demands, Own Performance, Effort, and Frustration. And they discovered, for example, that they need a better feedback every time the dog encountered a secondary character.

We liked that, because sometimes in academics is hard to find this kind of "self-criticism ", and seems like a string of success.

Is an exciting project because shows a new direction for storytelling.

Immersive:

In this kind of interaction the user “share” space and presence with the story. As Parijat mentioned before, this kind of immersive storytelling was practised in theatre from the beginning of the 60s.

Another medium that shares this immersion is VR. As a side note, we are going to call VR to all the possibilities of immersive environments with an HDM (Head Mounted Display), In other words, we will refer VR, like Virtual Reality, 360-degree videos and Mixed reality.

Linking these two practices, it is interesting the study realised by Sandy Louchart and Ruth Aylett in The Center for Virtual Environments, of the University of Salford, called Towards a narrative theory of Virtual Reality.

In their study we can see a comparative table of the different narrative forms.

Linking these two practices, it is interesting the study realised by Sandy Louchart and Ruth Aylett in The Center for Virtual Environments, of the University of Salford, called Towards a narrative theory of Virtual Reality. (Aylett and Louchart, 2003)

In their study we can see a comparative table of the different narrative forms. And is relevant the comparison between theater and VR in terms of sharing time and space or how we can experiment the presence. I insist that those are really complex concepts to explain in this presentation.

In VR we can see that unlike theatre, although its presence is not physical, that is to say, although the spectator is not really into the action, its presence is immersive since it shares time and space with the logic of history.

In recent years we have seen the emerge of any kind of experiences in VR.

One interesting example is "Madame Bovary on the Holodeck" (Cavazza and Lugrin, 2007)made ten years ago. (In tech time is like a century) We chose this project because exposes one of the milestones to achieve, when we talk about interactive storytelling and is the Holodeck. Holodeck was the "holographic" technology and a plot device used in stories set within the Star Trek universe.

They have used a commercial game engine as a development environment, supporting real-time visualisation as well as the inclusion of Artificial Intelligence components controlling virtual actors. The hardware platform was built around a 4-sided CAVE-like immersive display operated by a PC-cluster. CAVE are the initials of Cave Automatic Virtual Environment. It is an immersive virtual reality environment, where projectors are directed to the walls of a room-sized cube.

They have used excerpts from Madame Bovary, a novel written by Gustave Flaubert in 1856. In particular the love affair between Emma and Rodolphe (whose role is played by the user).

https://www.youtube.com/watch?v=oZwtVz7z0wM

-The interactive storytelling engine is based on characters motivations and emotional states. -The user can interact with the virtual world using multimodal interaction ( speech and various forms of physical interaction (such as user position, body attitude and gestures).

The narrative unfolds as a realtime stereoscopic 3D animation, featuring virtual actors, and controlled by the interactive storytelling engine. The narrative unfolds as a realtime stereoscopic 3D animation, featuring virtual actors, and controlled by the interactive storytelling engine. Characters express themselves using speech synthesis as well as body animations and the user can interact with them naturally, using speech and attitudes, as if acting on stage.

This is the software architecture for immersive storytelling. Is very complex because they have:

-An interactive Storytelling Engine to unfold the narrative. On the animations side they have described a list of 21 feelings each taking a value in the {low, medium, high} interval. And in 200 ms the software engine decides the mood and the character that marks the next scene.

-The user as I said interact with the virtual world using his speech, position and gestures.

-Another exciting project is 6x9. The synopsis is: What's it like to spend 23 hours a day in a cell measuring 6x9 feet for days, weeks, months or even years?

6x9 is the Guardian's first virtual reality experience, which places you inside a US solitary confinement prison cell and tells the story of the psychological damage that happens from isolation.

I am going to make a little navigation that is not the ideal since the project has been thought to be seen individually with HMD. On the other hand, the original experience has interactivity through Gaze, I mean when you look at some specific point. For example in the letters over the bed, an audio or text is launched. In this navigation, the interaction is already activated.

We found this experience interesting for several reasons. First, because it uses 3D as a medium to represent reality, and this brings something very relevant, close to the deterministic theory, and is how technology forces the stories to be told in a certain way.

In the case of VR projects, we are facing an entirely new paradigm of storytelling, based on choreographies, mapping instead of shooting or located sound. New concepts that will bring new aesthetics and ways of thinking.

On the other hand, this project is attractive, because it opens new routes of interactivity, maybe less explored by the Interaction Design, such seeing as an act to decide. Usually, when we talk about interactive media, we are familiarised with buttons and keyboards to click and select the path. These technologies open not only new interactive ideas but also semantic.

Why is this important now?

In the recent years, two relevant things have been happening.

First the important increase of internet users in the world and how the consumer habits are changing. According to Internet Live Stats, The number of internet users has increased tenfold from 1999 to 2013. The first billion was reached in 2005. The second billion in 2010. The third billion in 2014. Around 40% of the world population has an internet connection today.

Furthermore, these habits bring a new way to communicate, entertain and inform, through the digital medium.

In the last years have been a paradigm shift in production and creation of contents. Ten years ago we had few channels, creators and huge audiences, time and money to spend in one production; now this pyramid has changed completely. Now journalists, filmmakers and game designers around the world are thinking how the viewers interact with the stories. And several players are interested in producing these contents: Long established broadcasters. BBC (UK), ARTE (France-Germany), NFB (Canada), iTVS and POV (USA) producing new media content.Interactive documentaries, Advertisement, Interactive dramas and immersive journalism

http://www.arte.tv/sites/webdocs/ https://www.nfb.ca/interactive/ https://itvs.org/digital-and-interactive

New players. For instance, newspapers The Guardian, NewYorkTimes, Le Monde…

E.g., Notes on the blindness. Synopsis: ”VR journey into a world beyond sight. In 1983, after decades of steady deterioration, John Hull became totally blind. To help him make sense of the upheaval in his life, he began documenting his experiences on audio cassette. These original diary recordings form the basis of this project, an interactive non fiction using new forms of storytelling to explore his cognitive and emotional experience of blindness. “ https://www.youtube.com/watch?v=W2eTgbyiY_0


Others players in the sight: -Netflix news 2013:http://siobhanoflynn.com/netflix-arrested-development-future-storytelling/

References

Crawford, C. (2012) Chris Crawford on interactive storytelling. 2nd edn. Berkeley, CA: New Riders Publishing.

Laurel, Brenda. (1986) "Toward the design of a computer-based interactive fantasy system." Electronic Thesis or Dissertation. Ohio State University. OhioLINK Electronic Theses and Dissertations Center. 12 Feb 2017.

O. Riedl Mark and Bulitko Vadim (2013), 'Interactive Narrative: An Intelligent Systems Approach',  Ai Magazine,March, 67-77

Gifreu, Arnau (2010). The interactive multimedia documentary. A proposed model of analysis. Department of Communication. Universitat Pompeu Fabra, pp 122-136.

Ryan, Marie-Laure. Narrative As Virtual Reality: Immersion And Interactivity In Literature And Electronic Media. 1st ed. Baltimore: Johns Hopkins University Press, 2001. Print

Shen, Y.T. and Mazalek, A. (2010) ‘PuzzleTale’, Computers in Entertainment, 8(2), p. 1. doi: 10.1145/1899687.1899693.

Zhou, Z., Cheok, A. and Pan, J. (2004) ‘3D story cube: An interactive tangible user interface for storytelling with 3D graphics and audio’, Personal and Ubiquitous Computing, 8(5). doi: 10.1007/s00779-004-0300-0.

Aylett, R. and Louchart, S. (2003) ‘Towards a narrative theory of virtual reality’, Virtual Reality, 7(1), pp. 2–9. doi: 10.1007/s10055-003-0114-9.

Vayanou, M., Karvounis M., (2012) ‘The CHESS Project: Adaptive Personalized Storytelling Experiences in Museums’.

Cavazza, M., Lugrin, J., Pizzi, D., Charles F. (2007) ‘Madame Bovary on the Holodeck: Immersive Interactive Storytelling’ ACM 978-1-59593-701-8/07/0009

Personal tools