The Role of the Body-2013

From CSISWiki
Jump to: navigation, search


Contents

Introduction

Our seminar will explore how the human body is utilised to create a more natural, engaging and an immersive experience in games, interactive installations, medical technology and future technologies. This seminar will consist of 2 workshops. Attendance will be much appreciated. Please have a look at the reading list in preparation for the seminar.


Role of the body in Gaming

Role of the body in Gaming

I researched and talked about the “gaming” element in relation to “ The Role of the Body.” Initially, I started researching about virtual games, Kinect, Wii and Sony Move, whereby use the body as a controller. However, After stumbling upon an academic research paper, “Understanding the role of the body in player engagement” by Nadia Bianchi – Berthouze at University College of London, I shifted my focus to how the body relates to games, interactive installation art, medical technologies and future technologies. This research was neatly married in to the rest of the group work for a fluid transition in to other subtopics.

After extensive research, I discovered that Thomas Tolivan Goldsmith Jr developed the first interactive electronic game in 1947. He called this the “Cathode Ray Tube Amusement Device” which closely resembled the world war radar display. I continued my research in to why we involve our bodies the way we do in gaming. Cristian Schonaur et al. from Vienna University of Technology explains that it is highly engaging, involves natural body movements, immersive, motivating and most importantly, “Fun.” All of which relates to interactive installation art, medical technology and future technology. My research extended further in to understanding the role of the body movement in player engagement. Nadia Bianchi – Berthouze (UCL) studies the movement taxonomy in computer games, whereby she conducts tests on certain body movements, such as expressing interest or boredom through gestures i.e. yawning and still posture. I later studied more in to her “body movement taxonomy” and her “novel Engagement model”. This consists of five classes of body movements. Task control body movements, Task facilitating body movements, Role related movements, Affective expressions and expressions of social behavior. These classes explain how our bodies engage, develop and coordinate to certain games and simulations; a valuable tool for understanding the four main topics of this exploration (gaming, interactive installation art, medical technology and future technology). I noted that she distinguishes clearly between “Hard-Fun – game players who want to test their skill, and create and test new gaming strategies,” and “Easy-Fun – what captures the player’s attention is the sensation of wonder, awe and mystery. The player has the desire to feel part of the game, to be the character of the game.”

I also found it fascinating to learn about the body movements and the social factor. Bianchi – Berthouze says, “Predicted that controllers that encourage movement will support increased social behavior due to their stronger social affordances, and that higher levels of engagement will occur when the use of the input device entails natural movements. As I discussed this with my colleagues, it became apparent that natural movement and embodiment works well in any given setting.”

Finally, after thorough research in to gaming gave me a new insight into psychophysiological aspects of the roles of the body.


Bibliography and useful links.

Altinsoy, M.E., Jekosch, U. & Brewster, S., 2009. Haptic and Audio Interaction Design: 4th International Conference, HAID 2009 Dresden, Germany, September 10-11, 2009 Proceedings (Google eBook), Springer. Available at: http://books.google.com/books?id=JFuZ-kvDcx4C&pgis=1 [Accessed March 6, 2013]. Anon, BBC - Science & Nature - Human Body and Mind - Interactive Body. Available at: http://www.bbc.co.uk/science/humanbody/body/index_interactivebody.shtml [Accessed March 6, 2013a]. Anon, C. Schönauer, T. Pintaric, H. Kaufmann: “Full body interaction for serious games in motor rehabilitation” (2011) • Interactive Media Systems, TU Vienna. Available at: https://www.ims.tuwien.ac.at/publications/tuw-196339 [Accessed March 6, 2013b]. Anon, Design and evaluation of user’s physical experience in an Ambient Interactive Storybook and full body interaction games - Tags: MULTIMEDIA systems INTERACTIVE multimedia. Available at: http://connection.ebscohost.com/c/articles/61463803/design-evaluation-users-physical-experience-ambient-interactive-storybook-full-body-interaction-games [Accessed March 6, 2013c]. Anon, Feedtank LLC - [ Interactive Installations, Mobile Application Development, Experiential Design, Web Development | Brooklyn, New York ] - Full Body Games. Available at: http://www.feedtank.com/index.php?strProject=full-body-games [Accessed March 6, 2013d]. Anon, Interactive Entertainment Technology (M.Sc/P.Grad.Dip) : Trinity College Dublin. Available at: https://www.scss.tcd.ie/postgraduate/msciet/current/Dissertations/0708/Dumitrescu_Silviu.php [Accessed March 6, 2013e]. Anon, Interactive Systems and User Experience Lab: Research: 3D User Interfaces. Available at: http://www.eecs.ucf.edu/isuelab/research/3dui.php [Accessed March 6, 2013f]. Anon, Serious Games Institute (SGI) - AIS educational full body interactive games & user’s physical experience in schools - The SGI - Serious About Games. Available at: http://www.seriousgamesinstitute.co.uk/events/pages.aspx?section=75&item=436 [Accessed March 6, 2013g]. Anon, The Body as a Medium: Reassessing the Role of Kinesthetic Awareness in Interactive Applications | Aaron Levisohn - Academia.edu. Available at: http://www.academia.edu/209900/The_Body_as_a_Medium_Reassessing_the_Role_of_Kinesthetic_Awareness_in_Interactive_Applications [Accessed March 6, 2013h]. Anon, understanding the role of the body movement in player engagement. Available at: http://web4.cs.ucl.ac.uk/uclic/people/n.berthouze/BerthouzeHCI12.pdf [Accessed March 6, 2013i]. Bianchi-berthouze, N., Understanding the role of body movement in player engagement. Kanev, K. & Kimura, S., 2002. Integrating dynamic full-body motion devices in interactive 3D entertainment. IEEE Computer Graphics and Applications, 22(4), pp.76–86. Available at: http://ieeexplore.ieee.org/xpl/articleDetails.jsp?reload=true&tp=&arnumber=1016701&contentType=Journals+&+Magazines&sortType=asc_p_Sequence&filter=AND(p_IS_Number:21873) [Accessed March 6, 2013]. Koštomaj, M. & Boh, B., 2009. Haptic and Audio Interaction Design M. E. Altinsoy, U. Jekosch, & S. Brewster, eds., Berlin, Heidelberg: Springer Berlin Heidelberg. Available at: http://dl.acm.org/citation.cfm?id=1612265.1612287 [Accessed March 6, 2013].

Role of the body in Interactive Installations

Interactive installations allow the users to participate, using an element of their body, with or without a physical device. By providing an input it prompts the installation technology to generate an outcome. When the user interacts with an installation they become an integral part or actor within an environment that has been programmed to receive body actions. Within the interaction with the installation the body provokes the environment and discovers aspects of the environment.

Interactive Devices

Input Device Development

An interesting quote from Joy Mountford (a HCI developer) "The mouse is probably the narrowest straw you could try to suck all of human expression through." She is talking specifically about interactivity between people on computer systems but I think this is applicable when we look at the development of interactivity in installations. Up until the late 90's the user was somewhat restricted in how they could interact within an installation. Devices like the mouse and touch screens can be limiting in the fact that the user only uses their hands and fingers to interact. When full body input devices like Kinect were developed the user was more free to move around the interactive environment both physically and virtually as they were no longer tethered to a physical input device. This allowed the user greater creative agency and allowed the users' personality to come out more when interacting with the installation. The virtual environments are becoming more 'real' looking with the development of 3D stereoscopic imaging and these technological developments also enhance the users experience but there are other forms of interactivity that are in some ways more fundamental than the use of motion sensing.

Sound / Voice Interaction

One of the earliest forms of human expression is the voice. A few installations discussed were Aparna Rao's (Pygmies) and Golan Levin's (Mark). In Aparna Rao's installation small creatures hide behind blank frames and only emerge when the room is quiet. As soon as there is any foreground noise/voices they quickly hide back behind the frames. It's a playful environment that only involves minimal human bodily interaction but it creates a fun atmosphere that affords the users to play and discover the limitations of the creatures' movements.

In Golan Levin's installation (Mark) he bases the interactivity voice analysis technology. The interactive converts speech into letters and shapes. When a phoneme is recognized it is spelled out and if it's not recognized by the software a shape is output. The shape is coupled to the specific harshness and timbre of the users' voice. ie words with letters like 't', 'k', 'p' generate spiky shapes and words with sounds like 'o', 'm','n', 'a' create rounder shapes.

Eye Interaction

Another interesting area of interactivity was that of eye interaction. An installation I looked at was from Golan Levin. He creates a recursive piece of art by standing in front of a screen that has a webcam. The camera tracks his eyes and takes movie snapshots of his eyes and places them in an array of multiple snapshots on the screen. The installation is quite basic but behind it holds the concept of ever changing art. When the next user comes along their eyes are filmed and the screen is re-written with snapshots of their eyes looking at the previous users' eyes. This continues each time a new user interacts with the installation. It's an interesting area as it not only lets the current user become an integral part of the installation but it also creates an organic piece of art that is forever changing.

Body Architecture

An area if interactivity involving the human body that encompasses conceptual, artistic and technological paradigms is body architecture. An artist who describes herself as a 'body architect' is Lucy McRae. She has several areas of design and research. She attempts to blur the physical lines that define the human form with it's environment. She explores this concept using new materials science and technology. Another of her areas worth mentioning is that of emotive technology where the human users' emotional state is measured and physically alters the state of a piece of technology or material. A research project relating to this was Phillips' Skin Probe Project. A dress responded to the wearer's emotional state by 'blushing'. At the moment these are primarily display concept design artifacts but there is the potential for them to be incorporated into interactive installation design where the user's physical and emotional state is integrated into the response of the installation.

Interaction Technologies for People with Disabilities

In this section we discussed a selection of technologies that have been developed to help people with disabilities interact with computers and other devices.

Examples

Brain Port

Developed by Paul Bach-y-Rita, originally as an aid to help stroke victims regain their sense of balance. A camera transmits images that are transferred to an electrode array that the user rests against their tongue This enables the users to ‘see’ the image with their tongue.

Mobile Lorm Glove

Designed for communication between deaf and blind people. Lorm is a language used by deaf/blind people, which uses hand touch. It was invented by Hieronymus Lorm in the 19th century. The Mobile Lorm glove uses buttons on the bottom of the glove so the user can send messages. Micro vibrators, of the type used in mobile phones are placed in positions on the top of the glove, giving haptic feedback so the user can receive the messages. The Mobile Lorm glove is designed to be interfaced with mobile phones, through which it offers the potential to control other devices.

Tongue Drive

A magnet is attached to the users tongue. The position of this magnet is picked up by sensors, either positioned outside the face by the users cheeks, or in a dental retainer. This system enables accurate positioning interfacing with computers or wheelchairs.

Tobii PCEye

This is a consumer device, costing around €3,500. It enables basic interaction with computer operating systems using the eye.

Eye Control Headset by Dr. Aldo Faisal

Dr. Faisal is a neuroscientist who has developed an eye control headset that can be built for around €25. It uses parts from gaming console controllers and glasses frames.

Brain Computer Interface (BCI)

Invasive or partially invasive BCI offers the most reliable “interface” with the brain. With Invasive BCI electrodes are placed directly into the grey matter of the brain. A disadvantage is that scar tissue builds up around the electrodes after a period of time, reducing the signal quality, eventually making it unusable. Partially invasive BCIs place the electrodes inside the skull, but not in the grey matter of the brain. Non invasive BCI uses sensors external to the skull. This is not as accurate or reliable as invasive BCI, however recent advances in this technology have meant commercial devices are available for just a few hundred euros. Examples of this is the EPOC, a device that costs €3,500, and toys that use this technology, such as the Japanese Necomimi, a pair of animal-like ears attached to a head band, which respond to the wearers mood. Although these technologies are consumer, what is particularly exciting is that the availability and accessibility of these technologies enable accelerated development of applications of these technologies from many different areas, not just medical research.

Role of the body in Future Technology

This sub-topic discusses four main areas:-

  • Bio Printer, which is an invention in medical technology
  • Wearable Technology; this falls under art.
  • Ubiquitous Computing and Embedded Interfaces on kenectic
  • Nano-science Technology; where all these areas are combined

Bio Printers

  • Bio printers are recent development from 3D printers that construct living tissues using living cells. Unlike 3D printers which use materials like metal and plastic.
  • They print cells from a bio print head that moves left and right, back and forth and up and down in order to place cells exactly where required.
  • Over a period of several hours, an organ object is printed

Bioprinting Pioneers

In 2002 Professor Makoto Nakamura realized that the droplets of ink in a standard inkjet printer are about the same size as human cells. He therefore decided to adapt the technology, and by 2008 had created a working bioprinter that can print out biotubing similar to a blood vessel.

Future Vision

  • The bio printing pioneer Organovo company has successfully implanted bioprinted nerve grafts into rats, and anticipates human trials of bioprinted tissues by 2015.
  • At the moment bio printers are just for research purposes but will produce simple human tissue structures for toxicology tests thereby reducing the need for animal tests.
  • This has also led to the creation of Envisiontec Bioplotter which prints tissue spheroids' and supportive scaffold materials including fibrin and collagen hydrogels, that may be used to support and help form artificial organs, and which may even be used as bioprinting substitutes for bone.
  • In future bio printers will print human organs using culture of a patient's own cells to reduce the risk of rejection. Others are developing techniques that will enable cells to be printed directly onto or into the human body.e.g. skin grafting
  • Organovo Products

For more information about the future of bio printing and bioprinting companies visit:-



Wearable Technology

I-Garment

  • This is a smart full body garment integrated with a civil protection unit management system using satellite communications and space technology.
  • The garment is equipped with a Global Positioning System to pinpoint the location of the wearer. firefighters can be located during an emergency
  • It is also fitted with body sensors therefore can continuously monitor the firefighter's state of health (body temperature and heartbeat).
  • This information enables the coordination centre to initiate activities necessary to rescue endangered firefighters.

History

This garment was developed as part of the European Space agency telecom Programme which aims to develop applications for satellite communications The project has developed prototypes and include field tests involving fire

Evolution of Smart Textiles

Smart Textiles and Clothing is proceeding through a series of evolutionary stages on a path which will finally result in complete, seamless, invisible, integration with the fabric of the clothing itself. These gradual steps can be summarised as:

  • Step 1: Side by Side Systems

Here the electronics are attached to the textile through external elements (pockets, pouches) and remains rigid and enclosed. Cables and soft switches are sewn into the clothing as could be rigid electronics such as LED lights which are either removable for washing or encapsulated.

  • Step 2: Hybrid Systems

Electronics are permanently attached to the textile through a closer coupling, such as attached through embroidered patches or woven connections and becomes flexible and washable

  • Step 3: Complete Integration

Electronic functions are completely integrated with the clothing, textiles and even in the fibres themselves, including providing active functions such as sensors, actuators, processors. There are no electronics and textiles, only electronic textiles.

Exoskeleton

Exoskeletons for human performance augmentation is a new type of body cover being developed for soldiers that will significantly increase their capacity. An exoskeleton will allow army to carry more without feeling the weight, and move faster too.

History

  • General Electric developed the first exoskeleton device in the 1960s. Called the Hardiman, it was a hydraulic and electrical body suit, however, it was too heavy and bulky to be of military use.
  • Currently, exoskeleton development is being done by DARPA under their Exoskeletons for Human Performance Augmentation Program lead by Dr. John Main.
  • DARPA began phase I of the exoskeleton program in 2001. Phase I contractors included Sarcos Research Corporation, University of California, Berkeley, and the Oak Ridge National Laboratory.
  • DARPA selected two contractors to enter the program’s second phase in 2003, Sarcos Research Corporation and the University of California, Berkeley.
  • The program’s final phase, which began in 2004, is being conducted by the Sarcos Research Corporation and focuses on development of a fast-moving, heavily armored, high-power lower and upper body system.
  • http://inventors.about.com/od/estartinventions/a/Exoskeleton.htm

For more information visit:-

Ubiquitous Computing and Embedded Interfaces

Ubiquitous Computing

This is where computers are anywhere anytime; we use them without thinking about them. Mark weiser calls it the - The calm technology, when technology recedes in to the background of our lives. Ubiquitous computing is driven, not so much by the problems of the past but by the possibilities of the future.

The Present

At the moment we have umbiquitous computing, we have computers in our phones, watches(pebble watch), TVs, stereos, kitchens,tables and floors but yet we have not achieved the perfect vision. Ubiquitous computing nowadays has been largely application driven, reporting on technical developments and new applications for RF(Radio Frequency) ID technologies, smart phones, active sensors, and wearable computing. The risk is that in focusing on the technical capabilities, the end result is a host of advanced applications that bear little resemblance to Weiser's original vision. This is a classic case of not seeing the forest for the trees.

Nanoscience Tehnology

First of all we have to understand that nano is very small. Nanometre converts to 1 billionth of a metre. A man’s beard grows five nanometer every second.

  • Nano technology is the science of interacting with atoms or molecules to modify the way they behave. Nanotechnology has been around since the early Romans when they created windows using different colour glass. They used nanotechnology to make these glass different colours.
  • All around us we see things happening at nano scale for example gecko sticking to trees, birds navigating and the sunset looking red.
  • As gold gets smaller it stops looking gold and starts reflecting different colours like red, purple, blue and green.
  • If scientists understand nature's nano secrets, they can use them to better our world.
  • Tropical plants have leaves that allow water to run off them instead of sticking. At nano scale these leaves have tiny structures that stop water from sticking. scientist can use this knowledge to make water repellant surfaces.
  • Ants can stick to surfaces upside down holding up to 100 times their own weight and release when they want to move on. At nano scale ants have tiny little pads on their feet that help them stick to surfaces. This knowledge could lead to the development of quick-release adhesives and miniature grippers ideal for manipulating microscopic components or holding tiny bits of tissue together during nanosurgery.
  • How Do Ants Stick on Ceiling
  • Picture plants eat insects that slip in to them, but it does not work unless covered with water. The water can spread even upward against gravity on their realms. If looked at closely, picture plants have grooves running across the realm, at nano scale this grooves have even smaller grooves. This surface is called Superhydrophilic surface. A surface that likes water. Scientists can use this in water filters to filter even the dirtiest water and make it safe for drinking.

Nanoscience Instruments


Seminar Activities

We had two class activities

First Activity

  • Choose a random envelope from both colours.
  • Create a concept for the selected theme
  • Explain the design and how it works
  • Discuss it’s uses

The class was divided in to four groups and each group was presented with two envelopes containing body part and one of the four areas. This activity was very interesting we provided the class with materials to use for the project; the final product was very colourful. All members of the groups discussed their projects, we took photos and videos of these projects.

Second Activity

  • Use only your head, any part
  • Create a piece of Art Work
  • Upload it to your blog or student share

The class had to draw, paint, colour or scribble using only the head, no hands. We had provided the class with rubber bands, straws, strings, coloured pencils. This activity made people to reflect on what disable people go through everyday in their daily activities.

Class Seminar Slides

Slides: [1]

Class Seminar Photos

Photos: [2]

Class Seminar Videos

Videos: [3]

Personal tools