Interfaces

From CSISWiki
Jump to: navigation, search

Contents

What is an interface?

First, let’s talk about systems:

  • Systems

Our daily lives are full of various systems, lighting systems, door access systems, computer systems, and our bodies are formed of systems. Normally, systems have three parts:

System inputs, System outputs, System black box

And system black box, where the actual work or calculation takes place. I call it ‘black box’ because it is encapsulated and we don’t need to know what is really going on inside.

Here is a simple picture for a simplified system. Inputs, outputs and black box

3431.png

If we pick out all or some inputs and outputs, and then put them together in some kind of order or manner, then we got a interface! That is my personal understanding.

111.png

  • Inputs

So, first are the system inputs. As we are all mainly focus on media systems, so the inputs here are all about computers.

    • Mouse, keyboard, speaker, camera…

These are all most regular computer input devices, and we know them very well by using them every day. There is another kind of input devices that are not so common for us, but they are very important for our studies – sensors.

    • Sensors

I believe that everyone sitting in this classroom wants to know how many kinds of sensor are there, and which ones are feasible for us to use in our projects, right? So, first thanks to Gabriela for giving us this picture about sensors categories. It doesn’t include all sensor types, but I believe it is sufficient for our study. Wenbin and I are going to make a very rough introduction for some of these sensors.

34231.png


      • Flex sensor

- Flex sensors are sensors that change resistance according to the amount of bend on the sensor. Flex sensors convert the change in bend in electrical resistance - the more the bend, the greater the resistance.

Applications: Human joint movement or placement It can be placed on moving joint of human to provide electrical indication of movement or replacement.

      • Gaming or control gloves

It can be placed onto a glove to make a controller.

Exam11ple.png Examp11le.png

      • MEMS gyroscope

– It measures motion, position and rotation of moving object. MEMS gyroscopes can measure complex motion accurately in multiple dimensions, tracking the position and rotation of a moving object, unlike accelerometers which can only detect that an object has moved or is moving in a particular direction.

Applications:

It can be used for GPS It can be used for video game controller

      • Conductive ribbon

– It measures stretch force. As you stretch it out, its resistance increases linearly.

Examp111le.png Exampl1e.png

Application:

Wearable Imagine a cloth made of this

      • Light sensor

– Light sensors are used to detect light. A light sensor used to detect the presence of light in an area.

Applications:

Car light control Turn on the car lights if it is dark.

      • Color/Image sensor

- It converts an optical image into an electronic signal Here is a video that visualize the process beautifully. http://www.youtube.com/watch?v=ezIZjFt80kQ

Application: Any imaging devices

      • Sound sensor

- Sound sensor converts sound into electrical signals. Sound sensor detects pressure differences in the air and transforms them into electrical signals

Application: Sound control Voice command

      • Gas sensor

- Gas sensor measures the concentration of gas in its vicinity. Gas sensor interacts with a gas to measure its concentration. Each gas has a unique breakdown voltage i.e. the electric field at which it is ionized. Sensor identifies gases by measuring these voltages.

Application: Fire detection We got one right here Alcohol breath test Environmental monitoring

      • IR sensor

– IR sensor is capable of measuring heat of an object and detecting motion. IR sensor senses certain characteristics of its surroundings by either emitting or detecting infrared radiation

Application: Collision detection or obstacle detection Heating, hyper-spectral imaging, night vision...

      • PIR sensor

– Passive infrared sensor measures infrared light radiating from objects in its field of view. All objects with a temperature above absolute zero emit heat energy in the form of radiation. Usually this radiation is invisible to the human eye because it radiates at infrared wavelengths, but it can be detected by electronic devices designed for such a purpose.

Application:

Motion detection Like burglar alarm

      • Ultrasonic sensor

– it works on a principle similar to radar or sonar which evaluates attributes of a target by interpreting the echoes from radio or sound waves respectively. Ultrasonic sensors generate high frequency sound waves and evaluate the echo which is received back by the sensor. Sensors calculate the time interval between sending the signal and receiving the echo to determine the distance to an object

Application:

Detect object and measure distance Ultrasonic sensor can detect movement of targets and measure the distance to targets Make pictures of different parts of the body It also has usage in medicine.

      • Accelerometer

– It is one of the most common inertial sensors. Accelerometers can measure acceleration in one, two, or three orthogonal axes

Application:

As inertial sensor, it measures velocity and position As inclination sensor, it detects inclination, tilt, or orientation in 2 or 3 dimensions As vibration sensor, it senses vibration or impact

      • IMU

– Inertial measurement unit It measures and reports on a craft's velocity, orientation, and gravitational forces, using a combination of accelerometers and gyroscopes, sometimes also magnetometers

Application:

IMUs are typically used to maneuver aircraft, including unmanned aerial vehicles(UAVs) and spacecraft. The IMU is the main component of inertial navigation systems used in aircraft, spacecraft, watercraft, and guided missiles among others.

Non-GUI interfaces

GUI stands for graphic user interface. This type of interfaces is the most common ones. Wenbin and I are going to talk about other types of interfaces. They are:

Haptic interface Locomotion interface Olfactory interface Gesture interface Kinds of auditory interfaces


Haptic interfaces

So, what is haptic interface? Haptic interfaces receive haptic information from user and/or deliever haptic feedback to user. cutaneous sense kinesthetic sense

This kind of interface is generally referred to sensations of two directions: cutaneous sense and kinesthetic sense. Cutaneous touch is about sensing surface features and tactile perception. Normally it is delivered through our skin. Kinesthetic touch is related to muscles and tendons. It gives us the understanding of where our limbs are in space. Basically, main components for haptic interface are: Mechanism that defines the motion capabilities of the human user Mechanism that defines the motion capabilities of the human user when interacting with the device;

Sensors that track user motion

sensors that track user motion in the virtual environment;

And actuators actuators(motors) that perform the desired forces or textures.

Why haptic interfaces?

The feedback from haptic interface is mainly force or touch from the human-machine interface. So, haptic feedback is able to offer remote or visually simulated objects with actual physical attributes like roughness, hardness, and mass. This granted user of the ability to feel and manipulate remote or virtual objects.

Haptic interface enables perception of limb movement and position.

Not long ago, Geoff showed us the Playstation Move. It offers very basic haptic feedback, the vibiration. But it is mainly a motion controller. Take that gladiator game as example, user can not feel the weight of the weapon, the bounce of hitting on objects. If we can somehow deploy haptic interface to convey these information, the game experience will be improved in a revolutionary level. Haptic interfaces also allow performing skilled tasks, espacially tasks that require precision and speed. A good example can be remote surgery. To do a surgery operation, information received from eyes is far from enough. Doctors also need the information felt on their hands to perform the surgery successfully. If a good haptic interface can simulate these feeling and pass to the doctor, remote surgery can be more deasible, much safer for patient and much easier for doctor.

Design:

There is a lot of information about design haptic interface, I only picked several of them to talk about. What kind of haptic interface, is it tactile or kinesthetic Designers need to determine which type of feedback is preferred, tactile or kinesthetic. If the purpose is to display surface roughness or provide simple alert, the tactile device is preferred. If the purpose is to sensing shapes, positions, force, then the kinesthetic device is preferred. Consider the inherent capabilities of human Haptic device will be mechanically coupled to the human user, so designer must make sure the system characteristics, such as work-space size, force magnitude, velocity, etc., are well matched to the human user. Consider the sensitivity of human to tactile stimuli Sensitivity to tactile stimuli varies on many factors. Gender, even location can affect the detection thresholds. Designers need to know what level the vibration should be or how strong the pressure should be to make sure the user feels them. Use active rather than passive movement Active movement ensures more accurate limb positioning. Designer should avoid static positions at the end range of motion to reduce fatigue.

Video 1 Video 2 Video 3


Locomotion interface

Here is a small game for you to play first.


What is locomotion interface? Locomotion refers to the action of the organism moving itself from one place to another, such as bird flying, fish swimming. For humans, locomotion is about walking, running, jumping and so on.

Locomotion interfaces make users to move around in real or virtual spaces and let them feel as if they are moving. Common virtual-locomotion interfaces get input from only the user’s hands. Like 3D video games are operated by keyboard, mouse, or game controller. Here we are going to see more about involving of user’s body on virtual-locomotion interfaces.

The key point for virtual-locomotion interfaces is to understand user’s intention by data that can be derived from sensing the pose and movement of the user’s body.

There are three types of systems for virtual locomotion:

  • walking-style input movements and resulting movements are as natural and as much like really walking as possible..

video

Its input movements and response are close to driving a vehicle.. example

  • magical-style allows movements that have no natural corollary in the real world but that serve a useful purpose when you are moving about in a virtual scene. For example, the ability to teleport between two location in virtual world is a magic property.

Sensing body position and movement

Sensors that measure, record and report body motions are normally called trackers. Trackers can measure the position and orientation of parts of the body, or can measure body movement. Here are two common tracking systems:

  • Trackers with sensors and beacons -Commonly used in virtual-reality systems, this type of trackers have one or more sensors worn on the user’s body, and beacons fixed in the room. In this system,each sensor reports its pose relative to the room.
  • Beaconless trackers - Tracking technologies that do not rely on beacons. Some of them determine the orientation of the body part from the Earth’s magnetic field or the Earth’s gravitational field. Other use inertial sensors.

Displaying feedback

As users specify how they want to move, the system must provide feedback via a display to indicate how and where they are moving. Display is a general term, and can also refer to other stimuli to other senses.

HMD - Head-mounted displays, head-worn visual displays provide visual feedback. With this kind of display, no matter how the user turns the body or head, the display is always directly in front of the eyes. The down side is that many HMDs are heavy to wear and the cables may interfere with the user’s head motion. example


Large screen display example

Display for other senses:

Auditory displays - most systems use auditory displays as a supplement to the primary visual display. Hearing your own footsteps is one cue that helps you remain oriented during movement, and subtle echoes tell you much about the size of the room you are in and your position in it.

Motion display - platform that is physically moved under computer control with the user sitting or standing on it. Or motorized treadmill, where where the computer system continually adjusts the speed and direction of the moving belt in response to the user’s inputs and position.

The locomotion metaphor(walking-style, vehicle-style and magic-style) should match the goals of the Interface Designer should always consider whether the locomotion metaphor suits the goal of locomotion interface. For example, if the interface’s goal is to simulate real walking, then the interface should require the user to really turn the body to turn around(walking metaphor), rather than turning by manipulating a steering wheel or hand controller (vehicle metaphor).

Consider supplementing visual display with other senses

Visual display alone may not deliver certain kinds of movement nicely. For example, in a video game, when user hit by bullet, auditory display and motion platform would work better in such case.

Consider how long and how often the interface will be used

Designer must consider the length of time the user will be using the interface and how fatiguing and stressful the motions user must make are.


Smell/Olfactory Interfaces

What is olfactory interface?

Olfaction is the sense of smell, and olfactory interfaces refer to devices/systems that deliever information to users by smells. The smell interfaces are poorly developed comparing with visual, auditory or haptic interfaces. The olfaction is the chemical sense while visual, auditory or haptic interfaces are more related to physical stimuli.This makes olfactory interface hard to develop in a similar manner on sensing physical stimuli. https://www.youtube.com/watch?v=oDQpF3NWoJQ

Producing smells Two ways:

  • Vaporization

The straightforward way of vaporizing smells is natural evaporation. This method can be used for liquid sources with relatively high volatility. To accelerate evaporation, we can heat up the materials. But be careful, some types of odor molecules can be easily destroyed by high temperatures. Another way is to make fine mist of the liquid using tools like sprayer.

Blending There are two ways to do so: blend in liquid status and blend after vaporization. To blend odors while they are in liquid form, dilution fluids, such as alcohol, are typically used. To blend odors after they are vaporized, valves are usually used.

Delivering the smells The traditional way of enjoying smells involves diffusing aromas throughout a room, where they are continuously smelled by users for a sufficient period of time. Widespread, long-term aromas are good enough for conveying environmental, background or ambient information. But this traditinal delivering method is not sufficient for computer media technologies. Because they are unable to erase the smell fast enough once it is diffused in the space. If we want to deploy smells into movies, video games, etc., that requires the change of smells very fast between each scene or picture. So, to control the smells, there are two ways for the moment.

Deliver the minimum amount of generated smell directly to the nose. When the scent generator produces smell, we can use something like a tube to send the smell to the nose. If the scent generator can be small enough, we can also attach it directly on the nose, make it wearable.

Use smell elimination equipment Suction pumps and filters can be used for eliminating massive amount of smells. With this approach, scent delivering system is less critical.

Application


Design

Is it meaningful or effective for the entire system to append an olfactory channel? If the olfactory interface is inherent to the designer’s intention, this decision is reasonable. But if the olfactory interface is not the primary mechanism of interaction, designer should consider whether this interface worth the cost.

Number of odors Humans can distinguish thousands of smells, but the number of smells required for a single application may be relatively small. The number of simultaneously required odors directly affects the complexity of the system; thus, careful selection of the odors is essential for creating cost-effective systems.

Number of users and spatial size

Designer should consider the number of users. If it is for single person, then the system should be personal size. If it is for group, larger devices might be used for implementation.

Odor control in timing aspects

As mentioned before, use olfactory interfaces in audiovisual programs or applications with rapid interaction requires fast switching of odors. So designers need to solve the problem that emitted smells cannot be erased instantaneously.


Gesture Interfaces

GESTURES Gestures originate from natural interaction between people. They consist of movements of the body and face as nonverbal communication that complements verbal communication. This is the inspiration behind using gesture interfaces between man and machine.

TECHNOLOGY AND APPLICABILITY Mechanical and Tactile Interfaces Early gesture interfaces relied on mechanical or magnetic input devices. Examples include the data glove, the body suit.

Exam111ple.png

(http://static2.actualidadgps.com/wp-content/uploads/2010/02/guantes-gps.jpg)

Exam1111ple.png

(http://www-personal.umich.edu/~hamms/portfolio/motioncapture/images/compo_01.jpg)

Computer Vision Interfaces

When the aim is to make gesture interfaces invisible to the user, computer vision is a nice way to detect gestures.

Face Gaze and Expression

Face gaze and expression are a subdomain of gesture research. Facial expressions may be used as input modalities in accessibility research, such as for disabled people who cannot make any movement other than facial, such as blinking or smiling.

Applicability

Gesture interfaces are popular wherever the interface requires some freedom of movement or immersive feeling such as in virtual-reality environments

FUNDAMENTAL NATURE OF THE INTERFACE

The simplest gesture interface is the well-known motion detector that turns on a light in a room.

Exa213213mple.png

(https://www.youtube.com/watch?v=COt2O6mpu1g)

Gesture Taxonomies

Semantic labels describe the meaning of the gesture, that is, what it communicates and its purpose. Commonly used in nonverbal communication studies.

Exam123123ple.png

Functional labels describe what the gesture does in an interface. Commonly used in technical human–computer (HCI) descriptions.

Descriptive labels describe how the gesture is performed, such as its movement. Commonly used in technical HCI descriptions. (https://www.youtube.com/watch?v=T63BDr3RLb8)

Exam1321535ple.png

HUMAN FACTORS INVOLVED IN INTERFACE DESIGN

One of the most difficult tasks is to find a feasible gesture vocabulary that is easy for the user to remember and perform.


[TEST TIME] find 4 students 2 groups Guess the word of ‘tiger’ Group one: only communicate with language; result 1 minute. Group two: can communicate with both language and gesture; result just 1 second!.


Auditory/Voice user/Voice response Interfaces

Auditory Interfaces Auditory interfaces are bidirectional, communicative connections between two systems—typically a human user and a technical product. Auditory displays are not new and have been used as alarms(bells), for communication, and as feedback tools for many decades.


Some of the needs that can be met through auditory displays include (1) presenting information to visually impaired people, (2) providing an additional information channel for people whose eyes are busy attending to a different task, (3) alerting people to error or emergency states of a system, and (4) providing information via devices with small screens such as PDAs or cell phones that have a limited ability to display visual information.

Sonification of Complex Data

Sonification is the use of nonspeech sound to render data, either to enhance the visual display or as a purely audio display. (https://www.youtube.com/watch?v=3PJxUPvz9Oo)

NATURE OF THE INTERFACE

Basic Properties of Sound

Sound arises from variations in air pressure caused by the motion or vibration of an object.

Human Sensitivity to Auditory Dimensions Human listeners are able to detect sounds with frequencies between about 16 Hz and 20 kHz, with sensitivity falling off at the edges of this range.

TECHNOLOGY OF THE INTERFACE

CURRENT INTERFACE IMPLEMENTATIONS Drawbacks to Using Sound: (1)Annoyance (2)Privacy (3)Auditory overload (4)Impermanence Advanced Audio Interfaces (1)Audio for Monitoring (2)Audio in Games


Applications of Auditory Interfaces to Accessibility (1)Accessibility and the Desktop--Setting on mac

Exammacple.png

(2)Mobile Accessibility--Traffic light

Examlightple.png


Voice User Interfaces

A voice user interface (VUI) is the script to a conversation between an automated system and a user. --Apple Siri (https://www.youtube.com/watch?v=8ciagGASro0)


The technology behind VUIs is automatic speech recognition (ASR). The system is often a speech-enabled IVR, but increasingly may also be a personal computer, in-car navigation system, or other speech-enabled handheld device. All these systems depend on the same basic technology to recognize speech input. In each case, the speech signal is captured, digitized, segmented, and then compared against a set of stored acoustic models for speech sounds.

Interactive Voice Response Interfaces Interactive voice response (IVR) interfaces are chiefly telephony interfaces. The most common IVR application is voice mail. Activity: call Vodafone 1741, feel the voice response.

Authors: Jieling Yang, Wenbin Tian, iMedia 2013-2014

Personal tools