Spring 2005: Seminars and guidance for student work (3 cu)
Dept. of Computer Sciences
Kanslerinrinne 1 (Pinni B, room 4043)
FIN-33014 University of Tampere
Voice +358 3 215 8549
Fax +358 3 215 6070
Final Report (~9Mb Lectures + students’ projects)
“there are a number of assistive technology solutions available today being used for general computer access that could also being used with games
some are designed for disabilities but not for games while others are designed for games but not disabilities”
Game is a natural environment to study and improve interface design and interaction techniques, to test usability and accessibility. Nevertheless, primarily games are intended for people without sensory problems. There are many blind and visually impaired people; there are people with limited dexterity or a cognitive deficit; there are deaf or dumb people. There is a small group of the deaf-blind users who also need special educational and training tools. Perceptual testing & training are exceptionally important for All.
The projects and ideas, which will be shared during the seminars, present the novel view and understanding human feelings which should be tested as early as possible; moreover, actual or residual feelings might be involved and developed through technologies and augmented communication.
The seminars consist of Introduction (lectures, 6 hrs), student work (5-8 weeks, ~100 hrs) which includes development (literature review and the report, software evaluation) and presentation of the topic at the Student Conference (20 minutes + 15 minutes for Q&A) per person, guidance for individual student work. Permanent supervising will be provided through individual consultations once per week and on-line by email.
The working language of the course is English.
All the papers will be published in the Series B - student projects in courses and seminars, etc. (see for example, (Report B-2003-5), Department of Computer and Information Sciences, University of Tampere pp. XX-XX, http://www.cs.uta.fi/reports/bsarja/B-2003-5.pdf )
Therefore, the publication format for students’ reports (ACM) is (that is strongly Required) http://www.acm.org/sigs/pubs/proceed/template.html
See also General Graphical Errors and comments for data imaging and presentation in the paper:
09.2003, room: B3110, 12.00-14.00 & 30.09.2003, room: B3110, 12.00-14.00
the single-track lectures (10.01, 17.01, 24.01) included the next topics:
Some common errors in physiology of analyzers and psychophysics of perception. Wishes & challenges.
Artifacts, intentional motions, behavioral strategy and interaction style
Parameters, signals & patterns, mapping, modality
Game. Games & assistive technology.
Games with sounds and the sounds for special games.
Logic games with sounds and touch (puzzles).
Games based on tactile signs & navigation (grids).
Interaction with graphs. Vibrations. Motions & tracking.
Leena Vesterinen Sonification and Basic Behavioral Patterns in Blind Inspection of Hidden Graphs
Antti Nyman Games with Sounds: the Blind Navigation and Target Acquisition
Jalo Kääminen Exploring Micro-Movements for Diagnostics of Neurological Problems
Ruijie Ban eSmileys: Imaging Emotions through Electro-Tactile Patterns
Jarno Jokinen Math-Puzzle: Equation Tutor for Sighted and Visually Impaired Children
Deepa Mathew vSmileys: Imaging Emotions through Vibration Patterns
Oleg Špakov EyeChess: the tutoring game with visual attentive interface
Reference to students’ papers (sample):
Vesterinen, L. Sonification and Basic Behavioral Patterns in Blind Inspection of Hidden Graphs, in G. Evreinov (ed.), Alternative Access: Feelings & Games 2005 (Report B-2005-2), Department of Computer Sciences, University of Tampere (2005) 45-50. http://www.cs.uta.fi/reports/bsarja/B-2005-2.pdf
Sonification has become important in encoding information for the blind and visually impaired users. The goal of the Hidden Graphs game project was to optimize the “sonification dialogue” with a player through basic behavioral patterns (BBP) coordinated to capture radius and directional-predictive sound signals (DPS) to facilitate shaping the personal behavioral strategy in discovering the features of the hidden graphs. BBP were applied to build up three different behavioral strategies for the game playing. Concepts of capture radius [Walker and Lindsay, 2004] and DPS were used for guiding the player to attain the goal in the game. Four subjects who started as novices and have become experienced players in blind inspection took part in testing the game. The statistical data analysis showed a significant difference in the performance when three behavioral strategies with different capture radius and directional-predictive sound signals were employed. The performance of the subjects was evaluated in terms of the stylus deviation, the relative frequency of the DPS-sounds used and the task completion time concerning the hidden graphs. The results of the of proposed sonification technique based on directional-predictive sounds paradigm and BBPs are discussed.
Lack of feedback makes mouse use difficult for blind persons. Additional sound feedback can be used to facilitate spatial navigation in a screen space. The game-like navigation task was based on capturing 25 circular targets (spots) with a radius of 65 pxls arranged in a square grid (5 rows by 5 columns). Two different sound mappings were tested and evaluated under blind conditions; one grid was augmented with 6 different sounds and other one with 3 different sounds. It was revealed that the mapping of 6 sounds provided more effective non-visual interaction and made the target acquisition time shorter by about 1.39 times and the number of errors smaller by about 32%. The reasons for navigational problems were also analyzed. It was noticed that the number of spots passed during the target acquisition was smallest when the target was located along the edges of the grid. While the sounds associated with the corners of the grid were more distinctive than other sounds, the eight spots surrounding the central position of the grid were generally the hardest group to detect and capture. The features of sound mappings and behavior of the subjects are discussed in detail.
Tremor can be as a symptom of such neurological diseases as Parkinson's disease, multiple sclerosis, and damage to the cerebellum. A successful screening method could open the way for earlier treatment that may delay the progression of presently incurable diseases. The goal of this project was an exploration of the pen-based technique for early diagnostics of the deterioration level in a person’s ability to control micro-movements. Eight subjects of different age groups took part in the pilot test. The method was based on a comparison of the personal immediate handwriting performance in copying the graphical patterns. The performance of the subjects in game-like testing was evaluated in terms of the stylus deviation and correlation of the scan path to the graph on X-axis and Y-axis separately and the task completion time when output-to-input ratio was non-less than 4. The results of a pilot testing are analyzed. They reveal that coordination problems can be registered even when the problems have not previously been detected by the person himself. We guess that further exploration of the pen-based technique for screening hand-eye coordination problems can increase selective sensitivity of the method regarding verified symptoms of neurological problems.
Graphical input and visualization has irrefutable benefits as an intermediate between an analog perception of the human being and a computer that operates on and produces discrete data. However, graphical images pose greater problems for blind and visually impaired users. The converting of visual images into another sense (modality) could open new ways in visualization technique and promote alternative communication and user interface technology for all. Tongue is one of the most sensitive parts of human body. There have been several attempts to apply the tongue in human-computer interaction both to control and display information. The current study is based on electro-tactile stimulation of the tongue to transform short conditional messages, presented by symbols or graphics, into electro-tactile patterns. The goal is to estimate the efficiency of sensory substitution of the conditional semantic information regarding emotions with the use of Composite Electro-Tactile Patterns (CETP). The rectangular pulses of stabilized current of 0.3 mA with alternative polarity have been used to shape the CETPs. Three volunteers took part in the pilot testing of the technique. The performance of the subjects was evaluated through match game in the terms of the number of repetitions to memorize each of 9 CETPs, test-time completion and the error rate at recognition of the test patterns. The benefits and lacks of the eSmileys technique are discussed in detail.
Teaching mathematics for blind people is a challenging task, since most of the mathematical methods require logical skills and manipulating by abstract notions that are always easy to understand visually. Math puzzle has been designed for supporting the learning of mathematical operators and equations. The simple graphical interface was augmented with short speech cues and earcons, so it can be played in a blind mode. The game allows to develop spatial imagination, memory and logics in blind or visually impaired children.
The haptic computer interface is taking a giant leap into the future for millions of people, especially people with sensorial deficiency. A lot of research have been done worldwide on how to create or improve haptic interfaces and are seeking for the challenges and possibilities that the haptic technology can offer. When computer interface is augmented with haptic (or tactile) signals people with sensorial deficiency can play various computer games, learn mathematics by tracing touchable curves, and gain better access to graphical user interfaces [Wall, 2005]. Vibro-tactile patterns can play a vital role for both blind and deaf users by substituting Bliss symbols and earcons. This paper describes the designing and evaluating vibro-tactile patterns (tactons) for the match game “vSmiley” using the tactons for imaging of emotions. Based on the results of the pilot testing of the game it is clear that a carefully encoded tactons are easy to identify and distinguish. This game is also intended for deaf or/and visually impaired children who employ sense of touch as a way for communication.
Advances in eye tracking have enabled the physically challenged people to type, draw, and control the environment with their eyes. However, entertainment applications for this user group are still few. The EyeChess project described in this paper is a PC-based tutorial to assist novices in playing chess endgames. The player always starts first and has to checkmate the Black King in three moves. First, to make a move the player selects a piece and then its destination square. To indicate that some squares could be activated, while other ones were forbidden for selection, color highlighting was applied. A square with a green highlight indicated a valid action, and the red color denoted invalid action. There were three options to make a selection: blinking, eye gesture (i.e., gazing at off-screen targets), and dwell time. If the player does not know how to solve the task, or s/he plays by making mistakes, the tutorial provides a hint. This shows up a blinking green highlight when the gaze points at the right square. Preliminary evaluation of the system revealed that dwell time was the preferred selection technique. The participants reported that the game was fun and easy to play using this method. Meanwhile, both the blinking and eye gesture methods were characterized as quite fatiguing. The tutorial was rated helpful in guiding the decision-making process and training the novice users in gaze interaction.
This page maintained by Grigori Evreinov (firstname.lastname@example.org)
10 May 2005