Grigori Evreinov ‘s Home Page
Department of Computer Sciences
33014 University of Tampere
Interaction Techniques papers
Selina Sharmin Evaluating Non-Visual Feedback Cues for Touch Input Device
Natalie Jhaveri Two Characters per Stroke – A Novel Pen-Based Text Input Technique
Remberto Martinez Gonzalez F-Pointer: Finger-Manipulated Device and Usability Evaluation
Weiwei Zhang Target Selection under Time Pressure Conditions
Tatiana Evreinova and Oleg Spakov Usability Exploration of the Visual Textual Imaging
Ioulia Guizatdinova and Zhiguo Guo Sonification of Facial Expressions
Melinda Luoma Symbol Creator: Usability Evaluation of the Novel Pen-Based Text Input Technique
Xiaoqing Yang Text Entry through a Single Button and Auditory Feedback
Juha Pieviläinen Text Input through One-Dimensional Head Tracking
in Proceedings of "New
Interaction Techniques 2003", G. Evreinov (ed).
Dept. of Computer and Information Sciences, University of Tampere,
Report B-2003-5, June 2003, ISBN 951-44-5713-7, ISSN 1457-2079
*** project-report is temporarily unavailable
The practical phase of the course included individual and group activity in designing or/and usability exploration of the prototype (hardware or software) one of the novel interaction techniques, that is, carrying out the pilot project (about 8 weeks). That part was more difficult both for the students and instructor due to very short time, different directions of the research and individual skills of the students. In any case, doing the project the students should receive some experience how design the experiment, how organize usability evaluation of some technique or device, how to collect and process data, and, finally, how to present results of research project via a scientific report. After difficult choice, 11 projects were selected for investigation. Before start the project development all students wrote Research plan [see Contents ] and did a special presentation. The goal of this stage was to clarify the tasks for researchers and to involve other students in active discussion around the proposed topics (similar to brainstorm activity). Presentations took 12 hrs.
Text manipulation as a model for menu pointing and navigation
A novel pen-based text entry technique, which yields two characters with one stroke, is introduced, and the results of its investigation are presented. The 2CPS technique is implemented as a gesture-based user interface on a QWERTY soft keyboard. Eight participants completed eight text entry trials on a fully functional version of this stroke-based system on a Pocket PC. The difference between the means of the eighth trial and the first trial demonstrated a substantial improvement of 41%. In spite of this evident progress on the learning curve, the participants’ mean rate for the eighth trial was a mere 5.6 wpm. The reasons behind the results are discussed and future improvements to the system are proposed.
Keywords: soft keyboard, pen-based text entry, gesture-based user interface, stroke
The main objective of this project was to study the features of user perception of spatial-temporal mapping through sonification of the spatial patterns. An alternative text entry with a single button was considered as a model for a menu selection task. A traditional seven-segment display element was used as a spatial layout for symbol input and imaging. Empirical research was carried out for the rhythmic musical sequences coordinated to the spatial seven-segment layout. Seven notes were assigned to each segment in a temporal sequence, which the user had chosen by pressing a button. The segments were activated by cyclically after the first click. When all segments have been cycled, the result was interpreted as a character according to a set of rules and depending on the character set used. The research was focused on examining temporal components, user behavior strategy and decision-making taking place under time pressure. The rationale for the test design and the results of a preliminary evaluation are presented.
Keywords: spatial-temporal pattern, sonification, user behavior strategy, decision-making under time pressure
Human-computer interaction techniques include sensor technology and strategy of user behaviour as joint parts of the interface. Miniaturisation and economized space constraints require simple design to perform pointing and selecting tasks, while keeping efficiency as high as possible. Having finger motion detection along a surface, usability of a single finger manipulated device is assessed within a text entry scenario. The experiment was designed to measure novice user performance by screen typing where speed, accuracy, and response time were registered. The results showed a throughput of 1.66 bps and velocity of 10 wpm. Future applications should take advantage of the integration capability of this device.
Keywords: finger manipulation device, pointing and selection tasks, text entry
The goal of this project was a usability study of Symbol Creator, a pen-based text input technique. Symbol Creator (SC) is a new technique, which aims at faster and more intuitively usable text entry. It is based on assembling characters using symbols that resemble basic elements of Latin cursive*. Since keyboards are getting smaller and smaller, and the number of text messages has increased in the last decade, it is important to study ways to make multitap input technique more fast and easy to learn. Usability was being evaluated by text entry speed, number of errors made, and subjective user experiences. Results showed multi-tap input technique to be faster than Symbol Creator technique. However, the number of errors was less with Symbol Creator technique than with multi-tap technique. Also, when measuring text entry rate with keystrokes per character, the two text input techniques reached the same rate of 2 KSPC. Subjective experiences were negative regarding Symbol Creator before testing, but changed to more positive during testing the technique.
Keywords: pen-based text entry, handwriting, KSPC, software keyboard, Symbol Creator, multi-tap.
* Handwriting styles, fontware, http://www.educationalfontware.com/LG_style.html
In everyday life we encounter many situations where we need to manipulate some machine when using hands is not possible. Nowadays there are commercial solutions that allow hands-free remote control by the stationary device such as computer, TV-set or similar one [**]. But they do not allow mobility for user as these devices are based on interaction between external unit (IR/RF transmitter and receiver or receiver) and marker (mirror or transmitter) attached to user head, forehead, finger, glasses etc. This paper describes a new technique for head tracking that allows the user to be mobile by using users torso as a reference point. While usability testing for prototype was carried out in stationary conditions, our input device allows mobility of the user and could be applied for wearable device like mobile phone, PDA or alternative input device for people with special needs.
Keywords: hands-free technique, head tracking, wearable device, Symbol Creator
** The Boost Tracer http://www.boosttechnology.com/
Smart-Nav™ hands free mouse http://www.naturalpoint.com/
Tracker One, 2000 etc http://www.madentec.com/
The DynaSightTM Sensor http://orin.com/index.htm
The project aims to research the stimulus-dependent goal-directed behavior of the user under time pressure conditions, in particular, during the text input. We can get the different results from users when they input the text by conventional way with the help of keyboard, via the mouse and software keyboard, by using a single button manipulation and different text entry models like Morse codes or some cyclic mode*. Two techniques for text entry: mouse and software keyboard, and cyclic input through a single digital button, were selected for an exploration. We will change time conditions (waiting time, segments’ exposition) in order to evaluate critical temporal parameters for decision making, number of errors and subjective user experiences.
* Scanning Mouse (MauSi scan), Product information is available at:
Evreinov, G., Raisamo, R. Cyclic Input of Characters through a Single Button Manipulation. In J.Klaus, K.Miesenberger, W. L. Zagler (Eds.), Proc. ICCHP 2002 Linz, Austria, 15-20 July 2002, LNCS Vol. 2398, Springer-Verlag Berlin Heidelberg 2002, pp. 259-266.
Haptic User Interfaces
Non-visual tactile feedback improves user experience by providing user a wider area of the target and faster access to the same. Development of haptic communication and appearance of input device with tactile feedback provide scope to systematically investigate how people perceive the world indirectly through the use of various intermediate objects. Non-visual tactile feedback can be applicable in blind manipulation as well as in the situation when the vision is occupied in some other tasks.
The goal of this study was to carry out a research to evaluate non-visual feedback cues, tactile and sound, during navigation in a Maze. Results of the study can be used for the development of novel pen-based input techniques, haptic interfaces and applications for people with special needs.
Keywords: haptics, non-visual feedback, tactile feedback, texture, sound feedback, force feedback.
Navigation, orientation, communication
Many factors can be involved and negatively influence on a decision-making. This project presents an exploration of the new technique for analysis of human behavior in simulated stress-conditions: two-hand coordination during the task of pointing and selection the targets under time pressure. As mentioned by S. Keele , planning goal-directed actions involves two important components: first, the intention to attain an action goal or to experience a desired event after executing an action; and, second, the selection of a motor program for a movement that elicits precisely this event. What would happen if different hands would control X and Y coordinates separately? What strategy will be chosen in critical situation? A new input device, the analog buttons, and designed for this purpose software allow carrying out the objective investigation of human performance and analyze data concerning individual behavior patterns. Handedness and performance are two main competitive factors, which determine behavioral dynamics of the subjects. Under time pressure conditions we have observed a variety of techniques with different accuracy-speed trade-off. However, we can say that there are, at least, three fields along track to the target and three temporal intervals when dominant hand and strategy can change, they are: long-distance behavior 0-600 ms; tuning for capture - two-radius of the target and 300-600 ms duration; approach – middle-distance behavior, the most variable tracking part.
Keywords: Time pressure, target selection, decision making, human behavior, leading hand.
Spatial sound application
Visual attributes specifically describing human facial expression can be inaccessible when visual perception is blocked or an image is hidden for observer. Current computer programs can process facial expressions with speech output or alternatively by adding tactile feedback cues. However, facial traits and expressions are still not sonificated. The goal of this project was development and usability evaluation of the sonification system for alternative access to facial expressions through eARmoticons. eARmoticons should evoke the emotional feelings of a listener like those a visual image could produce. The results of our study showed that after some training time the auditory imaging facial expressions could also be accessible for a listener. The proposed technique, which can briefly display an array of related attributes like facial traits, could facilitate communication and interpretation of visual images for people with special needs.
Keywords: emotional expression, facial traits, sonification, earcons, eARmoticons.
Ekman, P. and Friesen, W.V. Facial Action Coding System (FACS): A Technique for the Measurement of Facial Action. Palo Alto. CA.: Consulting Psychologists Press, 1978. http://www.paulekman.com
Schwende, H. Auditory Emotional Access to Visual Information. In Proceedings of Computers Helping People with Special Needs, 8th International Conference, ICCHP (July 15-20, Linz), 2002, Austria, pp. 445-447.
Bresin, R. & Friberg, A. Expressive musical icons. In Proceedings of the 2001 International Conference on Auditory Display, Espoo, Finland, July 29 - August 1, 2001, pp. 141-143.
Alternative Visualization Techniques
The lack of access to verbal communication with non-deaf individuals is a major problem for the profoundly deaf people. The goal of our work was to develop communication techniques through graphical imaging of textual information for the profoundly deaf and hard-of-hearing people. Our empirical research was focused on an exploration the visibility of proposed pseudo-graphic typeface in comparison with five conventional phonetic typefaces. Our results show that Impact and Styled typefaces were perceived easier and seemed to be rather legible than Courier, Arial, Comic and Times New Roman. The tachistocopic analysis of the amount of recognized token (target stimulus) among distractors showed that subjects had fewer recognition difficulties with target stimulus among distractors for Arial and Times New Roman typefaces. For font attractiveness, Comic was perceived as being more attractive than Arial and Courier, while Styled and Impact were perceived as more attractive than Times New Roman. Of the fonts studied, Impact and Styled appear to be the more visible. Besides, being the most preferred they could be perceived fairly preattentively. We suppose that the proposed Styled typeface may have wider applications such as public display systems for dynamic imaging of current financial events in stock exchanges or present environments with different constraints.
Keywords: hearing impaired, pseudo-graphic tokens, syllabic tokens
We acknowledge all students who were not daunted by the practical difficulties and successfully finished the projects and course.
This page maintained by Grigori Evreinov (email@example.com)
21 October 2003