User interfaces: visual & haptic
Top | Topic News | Event News | News Archive | Projects | Products | Topics | Events | Resource Links and Downloads | Associations | Electronic Journals | Site info | Search | Feedback
Adverts from Google
CONTENTS:Information Visualisation | Reading news online - research | On-line Library of Information Visualisation Environments (OLIVE) | Visualisation problems and resources | Auditory image representations | Haptic user interfaces | Harvard endorses tactile feedback | Haptics community | Multimodal interfaces
A project run by Stanford University and The Poynter Institute focuses on how frequent Internet news readers go about perusing news online. The research concentrates on tracking eye movement, and aimed to answer such questions as:
- Exactly where do Internet news readers go to catch their news?
- Which stories do they read, which skim, which ignore?
- Do they read only headlines and briefs, or full articles?
- If they hyperlink to a related story, do they return to the original site?
Many of the common perceptions of the way users consume news and information on the web may be wrong - according to a the preliminary findings of the study.
The way users consume online news and information, studied by the project, may surprise many editors and producers of news and information-rich sites. For example, in response to the question: What do users look at first when they visit your site? The text? Photos? Images? The study found that on a typical web page, 22% of graphics, 64% of photos and 92% of text were looked at.
Learn more from the project website. 16/06/00
The website covers eight categories of information visualisation environments differentiated by data types: 1-D, 2-D, 3-D, multi-dimensional, temporal, tree, network and workspaces. It provides a wealth of information sources based on these categories and is based on a "Taxonomy of Information Visualisation User-Interfaces" which is published on the site. There are a number of related US academic sites which cover the topic of Human-Computer Interaction:
- Main site of Univ. of Maryland HCI Laboratory
- Technical reports from Univ. of Maryland HCI Laboratory
- Students HCI on-line research experiments
Bill Hibbard from the SSEC Visualisation Project at Wisconsin-Madison University has published (for SIGGRAPH 99) a list of the Top Ten Visualisation Problems. His web site also carries information about software and projects developed in the SSEC programmes, and links to other sites such as the NASA scientific visualisation sites list.
Prof. Dr. Gitta Domik of the University of Paderborn (Germany) has a set of web documents entitled Curriculum for Visualisation that includes an extensive Tutorial on Visualisation. 18/08/00
"Soundscapes from the vOIce" is research which is exploring ways to help the blind to "see with their ears". The work has centred on an experimental system for auditory image representations. The prototypes developed are designed as a step towards a vision substitution device for the blind through the real-time conversion of arbitrary images into soundscapes. It is characterised as "a quest for options beyond the guide dog and the long cane".
Logitech will incorporate Immersion's TouchSense Technology to enable computer mice to simulate the sense of touch by using Immersion's Inertial Harmonic Drive hardware design and Immersion Desktop software.
"The addition of tactile feedback to computer mice can significantly enhance user performance," says Jack Dennerlein, Assistant Professor of Ergonomics at Harvard University. "Our laboratory studies show that people complete basic cursor targeting tasks faster with tactile feedback." 30/08/00
The Haptics Community web page is building resources for researchers and developers in the area of sensory/touch interfaces. Topics covered include:
- Image gallery of haptic displays around the world
- Control issues
- Mechanical design
- Simulation design
- Tactile display
A project at Carnegie Mellon University is looking at ways to give users convincingly real haptic interaction with computers. A user interacts with the computer by grasping a rigid tool whose behavioral description is computed, employing this tool to interact with computed environments which are semantically meaningful in terms of the application. At the same time, the environment exerts realistic forces and torques on the tool's handle which are felt by the user. The approach is based on a recently developed magnetic levitation technology, and advances in the art of physically-based simulation.
Immersion Corp. and Logitech have developed a force feedback mouse that enables users to "feel" the weight and texture of objects through the mouse. The companies have already collaborated in producing the Wingman Force Feedback joy stick for gamers.
URL: product info http://www.force-feedback.com/feelit/feelit.html
URL: BBC review with Real audio interview http://news.bbc.co.uk/hi/english/sci/tech/newsid_388000/388928.stm
An extensive review paper of Multimodal Interfaces (in French) is published at the website of The Centre de recherche informatique de Montréal. The site provides many links to papers on different aspects of the subject in the bibliographic section.
URL: : http://www.crim.ca/ipsi/multi/mulmod_bienvenue.html
El.pub - Interactive
Electronic Publishing R & D News and Resources
We welcome feedback and contributions to the information service, and proposals for subjects for the news service (mail to: firstname.lastname@example.org)
Edited by: Logical Events Limited - electronic marketing, search engine marketing, pay per click advertising, search engine optimisation, website optimisation consultants in London, UK. Visit our website at: www.logicalevents.org
Last up-dated: 29 June 2018
© 2018 Copyright and disclaimer El.pub and www.elpub.org are brand names owned by Logical Events Limited - no unauthorised use of them or the contents of this website is permitted without prior permission.