HCI

From BC$ MobileTV Wiki
Jump to: navigation, search

Human Computer Interaction (also referred to as Human Computer Interface; commonly abbreviated as HCI) is the platform through which humans can interact with and/or control machines.


Mouse

For roughly 40 years (1972-2009), the ball and track mouse (and later, laser LED light then finally invisible optic light mouse) was the de facto standard for human-computer interaction.


Serial Mouse

Douglas Engelbart [1] at the Stanford Research Institute [2] invented the mouse in 1963[3] after extensive usability testing [4]. He never received any royalties for it, as his patent ran out before it became widely used in personal computers.[5]

Eleven years earlier, the Royal Canadian Navy [6] had invented the trackball [7] using a Canadian five-pin bowling [8] ball as a user interface for their [9] system.[10]

Mouse Listener

Capturing the mouse position means determining the X and Y co-ordinates of the mouse's pointer. It is typically done when a certain action occurs, for example when the mouse is moved, placed over a specific region, or clicked on a specific object. A class whose primary function is to capture mouse positions and other related mouse events is called a Listener class. A Listener is a coding mechanism designed to listen to events within a given environment. When certain events occur, a given set of computation or pre-set responses will be carried out.

USB Mouse

Bill English [11], finally built Engelbart's original vision for the mouse. The ball-mouse replaced the external wheels with a single ball that could rotate in any direction. It came as part of the hardware package of the Xerox Alto [12] computer. Perpendicular chopper wheels housed inside the mouse's body chopped beams of light on the way to light sensors, thus detecting in their turn the motion of the ball. This variant of the mouse resembled an inverted [13] and became the predominant form used with [14]s throughout the 1980s and 1990s. The Xerox PARC group also settled on the modern technique of using both hands to type on a full-size keyboard and grabbing the mouse when required.

The ball mouse utilizes two rollers rolling against two sides of the ball. One roller detects the horizontal motion of the mouse and other the vertical motion. The motion of these two rollers causes two disc-like encoder wheels to rotate, interrupting optical beams to generate electrical signals. The mouse sends these signals to the computer system by means of connecting wires. The driver software in the system converts the signals into motion of the mouse pointer along X and Y axes on the screen.[15]

Eventually, the serial mouse would be replaced by the USB mouse as the need arose for being able to use one's mouse with a number of different systems, and the advent of USB permitted this.


Keyboard

QWERTY

On-Screen

Touch-Screen

A prerequisite for "On-Screen Keyboards" mentioned above, Touch-Screen interface technology has been around for quite some time[16][17], particularly in industrial applications, however the technology did not receive widespread use in consumer electronics until 21st century innovations in Multi-Touch.

Multi-Touch

Multi-Touch is quickly emerging as the next best thing in computing. It is essentially a receptive/capacitive surface such as specialized glass, plastic or translucent metals which allow a user to both see an interface and interact with it directly by touching it (as opposed to traditional screens which are displays only, not themselves input devices). Especially since the infamous 2006 TED Talk by Jeff Han[18][19] and release of the first iPhone and iPod Touch models by Apple, most significant Tablets, SmartPhones and Mobile Devices include some form of touch-screen interface, functionality or features.


Haptics

Haptic communication recreates the sense of touch by applying forces, vibrations, or motions to the user. This mechanical stimulation can be used to assist in the creation of virtual objects in a computer simulation, to control such virtual objects, and to enhance the remote control of machines and devices (i.e. Telerobotics). Haptic devices may incorporate tactile sensors that measure forces exerted by the user on the interface. Most researchers distinguish three sensory systems related to sense of touch in humans: cutaneous, kinesthetic & haptic.

[22]


Oral

Voice

Voice controlled (commands).

Speech

Speech controlled (natural language and discussions).


Gesture Interfaces

Hand

The Hand Gesture Interface is a real time gestural interface based on 3D dynamic hand gesture recognition using simple webcams. The application captures and recognizes hand gestures of the user wearing colored gloves, where the hand coordinates are obtained via 3D reconstruction from stereo.

The developed system aims to be a generic interface for Windows based applications, and provides supplementary features such as an interactive training and gesture defining system, a gesture tutor, a self-calibration utility for the cameras and a tool for linking the interface to different applications. The interface can be used by people with disabilities for various tasks, as well as by common users for controlling of desktop applications. The system can be linked to any third party program through generation of windows based mouse and keyboard events.

Body

Body Gesture Interfaces use a camera (i.e. webcam) or other tracking mechanism (such as the body suit commonly used in CGI action capture) to recognize movements for the purpose of controlling a user interface (i.e. projection or large-screen display).

Eye Tracking

Emotion sensing/control

[23]

Situational Awareness

[24] [25] [26] [27] [28] [29] [30] [31] [32] [33] [34] [35] [36] [37] [38] [39]


Mind-Controlled

A Brain-Machine Interface (BMI) is any device which uses the fluctuations or patterns in brainwaves and/or human behaviours to control devices, services, applications or other operating systems (such as that of a piece of equipment/machinery).

Brainwave

[40]

Intent Prediction

Tools

OpenInterface

The objective of the OpenInterface project is to provide an open source platform for developing interfaces that communicate intelligently through several modalities. OpenInterface focuses on human-human and human-machine natural interaction and the physical or virtual interaction environment.

The platform tools are aimed at designing, implementing and testing natural and easy-to-use multimodal interfaces that:

* handle a rich and extensible set of modalities,
* enable quick replication,
* enable a focus on innovation (new modalities or forms of multimodality),
* support dynamic selection and combination of modalities to fit the ongoing context ofuse,
* enable iterative user-centered design.

The idea of the open-source reusable OpenInterface platform seeks for efficient cooperation of consumers from both industry and academia.


Resources


Tutorials


External Links


References

  1. wikipedia: Douglas Engelbart
  2. wikipedia: Stanford Research Institute
  3. The Invention of the Computer Mouse: http://web.archive.org/web/20061018141720/http://www.afrlhorizons.com/Briefs/Mar02/OSR0103.html (Retrieved 31 December, 2006)
  4. wikipedia: usability testing
  5. Template:Cite news
  6. wikipedia: Royal Canadian Navy
  7. wikipedia: trackball
  8. wikipedia: five-pin bowling
  9. wikipedia: DATAR
  10. Ferranti-Packard: Pioneers in Canadian Electrical Manufacturing, Norman R. Ball, John N. Vardalas, McGill-Queen's Press, 1993
  11. Bill English
  12. wikipedia:Xerox Alto
  13. wikipedia: trackball
  14. wikipedia: personal computer
  15. wikipedia:Mouse(Computing)
  16. Who Invented Touch Screen Technology?: http://inventors.about.com/od/tstartinventions/a/Touch-Screen.htm
  17. The Evolution of Touchscreen Technology: http://www.makeuseof.com/tag/evolution-touchscreen-technology/
  18. Jeff Han's first public demo of his innovations in Multi-Touch technology: http://www.ted.com/talks/jeff_han_demos_his_breakthrough_touchscreen
  19. Jeff Han - Biography by TED: https://www.ted.com/speakers/jeff_han
  20. How the iPhone's Touchscreen Works: http://electronics.howstuffworks.com/iphone1.htm/printable
  21. How do touch-screen monitors know where you're touching?: http://computer.howstuffworks.com/question716.htm
  22. Haptic feedback with the Taptic Engine - WKInterfaceDevice and WKHapticType in WatchKit and watchOS 2: http://www.sneakycrab.com/blog/2015/6/22/haptic-feedback-with-the-taptic-engine-in-watchkit-and-watchos-2-wkinterfacedevice-and-wkhaptic
  23. Intel calls its AI that detects student emotions a teaching tool. Others call it "morally reprehensible": https://www.protocol.com/enterprise/emotion-ai-school-intel-edutech
  24. Situational Awareness explained: https://securityadviser.net/situational-awareness-definition/
  25. Dynamic Removal of Clutter to Improve Situational Awareness in (Air) Traffic Management Systems: https://oied.osu.edu/technologies/dynamic-removal-clutter-improve-situational-awareness-traffic-management-systems
  26. Using PMU Data to Increase Situational Awareness (in electric power grid): http://overbye.engr.tamu.edu/wp-content/uploads/sites/146/2021/04/Using-PMU-Data-to-Increase-Situational-Awareness.pdf
  27. An Industrial Control System Situation Awareness Method based on Weighting Algorithm: https://www.atlantis-press.com/article/55917281.pdf
  28. Study on multimedia network Weibo situational awareness model and emotional algorithm: https://link.springer.com/article/10.1007/s11042-019-07779-8
  29. Quantitative situational awareness algorithm of land state network based on neutral statistics: https://link.springer.com/article/10.1007/s12652-020-02827-w
  30. Improving stand-on ship's situational awareness by estimating the intention of the give-way ship: https://www.sciencedirect.com/science/article/abs/pii/S0029801820301761
  31. Machine learning algorithms promise better situational awareness: https://www.army.mil/article/236647/machine_learning_algorithms_promise_better_situational_awareness
  32. Robust Coreset Construction for Distributed Machine Learning: https://ieeexplore.ieee.org/document/9109724
  33. Advances in Artificial Intelligence and Machine Learning for Networking: https://www.comsoc.org/publications/journals/ieee-jsac/cfp/advances-artificial-intelligence-and-machine-learning
  34. SPACE SITUATIONAL AWARENESS - IT’S NOT JUST ABOUT THE ALGORITHMS: http://iaassconference2013.space-safety.org/wp-content/uploads/sites/19/2013/06/0920_Schonberg.pdf
  35. Real-time Fault Detection and Situational Awareness for Rovers - Report on the Mars Technology Program Task: http://robots.stanford.edu/papers/Dearden04a.pdf
  36. Geo maneuver detection for space situational awareness: https://www.researchgate.net/publication/289550326_Geo_maneuver_detection_for_space_situational_awareness
  37. Evaluation of PNT Situational Awareness Algorithms and Methods: https://www.ion.org/publications/abstract.cfm?articleID=17935
  38. INGENIOUS situational awareness algorithms for UAVs: https://ingenious-first-responders.eu/ingenious-situational-awareness-algorithms-for-uavs/
  39. Towards Algorithms for Effective Stabilization and Situational Awareness for Humanoid Robots: https://celebration.tcnj.edu/wp-content/uploads/sites/115/2021/04/Bland-Madison-MUSE-2020-Poster.pdf
  40. A Brain Implant Helped a Paralyzed Man Turn Thought Into Text: https://www.reviewgeek.com/83180/a-brain-implant-helped-a-paralyzed-man-turn-thought-into-text/
  41. Fitts' Law: http://www.interaction-design.org/encyclopedia/fitts_law.html
  42. How to Hack Toy EEGs: https://frontiernerds.com/brain-hack

See Also

UI/UX | A11Y | Testing | Device | Mobile Device | Computer | Technology | Design | Bionics | Robot | Internet of Things | VR/AR | LBS