4 results match your criteria Acm Transactions On Applied Perception[Journal]

  • Page 1 of 1

Keppi: A Tangible User Interface for Self-Reporting Pain.

ACM Trans Appl Percept 2018 Apr;2018

University of Cincinnatti.

Motivated by the need to support those self-managing chronic pain, we report on the development and evaluation of a novel pressure-based tangible user interface (TUI) for the self-report of scalar values representing pain intensity. Our TUI consists of a conductive foam-based, force-sensitive resistor (FSR) covered in a soft rubber with embedded signal conditioning, an ARM Cortex-M0 microprocessor, and Bluetooth Low Energy (BLE). In-lab usability and feasibility studies with 28 participants found that individuals were able to use the device to make reliable reports with four degrees of freedom as well map squeeze pressure to pain level and visual feedback. Read More

View Article

Download full-text PDF

Source
http://dx.doi.org/10.1145/3173574.3174076DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC6287633PMC

Scene-Motion Thresholds During Head Yaw for Immersive Virtual Environments.

ACM Trans Appl Percept 2012 Mar;9(1)

University of North Carolina at Chapel Hill.

In order to better understand how scene motion is perceived in immersive virtual environments, we measured scene-motion thresholds under different conditions across three experiments. Thresholds were measured during quasi-sinusoidal head yaw, single left-to-right or right-to-left head yaw, different phases of head yaw, slow to fast head yaw, scene motion relative to head yaw, and two scene illumination levels. We found that across various conditions 1) thresholds are greater when the scene moves with head yaw (corresponding to gain < 1:0) than when the scene moves against head yaw (corresponding to gain > 1:0), and 2) thresholds increase as head motion increases. Read More

View Article

Download full-text PDF

Source
http://dx.doi.org/10.1145/2134203.2134207DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC4334481PMC

A Feedback-Controlled Interface for Treadmill Locomotion in Virtual Environments.

ACM Trans Appl Percept 2007 Jan;4(1)

The Schepens Eye Research Institute, Harvard Medical School.

Virtual environments (VEs) allow safe, repeatable, and controlled evaluations of obstacle avoidance and navigation performance of people with visual impairments using visual aids. Proper simulation of mobility in a VE requires an interface, which allows subjects to set their walking pace. Using conventional treadmills, the subject can change their walking speed by pushing the tread with their feet, while leveraging handrails or ropes (self-propelled mode). Read More

View Article

Download full-text PDF

Source
http://dx.doi.org/10.1145/1227134.1227141DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2132658PMC
January 2007
3 Reads

Heading assessment by "tunnel vision" patients and control subjects standing or walking in a virtual reality environment.

ACM Trans Appl Percept 2007 Jan;4(1)

Schepens Eye Research Institute, Harvard Medical School.

Virtual reality locomotion simulators are a promising tool for evaluating the effectiveness of vision aids to mobility for people with low vision. This study examined two factors to gain insight into the verisimilitude requirements of the test environment: the effects of treadmill walking and the suitability of using controls as surrogate patients. Ten "tunnel vision" patients with retinitis pigmentosa (RP) were tasked with identifying which side of a clearly visible obstacle their heading through the virtual environment would lead them, and were scored both on accuracy and on their distance from the obstacle when they responded. Read More

View Article

Download full-text PDF

Source
http://dx.doi.org/10.1145/1227134.1227142DOI Listing
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1920177PMC
January 2007
1 Read
  • Page 1 of 1