ISAAC 2016

The International Society for Augmentative and Alternative Communication met in Toronto on 8-11 August, 2016. Abstracts are available here.

My collaborators and I made three presentations. Through these, we got feedback for our projects and, gratifyingly, made contact with like-minded people working in our area.

Some have indicated that they could use some Sensact boards. This gives us the impetus to begin planning for a small scale production.

Others asked us to join their effort in making an Accessible OS.

A shoutout to Mary Elizabeth McCulloch and the folks at Project Vive. They want to provide a simple but effective aural device for persons with CP. Their devices, to be locally assembled and maintained, could well have wide-spread applications. Check them out!

All in all a very exciting and fruitful 4 days.

Advertisement
Posted in Uncategorized

Sense + Act = SensAct!

For the last 3+ years I have been making custom switches for AAC users at Bruyère Continuing Care’s Saint-Vincent Hospital. The impetus came from the fact that sometimes commercial switches do not suit our users’ needs. That is understandable as mass-produced devices by definition cannot cover the full spectrum of people’s capabilities. When one falls outside of a device’s stated specifications, then one would not be able to use it. A user may also ‘grow’ out of spec. For instance, we had a user who for a number of years quite adeptly used a single switch device to operate her computer. That is, until she gradually lost both strength and movement in her one mobile finger. When she could only move 2mm, the commercial switch became impossible to operate. That’s when we outfitted her with a custom light-sensor switch.

We have come across quite a few cases where off-the-shelf means out-of-reach.

Having worked with a number of users, we now think that there are some commonalities. That’s when SensAct is born.

The sense aspect is handled by a number of analog ports. We supply 5V so many hobbyist sensors can be readily adopted. In addition, we can accept I2C input which caters for an even wider array of sensors such as accelerometers and gyroscopes. We adopted the common 3.5mm audio cables. Now the sensors are truly plug-and-play. Occupational therapists will be able to try different sensors to find which one would be most ideal for a particular user.

The act aspect deals with commonly needed AAC functions such as controlling a desktop, tablet or smartphone. So we have both USB and Bluetooth (HID) outputs. In addition, we recognize that many users need to operate TVs and entertainment devices so we also have on board IR. Lastly, we provide two relays for activating common items such as the call bell.

I was determined to build something that is easy to configure. The software is thus browser-based and cross-platform. We experimented with two different ways and will likely iterate more in order to come with something flexible and useful that is also approachable to non-technical people.

SensAct is open source both in hardware and software. We realize that not everyone has the resources to DIY. We plan a small-scale production so that SensAct will be available off-the-shelf.

At last, off-the-shelf can mean within-reach!

*Thanks to Bocar N’Diaye for coordinating and spearheading at SVH, Hilary McKee, also at SVH, for unfailing faith in testing the various versions, Bruce Braidek for board layout, Nathan Lim for v2 of the configuration software, and Bill Dawson for championing and advocacy, and many others who are part of the journey!

Posted in Uncategorized

A $10 Blink Detector for Locked-in Syndrome Users

This DIY blink detector can be made in less than 10 minutes for under $10.

A borescope is mounted on a goggle to the side of one eye. Software interprets the camera’s video image to identify two (maybe three) eye gestures. The user could use a scanning ‘keyboard’ to blink out a message. Alternatively, she could blink Morse code.

Instructional video: https://www.youtube.com/watch?v=C95J9l0416I&feature=youtu.be

eyescopefrontcolor-2016-03-23-08-09.jpg

eyescopesidebw-2016-03-23-08-09.jpg

gogglebore2-2016-03-23-08-09.jpg

gogglebore3-2016-03-23-08-09.jpg

The borescope is mounted on a goggle so the camera does not move when the head moves, thus making the design a lot more robust than one that uses a web cam (see https://abilityspectrum.wordpress.com/2015/05/18/blink-is-bliss/)

The software is made with Processing and has been tested on OSX. Work is progressing to make the software cross-platform.

We also intend to add functions such as email and texting.

https://github.com/AbilitySpectrum/blink_based_aural_scanning_keyboard_with_Morse_code_option

Posted in Uncategorized

Blink is Bliss

I have made a webcam based blink system that allows a user to blink a message. Actually, the system only requires that one can effect two gazes, a regular gaze and a ‘select’ gaze. A helper will use the keyboard to register these two gazes at the beginning. I use template matching to find out if the eye has blinked or made the ‘select’ gaze.

screenshot2015-12-22at12-36-23pm-2015-05-18-11-423.png

How to use the system

See https://youtu.be/1–6nZVQz3c

Press 1 to start or reset, then press 2 when the gaze is steady (‘regular’ gaze). Press 3 to register the ‘select’ or ‘yes’ gaze. Press 4 to register the ‘no’ gaze (not used currently).

Press 5 to go into action!

The system reads out the ‘menus’ one at a time. There are 5 menus, with 1 to 4 corresponding to 4 rows of a keyboard:

  • menu 1 – A, B, C, D, E, F
  • menu 2 – G, H, I, J, K, L, M
  • menu 3 – N, O, P, Q, R, S
  • menu 4 – T, U, V, W, X, Y, Z

When a menu is selected, the system reads out each of the letter in turn. A ‘select’ (‘yes’) gaze will pick the letter and add it to the line (in red). When a line fills up, it is automatically hoisted into a buffer (upper section, in green). The user can thus compose multiple lines of text.

Menu 5 provides a number of functions:

  • add a ‘space’ character’
  • delete a letter, a word, or the line of text
  • There is a warning before ‘line’ deletion, with the announcement of ‘erase coming’, which lets the user skip the deletion if he does not select the next item
  • read the line of text
  • go into Morse entry mode
  • pause the program for 3 minutes
  • retrieve the last line in the buffer and put it in the current line for editing
  • make a bell sound to attract attention

There are several settings that can be set on the keyboard (e.g. to make the scan faster or slower). These are listed on the screen.

The Morse code entry option

In the main mode, the user waits for the selection to come by. This means that a typical word takes just under a minute to blink out. If the user knows Morse code , then he controls the pace of typing and can blink out a word in under 15 seconds (speed is also adjustable using the keyboard).

Posted in Uncategorized

Where we can, however we can

Featured in the Ottawa Citizen.

Posted in Uncategorized

Head-mounted Gyro Mouse

Using gyroscope in a mouse is not new. I bought one some ten years ago. As an in-the-air mouse replacement that does not require a mouse pad, it was novel, but not very effective. One’s action has to be so much slower to be accurate. For someone without good manual control for a conventional mouse, however, using another part of the body such as the head to signal movement, a gyroscope becomes a good intermediary. A gyro mouse may be slower, but it could work accurately.

In the current iteration of the design, the gyroscope is mounted on a headband. Two additional sensors are used: one to effect a click. A second, optional, sensor acts as a switch to disable the gyroscope temporarily to avoid unintended mouse movement. The two sensors could be a light or touch sensor. Conceivably, for a user that has no other output than the head movement, we could algorithmically detect a special head movement for click control.

I have used the L3G4200D as well as the newer L3GD20 but I now use the InvenSense MPU-6050 which has the 3-axis accelerometers as well. I’m standardizing on the MPU-6050 only to ease code maintenance.

Using head movement to control the mouse presents a different challenge. Originally, I used the accelerometer to detect positional change. So when the head moves to the left, the mouse moves to the left to a corresponding degree. This however, requires the head’s resting position to be where the mouse is centred to the screen initially. Moreover, careful calibration would be needed to map the sweep of the head movement to the size of the screen.

A more intuitive way to control the mouse is to use the gyroscope to detect the head movement: a slight tilt to the left (or right, or upwards, downwards) will kick the mouse into motion. Once in motion, the head could remain still. The mouse continues to move and is then stopped with a small opposing head movement. I find that an audible click signalling the stop provides valuable feedback.

I also built in a threshold for motion detection so that natural, non-jittery movement will be ignored. This allows the user to be able to relax and move her head (slowly) around without triggering unintended mouse movement.

The Arduino Leonardo has an HID interface, which sends out mouse control codes via the USB port. The Arduino-based gyro mouse easily interfaces with any PC without device drivers. It also interfaces with Android tablets using an OTG cable, which powers the Arduino at the same time.

Posted in Uncategorized

More media

Canadian Healthcare Technology  Nov/Dec 2014
Saint-Vincent Hospital Helps Disabled Patients Communicate

Hospital News November 2014
The Power of People

Posted in Uncategorized

Media

CBC Radio All in A Day
High-tech Headband

CFRA
Saint-Vincent Hospital helps patients with disabilities communicate 

Metro
Saint-Vincent patients get ‘window to the world’ with new technology

Ottawa SUN
Technology Brightens Lives at Saint-Vincent Hospital

Posted in Uncategorized

Light Sensor as Multi-Level Switch

A simple light sensor provides a measure of light incident on the photo-resistor. If one were to move a finger to touch and cover the sensor, the reading many range from 0 to 1023. In principle, that would differentiate more than a thousand shades.

In practice, I could reliably position a finger at perhaps 4 different points, from fully covering the sensor to about 2cm away, with 2 differentiable intermediate positions.

The code in the sensor library allows you to set the number of levels you wish to detect, say n, and returns a number between 0 and (n-1). It assumes that an analog sensor is connected to one of Arduino’s analog pins. Of course, the code would work with any analog sensor. The code recalibrates when the readings are held for a period of time (currently set to 10 seconds) that is not the normal at rest value (1 or 0).

To use, include the header file at the beginning of the code:

#include “AACsensors.h”

and then create the object, s:

AnalogSensor s(A0, 500, 0, 0);

where the four arguments are: pin number, the refractory period in milliseconds, normal high or low (1 or 0), debug mode (0 if no debug messages are needed). The refractory period is used so that the reading is stable within that period.

To take a reading,

reading = s.Level(3);

where, in this example, the variable 3 is the number of levels one wishes to distinguish. The number returned is 0, 1 and 2.

There is a corresponding digital sensor function. Normally, one would just take the reading directly. My code for the digital sensor allows for a refractory period to be used, again, in order to obtain a stable reading. To set up, use the same argument signature as above,

DigitalSensor s(0, 500, 0, 0);

To take a reading,

reading = s.Read();

Code repository: https://github.com/AbilitySpectrum/ArduinoAAC

I have used both the wRobot Light Sensor as well as the Minimum Luminence Light Sensor.

wpid-screenshot2014-07-23at5-41-28am-2014-07-22-23-55.png

wpid-screenshot2014-07-23at5-40-04am-2014-07-22-23-55.png

Posted in Uncategorized

ISAAC 2014

International Society for Augmentative and Alternative Communication met in Lisbon from July 21-24, 2014. This is a premier forum for researchers, practitioners and users. In my technical career, I have never been to a forum that is at once technical and practical, serious and fun at the same time.

The next conference will take place in Toronto, from August 6th to 13th, 2016.

The synopsis of our presentation is here. Some of the presentations are available here.

Posted in Uncategorized