Wednesday, April 25, 2012

EyeRing: Wearable Camera That Detects Objects For The Visually Impaired

EyeRing camera mounted on a finger
Image source: Engadget

Suranga Nanayakkara and Roy Shilkrot, researchers at MIT Media Lab, have developed a new bluetooth camera that can be worn on a finger and used to detect objects. This camera has been designed keeping people with visual impairment in mind, and can perform various tasks (based on the "mode" selected) that will not only help perform everyday tasks but also assist in self learning.

The concept of EyeRing is simple - a camera, connected via bluetooth to a phone or similar device that has voice output, is mounted on the user's finger. The user points the camera at an object and presses the shutter button on the side. The camera takes a picture of the object and sends it to the phone, which detects the object and provides a voice response back to the user.

Picture this: A visually impaired shopper goes to a store. He wears EyeRing on his finger and selects the "distance" mode and points ahead. EyeRing calculates the free space available and speaks out to the shopper. He then starts looking at shirts. He switches to "color" mode and points the camera at the shirts he is looking at. EyeRing detects the color of every shirt and tells the shopper what colors they are. He then switches to "tag" mode, and points EyeRing at the tag of the shirt he selected. EyeRing reads the tag and tells him the price printed on the tag. Finally, to check how much money he has, the shopper pulls out a couple of bills from his wallet and points EyeRing at them. EyeRing reads the denomination on the bills and tells the shopper how much the bills are worth.

Everyday shopping just got so much simpler and easier for visually impaired people!

EyeRing can also be used by little kids who are learning how to read. Instead of someone actually telling them what a word is, they can simply point EyeRing at the word, and using its OCR feature, EyeRing would speak the word back to the kid, thus enabling self learning without anyone else's assistance.

According to the source link, EyeRing is still buggy, but of course, it is still being worked on, and hopefully in the coming days/months/years would be out in the market and make every day life a lot easier for people with visual impairment.

Watch the following videos to see EyeRing in action. The first video shows how a shopper can use it at a store. The second video is of the researchers talking about EyeRing in more detail.







Source: Engadget, EyeRing website

Monday, April 23, 2012

Blind Man Drives Google's Self Driving Car

"This is some of the best driving I have ever done."


Meet Steve Mahan - A man, having lost 95% of his sight, is way past legal blindness. Recently, he was asked by Google to test drive their self driven car, and the result is just mind blowing.

Watch the video to see Steve Mahan "driving" the self driven car, with Google employees giving him company. "Look ma, no hands" he says, as the Prius, decorated with Google logos, drives off on the streets. The video shows the car carefully and quite accurately maneuvering turns and entering a Taco Bell drive through where Steve orders a burrito. While driving, Steve is seen chatting with his co-passengers (of course with no hands and feet touching any of the controls).

The car is still not perfect, but according to Google, the self driving car project has completed more than 200,000 miles of computer assisted driving since 2010. From what we see in the video, the car is on the right track!




Hit the source links to read more.

Source: Google+ via The Verge

Sunday, April 22, 2012

Mobile Accessibility Suite Free On Sprint, Boost and Virgin Mobile

Sprint, Boost Mobile, and Virgin Mobile subscribers in the US, who are blind or have visual impairment, can download a suite of accessibility apps  on their Android phones for free that usually costs $99. The suite consists of ten apps that are used by an average users on a daily basis  -

  • Phone - Make/ answer calls; manage call log.
  • Contacts - Manage contacts (even from Facebook).
  • Calendar - Manage and view calendar entries.
  • SMS - Send, receive and manage text messages.
  • Alarms - Set alarms.
  • Email - Full access to Gmail account.
  • Where Am I - Tells current location.
  • Web - Internet.
  • Apps - Access all apps on your phone.
  • Settings
The suite also does the following:
  • Screen reader
  • Date/time, battery level, network coverage, missed calls etc.
This suite can be downloaded on any phone that has Android 2.1 and above. To use the screen reader feature of the suite, the phone should have a physical navigation button (like a trackball), and only works on Android 2.2 and above.

Watch the video to see a quick demo of Mobile Accessibility.


Visit the following link to read more about the suite, read the phone providers terms and conditions, and to download the suite on to your phone.

Mobile Accessibility on Google Play

Remember, you can still download this suite for $99 if you are signed up with other phone service providers.

Source: Engadget

Tongue Interface Using Kinect

Researchers at The University of Electro Communications are busy developing an interface that detects tongue movements and allows users to access items using their tongues. This interface is being designed for people who have difficulty speaking and swallowing (stroke victims, for example).

The interface uses the Kinect and works by first detecting the location of the two eyes. Once it has located the eyes, it estimates the location of the tip of the nose. Once that is done, it estimates the location of the mouth area, and based on that, the actual movement of the tongue is obtained.

This interface could be efficiently used if the user could train their tongue. One way to do so is to move the tongue left and right. The researchers have actually created a game that uses the left to right tongue movement.

The interface is still a work in progress  - quite raw and not very robust. The researchers are working on improving the interface's ability to detect tongue movements more precisely. Also, they plan to include detection of lip movements in the future along with tongue movements. However, this is another fine example of what technology and innovation can do to make lives better for everyone.

Watch the video to see this interface (and the game) in action.




Source: DigInfo via Engadget

Monday, April 16, 2012

Bionic Eye Research at Monash University


microchip for bionic eye
Microchip for the bionic eye (source: Monash)

Researchers at Monash University, who have been working on a project to deliver a direct to eye bionic eye implant by 2014, have started testing the microchip that will power the bionic eye. So far, the results have been very good, which means that the project is on the right track.

The bionic eye would mainly consist of three parts - a camera that would be mounted on a pair of glasses (retina), a pocket processor that takes all the information from the camera and converts it into signals that could be understood by the brain, and cortical implants consisting of several tiles which will stimulate the visual cortex (the part of the brain that processes signals from the eyes). Each of these tiles will have tiny microchips which are currently being tested.

The aim of the researchers is to create something that is the equivalent of a seeing-eye dog or a white cane. Initially, the bionic eye would complement them, but eventually will replace them.

Pre clinical assessments for this bionic eye would start soon.

Watch the following video to understand how a bionic eye works. This video is not related to the research being done at Monash, but gives a good overview of what researchers who work on such projects try to achieve.




Source: Monash

Wednesday, April 11, 2012

Tobii IS 2 - The Next Generation Eye Tracker

tobii IS 2If you are a fan of the Tobii Controller or are interested in knowing more about eye controllers, you should check out the newest eye controller from Tobii - the IS 2. The Tobii IS 2 is 75% smaller than its predecessor IS 1, and consumes 40% less power.

The IS 2 has two infrared projectors that are used to measure the user's pupil size and two cameras to record their positions. The information captured is sent to the IS 2's processors, which in turn sends it to any hardware or program IS 2 is connected to.

tobii is2Tobii plans to integrate IS 2 with pretty much everything, from medical imaging, medical diagnostics to attention monitoring and training.  If you are a developer, you can download the SDK and create apps that can be downloaded by users on the Application Market for Tobii Eye Trackers.



Watch this video to learn more about the Tobii IS 2.




Source: Tobii via Engadget, Mike Paciello