Labels

Wednesday, April 24, 2013

Computer Game Could Improve Sight of Visually Impaired Children

 
Apr. 24, 2013 — Visually impaired children could benefit from a revolutionary new computer game being developed by a team of neuroscientists and game designers.

Academics from the University of Lincoln, UK, are working with WESC, one of the UK's most respected specialist schools for visually impaired children, to create and evaluate a new 'visual search rehabilitation game'.

There are around 25,000 children in Britain -- equating to two children per 1,000 -- with a visual impairment of such severity they require specialist education support. The causes of blindness in children are extremely varied, but cerebral visual impairment (damage to areas of the brain associated with vision, rather than damage to the eye itself) is among the most common.

Researchers from Lincoln's School of Psychology and School of Computer Science will work with staff and children from WESC -- the specialist centre for visual impairment. The school and college, based in Exeter, has been providing education and care for young people with visual impairment since 1838 and is a designated High Performing Specialist School.

Together they have been awarded a grant worth around £130,000 for a Knowledge Transfer Partnership (KTP) which will apply the very latest research in visual neuroscience to the rehabilitation of childhood cerebral visual impairment and special education.

Timothy Hodgson, Professor of Cognitive Neuroscience in the School of Psychology at the University of Lincoln, will lead the project.

He said: "Previous research has shown that visual search training can lead to significant recovery of sight following damage to visual centres of the brain in adults. The problem is these training programmes are just too boring to use with children. "Our game will be a fun computer based tool which will benefit children with visual field loss -- holes in their vision due to damage to the brain's visual pathways. "This is an exciting research project which brings together expertise from diverse disciplines and puts this knowledge into practice in a way that could make a real difference to the quality of life of visually impaired children. "At the same time, we also expect the game will be suitable for rehabilitation of adults who have suffered sight loss due to stroke."

The game will use principles derived from existing programmes used in adults with visual field loss, whereby patients have to search for hard-to-find objects on a computer screen (a 'visual search' task), but the game will be modified to make the task more stimulating and fun for children and structured to maximise the efficiency of learning.

Working alongside Professor Hodgson on the KTP will be Dr Conor Linehan, a specialist in computer game development based in Lincoln's School of Computer Science. They will oversee the work of KTP Associate Jonathan Waddington, an experienced computational neuroscientist, who will be based at WESC for the duration of the two-year project. Financial support for the project is provided by the Technology Strategy Board and the UK's Medical Research Council (MRC).

Tracy de Bernhardt Dunkin, Principal and CEO at the WESC Foundation, said: "This is a tremendously exciting development for WESC and the culmination of five years' work to introduce learning and research around neurological visual impairment. We are delighted to be employing our first visual neuroscientist, supervised by University of Lincoln. We plan to expand our research and development department further over the coming years to reflect our interest in this highly specialist area of work which is so relevant to many young people with visual impairment across the UK as a whole."

Tuesday, October 16, 2012

Watch this robotic wheelchair turn its wheels into legs and climb over stairs

by George Dvorsky



Traditionally, wheels and stairs are a tragic combination, what's akin to mixing oil and water. Looking to overturn this convention, researchers at the Chiba Institute of Technology have developed a robotic wheelchair that can actually climb over steps.
In addition to this, the wheelchairbot can line its wheels up, and extend stabilizers to the left and right, enabling it to turn a circle. This makes it easy to reverse, even in a narrow space.

We were particular about using wheels, because this kind of vehicle will mostly move on ordinary paved surfaces. The most efficient way of getting around on paved surfaces is to use wheels, like a car. So, this robot mainly uses wheels, but the wheels can become legs.

For now, we're presenting this system and form as a concept, and the motion has mostly been worked out. So, we're at the stage where we can show this robot to the world. In the next phase, we'll get a variety of people to try it, so we can fine-tune the user experience.

Sunday, September 30, 2012

OpenDyslexic font gains ground with help of Instapaper



 A free-to-use font designed to help people with dyslexia read online content is gaining favour.

OpenDyslexic's characters have been given "heavy-weighted bottoms" to prevent them from flipping and swapping around in the minds of their readers.

A recent update to the popular app Instapaper has adopted the text format as an option for its users.

The font has also been built into a word processor, an ebook reader and has been installed on school computers.

The project was created by Abelardo Gonzalez, a New Hampshire-based mobile app designer, who released his designs onto the web at the end of last year.

http://www.bbc.com/news/technology-19734341

Tuesday, July 24, 2012

Smart Accessible Mobile Challenge 2012

The Vodafone Foundation Smart Accessibility Awards 2012 are hosting a €200k prize fund competition. It is calling on developers across Europe to design smartphone applications and services which consider the needs older people and people with disabilities.

In support of the competition, Vodafone, Mobile Monday, in conjunction with NDRC Inventorium have partnered with the NCBI Centre for Inclusive Technology and the Irish Internet Association (IIA) to run a two-day workshop to stimulate the generation of smartphone app ideas to submit to the competition for a share of €200k worth of prize money.

The two-day workshop will explore the challenges, problems and commercial opportunities that exist for developers and entrepreneurs when building smartphone applications that consider the needs of this significant demographic.

http://smartaccessiblemobile-estw.eventbrite.com/

Monday, July 9, 2012

Surgical implant will allow cyborg artist to see colors through sound

by George Dvorsky

For years now, colorblind artist Neil Harbisson has used a special head-mounted device to help him translate colors into sound. Not content to wear it on the head for the rest of his life, however, Harbisson has decided to have it surgically implanted. The upcoming procedure is part of the European artist's larger effort to get people accustomed to the idea of cybernetic implants.

Harbisson was born with a rare condition called achromatopsia, which limits his color perception to black and white. Eight years ago he developed a device that helped him correlate sound frequencies to the wavelengths of colors. At first he used headphones, but he has increasingly incorporated the device into his body. Even his passport photo shows him wearing the device — what he calls the eyeborg.

MORE HERE >>>

Ukrainian students invent gloves that convert sign language into speech

by George Dvorsky

With all the marvelous text-to-speech and speech-to-text technologies currently in our midst, it's surprising to realize just how few of these devices actually serve as assistive devices — particularly for the hearing-impaired. But a new invention from a group of Ukrainian students is set to change all that: They have developed a glove that can translate the movements made by sign language into speech.

Called EnableTalk, the gloves are fitted with flex sensors, touch sensors, gyroscopes and accelerometers — as well as solar cells to increase battery life (talk about attention to detail). It has a built in system that can translate sign language into text and then into spoken words using a text-to-speech engine. And the entire system can work over Bluetooth enabling smartphone connection. The project was a finalist at Microsoft's Imagine Cup held in Sydney Australia, created by the QuadSquad team.

MORE HERE >>>