Monday, April 21, 2014
Wednesday, April 24, 2013
Academics from the University of Lincoln, UK, are working with WESC, one of the UK's most respected specialist schools for visually impaired children, to create and evaluate a new 'visual search rehabilitation game'.
There are around 25,000 children in Britain -- equating to two children per 1,000 -- with a visual impairment of such severity they require specialist education support. The causes of blindness in children are extremely varied, but cerebral visual impairment (damage to areas of the brain associated with vision, rather than damage to the eye itself) is among the most common.
Researchers from Lincoln's School of Psychology and School of Computer Science will work with staff and children from WESC -- the specialist centre for visual impairment. The school and college, based in Exeter, has been providing education and care for young people with visual impairment since 1838 and is a designated High Performing Specialist School.
Together they have been awarded a grant worth around £130,000 for a Knowledge Transfer Partnership (KTP) which will apply the very latest research in visual neuroscience to the rehabilitation of childhood cerebral visual impairment and special education.
Timothy Hodgson, Professor of Cognitive Neuroscience in the School of Psychology at the University of Lincoln, will lead the project.
He said: "Previous research has shown that visual search training can lead to significant recovery of sight following damage to visual centres of the brain in adults. The problem is these training programmes are just too boring to use with children. "Our game will be a fun computer based tool which will benefit children with visual field loss -- holes in their vision due to damage to the brain's visual pathways. "This is an exciting research project which brings together expertise from diverse disciplines and puts this knowledge into practice in a way that could make a real difference to the quality of life of visually impaired children. "At the same time, we also expect the game will be suitable for rehabilitation of adults who have suffered sight loss due to stroke."
The game will use principles derived from existing programmes used in adults with visual field loss, whereby patients have to search for hard-to-find objects on a computer screen (a 'visual search' task), but the game will be modified to make the task more stimulating and fun for children and structured to maximise the efficiency of learning.
Working alongside Professor Hodgson on the KTP will be Dr Conor Linehan, a specialist in computer game development based in Lincoln's School of Computer Science. They will oversee the work of KTP Associate Jonathan Waddington, an experienced computational neuroscientist, who will be based at WESC for the duration of the two-year project. Financial support for the project is provided by the Technology Strategy Board and the UK's Medical Research Council (MRC).
Tracy de Bernhardt Dunkin, Principal and CEO at the WESC Foundation, said: "This is a tremendously exciting development for WESC and the culmination of five years' work to introduce learning and research around neurological visual impairment. We are delighted to be employing our first visual neuroscientist, supervised by University of Lincoln. We plan to expand our research and development department further over the coming years to reflect our interest in this highly specialist area of work which is so relevant to many young people with visual impairment across the UK as a whole."
Tuesday, October 16, 2012
Sunday, September 30, 2012
A free-to-use font designed to help people with dyslexia read online content is gaining favour.
OpenDyslexic's characters have been given "heavy-weighted bottoms" to prevent them from flipping and swapping around in the minds of their readers.
A recent update to the popular app Instapaper has adopted the text format as an option for its users.
The font has also been built into a word processor, an ebook reader and has been installed on school computers.
The project was created by Abelardo Gonzalez, a New Hampshire-based mobile app designer, who released his designs onto the web at the end of last year.
Tuesday, July 24, 2012
In support of the competition, Vodafone, Mobile Monday, in conjunction with NDRC Inventorium have partnered with the NCBI Centre for Inclusive Technology and the Irish Internet Association (IIA) to run a two-day workshop to stimulate the generation of smartphone app ideas to submit to the competition for a share of €200k worth of prize money.
The two-day workshop will explore the challenges, problems and commercial opportunities that exist for developers and entrepreneurs when building smartphone applications that consider the needs of this significant demographic.
Monday, July 9, 2012
by George Dvorsky
For years now, colorblind artist Neil Harbisson has used a special head-mounted device to help him translate colors into sound. Not content to wear it on the head for the rest of his life, however, Harbisson has decided to have it surgically implanted. The upcoming procedure is part of the European artist's larger effort to get people accustomed to the idea of cybernetic implants.
Harbisson was born with a rare condition called achromatopsia, which limits his color perception to black and white. Eight years ago he developed a device that helped him correlate sound frequencies to the wavelengths of colors. At first he used headphones, but he has increasingly incorporated the device into his body. Even his passport photo shows him wearing the device — what he calls the eyeborg.
by George Dvorsky
With all the marvelous text-to-speech and speech-to-text technologies currently in our midst, it's surprising to realize just how few of these devices actually serve as assistive devices — particularly for the hearing-impaired. But a new invention from a group of Ukrainian students is set to change all that: They have developed a glove that can translate the movements made by sign language into speech.
Called EnableTalk, the gloves are fitted with flex sensors, touch sensors, gyroscopes and accelerometers — as well as solar cells to increase battery life (talk about attention to detail). It has a built in system that can translate sign language into text and then into spoken words using a text-to-speech engine. And the entire system can work over Bluetooth enabling smartphone connection. The project was a finalist at Microsoft's Imagine Cup held in Sydney Australia, created by the QuadSquad team.