Sign In

Communications of the ACM

ACM News

Can AI Detect Dyslexia?


View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
A dyslexia word cloud.

A growing number of researchers are looking at how artificial intelligence could help detect dyslexia early on.

Credit: Paradoxfx/Dreamstime.com

Dyslexia is a learning disability affecting 5% to 15% of Americans that makes it difficult for a person to read, write, and spell. It often isn't detected until a child is in the 4th grade (around age 9) or later. Since most children make mistakes when they start learning to read and write, warning signs of the condition are rarely spotted early on.  "At the moment, we're often waiting for students to fail or to be far behind everyone else before they get help," says Maria Rauschenberger, a researcher at the Max Planck Institute for Software Systems in Saarbrücken, Germany.

Detecting dyslexia early could be critical to managing the condition. Although dyslexia can be treated through training, often by teaching compensation strategies, it typically takes about two years for a child identified as having dyslexia to catch up to their peers. If they fall behind academically for too long, they are at growing risk of dropping out of high school. Furthermore, undiagnosed dyslexic children often suffer from frustration and low self-esteem, which becomes more pronounced with time and increasingly harder to remedy.

"Not all people can easily figure out how to compensate," says Ricardo Baeza-Yates, a professor and founder of the Web Science and Social Computing Research Group at Pompeu Fabra University in Barcelona, Spain, and a professor at the Khoury College of Computer Sciences of Northeastern University's Silicon Valley campus. "That's why it's important to intervene very early, basically when they are starting to learn to read and write." 

A growing number of researchers are looking at how artificial intelligence (AI) could help detect dyslexia early on. Rauschenberger, Baeza-Yates, and their colleagues, for example, are developing what they call the first Web-based game that could be used to screen a child's risk of having the condition before they can read and write, based on language-independent content and using machine learning. Although specialized therapists, fMRI scans, and eye-tracking systems can help assess dyslexia in pre-readers, no cheap and easy technique is currently available, says Rauschenberger. "I think this could be very helpful because it would be the first consumer product," she says. "It would be easily accessible."

Instead of focusing on aspects of dyslexia linked to reading and writing, the game uses either visuals or sounds and screens to identify less-obvious indications of the condition. Dyslexics confuse certain similar sounds and shapes for example, while their short-term memory can be impaired, too. The game uses these indicators, as well as others that can be used with visual and auditory cues, making it accessible to non-readers, as well as speakers of any alphabetic language.

In the visual mode, a player would be presented with a visual cue, which then disappears and, a few seconds later, the player is asked to identify that visual cue from among a group of nine similar shapes. Since dyslexics are known to confuse letters with similar symmetries, such as B and D, some shapes may simply be flipped, or differ in their orientation.  

In the auditory version of the game, the gameplay is similar to that of the classic Memory pair-matching game. A player is presented with cards on a screen, and clicks on one to hear a sound; the goal is to find two sounds that are the same.

In initial tests, the game was played by 313 children who could read and write and spoke either German or Spanish. Some had been diagnosed with dyslexia, while others had no signs of the condition. Machine learning models, which involved weighted decision trees, were then trained with data collected from the game to see if they could predict which players had dyslexia and which did not.

The team found its models could predict dyslexia reasonably well. The highest accuracy they achieved was 74% with German speakers. Typically, tests with reading and writing cues, which are stronger indicators of dyslexia, can detect the condition with an accuracy of about 85%. Baeza-Yates thinks their system is promising, since the difference in accuracy isn't huge and it's their first attempt. "We hope to increase [the accuracy] by using more data," says Rauschenberger.

Analyzing those models revealed which aspects of the gameplay were being used to distinguish children with dyslexia from the rest. Similar indicators were expected for German and Spanish speakers, but Rauschenberger and her colleagues found many differences between the two language groups.

For example, children with dyslexia typically take more time to process information, so it was thought that they would play the game more slowly. This was observed among the Spanish speakers, where dyslexic children took longer on average to click on game icons compared to non-dyslexics. In the sound game, Spanish children with dyslexia took longer to find matching pairs, but there was little difference in the activity speed of German participants with and without the condition.

The team thinks cultural differences could explain varying performance in the sound game. "The only guess we have is that maybe German students have more knowledge of music compared to Spanish students," says Baeza-Yates.

Once the game is perfected, Rauschenberger thinks it could be used as an initial test for dyslexia. Parents could then follow up by taking their child to a therapist for an evaluation or by using linguistic screening tools if they can read and write. "Having it as a consumer product to guide people to more help would be very helpful and possible," she says.

Another research group is trying to detect dyslexia using handwriting, another indicator that hasn't been used much before. Katie Spoon of IBM Research was at Indiana University Bloomington when she and her colleagues became keen to speed up the process of getting screened for dyslexia, since they were aware of how long it can take. "We were looking for something that would be fast and cheap and easy to collect," she recalls, adding that "handwriting came up as a possibility."

Previous research has identified aspects of handwriting that can differ between dyslexics and non-dyslexics. Instead of writing horizontally, for example, dyslexics will often write in a way that is sloped upwards or downwards. They also may have poor spelling, and omit spaces between words, but it's often a combination of factors that require a trained professional to pick out.

Spoon and her colleagues wanted to see if deep learning would be able to predict dyslexia from handwriting. A deep learning model learns on its own, so it doesn't need to be trained to recognize specific features. "You hope it learns the features that are important," says Spoon.

Up to 800 handwriting samples were collected from elementary school students, about 15% of whom had been diagnosed with dyslexia. Random patches of handwriting of the same size were extracted from the samples and were fed into a standard convolutional neural network (CNN), a deep learning model widely used to analyze images, which would then decide if a sample was indicative of dyslexia or not.   

Initial results showed the system could pick out samples that belonged to dyslexic children with about 77% accuracy. Although Spoon says that isn't accurate enough to be used for diagnostic purposes, it suggests the model is identifying features of handwriting that can distinguish the two groups. "I think it's very encouraging," she says.

Spoon and her colleagues tried to figure out which features the CNN might be recognizing. They first visually inspected writing samples to see if there were noticeable differences between those identified as having dyslexia or not.  

The most obvious distinguishing feature was what Spoon describes as messiness. When the researchers classified handwriting samples as legible, partially legible, or illegible, they found that 84% of those marked illegible and 60% of those that were partially legible were from children with dyslexia. "That shows a pretty clear distinction for what we saw visually," says Spoon, who thinks their model could be recognizing messiness, too.

Spoon stresses they are still in the early stages. So far, their model hasn't been trained with a lot of data, and it is all from the same school system, which could introduce bias. The researchers are still exploring whether handwriting is an appropriate indicator to use in the first place.

With further work, however, their system could help with early screening of dyslexia. Spoon thinks it has potential as an easy way to spot red flags, which could be followed up with a more thorough evaluation, for example. "It's not supposed to say anything definite," says Spoon. "I see it being used in conjunction with other tools."

Sandrine Ceurstemont is a freelance science writer based in London, U.K.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account