News
Architecture and Hardware

Chip Technology Could Help the Blind, Robots, See

Posted
A patient in Paris wears Pixium's Bionic Vision Restoration System.
A time-encoded ATIS imaging chip inside the eyeglasses starts the process by which patients such as this man in Paris are able to spot doors and stairs, identify objects on tables, and accomplish other rudimentary visual tasks. Pixium Vision continues to

Over a decade ago, physicist Christoph Posch helped develop particle detector chips for the Large Hadron Collider (LHC) at Switzerland's renowned European Organization for Nuclear Research (CERN) laboratory. Some 60,000 of his circuits inside the ATLAS muon detector would go on to help find the long-hypothetical Higgs Boson in 2012. The CMOS devices, still active today, use an "event-based" scheme to record when particles fly by.

The LHC, it turns out, was just an opening act.

Posch left CERN in 2004 and, enthused by the California Institute of Technology's pioneering work on circuitry that mimics biological and neurological processes, he set about developing a chip to deliver computer vision based on how the human eye and brain work together, in a collaboration with Tobias Delbruck of the Swiss Federal Institute of Technology in Zurich (ETH Zurich).

"Initially, there was no obvious connection between the stuff I did back at CERN and the new research on bio-inspired vision," Posch said. "However, at some point it appeared to me that combining ideas and approaches from these two seemingly unrelated fields may result in something disruptive."

Disruptive indeed.

Fast-forward to today, to two young companies that Posch helped found in Paris.

One of the outfits, publicly traded Pixium Vision, already has restored rudimentary sight to 18 blind people by using a combination of chips and digitized glasses connected to the brain via retinal implants and the optic nerve using a system it calls IRIS. Later this year, it will equip several patients with an advanced version providing an improved level of vision, called PRIMA.

In an entirely different application, the other company, Chronocam, will help machines on the factory floor perform tasks such as inspecting finished goods, monitoring safety conditions, picking and packing consumer goods from assembly lines, and making it easier for robots to collaborate with humans.

Beyond the Frame

At the root of Pixium's and Chronocam's endeavors is an imaging chip that Posch and colleague Daniel Matolin invented in 2008 at the Austrian Institute of Technology in Vienna, which radically departs from imaging's 150-year-old "frame" approach. The chip, called an Asynchronous Time-based Image Sensor (ATIS), captures only things that change, rather than constantly recapturing everything in an image for each new frame.

ATIS takes the "event-based" approach that Posch also applied in his CERN semiconductors, using what he described as "time-domain encoding of information." Simply put, whereas conventional imaging chips encode pixel values in electrical quantities such as voltage or current or charge, the ATIS devices convert the amount of light shining onto a sensor's pixel into a period of time that's equivalent to the time between two circuit pulse edges.

"The information is transmitted off the sensor this way; the timing of electrical pulses sent by the sensor is carrying the image information," Posch explained. It's a twist on the CERN chips, which did not use pixels and light; rather, they took the charge generated by a passing particle, rather than light, and encoded it as time.

The move away from frames to the event-based time approach ushers in enormous benefits in system efficiency. It avoids constant snapshots, and the constant re-recording and resending of unchanged information. It also taps only the pixels that have changed, whereas the frame approach continuously updates all pixels whether needed or not, which, as Posch noted in an IEEE paper, "leads to redundancy in the recorded image data, unnecessarily inflating data rate and volume."

The frame approach can also miss an event if it occurs in between frames.

The practical ramifications are that the technology can underpin visioning systems that take up fewer computing resources per action, thus making a far better use of power, storage, data transfer rates, you name it.

Bio-inspired

It takes its cue from neuromorphic engineering, a concept pioneered in the late 1980s at the California Institute of Technology by Carver Mead, and pursued over the years by a host of others such as Posch and Matolin, and by ETH Zurich. Neuromorphic engineering recognizes that systems for human neurological transmission of information can provide a superior template for man-made computing systems; thus, computer scientists should emulate them.

"If the brain worked like a computer, it would weigh 17 kilograms," said Bernard Gilly, chairman of iBioNext, a Paris-based investment firm behind Pixium, Chronocam, and other young companies working with ATIS and related bio-influenced technologies. To put that in perspective, 17 kg., or 38.5 pounds, is about 12 times larger than the average human brain.

"ATIS is encoding visual information in the timing of pulses, just like biology does," said Posch. "All information in biology is encoded in the timing of 'spikes'; all computation in the brain is done based on spike timing. And the temporal resolution and dynamic range of data produced by ATIS match those of the human eye. ATIS inherently talks the language of biology."

For Pixium and the blind, ATIS "makes it straightforward to derive stimulation signals for the retina from ATIS data. With ATIS, it becomes possible to stimulate the retina's ganglion cells at their native temporal resolution," added Posch. Such stimulation (at less than one millisecond) would be impossible with 30-frame-per-second conventional imaging, he said.

Mimicking human biology is a fine idea, but it takes more than a single chip to put the concept to use for things like correcting blindness and giving sight to robots.

Pixium's bionic answer to alleviating blindness was to engineer an entire system. In the current IRIS model, a large pair of glasses houses the ATIS imaging chip, which sends the information it gathers to a "pocket processor" housed in a black box about the size of small walkie-talkie. The box sends the information about what ATIS has seen back to the glasses, which transmit via infrared light waves to a receiver chip implanted on the side of the eye, hidden in the orbital socket. A ribbon runs from the receiver to a set of tiny electrodes tacked to the back of the retina in a replaceable and upgradeable manner; those electrodes stimulate retinal ganglion cells. The cells then transmit to the brain via the optic nerve, which is still intact in the majority of retinitis pigmentosa and macular degeneration cases, which are the types of blindness that Pixium targets.

The company is now on its second version of IRIS, which includes 150 electrodes, 100 more than in version one. With the enhancement, system users can see a little bit more than before. The vision is nowhere near full sight, but it gives them a degree of light and shape perception, which allows them to localize an object such as, say, a glass on a table, or to find a door, or to note whether stairs go up or down.

The next big step will be PRIMA, which aims to improve the picture so the user will approach the ability to recognize faces. It eliminates the intermediary receiver chip and instead allows the glasses to send signals straight to wireless micro photodiodes implanted underneath the retina, where the photoreceptors (rods and cones) have degenerated. The photodiodes house as many as 400 electrodes on a single chip, and eventually several thousand with multiple chips. PRIMA also shrinks the size of the pocket processor, and reduces the amount of time it takes surgeons to implant the photodiodes. Five trial patients are scheduled to receive PRIMA implants at the University of Pittsburgh Medical Center late this year or early next, according to Pixium CEO Khalid Ishaque.

Combining disciplines

To make all of this work, Pixium has relied on a multidisciplinary team of experts including Posch, as well as ophthalmologist José-Alain Sahel, mathematician Ryad Benosman (who derived the system's algorithms), and entrepreneur Gilly. All are co-founders.

Sahel helped oversee development of the system at the Pierre and Marie Curie University's Vision Institute in Paris, before spinning it out as Pixium. Sahel recently left his Paris university post to chair the ophthalmology department at the University of Pittsburgh Medical Center, which will undertake the first PRIMA surgery.

The person overseeing the ongoing coordination and strategy at Pixium is Ishaque, who himself started out in yet another field of expertise before tapping into digital neuroscience and biology, at first with Boston Scientific for 10 years before signing on to lead Pixium in 2014.

"I'm from aeronautics, where people specialize in different things; you have structures, you have mechanics, you have materials, electronics, and signal processing," said Ishaque. "I was fascinated by 'how does it all work together?' And this project is about all these things coming together. One of the key components is, how do you replicate the function of the human retina into a pair of glasses with a camera and transmit via the implant in a way that can be interpreted by the brain? Until one day when we find a way to transplant an eye, we are trying to do all that in an artificial way, using computing, intelligent algorithms, and the way you capture a scene—totally challenging the history of how a video camera worked."

It's a similar mix at Chronocam, where the co-founders again include chief technology officer Posch, as well as Benosman and Gilly. Also on the founders roster are ATIS co-inventor Matolin, and CEO Luca Verre, whose background is in business strategy and development. Investors include Intel Capital, Renault, and Robert Bosch Venture Capital Partners. The company also has a partnership with the Renault-Nissan Alliance.

"It's about algorithms' conception, programming, and porting in an efficient and performing manner into a processing platform for a dedicated usage," said Verre. As Chronocam notes on its website, "The company combines more than two decades of experience in neuromorphic computing, CMOS sensor design, and VLSI chip engineering. Its development team has built the foundation a true paradigm shift in computer vision that significantly improve traditional methods of capturing visual information."

Verre noted the ATIS-based system allows visioning systems to collect only a thousandth of the data that conventional, frame-based systems collect. He anticipates installing the technology on a live factory floor by the end of year. He won't say where, but said the application will be along the lines of helping machines to inspect finished goods, monitor safety, or pick and pack consumer goods. Verre described the system's current quality as being on the VGA level, and says Chronocam expects to deliver "high definition" by the end of next year.

Chronocam envisions a $50-billion market for computer visioning by 2022, with uses including guiding autonomous vehicles; collision warning systems for cars; driver monitoring and assistance; robot-to-human collaboration; Internet of Things devices for the home, city and workplace, and prosumer virtual reality, augmented reality, health monitoring, as well as other devices.

The system does not yet support color images. Benosman explained that color would triple the computational requirement because it would have to process separately for red, green, and blue (as color television cameras do). With that as a challenge, he said Chronocam is working on a "unique" way to support color, which Verre expects to be ready by next year.

It won't have a free ride. Competitors include Samsung, as well as spinoffs from ETH Zurich such as iniLabs, iniVation, and Insightness. But Chronocam is off to an impressive start: the World Economic Forum in June anointed the company as a WEF Technology Pioneer.

The underlying technology, with its roots in event-based "time domains" and in biology, and with its wide range of possible applications, is certainly one to keep an eye on.

Mark Halper is a freelance journalist based near Bristol, England. He covers everything from media moguls to subatomic particles.

Join the Discussion (0)

Become a Member or Sign In to Post a Comment

The Latest from CACM

Shape the Future of Computing

ACM encourages its members to take a direct hand in shaping the future of the association. There are more ways than ever to get involved.

Get Involved

Communications of the ACM (CACM) is now a fully Open Access publication.

By opening CACM to the world, we hope to increase engagement among the broader computer science community and encourage non-members to discover the rich resources ACM has to offer.

Learn More