Sign In

Communications of the ACM


Embracing Noise or Why Computer Scientists Should Stop Worrying and Learn to Love the Errors

View as: Print Mobile App Share: Send by email Share on reddit Share on StumbleUpon Share on Hacker News Share on Tweeter Share on Facebook
Geeky Ventures Founder Greg Linden
When people talk about differences between computers and the brain, they often emphasize the massively parallel processing of the brain.  There are hundreds of billions of neurons in the brain firing at a relatively slow rate compared to a just a few but very fast processors in a modern computer.
There is another very important difference between the brain and computers, tolerance of error.  Computers are derailed if even a couple bits are flipped during a computation.  Computers expect everything to be perfect.  Error recovery is an afterthought, a protection against what is expected to be a rare event of a data read error, and perfection is assumed.
By contrast, our wetware embraces error.  The brain is a cacophony, a battle of competing patterns, from which understanding emerges.  It is a system built on noise, one that expects constant error and recovery from error.  Brain abandons precision and thrives on approximation.
There are examples of embracing noise and error in computing.  Machine translation has made great strides by building likely translation rules from patterns in large amounts of data, then expecting, welcoming, and correcting frequent errors in the translated text.  Recommender systems make no assumptions that they are perfect, but only that they might be helpful, and only expect to be judged by how useful they are compared to alternatives for lists of movies or books you might like.  Web search spreads a query across hundreds or thousands of machines and simply ignores those that fail to respond quickly, relying on getting a good enough answer out quickly rather than getting a complete answer out slowly.
Often we find estimates are good enough.  Recent work on data mining in large clusters, such as Hadoop, has found orders of magnitude speed ups can be gained if only an approximate answer -- such as +-1% with 95% probability -- is required.  Some computer security researchers are embracing the idea classifying what is malware is not black or white, but a gray area, and starting to use techniques such as slowing suspicious processes rather than expecting to be able to identify and kill all wrongdoers.  
Of the lessons we take from biological systems, one should be that wetware is sloppy.  And that is okay.  Precision is not required in everything or even most things.  Failures are best handled by expecting them all the time, not treating them as exceptions.  We should expect errors and handle them routinely.  We should stop worrying and learn to love the noise.




from anonymous
what logical mush!

Embracing Noise or Why Computer Scientists Should Stop Worrying and Learn to Love the Errors
Silicon-ware versus Bio-wetware
Greg Linden
April 28, 2011

1.)massively parallel processing of the brain.
1a.)not 'apples to apples' comparison; there are other than Von Neumann architecture.
2.)Computers expect everything to be perfect.
so what? To err is human means LOW productivity and efficiency.

3.)brain is ... battle of competing patterns.
brain expects constant error and recovery from error.
3a.)logical fallacy: error classes include: intentional, systemic, systematic and RANDOM
brain thrives on approximation
3b.)no mention whatsoever of analog versus digital computation models (i.e. slide rule versus electronic calculator)

4.)Recommender systems only expect to be judged by how useful they are ...
4a.for example, the Netflix Prize. Not mentioned at all is that machine-human hybrid beats machine only and
humans only.
5.)data mining has found orders of magnitude speed ups can be gained if only an approximate answer...
5a.)even a simple comparison (using zoology as in TV series Animal Plant) shows that the human has weak muscle strength
in 'arm wrestling' with a gorilla! He doesn't bite like a SHARK does. Darwin favors 'random mutations' and generality.
5b.)However, non-human scale problems (nuclear reactor) require MUCH higher 'probability standards' than 'good enough.'

6.)We should expect errors and handle them routinely.
6a.)violates 'laws of psychology.' When I drive a car, I do so NOT expecting to crash.
I do NOT overtighten my grip on the wheel from fear and worry; nor do I grasp it loosely
expecting 'routine errors.'


Re: 6, 6a above.
Like most drivers I do expect errors but not from me,as I am a good if not perfect driver. Now the other drivers,oh my,and road conditions,weather etc. Errors abound and to not expect is to create them. The problem is always,except under very special circumstances the other.

James Byrd

Comment 3 - the brain fails on all counts all the time not just approximation. If you had to process the amount of information coming in to a living brain optically and sonically you'd make a few guesses and toss things you 'supposed' weren't important. Fixating on part of the environment at the expense of other parts is a luxury of civilization - earlier in history it was a great way to end up as something's lunch.

Displaying all 3 comments