Massachusetts Institute of Technology adjunct professor Madhu Sudan, a principal researcher at Microsoft Research New England, says he and his colleagues have begun to describe theoretical limits on the degree of imprecision that communicating computers can tolerate, with implications for the design of communication protocols.
"Most of the work is really in trying to abstract, 'What is the kind of problem that human communication tends to solve nicely, [and] designed communication doesn't?'--and let's now see if we can come up with designed communication schemes that do the same thing," Sudan says.
The research is based on previous work conducted in 2011 that focused on the minimum number of bits that one device would need to send another in order to convey all of the information in a data file.
In the new research, not only do sender and receiver have somewhat different probability estimates, but they also have slightly different codebooks. The researchers were able to devise a protocol that would still provide good compression. However, that method works only if the servers know in advance which bits to add up, and if they store the files in such a way that data locations correspond perfectly.
The new protocol could provide a way for servers using different file-management schemes to generate consistency checks in real time.
From MIT News
View Full Article
Abstracts Copyright © 2014 Information Inc., Bethesda, Maryland, USA
No entries found