C.A.R. Hoare’s viewpoint "Retrospective: An Axiomatic Basis for Computer Programming" (Oct. 2009) reminded me of a saying attributed to both Jan L.A. van de Snepscheut and Yogi Berra: "In theory, there is no difference between theory and practice. But, in practice, there is." I recall as an undergraduate the elegance of using induction to prove that a recursive program was correct. Induction and recursion were two sides of the same coin, one theory, the other practice.
I’ve been a software engineer for almost 25 years. Though I’ve used axiomatic techniques designing, implementing, and debugging my code, no project (as a whole) could possibly rely on it. Any form of axiomatic verification requires a rock-solid foundation on which to argue the correctness of an implementation. Programming with well-defined functionality (such as data-structure manipulation) can be verified axiomatically, but as a project’s size and complexity grow, its behaviors become less rigorous. Testing doesn’t verify that the functionality of a large project is correct, only what the interpretation of that functionality should be.
Customers are rarely able to define what they want but tend to know it when they see (or don’t see) it. This makes it difficult for programmers to create complete, consistent, unambiguous requirements. Early in my career, I used highlighter pens to extract requirements from whitepapers. Fortunately, requirements today are enumerated, version-controlled, and placed within context via use cases. Still, there are too many details and too few systems-engineering resources to document every behavior, condition, and exception.
Since an implementation must be able to handle every case, developers must make assumptions that cannot always be confirmed. Axiomatic confirmation and unit testing by developers verify code only as long as the assumptions hold true. Verification is needed by independent testers who can ignore the implementation but must be sure the product matches their interpretation of requirements. For this reason alone, software is still as much an art as it is a science.
Jim Humelsine, Neptune, NJ
DIY Biological (Nervous) Systems
Congratulations to Corrado Priami for sharing his insight into biological simulations in his article "Algorithmic Systems Biology" (May 2009). My own interest in emulating biological systems has yielded similar conclusions. In addition to computerized algorithmic representations in software, I’ve designed analog component circuits and linear coprocessors using operational amplifiers, including integrate-and-fire artificial neurons based on Hodgkin’s and Huxley’s research, synthetic emotion-processing neurons using sum-and-difference operational amplifiers, and artificial neural networks for machine vision. These components add instantaneous analog parallelism to the digital computer’s software concurrency, as Priami said.
For the past 10 years I’ve been developing a fairly elaborate nervous-system emulator that embodies many of Priami’s concepts. Designed originally as a control system for robotics written in a multitasking version of Forth, I’ve extended the project into a modular, extensible, open-systems design embodied in a multiprocessor network that emulates the major functions of the human nervous system. Included are interchangeable hardware/software components, a socketed software bus with plug-and-play capability, and self-diagnostics. The computer hardware is based on IEEE P996.1 bus cards; the operating system uses IEEE 1275-1994 standard software. The overall system features object-oriented design techniques and programming. I’ve also created a machine-independent high-level byte-coded script command language to manage it all.
Emulated neural-anatomical structures include cortex, brain stem, cerebellum, spinal cord, and autonomic and peripheral nervous systems, along with motor, sensory, auto-regulatory, and higher-cognitive AI behavior and synthetic emotions. Emulated body functions range from hormones and drugs acting on cell membranes to high-level responses.
As part of the IEEE 1275 standard, Forth helped me create a source-code library of individually compilable nervous-system components, per Priami. The library includes human childhood development milestones, epinephrine and oxytocin hormone functions, a pain mechanism and narcotic effects, the fear mechanism, and retrograde neuronal signaling via the endocannabinoid system. Recent enhancements include a form of autism based on a defective oxytocin receptor, the fibromyalgia syndrome (with simulated viral activity, immune-system responses, and antiviral antibiotic effects), and Bayesian probabilistic functions.
The system reflects intentional software and hardware flexibility. Using tiny six-pin eight-bit PIC10Fxx series microcontrollers, I’ve designed 35 different digital McCulloch-Pitts and analog Hebb artificial neurons. I also added eight-core 32-bit Parallax processors for coordinating brain stem sensorimotor, cerebellar, and low-level cortical activities. Moreover, the system can extend its original Forth-based, byte-coded AI scripting language via genetic algorithms to provide a form of machine learning and execution. It is also capable of examining its own internal variables, short- and long-term memory, knowledge base, and preferences profile to provide a limited form of self-awareness and personality expression.
I look forward to more such intelligent machines created through the kind of algorithmic systems biology explored by Priami.
Paul Frenger MD, Houston, TX
Join the Discussion (0)
Become a Member or Sign In to Post a Comment