My career started when I joined Kodak in the U.K. in 1959, where I was taught to program by Conway Berners-Lee, father of Sir Tim Berners-Lee, the WWW inventor. At that time, we only knew of about 300 stored program computers in the world, although there were probably 300 more in ‘secret’ places like the military or government.
By 1963, computing in the U.K. surpassed the rest of the world. The British government decided to make the world’s most powerful computer: the mighty Atlas, the first computer with an operating system. The manufacturer, Ferranti, couldn’t get it to work and asked would-be buyers to send their best programmers to help. Kodak sent me.
The sales price was about £UK 3 million (about $150 million USD today). The sales manager told his team if Ferranti sold three Atlases to the Russian government it would solve Russian computer needs to the year 2000. It was the time of MAD (Mutually Assured Destruction), when computers guided nuclear weapons and supported the space race to get a man to the moon. Atlas was less powerful than the MAC I’m using today.
As Special Ambassador for Univac, Grace Hopper, the inventor of COBOL (the most widely used programming language in the 1960s to 1980s), toured the world giving lectures about the industry. I had the pleasure of escorting her to various British Computer Society functions whenever she visited the U.K.
After a dinner in 1973, she asked if we would like to see the new computer Univac had loaned her. She dived into her handbag and brought out an object the size of a cigarette packet. We all stared, amazed, as she opened the box and picked up an even smaller object. Grace proceeded to tell us the impossibly small computer had a 64-kilobyte COBOL complier. We wanted to see it in action, so someone brought over a teletype with a printer, and from the side of the device Grace pulled out a fine cable the width of a human hair and a transformer with an adaptor for the fine cable to plug into as the power supply.
The group watched as Grace ran a simple COBOL program. We didn’t know it then, but we had just witnessed an early silicon chip-based computer. In Grace’s opinion, the mainframe was dead and would be replaced by ‘multitudes of minicomputers’ that would be linked by telephone lines, all working together. It was quite possible she had seen a demonstration of the U.S. Department of Defense’s ARPANET, the precursor of the Internet.
In 1974, I moved to Australia. I was the only woman executive at AMP, the largest company in terms of assets. Part of my job was to buy all the hardware. Computers still used core memory, which was sensitive to heat fluctuations. Our Univac salesman told me there was a new type of memory called Metal Oxide on Silica (MOS), which was not affected by heat fluctuations. It cost $1.5 million a megabyte. My boss was flummoxed, not only by the price but also by the size of the memory saying we already had two of the largest computers in the southern hemisphere, each with half a megabyte. I explained the system we were developing was to run real-time database systems, and more memory would be a distinct advantage. I was told if I could get the salesman to bring the cost down to $1 million, we would buy it.
I wish I saved the invoice because it would be of great historical and hysterical value now. Today we refer to MOS memory as computer chips and a megabyte costs a fraction of a cent. Although manufacturing processes for chips have changed considerably, they are not very different from that first megabyte I bought for AMP in the mid 1970s.
And so it goes. Always new things to learn, always tremendous increases in technology accompanied by drop in costs. Always exciting new applications.
It’s a wonderful career, especially for women. Come and join us.
Join the Discussion (0)
Become a Member or Sign In to Post a Comment