One Man’s 50-year History of Moving Computers from Large Rooms into the Clouds
Many of us have met long-time computer types who began working with computers long before PCs began appearing in the late 1970s. We recently interviewed a fascinating man who celebrates 50 years of working with computers. He loves explaining how computers moved from being complex, monstrous beasts run by certain “intelligentsia” into practical, analytical tools for everyday people; to how computers today help us communicate and work with each other, within different, newer social frameworks.
His career led him to some interesting new developments in computer science, including early contributions to the beginnings of the Internet. He now sits at the helm of a strategic team that is braving the most challenging tasks of dealing with government information in the clouds.
Dr. George O. Strawn is not just another computer/IT official who rose from the ranks – he is one of the most important thought leaders within Federal Government IT circles today. Plus, he loves the National Archives, because he says, “we bring to the table some of the toughest IT problems for all of the federal government that need to be solved in our time.”
Bob Chadduck, Principal Technologist for NARA’s Applied Research Division recently sat with Dr. Strawn to get his perspective on the direction of IT for the federal government over the next few years. As I listened and took notes, what emerged was a fascinating trip down memory lane.
Dr. Strawn began working with computers as a student aid, in a laboratory in the backroom at the Atomic Energy Commission in 1961. It was his first exposure to a computer, which at that time was rare, he had never seen one. He became interested in how programmers made the computers work. During those days, Strawn said, you were given the reference manual to read, and asked for work only when you were ready to be productive. He worked very hard for about a month, to learn about and understand a computer language and eventually wrote some rudimentary programs, and “…I’ve not been away from the beast since,” Dr. Strawn exclaimed. “The fact that I’m still here after 50 years, means I still like it.”
He came into computers at a very critical juncture at the middle of the 20th century. If you read our last blog we discussed early computers (e.g. the Seaton and Hollerith devices, Eniac/Univac, and FOSDIC, etc.) that were advanced machines for their time, and were used by the U.S. Census Bureau to speed up the task of tabulating population data.
After World War II, computers were experimental electronic tools that used complex mathematical algorithms and data structures, run by specialized experts who operated these machines in room-sized, mainframe environments. Around that time, there was a community of computer scholars who explored some questions and defined theories that became the foundations for Computer Science: What is computable? How can computers be more efficient? And furthermore, as Strawn opined, “How do we hide these complexities from mere mortals, so the complex stuff is “under the covers.”
By the 1960s key developments in software and operating systems improved computing performance, and computers were no longer room-sized operations. In 1971, the Integrated Electronics (Intel) Corporation had invented the first commercial microprocessor “chip” that could hold millions of instructions per second, improving the performance of powerful, new personal computers (PCs) that were emerging.
It was the introduction of these chips that enabled the ability to “hide the complex stuff,” making computers smaller, more portable, and more efficient. Scientists were thrilled about this because they no longer had to “bow before the intelligentsia – the keeper of the keys.”
Strawn described the use of computers during the 1980s as a disjointed period – a lot of federal agencies were using computers as departmental machines, performing operations for specific tasks/functions, or for creating work specific systems that produced incompatible data sets. These “stovepipe” systems were not intended to “talk” with each other, and many of these systems were created using technologies or proprietary software/hardware that are now obsolete.
Dr. Strawn’s career led him to the National Science Foundation – where he was the NSF’s Chief Information Officer (CIO). Around 1985 NSF began some innovative work with IBM on a grand project to connect NSF-funded advanced research centers with science communities; together they created a network of supercomputers dubbed “NSFnet” which significantly contributed to the beginnings of the Internet.
In part two of this write up, we’ll talk about Dr. Strawn’s current special assignment from the NSF to serve the Executive Office of the President on an initiative to move government information into the clouds!
Our interview with Dr. Strawn is the first in our Applied Research interview series called “At the Top of our List: Thought Leaders You Should Know.” Stay tuned, and in the meantime, learn more about Dr. Strawn on NITRD (Networking and Information Technology Research and Development).
Links to related to this blog post:
- 4/12/2011 – Looking Back, Looking Forward at NARA Research
- 11/10/2010 – NARA @ NITRD