Catching Up With . . . Rick Landau
Close Encounter with the Computer Industry
by Brooke C. Stoddard '69
Rick was born in Pittsburgh to two old Pittsburgh families. His home was near Carnegie Tech (now Carnegie-Mellon) and the University of Pittsburgh. “There were libraries, museums, restaurants – it was a great place for a kid to grow up,” he says. Rick’s mother came from a line of chemists, and Rick and his two brothers had extraordinary chemical paraphernalia in the basement. There were also the old family farms in the country. In summertime, the three brothers helped harvest crops and sold squash and corn door to door. (Photo, right: Rick in the White Mountains.)
By seventh grade, Rick was performing so well in math – calculus at this point -- that his teacher suggested a more rigorous school, so Rick was sent off to Mercersburg Academy, just east of the Appalachians in southern Pennsylvania. He continued to do well, but recalls more the exacting standards of grammar, writing, and speaking that the (then) all-male coat-and-tie prep school professed. He disliked the discipline and chafed at the dress code but understands he received an outstanding education.
He entered with the Class of 1967 but in the spring got a bad case of mononucleosis. He missed weeks of classes and even after recovery found it difficult to rouse himself for morning classes. The legendary “Flagrant Neglect” was draped upon him and he and the university separated.
“The next year changed my life,” Rick says. He had already lined up a summer job at Pitt being an operator in its computer center. He recalls: “I didn’t know a thing about computers but the people there thought a math major could handle it. The job left time on my hands because just watching the lights blink was not taxing. The job turned into a full-time one and some of the faculty gave me programming to do, simple at first and then more complicated. The computers were mostly IBM mainframes using cards and magnetic tape; some cost millions of dollars -- for one of them we bought a disk that cost $1 million! I took some courses at Carnegie Tech and I got better at programming. At that time we used (the now-ancient) Fortran II and machine assembly language. I discovered you could make money programming for people, and that has been my money maker ever since.” [For the uninitiated such as this writer, programming or writing computer code uses the same keyboard as the rest of us use, and over the decades has become increasingly sophisticated; writing code requires sticking to a strict syntax as well as fostering a good memory because a task often involves writing code thousands of lines long. –bcs]
Rick took up discussions again with Princeton. “They said, ‘We know you can do the work. Convince us that you will take it seriously this time.’ I said I would. I re-entered in the spring semester of 1966 with the Class of 1969.”
By this time Rick understood that the Princeton Mathematics Department was the best in the world and that “likely I would not be a star. I had, though, been working with the celebrated mathematician John Tukey, who was chairman of the Statistics Department at Princeton. Though I took some Electrical Engineering and Economics courses, and a whole lot of Philosophy, I signed up as a Stat major.” Doing so proved fruitful. With Prof. Tukey’s guidance Rick wrote a junior paper on random access devices (disks), for which he conducted interviews with engineers at computer companies in the region. His senior thesis was on the algebraic structure of formal computer languages, again a winner. And despite what one might infer from his owning a Cannon Club “Retread” T-shirt (extant in his current wardrobe) he departed with good grades.
After graduation he took a job with a computer company across the road from the Princeton Airport. Two months later while he was at work in his two-story building, a private aircraft crashed through the roof and began a conflagration. Fortunately, no one in or out of the plane was seriously injured. Rick was part of a “bucket brigade” rescuing computer tapes tossed out the window before the flames and water could reach them.
The job had a benefit beyond experience in rescuing flammable computer data; it convinced Rick he needed to know more about applied mathematics and statistics. So he went back to Princeton as a PhD student in Statistics. He was classified 1-A by his draft board but had a lottery number that was not called. He worked as a Princeton graduate student for three years and accumulated all the PhD credits but did not complete a thesis on account of running out of money. He took a job with Digital Equipment Corporation (DEC) on the Route 128 Corridor outside Boston.
DEC was a big deal in those days and made mini-computers (no such thing then as a personal computer) that cost $50,000-$200,000. Rick worked on programs for applications rather than on operating systems.
Rick worked on transaction processing and database management systems, which were very popular with corporations at the time. The software packages that he was developing sold for $25,000 a copy; similar packages now, at least as powerful, are given away for free. Next he worked on printer and video software development. When DEC was sold to Compaq Computer Corp. in 1998, Rick helped develop Remote Management Systems, i.e. software that allowed a computer in one place to operate another computer anywhere else on the internet. Later Compaq sold this division. In short order fierce cost-cutting forced out the highest paid, Rick among them. He posted his resume and was hired by Dell in Austin.
He worked in Austin for almost twelve years, was not crazy about Texas (see the portion of Rick’s blog called Tales of Texas at URL= www.ricksoft.com) but adjusted to Austin. His first task was to develop website appliance hardware and software, but the dot-com bust of 2000 tanked that business. He worked on creating standards for printers and projectors, then more remote management systems, ending with managing “virtual machines,” i.e. servers that allow operating systems to work on top of other operating systems, thus making web hosting and other tasks easier and more economical for the servers and the website owners.
Eventually he and his wife Beverly Thornton (married in the Princeton Chapel in 1977 and renewed in the same place 25 years later in 2002 -- photo, left) retired to Cambridge, where they live near Harvard Square with a terrific view of the Boston skyline. Beverly was an MBA who worked in business development and managed several companies. High on their retirement list was taking advantage of the extended educational opportunities of Cambridge. Both began to attend all manner of lectures at nearby Harvard. Rick joined the Harvard Institute for Learning in Retirement. He studied not only biology (a long-time interest) but also anthropology and mathematics. He has also taught courses for HILR on dystopian movies, the workings of the internet, future technology, and cognitive biases.
Besides these activities he programs (gratis) for Harvard professors, the Kennedy School, MIT Libraries and more. (His wife, ever the MBA, thinks he ought to get paid for it.)
Below are some of Rick’s thoughts stemming from his close encounters with the computer/tech industry:
Most significant – and sometimes serendipitous -- advances:
PC Revolution: Computers, even most minicomputers, dwelled on huge racks in air-conditioned rooms until IBM offered the PC in 1981. Astonishingly, IBM did not make Microsoft sign an exclusivity agreement, so anyone could use MS-DOS, and the notion of an "IBM-clone" personal computer spread very quickly.
The Internet: It was a kind of accident. The creators had no idea how large and important it would become. It was also an unusual and maybe unique idea: if we encounter alien life, it’s possible they would not have developed something like this.
Public Key Cryptography: Unfortunately, the persons who developed the internet believed the users would be like themselves – trustworthy, people in search of knowledge and who played by the rules. They did not seriously consider security. Still, the cryptography developed later was good enough so that businesses could get involved and thereby morphed the internet into a tool for commerce.
Linux: Unix was developed for (mainly DEC) computers in the 1970s; it was proprietary. Then a very smart programmer developed Linux, based on Unix, but he made Linux open source and basically free. This allowed for tremendous expansion. About two-thirds of all the world's servers now use Linux. The software is free; the big money is in consulting to companies on how to best use it.
Desktop Publishing: People did not really expect it – that you could write and change document layout and fonts at will on the screen in real time (What You See Is What You Get -- WYSIWYG) as well as lay out pages the way you want; it is now universal.
GUI -- Graphical User Interface: The big leap was that a GUI presented users only with choices that were valid – the user would pick from a menu or press a button. Previously, users typed commands in computer language, and if you made a single typo you would probably get a message reading “Syntax Error.” GUI eliminated that by only allowing “clicks” that would work, a huge advance in usability.
Web Access to Everything: The old model was client-server, a closed relationship that required changes -- a client program -- on the user's system. This has evolved to web-based applications where the user needs only an internet browser, which is now free and included on every platform from phone to desktop.
Dangers and Advice:
Virtual currencies: These could ruin civilization. They're anonymous and you can’t trace them. The IRS used to complain about assets going overseas and thus out of control. Digital currencies are worse than overseas because they are “nowhere.” You can't "follow the money."
Consolidation of Commerce: FAANG (Facebook Amazon Apple Netflix Google) constitutes a threat [cf also Jonathan Taplin ’69, a notable FAANG critic]. Consolidation is worse than in the era of the Robber Barons. They gobble up too much, integrate vertically and horizontally, and are the death of privacy.
Internet of Things: The idea is that everything is connected by wire or Wi-Fi, appliances and car components that talk with one another. Picture being able to turn on your oven while driving home so dinner will be hot when you arrive, etc. The trouble is security. We’ve already seen car computer systems hacked. Disaster awaits unless security is improved.
Kids Must Learn Coding: Tell you grandchildren to learn some coding. They will need it like we needed typing. Future jobs are going to involve the ability to search through and examine data; that requires knowing at least a bit of coding.
Early Adopter Syndrome in the U.S.A.: We’ve suffered occasionally because we adopted technology before it was fully developed. The Europeans were able to use more advanced television standards because they had a smaller installed base of televisions, whereas in the U.S. there were so many units already in homes that we could not easily convert – though we finally did get into HD. Another example: The U.S. plunged into the internet without properly working out security. That’s why emails are hacked, identity theft is common, and computers get viruses.
Virtual Reality vs. Augmented Reality: The VR “headsets” will not become universally common: the technology is too expensive, too geeky, and makes you clumsy. Augmented Reality (AR) has more of a future – it delivers information that is useful. Hold up a smartphone to a retail street and “thought balloons” at each door will show what the store sells, its hours of operation and more. Or read in your glasses the Wikipedia biographies of the people in the meeting with you. But danger: groups, even pseudo-governments will, for a price, offer access to vast amounts of information. Hoarding information and doling it out selectively will be a danger in politics, and pseudo-government groups will not recognize established national borders.
More of Rick’s commentary and reflections are available at his blog: www.ricksoft.com