Looking back, my 50+ years of software engineering fits nicely into five phases:
- Bitten by the bug
- Foundational work
- Sun Microsystems
- Cloud computing
- Smart cities
Let’s unpack them.
Bitten by the bug
At RGS (the Royal Grammar School, High Wycombe in England) I studied economics, maths, and physics, and had dreams of a career writing for The Economist. In 1968, at the age of 17, I left RGS with four A Levels, and began a “gap year” with the United Kingdom Atomic Energy Authority (UKAEA) in the Programmes Analysis Unit (PAU) at Harwell. Our unit was responsible for cost-benefit analysis of UK government-funded R&D programmes, and for a prospective graduate in econometrics it seemed like a natural fit. I started out building and running ROI (Return-on-Investment) S-curve models on a Wang programmable desktop calculator. I quickly realized that I needed more computational power, found the Computer Centre housing an IBM 360/65 next door, taught myself Fortran from the McCracken book, learned to key-punch cards.… and my life changed. By the time I left Harwell to go to University, I was working almost full-time in the Computer Centre, helping physicists with their reactor simulation models and debugging OS/360 JCL. Fun fact: I was, for a time, custodian of a magnetic tape containing the data base of every computer in the United Kingdom.
On arriving at Essex University in 1969, I immediately switched my major from Economics to Computer Science. It was an interesting era, because quite a few of the students had as much experience of the subjects as the lecturers, and some of us got credit for innovative side projects rather than sitting through the introductory courses. I had two summer jobs during that time: a gig at the Ministry of Technology running Leontiev Input-Output models for the UK economy, and a longer stint back at Harwell, working on reactor control systems and associated software tools. This practical experience was integrated into our curriculum, which included programming, numerical analysis, computational algebras, operating systems, and hardware projects. The latter were really hands-on, requiring soldering, wire-wraps, and circuit design using LS (low-power Schottky TTL) integrated circuits. When we graduated in 1972, I believe we were the first class anywhere to earn the degree of “B.Sc. (Comp.Sci.)”.
Within days of my graduation, I joined the University of London Computer Centre as a systems programmer, supporting their CDC 6400, 6600 and 7600 supercomputers. My primary job was to review operating system patches (published every month in booklet form) and install and test the ones that were relevant to our configurations. To do this, I had an hour of scheduled maintenance time every day, and on those days when there were no patches scheduled I took the opportunity to run some private experiments associated with memory management algorithms. (I didn’t fully appreciate the luxury of personal supercomputer access until many years later.) I was obviously still thinking of computer science from an academic perspective, and so in 1973 I applied for a Science Research Council grant, moved to the University of Newcastle-on-Tyne and started work on a Ph.D. Picking a thesis topic was complicated, not helped by my supervisor’s abrupt departure, but I eventually settled into a project on operating system command languages. This was basically a language design exercise, applying ideas from Simula-67, Forth and POP-2 to the problem of modelling and managing system resources. (The solution I came up with is oddly similar to Python!) I never actually finished the Ph.D., because when I left Newcastle in 1976 with my thesis 98% complete, I joined a startup which consumed all of my time (and where the work was actually more interesting than my thesis).
My first startup was a small contract programming shop called Pentagram. I spent three years working on everything from PDP-11 device drivers to writing a complete database management system. The DBMS, which I called Maestro, was enormous fun. The core idea was that any numeric field could hold either a value or an expression which would be evaluated at run time, a bit like a spreadsheet. (Remember, this was 1977.) It was written in Fortran 77, had its own application language called Music, and ran on a Honeywell mainframe in New York that we accessed through the transatlantic EPSS X.25 packet switched network. Sadly, a reorganization at the customer led to all of the UK-based projects being handed to IBM, so Pentagram folded. (I later discovered that our main client, a multinational consumer products company, was using the dynamic capabilities of Maestro to carry out legally questionable transfer tax arbitrage. Oh well….)
In 1979 I joined a computer company called CMC, the European arm of the US database engine company Microdata. CMC’s flagship product was Sovereign, a range of data entry systems with terminals and servers based on AMD bit-slice hardware. I was hired to transform Sovereign into a LAN-based workstation architecture, and wrote a complete distributed operating system for it. However, family circumstances meant that in 1981 we moved to the United States “for a few years”, and I joined Raytheon Data Systems (RDS) just outside Boston.
RDS had two highly successful lines of business: ruggedized minicomputers used for industrial and field operations, and IBM-compatible terminal clusters for the airlines. They also owned Lexitron, a pioneer in word processing systems. I started working on the RDS-500 minicomputer, but quickly shifted to become the software architect of a fault-tolerant messaging switch that RDS had pitched to SITA, the air transport telecommunications provider. It was a pretty advanced system for 1982, based on Motorola 68000s, Multibus, and Unix V7. My key contribution was to employ coax-based Ethernet as the multiprocessor interconnect. After that project I ran advanced development (“IRP”) for RDS, and doubled as director of PC software. (For RDS in 1983, “PC software” meant Multiplan plus CP/M running on their Lexitron word processors.) It became clear that Raytheon was going to focus on its defense business, and in 1983 I got out just before Raytheon sold off the RDS operation to Telex. After interviewing at Prime Computer, I joined my second startup, Mosaic Technologies.
Mosaic was building a graphics workstation for engineering and CAD, aiming to compete with Sun and Apollo. They were pinning all their hopes on the National Semiconductor NS16032 CPU, an incredibly ambitious design – basically, a VAX on a chip – which proved to be so buggy that we never had a chance to succeed. While I worked on the distributed OS (BSD Unix plus Locus), the hardware limped towards an ignominious demise, and the inevitable layoffs arrived. And that’s how, in the summer of 1985, I arrived at Sun Microsystems with a fistful of resumes.
I was the first engineer hired into the fledgling East Coast Division of Sun, and the five of us (VP, admin, HR, manager, engineer) camped out in a room in the sales office in Waltham, Massachusetts. There were only three chairs, so two of us were always on the road. My mission was to “connect PCs to this NFS thing they’re doing in Mountain View”. Within the first 6 months we had created and demonstrated PC-NFS, a Network File System (NFS) client for MS-DOS, with a UDP/IP stack, written in 8086 assembler, and packaged as a 64KB DOS device driver. Unlike all the other NFS implementations which were ported from the SunOS code, I based PC-NFS on the formal protocol specification. This led to several interoperability issues, all of which required the other vendors to fix their code to conform to the spec! That was the beginning of a product line which sold millions of units over the next ten years.
I spent twenty years at Sun, reinventing myself every three or four years. I headed up software for the Sun 386i workstation, wrote an email client for PC-NFS, developed X/Open standards, was CTO for the Networking Software division of SunSoft, and returned to the PC-NFS group as director of the business unit. As a Sun Distinguished Engineer (DE), I collaborated with my peers throughout the company. I chaired several Technology Leadership conferences, recruited new DEs, and did a lot of mentoring. I also worked on a few extra-sensitive projects, including the controversial plans for Sun to align or merge with Apple.
Perhaps the most satisfying effort was Windows Sockets, a multivendor initiative to create a standard networking API for Windows PCs. At that time, networking was an add-on to Windows, and there were many incompatible products. After an enthusiastic but somewhat chaotic kick-off meeting in 1991, five of us got together to draft the WinSock 1.0 specification, which we published in 1992. A year later, Marc Andreessen and Eric Bina at NCSA ported their Mosaic web browser to WinSock, and the rest is history.
In 1998 I shifted over to Sun Microsystems Laboratories, and spent the next three years working on distributed Artificial Intelligence – what were known as “multi-agent BDI systems”. I then spent a year investigating the viability of a concept called G2, which we would now call a PaaS (Platform-As-A-Service) for Java. Meanwhile, Sun had embarked on a series of corporate acquisitions, and I suggested that there was a critical need for a formal program to welcome and integrate the technical leaders from the acquired companies into the Sun technical community, particularly the “DE mafia”. As a result, I spent the next couple of years on the road, from India to Colorado, working to ensure that the key talent stayed with Sun. Ironically, as that role wrapped up in 2005, Sun was suffering from the “post dot com” blues, and I decided that it was time for me to leave.
After twenty years working at a single company, the next fifteen years was a series of shorter gigs. I relocated from the Boston area to Seattle and spent three years at Amazon, working on the ecommerce platform. I then moved down to Silicon Valley and joined Huawei, leading a team that was developing a private cloud for Huawei to offer to its telco customers. This involved numerous visits to China (and one round-the-world trip), and it was an eye-opening experience. But corporate politics at Huawei were incredibly complicated (everybody seemed to have two bosses), and in a moment of frustration I allowed myself to be lured away by an offer from Yahoo.
I landed at Yahoo in September 2010, in the middle of the chaotic CEO transition from Bartz to Morse. The project was fascinating: to create a private Yahoo cloud that could eventually replace their vast bespoke IT infrastructure. The technical staff were outstanding, I learned an immense amount about networking at scale, and I was really proud of the systems architecture and engineering plans that we developed. Alas, the executive politics at Yahoo were even worse than at Huawei, and after a year I quit in disgust.
I spent 2012 as an Entrepreneur-in-Residence at US Venture Partners and doing a little consulting. I did come up with an interesting business plan for a cloud “roll-up” but we couldn’t get it funded, which was probably a good thing. From USVP I joined one of their portfolio companies, Vyatta, which was being acquired by Brocade. It was at this point that I became deeply involved in the OpenStack cloud software community, helping Vyatta and then Brocade to adapt their products for the OpenStack architecture.
At the end of 2013, I arrived at Cisco to become the lead architect on Intercloud, a turnkey, white-label, multi-tenant, multi-partner, hybrid cloud computing service (whew!) for telco customers. This role involved even more commitment to the OpenStack community, including an interesting project on virtual sub-clouds which drew on my work at Sun, Amazon, Huawei, and Yahoo. Although the work was technically satisfying, the whole program suffered from two fatal flaws: most of the customers seemed more interested in selling their technology back to Cisco, and the Cisco sales force had no interest in selling the product. Oops.
In November 2015 I was unsure what was coming next. I was 65 years old, and I’d assumed that the Cisco gig would run for a few years and that I’d then retire (whatever that meant). Instead, things got remarkably interesting. A VC friend suggested that I become the CTO at a startup called Sensity Systems where he was an investor. Sensity was capitalizing on the movement to convert streetlights to LEDs by creating smart pole-top systems that could not only manage the lighting but also exploit a range of sensors and cameras to provide smart city services. In particular, they had developed a machine vision system that could be used for parking, traffic management, public safety, and so forth. The back-end software systems were cloud based, but the engineering leadership had a hardware background, and they needed help with their distributed systems architecture. I jumped at the opportunity, and immersed myself in this completely new business and technologies. We made good progress, and just eleven months after I joined we were acquired by Verizon.
Verizon bought Sensity as part of a strategy to build up a strong presence in the Smart City space, which they hoped would generate new use cases for 5G and smooth the politically sensitive roll-out of 5G infrastructure. The Sensity CEO, Hugh Martin, and I had some well-developed ideas about how to build a Smart City platform, but we both knew how big companies worked, and so we moved to New Jersey to work at the Verizon headquarters. It was a fascinating time, and I became deeply involved in the Smart City movement, working with the companies and standards organizations involved in critical urban infrastructure. Although our vision was well received in conferences and trade shows, the internal Verizon business plan became entangled in office politics, particularly the battle about who was going to become the next CEO at Verizon. The vision and goals shifted, progress slowed, and I decided to leave.
So in October 2018, two years after the Sensity acquisition, I moved back to Silicon Valley, and joined Lacuna, a stealthy startup that had been put together by a couple of my friends from Sun and Sensity. The details of my work there are still mostly confidential, but my job title was Chief Open Source Officer and much of my first year was spent setting up an open source software foundation for urban mobility systems. I was also de facto CISO, and created the cybersecurity program for the company. (I’m still spending a lot of time thinking about the challenge of cybersecurity in a small, agile startup.) But 2019 and 2020 were complicated years for venture funded startups, with the double whammy of the WeWork scandal and the pandemic. In June 2020, with my 70th birthday approaching, and after over fifty years of software engineering, I retired.
A fascinating review of your technical career, Geoff. I joined Sun UK in 1994 initially as a Sun Services developer support engineer migrating to SunSoft 18 moths later concentrating on PC-NFS, Plum, Wabi and PC-Solaris inter connectivity where I ventured on many occasions to the Sun office in Burlington, MA and the West coast. I was made redundant following the Oracle take over, after 15 years at Sun, Oracle weren’t keen on the cloud work my team was running with. Following a few years with Virgin Media with transit and peering and then a spell teaching Royal Navy engineers in the IT sphere I am now training flight data processing using both old 60’s systems and newer technology incorporating virtualisations on core data centres, AI flight vector processing and myriad other technologies.
The key take away I got from your article is that irrespective of age your skills and knowledge ensured you were a desirable asset which I believe in too, and encourage former and existing colleagues too. I have recently turned 64 and still have the desire to learn and share.
Thank you for sharing and all the best in your retirement.