I was recently asked what I thought were the most significant points in the history of computing. It made me realize most people coming into high tech don’t have a basic understanding of the history of computing, which is important to understand future trends, how products work, and the relationships among computing elements in the technology stack. Therefore, here are my top 36 milestones, with the last being an obvious plug for Devo, which is going to change how the world deals with machine data, automation and machine learning. If you think of anything I missed, or you disagree with my choices, I’d love to discuss.

  1. Since you can’t have a computer without 1s and 0s, the invention of the number zero is significant. You can argue about whether this happened in Egypt, Mesopotamia or India. I opt for India, as Asian Indians were the first to treat zero as a number and have used the decimal point since year 595.
  2. Blaise Pascal builds the Pascal Adding Machine – the first workable calculator. This is more significant than Napier’s bones, the development of logarithmic tables or some mechanical devices, like the watch or the quadrant, because the device does the computing. (1642 – France)
  3. Gottfried Leibniz perfects the binary number system. (1679 – Germany)
  4. Computers don’t work without electricity. Many inventors worked on electricity, but Ben Franklin’s discovery in 1751 should make the list. (1751 – USA) Of course, until Nicola Tesla invented AC motors/generators, no full-scale electrification was possible (1886-1888, USA).
  5. Joseph Jacquard builds his textile loom using the concept of a punch card to weave intricate designs into cloth. The Jacquard Loom is arguably the foundation of the programmable machine. (1801 – France)
  6. Charles Babbage had the idea for the Analytical Engine, and although he didn’t ultimately build it, it set the foundation for all modern computers. Augusta Ada Byron, Countess of Lovelace, who worked with him, proposed using punch cards like those used in Jacquard’s loom to make the Analytical Engine programmable, and is credited with proposing the first algorithm. (1833 – UK)
  7. George Boole creates Boolean algebra, laying the foundation for Information Theory. This is where “and,” “or” and “not” come into mathematical formulas. This formula was later used by Charles Sanders Peirce to develop the idea that Boole’s logic lends itself to electrical switching circuits. It would be 50 years before Bertrand Russell presented the idea that this is the foundation of all mathematics, and another 30 years until Claude Shannon incorporated the symbolic “true or false” logic into electrical switching circuits. (1854 – UK)
  8. Thomas Edison discovers thermionic emissions, the basis of the vacuum tube, which, in turn, becomes one of the building blocks of the entire electronics industry. When the vacuum tube is invented, in 1904, it enables amplified radio and telephone technology. (1863 – 1904 USA)
  9. You could argue the TV gets its roots from fax transmissions back in 1843, but when amplification made television practical, Scottish inventor John Logie Baird employed the Nipkow disk in his prototype video systems. (1925 – UK)
  10. Alan Turing was an amazing guy. Turing provided the basis for the development of automatic programming, demonstrating that computing machines can simulate more complicated problems. If it wasn’t for him the Z2, the first digital computer which was used to break Germany’s Enigma machine, would not have been built. And although the dream of artificial intelligence was first thought of in Indian philosophies such as those of Charvaka, dating back to 3500 years, Turing championed the notion of AI for computers, leading to the Turing test (1950). (1936 – UK)
  11. John Bardeen invents the transistor. (1948 – USA)
  12. An Wang invents magnetic core memory. Although he didn’t build it but sold the patent to IBM for $400K to get the funds to start his company, the idea was not practical until Jay Forrester at MIT enhanced the idea to put it into a matrix. This opened greater practical applications for the technology, which in turn led to the later development of computer memory by Fred Williams. (1949 – USA)
  13. Grace Hopper was a star. She pioneered the idea of using higher-level computer languages and built the concept of a compiler, so we could program in words, not numbers. This gave rise to COBOL, the first language to run on multiple types of computers. (1952 – USA)
  14. Remington Rand releases the first example of free and open-source software with its A-2 system, developed at its UNIVAC division. Without this example it’s doubtful IBM would have lead the market in releasing all of its mainframe code in open source, which would have slowed the innovation of the entire software/technology market. (1953 – USA)
  15. The airline industry develops the semi-automatic business research environment (SABRE) with two connected mainframes, the start of computer networking. This project borrowed some logic from the military SAGE project, but it is nonetheless the foundation of networking, which really took off after Robert Metcalfe created Ethernet for Xerox. The current internet gets it roots from ARPANET in 1969, the first network to implement TCP/IP and the ancestor of today’s Internet. (1953 – USA)
  16. Arthur Rock invents the term venture capital, which makes funding of technology ideas possible and launches the business model of the computer age. If it wasn’t for him, Robert Noyce and Gordon Moore could not have formed the “traitorous eight” to form Fairchild Semiconductor, which in turn spawned AMD and Intel. (1957)
  17. John F. Kennedy gives the “I believe we should go to the moon” speech, which puts funding and research into computer science. (1961 – USA)
  18. The database is critical to today’s computing environment. The first reference I can find to a commercial database came from General Electric’s release of IDS. (1963 – USA) Relational databases came later – Ted Codd’s paper “A Relational Model of Data for Large Shared Data Banks,” was seminal (1970). No mention of relational databases is complete without a hat-tip to Mike Stonebraker. Both Codd and Stonebraker are recipients of the Turing Award.
  19. IBM releases the IBM System/360, the first computer system to offer the concept of modular, compatible general purpose-computing. This led to the expansion of computer systems and the foundation of the personal computer market. Some would argue that the DEC PDP-11, developed in 1975, really led to the PC market. THE PDP-11 was just easier to program, had general-purpose registers and interrupts, and could be manufactured with semi-skilled labor. (1964 – USA)
  20. The first concept of a mouse and a graphical user interface is demonstrated by Doug Engelbart. It wasn’t until 10 years later, however, that Xerox PARC developed the Alto, which was later stolen by Microsoft and Apple. (1964 – USA) Ted Nelson, Project Xanadu, came up with hypertext, precursor to WWW and in many ways superior (bi-directional links, something Berners-Lee didn’t think of, 1960.)
  21. Gordon Moore and Robert Noyce create Intel to build the integrated circuit. After forming the company, it takes Moore only a year to posit Moore’s Law. (1964 – USA)
  22. The first software patent is issued to Martin Goetz. Without this, the software industry could not have gotten the capital to develop. (1968 – USA)
  23. The entire software security market owes its creation to the Creeper virus! Creeper was an experimental self-replicating program written by Bob Thomas at BBN Technologies. Creeper used the ARPANET to infect DEC PDP-10 computers running the TENEX operating system. Creeper gained access via the ARPANET and copied itself to the remote system where the message, “I’m the creeper, catch me if you can!” was displayed. (1971)
  24. The video game market can be traced to 1948 with a checkers game built by IBM. But it really took off when Nolan Bushnell created Atari, and with the success of Pong (his second game, as the first one was too hard to play). This is what gets the younger generation and people of my age excited about the industry. (1972 – USA)
  25. Intel releases the 8-bit 8008, soon replaced by the 8080, microprocessor. This was the first true microprocessor, which led to the PC revolution. (1972)
  26. The basis for the RSA public-key cryptosystem is invented at MIT by Ronald Rivest, Adi Shamir and Leonard Adleman. RSA is the most common asymmetric cryptographic technique on the internet today. Without it, governments and banking could not have moved to the internet. (1977 – USA)
  27. VisiCalc, the first electronic spreadsheet, is created by Dan Bricklin and Bob Frankston. This set the stage for Lotus 1-2-3 and Excel years later, but it also spurred the need to have PCs on people’s desks. (1979 – USA)
  28. The concepts of the PostScript language are conceived in 1976 by John Warnock. Later, he joined Xerox PARC which had developed the first laser printer and recognized the need for a standard means of defining page images. He left Xerox and founded Adobe Systems to create PostScript, a simpler language than Interpress from Xerox. (1982 – USA)
  29. Unix (BSD or AT&T System 5 [1971]) diehards can debate operating systems forever, but by 1987 Unix was a proprietary operating system. Richard Stallman’s announcement of the GNU Project, in part built on Unix concepts, arguably created the modern framework for open source software (1983-present), although Linux, created by Linus Torvalds, is the basis for most OSS today. (1991)
  30. The World Wide Web is born at the CERN physics laboratory, led by Sir Tim Berners-Lee. His paper is published in 1989, the WWW is built in 1990, and the product launches in 1991 (something I did not learn about until early 1994 when I was on a sales call working for BMC). (1989 – UK)
  31. Although Berners-Lee did build the first web browser, Mosaic is really the first consumer web browser, and drove the internet age. (1992 – USA)
  32. ARM Holdings, which has a great business model, is the company that made smart phones possible. Built on the RISC architecture, it required fewer transistors, which reduced costs, power and heat. In my opinion this is the last major invention of the computing world we know today. (1985 – UK)
  33. Sun Microsystems starts project Stealth to solve issues for engineers who had become frustrated with its use of C/C++ APIs and who felt there was a better way to write and run applications. Over the next several years the project was renamed several times, including names such as Oak and Web Runner. Finally, in 1995, the project was renamed Java. In 1996 the Java Development Kit (JDK) was released, allowing developers to write applications for the Java Platform – an inflection point in the growth of computing, since the internet was a collection of systems, and applications could run on any computers. (1990 – USA) JavaScript, one of the key languages underlying the Web, was introduced in 1995.
  34. Y2K. Year 2000 is a big event in technology, which had to fix the date problem in which early programmers had used the last two characters for year in date formats, making the year 2000 indistinguishable from 1900. Fixing this gave rise to the innovation of the dot.com boom in the 1990s. Y2K spurred spending and innovation like Kennedy’s 1962 challenge to the USA to go to the moon did.
  35. Concur evolves its business model into a pure SaaS business following its IPO. Instead of restricting its sales to computer hardware stores, the company sells its services over the internet, expanding its market to anyone with a browser. Companies like Salesforce, Devo and AWS took the lead from them. (2001 – USA)
  36. MapReduce gives birth to the Big Data market. MapReduce was introduced as “a programming model and an associated implementation for processing and generating large data sets,” in a Google paper presented at the 2004 Symposium on Operating System Design and Implementation (OSD). Underlying assumptions which helped MapReduce take off were its implementation on commodity hardware, scalability, and the ease with which it could be used by programmers unaccustomed to working with highly-distributed systems. In addition to providing the technology underlying the offerings of companies such as Cloudera, Hortonworks and MapR Technologies, MapReduce brought us the concepts of data lakes and data warehouses. (2004)
  37. Devo is born. For AI, machine learning and advanced automation to take hold, a single system of record or data operations platform for all data must exist. Traditional batch processing with data warehouse systems and streaming analytics products need to be combined for companies to increase agility through their digital transformation process. Devo will be the technology and platform that allows for massive-scale systems to query the exabytes of data needed to improve business performance and react to every changing opportunities and threats. (2014 – Spain)

It will be interesting to take a look at this in 20 years’ time and see what gets added to the list. I’m sure you have events you’d add to this list – Tweet them to @devo_Inc.