IBM, Google, Microsoft, and others are making large bets on quantum computing. However, they are going about it all wrong. Using the mainframe construction model of the 1950s, they have hired research scientists to build costly monoliths. Instead, they need inventors.
In 1976 Steve Wozniak (“Woz”) developed the Apple I. Consisting of a single board, the owner had to build a case, connect a keyboard, and plug in a television to operate it. Not to mention, no software was available. The complete computer, known as the Apple II, came a year later in 1977. To put that in perspective, it was not until 1981 that IBM released its personal computer.
Not a research scientist, or even a college graduate, Steve Wozniak was an expelled college student. He ended his short time at the University of Colorado by hacking the mainframe. Returning home to San Jose, Woz and his friend Steve attended meetings of the Homebrew Computer Club. It was during this time that the two Steves created Apple Computer.
After Byte Magazine ran the first article on the Apple II, IBM set into motion a plan to release a PC of their own. Again, instead of research scientists like the mainframe days, they approached a small chip manufacturer named Intel and a college dropout named Bill.
Edward Augustin Calahan left school at the age of 11 to pursue business. While working as a telegraph operator in Western Unions Manhattan office, he noticed messenger boys running from the stock exchange to surrounding buildings with reports of price changes.
According to Wikipedia, “Calahan realized that the prices could be sent directly via telegraph to each broker's office much more quickly and efficiently in a permanent stream of information.” This lead to his invention of the ticker tape in 1867. Calahan later went on to found ADT Security which still stands today.
The list continues with much of what we rely upon today being created by non-researchers. This is not to say that researchers are not important. After all, it is their discoveries that give inventors the tools to create. It is to just point out the ridiculous idea that researchers will come together and release a product faster than a team of hackers.
Xerox, the copier company, spent millions on research and development. Like the Bell Labs of the West Coast, Xerox developed the mouse, graphical interfaces, and many of the ideas behind modern programming languages. However, it took a band of misfits to turn these ideas into a viable product.
Lead by Steve Jobs, a small group of 30 employees released the Apple Macintosh in 1984 using technology from Xerox. While chronicled at folklore.org, the point is that the researchers involved in discovering the technology were not the ones able to commercialize it.
Right now, Quantum Computing is being held as too complex for regular engineering. Most of the current offerings look more like devices inside a physics lab than production systems. That is because they are lab equipment.
However, like the days of the Homebrew Computer Club, there is interest in DIY Quantum Computing. Noah Wood, a quantum hobbyist, explains, “Using the quantum properties of light and the KLM protocol, we are able to create qubits quickly and affordably using off-the-shelf optics and electronics components.”
The KLM technology Wood refers to came in 2000 by scientists: Knill, Laflamme, and Milburn. Wood says, “Essentially, what they found was that you could perform any theoretical quantum computation using nothing more than cleverly arranged optics! (mirrors).”
Basically, Wood has created Quantum Circuits using an Arduino controller with off-the-shelf components. More can be found on his GitHub page, including plans for an 8 Qubit processor.
We have technical behimiths working toward quantum dominance on one end and DIY hobbyist on the other. However, somewhere in the middle are the well funded startups looking to plant a flag.
Companies like Rigetti Computing and IonQ are startup creators of Quantum Computers. However, their development model is very similar to Google, IBM, and the other large players. Staffed by researchers, the architecture of these new systems are all similar in complexity to the big competitors.
It is like everyone is trying to build the same thing, a large-scale quantum computer based on complex physics. A reminder of the mainframes wars during the 1960s. Most likely, the winner will be the company that releases something entirely different than the others.
Launched by Stanford Professor James Clark in 1982 with a group of research graduates, Silicon Graphics was a company that specialized in 3D graphic workstations. At its peak in the late 1990s, the company grew to $3.7 billion in revenue. However, the advent of Linux and improved graphic capabilities of personal computers took their toll on the company.
The final straw for the then rebranded SGI was Nvidia. A graphics card manufacturer designed not to compete with personal computers but to enhance them. This shift from competitor to a partner made expensive workstations irrelevant. Today, Google occupies SGI’s Mountain View headquarters while Nvidia is powering the world’s Deep Learning systems.
We are waiting for a production Quantum Computer to come from a laboratory when history proves it will most likely come from a garage. The mainframe wars was ended by the personal computer. The graphic workstation was killed by graphic cards. Custom automobiles were destroyed by Henry Ford and his low cost alternative. The point is all came from inventors, not researchers.
© 2016-2021 Todd Insights LLC
Any strategies discussed are for illustrative and educational purposes and are not a recommendation, offer, or solicitation to buy or sell any security or to adopt any investment strategy. There is no guarantee that any strategies discussed will be useful. Todd Moses is not a licensed securities dealer, broker, or US investment adviser or investment bank.