It was 1972 when I embarked on my career in electronics and computers.
My initial training, courtesy of the US military in exchange for 4 years of my youth, was in vacuum tube technology.
You remember that don't you?
TV's and radios that literally had to 'warm up' before they functioned, tube-tester stations in the hardware store, and the tubes themselves that had enough voltage on them to reach out and strike like lightening when you carelessly got a body-part too close to the top electrode. (Yep, did that more than once! And to this day I do not wear highly conductive rings or watches)
From there my work quickly transitioned through solid-state transistors,
bug-like integrated circuits,
and by the late 70's, firmly into the world of micro processors
and main-frame computers such as this DEC VAX 780.
I got to where I became a global resource for keeping the VAX with its thousands of parts and individual circuits running and I could also go into them at the machine-language level and quickly tweak the operating system parameters to fine-tune it for maximum performance on different kinds of compute operations.
From there I jumped into the rarefied world of supercomputers.
First the liquid Freon cooled CRAY with it's strange shape designed to shorten the length of the internal connections to make them faster because - well you know - because of practical energy input considerations electrons racing down those connecting wires have a finite speed-limit
We even built a private computer room with a wall of viewing windows and special lighting just for it.
But the romance didn't last and we were soon building a second dedicated computer room for the cubistic Thinking Machine. (More shortened interconnects to speed things up)
But in the bleeding-edge, high-tech world I lived in these lightning fast but hugely expensive and persnickety custom computers (To replace a failed component in the CRAY we had to shut it down completely and drain the environment-killing Freon into the black tanks at the bottom before opening it up) soon became Dodo's
as the age of massively-paralleled computing took over.
We began taking hundreds, then thousands, of individual computers, each one representing 10's of thousands of those original vacuum tubes in less than the space of a single one of those little glass suckers, built with relatively inexpensive mass-produced processors and linking them together with complex networks and equally complex operating systems and applications to work together as one gigantic computer.
At one point our company owned one of the world's top 5 largest supercomputers and briefly boasted the world's largest on-line data storage capacity. (Can't compute without quickly accessible data to crunch and an equally quick place to put the results!)
As I approached retirement (2012) we were running the industry's most cost-efficient data centers and dunking our computers into tanks of chilled mineral oil to cool them so we could run them even faster.
We were so bleeding-edge we owned stock in Bandaid!
About a month ago, while trolling through my usual collection of news feeds, I came across an article published by MIT on practical work (as opposed to theoretical work) being done at an atomic level with qubits. A pairing of atoms that can exist in two different states simultaneously.
Egghead gobbledygook right?
Well check out this excerpt from the article. (My own highlights)
A qubit represents a basic unit of quantum computing. Where a classical bit in today’s computers carries out a series of logical operations starting from one of either two states, 0 or 1, a qubit can exist in a superposition of both states. While in this delicate in-between state, a qubit should be able to simultaneously communicate with many other qubits and process multiple streams of information at a time, to quickly solve problems that would take classical computers years to process.
Now work with the practical use of quibits has been going on since at least 2019 but in the few weeks following that MIT article I came across another one showing how traditional silicone and quibits have been combined onto a single unimaginably powerful hybrid chip using existing chip-manufacturing technology. And if that wasn't enough, a few days ago Caltech published a summary article of a paper titled Nuclear spin-wave quantum register for a solid state qubit (scientists are not noted for catchy, or even particularly descriptive, titles!) demonstrating the ability to reliably manipulate and store data at a quantum level within the synchronized spin of a handful of atoms and using that to build quantum networks to connect quantum computers. (Oh man! My head hurts!)
I used to be actively headhunted in the fast-moving, cut-throat field of advanced computer technology. I used to be a featured presenter at conferences. I used to sit on panels at symposiums. I used to be a master of computers!
But today I'm back here helplessly, ineffectualy, choking on the dust of progress. Not even a footnote in history.
These days I'm so obsolete that when I break I have to go to Ebay to find discontinued spare parts!