FAITH HAPPENS BOOKS

QUANTUM COMPUTING: A RADICAL TRANSFORMATION OF THE WORLD OF COMPUTING AND SOON, OUR HUMAN EXISTENCE

I have spent most of my career around computers.  I was an executive for Microsoft and Oracle.  With Ernst & Young, my focus was information technology. I also worked for Honeywell and Burroughs (today’s Unisys). In every position, my core competency was to help customers use computers to manage their business.  Speaking of cores and computers…

The Fabric of My Computer Career

During my career, I passed through about four or five generations of computing.  So bits and bytes were the “warp and woof” of my “day job”.  In this article and the next to follow, I want to discuss the staggering revolution that is about to happen in computing and why it is a critical part of the biggest machine ever built in the history of

Fabric Architecture — Warp and Woof (Weft)

humankind — and ultimately, what could be the most destructive to our planet and all of life, whether flora or fauna, non-human or human, that inhabits it.

When talking about the marvel of quantum computing, my background in computer systems is both a plus and a minus.  It’s a plus because I understand the foundational issues of information processing and the elements of a computer to store and retrieve information.  In fact, in my early days as a systems analyst for the old Burroughs Corporation (now Unisys), I programmed in machine language.  I understood binary coding and hexadecimal mathematics. I knew how bits made up bytes. I could read the content of memory in ones and zeroes which comprised the “bit” level, and I could enter “code” into the computer the next level up known as “hex code.”  Hex code resided at the “byte” level.  In other words, bits make up bytes.  They did then and still do today.  Therefore, I understood binary coding and hexadecimal computation. Going lower, to the hardware level, I even understood the principle of how an electric current flipped a bit “on or off”. By today’s standard, we managed an infinitesimally small amount of computer memory to get the job done.  We utilized these incredibly limited machines to do accounting back in the 1970s and 1980s with computers that had less than one-billionth the power of a personal computer today.  At the university where I teach entrepreneurship today, I tell this to my students and they have absolutely no conception how that could be the case. Their cell phones, with 64 gigabytes of memory (64 billion bytes), possess almost a trillion times more memory than the simple minicomputer I programmed in 1978 (make note, memory today is 64-bit, while memory when I started out, employed 8-bit memory).  All of these increases occurred within the last 40 years, overlapping with my business career.

We know the advancement of computing by the principle called Moore’s Law —  named after Gordon Moore, one of the founders of Intel.  The Law states that computers would double in power every two years (to be more precise, transistors would double in their density on a circuit board).  Nowadays, we talk about computer power doubling every 18 months.  However, whatever the rate of increase, Moore’s Law soon will be meaningless as we become fully engaged in the era of Quantum computing. Instead of doubling — an increase by a factor of two — increases will jump at least a thousand times every two years, that’s ten to the third power (10^3 in exponential notation).

Computer Memory was Originally Woven by Fabric Makers

The fundamental building block of the computer is known as the computer core.  The name signifies a single “bit” of information that is either a one or a zero.  If one understands how a computer core is set on or off, then talking about warp and woof of fabric is a perfect analogy to picture the way the classical computer memory “byte” worked on an old-fashioned memory board. In fact, initially, fabric makers were employed to “manufacture” memory.

Memory Pre-Silicon, Dream Weaving

Visualize a core as a doughnut with electrical properties. Wires run one direction through any number of memory cores which turn individual bits “on” via an electrical charge.  Wires running “the other direction” (at a 90-day angle reaching across the wires just mentioned), would turn individual bits off.  Therefore, imagining the “warp and woof” of computer memory supplies an accurate picture of how computer memory was established and electrified to set its “state” a certain way.  “Writing memory” would involve running currents through certain wires in one instant and then other wires in another.  “Reading memory” employed a sensing wire to detect the state of memory. However, initially reading a state of a core was destructive, so rewriting it immediately afterward was needed lest the process “flip the switches” the opposite way (“off to on” or “on or off”).

Using coding (either ANSII or EBCDIC coding systems that signified “alphanumeric” symbols otherwise known as letters and numbers) would tell you what the combination of bits turned on or off meant.  For instance, a certain mix of bits would signify words like, “Hello World!” or something much less friendly, like “Get Out!”  This process of encoding and decoding would happen faster than any engineer could work out with a slide-rule or accountant could punch into a ten-key pad on a calculator.  And all this was going on before we discovered the ability to cast transistors into silicon in “bite-size chips” (pun admitted) and increase computing power exponentially.

Inventor of Computer Memory Architecture – An Wang

The computer core was patented by An Wang when working at Harvard in 1951 (but not granted until 1955).  Wang would use monies from the patent to later form Wang Computers in Lowell, Massachusetts.  Wikipedia provides a nice but brief history of the magnetic memory core and Wang’s contribution:


“Magnetic-core memory was the predominant form of random-access computer memory for 20 years between about 1955 and 1975. Such memory is often just called core memory, or, informally, core.  Core uses tiny magnetic toroids (rings), the cores, through which wires are threaded to write and read information. Each core represents one bit of information. The cores can be magnetized in two different ways (clockwise or counterclockwise) and the bit stored in a core is zero or one depending on that core’s magnetization direction. The wires are arranged to allow for an individual core to be set to either a one or a zero and for its magnetization to be changed by sending appropriate electric current pulses through selected wires. The process of reading the core causes the core to be reset to a zero, thus erasing it. This is called destructive readout. When not being read or written, the cores maintain the last value they had, even when power is turned off. This makes them nonvolatile.”

“Two key inventions led to the development of magnetic core memory in 1951. The first, An Wang’s, was the write-after-read cycle, which solved the problem of how to use a storage medium in which the act of reading erased the data read enabling the construction of a serial, one-dimensional shift register of o(50) bits, using two cores to store a bit. A Wang core shift register is in the Revolution exhibit at the Computer History Museum.”


Wang Computers was located in a not-so-notable six-story building I used to drive by every day on my way to work at Honeywell and later at Oracle in “Back Bay” Boston. At the time, I lived in Nashua, New Hampshire. When Wang innovated his computer core, he established the basis for every bit and byte inside the computer.  Remember that at this time, hard disks really weren’t employed because they were very expensive.  External storage of information required punched cards and paper tape with holes punched in them or paper “bits” (literally) popped out.  Programs were compiled in the computer’s “main memory”. Once finished, the mainframe computer would generate punch cards or paper tape. (It was fun to hear the mainframe spit out a punched tape!) Data likewise would be stored on punched cards.  It was a major upgrade in computerdom when programs stayed “resident” in memory to be kicked off and executed at the will of the computer operator at the “computer console” which initially had no keyboard.  An execute command (today’s “double click”) would exist on perhaps 4 or 5 cards placed in a card reader and “read into” the input port on the mainframe. When I coded my first COBOL program, this is how we compiled code into machine language for execution on a small minicomputer.

The Memory Core Advances to a Fully Manufactured Device

Wang originally sold his concept of a memory core to IBM for a mere one penny per core.  Each core represented one bit.  As IBM soon discovered, this royalty would become the greatest giveaway in the history of the world. Eventually, they realized their error.  They negotiated a one-time payment of $500,000 to Wang in the 1950s.  If this royalty continued up to today, Wang would be the richest man in the world.  IBM eventually had to renegotiate the deal they struck with Wang.  How big was  this deal?  Well, if the arrangement was still in place now, Bill Gates wouldn’t have one-millionth the wealth of An Wang. And frankly, there might not have been a Bill Gates. How so?  Let’s look at the math.

In one million bytes of memory today (which is 64-bit memory, pretty standard for almost all computers in the personal computing world), the number of memory cores would be 64 times one million or 64 million cores.  At one penny per memory core, Wang would earn a royalty of $640,000 for every megabyte of memory sold (using a value of 1,000 rather than 1,024 — which is the more accurate value of a megabyte).  In a typical personal computer being sold today, the minimum memory for just the main memory used by the processor (or multiple processors for those machines that have multiple “cores.”  These cores comprise all the cores collectively in the main processor not the single ‘bit’ of a memory module), Main memory today would be no less than 4 GB or four billion bytes.  One billion bytes of memory would equal $640 million in royalty for Wang.  Therefore, four gigabytes would equal almost $2.5 billion in royalties. Despite inflation, that is still a big number.  Again, keep in mind this is because every “byte” today is a big byte of 64 bits or 64 memory cores.  And this is just considering the “main” memory.  A personal computer has many processors doing many things and most have some amount of dedicated memory to help them do their job in the overall process of running a personal computer (or an iPhone which is a computer).  Consequently, it isn’t too far off target to say that the amount of royalty money Wang would make for the sale of a hefty personal computer today would be almost as much as all the wealth of a Bill Gates, the richest man in the world (worth $85 billion at the beginning of 2017).  Talk about depreciation.  Memory cores just aren’t worth what they were worth back in the early 1950s.  It is even more striking when you realize what 16 GB (sixteen gigabytes) of memory costs (16 GB of memory would be what a power user wants for his or her computer). How much is that? It only costs about $120 in today’s market (Corsair is selling that amount of memory on Amazon right now for $114.00 if you need to purchase some).

Moore’s Law for Intel Processor Chips — the “Classical Architecture” about to be eclipsed by Quantum Computers

However, as staggering as the evolution of the classical computer is, as much as it has advanced and grown in power and performance, it is trifling compared to what we are about to see with quantum computing and how its growth will increase every two years.  Indeed, as my friend and co-author Anthony Patch has predicted accurately over the past two years, the only commercially available quantum computer, D:Wave (a Canadian firm) grew its computer capability by doubling its “qubits” about every two years.  However, “doubling qubits” is not just increasing its power two-fold as explained earlier — its growth in power increases exponentially at a staggering rate.  Anthony shared with me (correcting the original information published) that originally, D:Wave sold a 128-qubit computer sold to Lockheed and USC.  Then its second system consisted of 512 qubits and was sold to the same purchaser as well as Google/NASA, and to the NSA.   Then their systems were upgraded to 1024-qubit systems (characterized as 1,000 qubits) and each of the previous systems upgraded.  Recently,  as Anthony predicted, a 2,000 (2,056) qubit machine was placed with Temporal Dynamics and the same as an upgrade to all the previous purchasers. (Anthony is the true expert on the topic of the quantum computer.  I am a lowly writer just trying to make it simple so I can understand and share it with you).

A Preview of the Quantum Computer

So, in concluding this article, I will set up the next.  As indicated, over the past decade, a completely new paradigm has developed for computing.  It is called “quantum computing”.  My simple definition for this mind-bending, complex approach to processing information would be this:

Quantum computing is the calculation of equations and execution of algorithms based on (1) principles of how quantum particles behave physically while (2) employing quantum particles in the process of doing the computing.  

Such computers aren’t really digital computers.  They are quanta computers.  Instead of being binary-based calculating machines that provide outcomes predicted based

D:WAVE & QUANTUM COMPUTING

on physics we empirically detect in the observable universe, they are quadnary-based calculating machines and perhaps even “thinking” machines that operate in imprecise, unpredictable ways. Perhaps it would not be overstating things to say, “They have a mind of their own.”  They are not based on “rules”, they are based on statistical probabilities. They operate in a manner that does not correlate with the way things work in the world as understood by the human mind, or more precisely according to the rules of reality as stated by Sir Isaac Newton and even the relativistic notions of Albert Einstein.  Dr. Einstein famously said, “God does not play dice with the universe”, meaning that outcomes must follow rules that correspond to the rest of reality.  While God does not play dice, apparently at the level of quanta, only God knows what will happen.  Quanta make their own rules and humans aren’t fully able to manage quanta — at least not in quite the same way we manage computers today.

Instead of transistors serving as gates that are flipped on or off based upon electrical charges passing through them, to open or close the gates by electrical charges, “qubits” are being set or detected (written or read) based upon arcane behaviors of quantum particles.  The science of quantum behaviors is nothing like the science of silicon processors. When we begin to discuss the kind of behaviors quanta exhibit, such as “quantum entanglement” (Einstein called it, “spooky action at a distance”), indeed all the rules of space-time no longer apply.  But let’s save that discussion for the next article because you’ll need a mind fresh, freed from the “old ways” of thinking about computers, and open to the completely new ways future computers will operate. As you will see, quantum computers are especially spooky in their own right.

And once we have those ideas “in mind” (as best as our minds will allow), we will be able to explore why CERN demands the use of quantum computers to fully exploit its present purpose and the plan I believe its founders had in mind when they broke ground for CERN six decades ago.


The DVD, CERN DECODED, will be available February 18, 2017, from Amazon and from S. Douglas Woodward.
The DVD is priced at $16.95 and presents a 123-minute information-packed discussion
with hundreds of photos featuring Anthony Patch and S. Douglas Woodward, hosted by Lyn Leahz.
The producer of the DVD is BookMinistry.org.

COMING FEBRUARY 16, CERN DECODE, 123 MINUTE DVD @ AMAZON

Share on facebook
Share on twitter
Share on linkedin
Share on whatsapp

No Comments on QUANTUM COMPUTING: A RADICAL TRANSFORMATION OF THE WORLD OF COMPUTING AND SOON, OUR HUMAN EXISTENCE