The modern computer has for the most part always ran off of the Turing System in order to perform computations. Computer Science as a whole has strongly relied on computers running off of a base two binary system where a bit can either be 1 or 0 with absolutely no in-between. The standard binary system has been so effective in the realm of computing that Moore’s Law has been able to stand the test of time and has existed for nearly forty years. However, times are changing and a new computational paradigm does not seem nearly as impossible in the future as IBM and many other tech companies continue to invest large sums of money into quantum computing.
Though Moore’s Law continues to remain relevant, many question just how much longer it can continue to remain in this trend. This is why IBM, Microsoft, Google, the Department of Defense, Intel and many other entities are starting to sink millions into quantum computing. Intel alone has recently invested millions into quantum computing, even though the technology is ten to twenty years away. Though these companies have many motives I believe that Intel’s reason for doing so is to keep a step ahead of the competition as usual. They most likely look to mass produce quantum chips sometime in the future as they are one of the leading microprocessor producers for personal computers to date. IBM has also thrown their hat into the ring and have recently come one step closer to making quantum computing more of a reality. It is no secret that IBM has devised a way to detect two quantum errors at the same time while before they were only able to detect only one. The detection of these errors are important because in a standard Turing Paradigm where bits can only be on or off, a quantum paradigm allows for bits to be both at the same time. IBM also claims to have designed a way to fit a four quantum bit circuit on a quarter inch lattice structure and argues that it is the most efficient way to add a qubit to make an operational system. Such a breakthrough is significant seeing as to how a processor with 50 qubits would have the power of a super computer in today’s world.
Though quantum computing still has awhile until it is the computational norm, it is definitely what computer scientists need to anticipate. Quantum computing is overall a good thing, it is going to completely and totally change how computing is done and make a lot of older technology and algorithms obsolete. It is extremely important that people working in the field stay up to date on the advancements made in quantum computing. If not, they may be left behind and forced to quickly adapt to the new and upcoming technology of quantum computing.
Unsure how quantum computing will affect your computational experience? Feel free to ask in the comment section.