How relevant is quantum computing?
Erik Lucero et al. of the University of California-Santa Barbara built a quantum processor earlier this year to factorize the number 15 using a quantum algorithm (an algorithm that runs on a quantum computer) for integer factorization formulated by Peter Shor in 1994 . The team built a quantum circuit made of four superconducting qubits (bits which are a little bit of both 0 and 1 at the same time) on top of a substrate made of sapphire. These qubits are not exactly easy to implement in reality considering our present technological limitations. Here are some of my thoughts on the relevance of quantum computing. Moore's Law is dying Moore's law is the observation that the number of transistors in a dense integrated circuit doubles approximately every two years. Computers have been getting smaller and integrated circuits (ICs) more densely packed with transistors ever since the advent of the first IC-based computer. This is a modest explanation of Moore's Law. Since 2014,