Search
Generic filters
Exact matches only
Filter by content type
Users
Attachments

Quantum@HSG

Quantum computing promises to be the next disruptive technology, with numerous potential applications and implications for organizations and markets.
At the University of St.Gallen (HSG), the Institute of Information Management and the Institute of Media and Communications Management are collaborating with the Massachusetts Institute of Technology (MIT) to explore how this new technology can be deployed in organizations to utilize its potential. One of the next steps in the collaboration is to develop a framework to assess quantum computing use cases. If you are interested in our research or a potential collaboration, please do not hesitate to get in contact.

Quantum Computing

A recently published report by McKinsey1 estimates the global market value of quantum computing to be at 1 trillion USD by 2035, mainly in the financial, chemical, pharmaceutical, and automotive sectors. Today, the world’s largest technology companies, such as Google, IBM, Microsoft, Amazon, and Alibaba, are already investing billions in research and development of their quantum computing and provide partial access to these quantum computers to the public via cloud infrastructures. However, not only industry players invest but also governments, for example, China is investing 10 billion USD in a national quantum computing laboratory, the U.S. government provided 1 billion USD, and the EU has a budget of overall more than 1 billion EUR.

Quantum computers exploit principles of quantum mechanics, such as superposition and entanglement, to represent data and perform operations on them. Both of these principles enable quantum computers to solve very specific, complex problems significantly faster than standard computers.

Quantum computers can calculate and test extensive combinations of hypotheses simultaneously instead of sequentially. Furthermore, some quantum algorithms can be designed in a way that they can solve problems in much fewer steps than their classical counterparts (their complexity is lower). For this reason, quantum computing could represent a significant breakthrough in modern IT in the next few years and might initiate the transition to the 5th industrial revolution.

Against this backdrop, this website aims to give a brief overview of the three layers of a quantum computer: hardware, system software, and application layer. Furthermore, potential application areas of quantum computing and possible research directions for the field of information systems are introduced. Learn more about the individual parts and applications of quantum computing systems below.

Quantum Computing System

In 1980, Paul Benioff envisioned the concept of a quantum Turing machine, i.e., the theoretical concept of a quantum computer. In 1982, Richard Feynman proposed the first practical application of a quantum computer: efficient simulations of quantum systems. In general, a quantum computer can be defined as a universal computing device that stores information in objects called quantum bits (or qubits) and transforms them by exploiting very specific properties from quantum mechanics.

Importantly, quantum computers are not intended to become general purpose computers that operate by themselves. They will be highly specialized devices that can solve specific tasks much faster than classical computing. Operating quantum computers will most certainly require a classical computer for loading input/output data, retrieving results from computations as well as controlling the quantum computer’s electronic and internal processes. Thus, quantum computers and classical computers form a quantum computing system that enables quantum computers to perform quantum computing.

To depict the different layers of a quantum computing system, the model of Ding and Chong (2020)2 is used. This allows to analytically distinguish the key components of a quantum component system to illustrate the fundamental mechanisms and elements. Additionally, it builds on an analytics distinction of hardware, system software, and application, which is mirrored in conceptual views on computing architectures, e.g., cloud computing (Infrastructure-as-a-Service, Platform-as-a-Service, and Software-as-a-Service) or the layered modular architecture of digital technologies. Lastly, our expert informants distinguished between similar layers as well in their interview statements to explain the state of the art, the challenges for today’s organizations, and the functioning of quantum computing systems. The figure at the top of this section shows a quantum computing system consisting of a van Neumann architecture for classical computing and a quantum computer with its three layers architecture, which we will explain accordingly.

Hardware Layer

One fundamental difference between classical and quantum computers is how information is stored. Whereas classical computers use bits, which can have the value of either zero or one, to store information, quantum computers use quantum bits (or qubits), which can hold any linear combination of zero and one simultaneously. Qubits leverage the advantage of the properties of quantum mechanics and in particular the effect of superposition and entanglement. More information about the properties of quantum mechanics as well as the approaches to physically representing and manipulating qubits can be found below.

Quantum mechanics

arrow_forward

Superposition

arrow_forward

Entanglement

Approaches to physically represent and manipulate qubits

arrow_forward

Analog quantum computing

arrow_forward

Digital quantum computing

System Software Layer

The system software layer builds on top of the hardware layer and orchestrates the system’s processes to leverage the potentials of the qubits (superposition and entanglement). This layer has to cope with challenges of the thermodynamically unstable quantum states. It actively reduces thermal noise within and around the quantum system and performs error correction procedures.

In quantum computing there are many potential sources that can cause noise. For example, quantum computers and especially digital gate-based ones are highly sensitive to changes in the environment, such as vibration, temperature fluctuations, etc. Noise can also be caused by imprecise control of the quantum hardware or manufacturing defects. Most quantum computers even require their chips to be cooled down to a hundredth of a degree above absolute zero temperature to operate. Thus, since noise cannot be avoided, the first era of quantum computers is also called noisy Intermediate-Scale Quantum Computer. This abbreviation implies that current quantum hardware using dozens of qubits has error rates that are too high, which need to be improved before we can build useful quantum computers with hundreds, or even thousands, of usable qubits.

Noise in the environment can lead to qubit decoherence which is environmental influences causing quantum states to randomly change. This is problematic, as a single error in a calculation usually causes the result to be incorrect, unless the error is corrected during the calculation. Since it is impossible to prevent every kind of noise, error correction is important. Ongoing research on quantum error correction seeks to achieve system-level fault tolerance. Quantum error correction differentiates between physical and logical qubits. Logical qubits are represented by a group of physical qubits, which are needed for error correction. Physical qubits work together on correcting errors on individual physical qubits. A group of physical qubits is less likely to cause an error in a calculation than just one physical qubit. Unfortunately, error-correcting mechanisms can cause errors themselves. Depending on the error-correcting mechanism, the relation is typically five to nine physical qubits to achieve one almost error-free logical qubit.

One way to do this is by representing every qubit with groups of several physical qubits that, loosely speaking, work together on correcting errors on individual physical qubits. A perfect physical qubit can work as a logical qubit, as it requires no error correction. Today, the biggest challenge is scaling up to thousands of qubits. Even though the computational space that can be used for calculations doubles with the addition of every individual qubit, this advantage presently cannot be exploited in its full capacity due to high error rates. One prominent example for trying to increase the number of qubits is IBM, which states that it wants to achieve over 1,000 qubits by 2023, while currently there are machines with 60-100 available.

Application Layer

One of the main challenges of today’s quantum computers is the unsolved problem of efficient quantum memory. There exist several theoretical proposals for building quantum random access memory (QRAM). Even though it may be experimentally difficult to build (just as the quantum computer itself), recent publications demonstrated several possible paths of doing so.

Thus, currently exists no efficient way to store states of qubits in a memory for a long time for other calculations. Therefore, data needs to be loaded from a classical computer to the targeted quantum computer, and after performing the calculation states need to be read (measured) by the classical computer before the qubits lose their information. Due to the no cloning theorem, we are also not able to make copies of quantum states and use them for calculations. The only way to load a quantum state from quantum memory into a quantum program is by applying a SWAP operation, and thereby removing it from the memory.

When a quantum state is measured, it collapses to either one or zero. Therefore, we have no way of finding out what state a certain qubit is in. The only way we can approximately find out what state a qubit is in is if we have multiple copies of the same qubit and measure them all. In some cases, reading the classical data may dominate the cost of quantum algorithms so that it cannot speed up the whole algorithm at the macro level. Reading out the data exactly may be infeasible, which cannot meet the computing needs in some tasks. This is especially the case for methods that need large data sets, such as machine learning and artificial intelligence.

Finding a useful algorithm for quantum computers is mostly about constructing it in such a way that the probability of measuring the desired outcome is maximized. Even though the output of the quantum computer may be an exponentially large number of solutions, we are usually just interested in a small subset of these solutions. Finding them without having to run the whole algorithm many times is the art of quantum algorithm construction. Here are three of the most important quantum algorithms.

Grover’s algorithm is also known as the quantum search algorithm. Grover’s algorithm is used for searching an unstructured database or an unordered list. Classically, for finding a particular item in a database of size N, we need to go through, on average, N/2 items to find the right one. Using Grover’s algorithm, we can do this in only √N steps, on average. For a large N, this can be remarkably faster. This is called a quadratic speedup.

Shor’s algorithm, also known as the integer factorization algorithm, can factorize integers almost exponentially faster than the fastest known classical algorithm. Factorizing integers is very difficult computationally and is therefore also the basis of RSA encryption.

HHL (Harrow Hassidim Lloyd) is also known as the quantum algorithm for linear systems of equations. The algorithm can estimate the result of a function of the solution x of a linear system (Ax = b), where A is a matrix and b a vector.

Application Areas

Thanks to the enormous progress in hardware, more and more established commercial companies are investing in quantum technology. Examples include Boehringer Ingelheim, who recently announced a research partnership with Google, and Daimler, who announced progress in the field of materials research, or chemistry giants like BASF who aim to stay at the forefront of chemistry research and business. Quantum computing has three essential capabilities to address today’s computational problems that current computers are not or only partially capable of and that bear benefits for companies:

Problem Type Approach Example Use-Cases
Search and Graph Finding one or more optimal solution(s) to a complex problem. Often the problems involve a large number of possible parameter combinations.
  • Find optimal parameter configuration to optimize portfolio in the finance industry
  • Search for possible routes to optimize traffic flow in transportation
  • Factorize prime numbers to break encryption in secure communication
AlgebraicCalculating complex network architectures and the weights for machine learning and artificial intelligence. This involves transforming and calculating large matrices.
  • Transform matrices to find objects in images in computer vision
  • Find patterns in texts to understand semantics in natural language processing
SimulationCalculating how states of a system change through manipulating parameters to analyze the behavior of complex systems.
  • Simulate states of molecules and their changes to understand chemical reactions in pharma industry
  • Simulate the behavior of materials to find more efficient materials in battery industry

These capabilities determine the potential applications of this technology in numerous industries, such as finance, chemistry and pharma, manufacturing, energy, or cybersecurity.

Interested? Find out more about the individual application areas below.

The fact that a qubit can theoretically represent an infinite number of states allows for solving complex combinatorial optimization problems, which is currently one of the major application areas for current quantum computing technologies, such as the solution of D-Wave. Combinatorial optimization is the process of finding one or more optimal solutions to a problem. Examples of such problems include supply chain optimization, optimizing public transportation schedules and routes, package deliveries, etc. These solutions are searched for in a discrete (finite) but very large configuration space (i.e., a set of states). The set of possible solutions can be defined with several constraints and the goal is to optimize the objective function with the best solution.

Since the problem spaces in certain complex problems are very large, it is extremely difficult to find the optimal or even a single good solution to these problems with classical computers in a reasonable time frame or with sufficient accuracy. Such combinatorial optimization problems often pose a great challenge for the private as well as the public sector. While they are often simple to describe, they turn out to be very difficult to solve. Combinatorial optimization problems may be divided into order, assignment, grouping, and selection problems, and within these classes, subclasses exist, such as the knapsack or the traveling salesman problem. In addition to the property that there can be a lot of qualitatively different solutions for a problem, no known algorithm exists that can easily compute these problems directly. Searching very large problem spaces requires an enormous amount of computing capacity and time.

Respectively, quantum computers are expected to play a decisive role in the financial services industry. Especially players specializing in portfolio optimization and arbitrage could benefit. From a very large pool of existing financial instruments, a subset should be selected so that a certain portfolio volume is achieved, while at the same time a large number of factors must be taken into account to minimize risk and achieve profitability. Further, Deutsche Börse (a German company offering marketplace organizing for the trading of shares) already experimented with the applicability of quantum computing for a sensitivity analysis on one of their risk models, a computation that is too expensive to be run on classical computers. Due to its suitability to solve optimization problems, another application of quantum computing is the optimization of flow, e.g., of traffic or goods. Collaborating with D-Wave Systems, VW has already shown in a pilot project how to optimize a simplified traffic flow in the city of Lisbon by leveraging quantum annealing technologies (Neukart et al., 2017; Yarkoni et al., 2019) – a project that started in late 2016 with a proof-of-concept project. It investigated the readiness of quantum computing by building a traffic-flow optimization program that used GPS coordinates of 418 taxis in Beijing to resolve traffic congestion.

Moreover, quantum computers are superior to classical computers regarding certain prime factorization procedures that play an important role in the secure encryption of data. A popular example for this is the aforementioned Shor (1994) algorithm that factors a number into its prime factors, a process used often in cryptography and cybersecurity. A dataset encrypted with quantum technology would be impossible to decrypt with classical computer technology, or at least not in time periods relevant to human users. Conversely, it would be easy for a quantum computer to crack data encrypted with classical RSA technology – a phenomenon that may be coined as quantum threat.

The ability of quantum computing to accelerate optimization problems plays a crucial role for narrow AI approaches. Quantum computing can help to calculate complex network architectures and weights for machine learning and artificial intelligence. Quantum computing shows its advantage in transforming and calculating large metrices. For example, in the context of supervised learning, the model aims to minimize the error between the prediction of the model itself compared to the input and adequate output or label given.

Quantum computers offer several approaches to solving problems like this, thereby, again, accelerating calculation and allowing for more complex network architectures. They may be applicable to all relevant practices or sub-tasks of artificial intelligence, such as image processing and computer vision or natural language processing, as demonstrated in an experiment by Cambridge Quantum Computing. Having said that, it is important to note that, so far, no near-term machine learning algorithm with provable speedup has been found.

A quantum computer has a fundamental advantage over classical computers: It can simulate other quantum systems (e.g., a nitrogen molecule) much more efficiently than any computer system available today. For classical computers, even molecules with comparatively low complexity represent an almost unsolvable task. In the 1980s, Richard Feynman theoretically substantiated the possibility of a quantum-based computer for simulating molecules. Since then, researchers have attempted to transfer the quantum system of a molecule into another quantum system, i.e., into the quantum computer, in order to simulate it. One new hope in the application of quantum computers is the simulation of more efficient catalysts for ammonia synthesis in the Haber-Bosch process, which today accounts for about 1 to 2 percent of global energy consumption. Better catalysts could reduce energy consumption and thus also help slow global warming. Even quantum computers without full error correction may already be better suited for this application than simulations on classical computers.

Furthermore, the development of active ingredients and drugs is often a lengthy and very cost-intensive process. This is due in particular to the fact that a large number of substances have to be tested on a trial-and-error basis in the real world. Yet, building on the same principles of quantum physics, quantum computing may be able to virtually replicate the behaviors such that simulation-based research may sooner or later replace this cost-intensive process.

For instance, BASF, pursuing its high requirements for the accuracy of quantum chemical calculations, investigated, in collaboration with the company HQS, the applicability of quantum computing. Specifically, they aimed to understand the quantum mechanical calculation of the energy course of chemical reactions, as this actually allows for the prediction of the probable course (i.e., how does the reaction proceed, which products, by-products, etc., are formed, how can I accelerate the reaction with the help of catalysts, etc.) of chemical reactions. This application of needed methods reaches the limits of conventional computing methods (Kühn et al., 2019). In addition, material research on the functioning of batteries is deemed to inform today’s electromobility and is already targeted by automotive giants such as VW.

Link to the Field of Information Systems

Even though there are high investments in quantum computing, most expert estimations still place the widespread industrial application of quantum computing at least five to ten years in the future. Its exact manifestations in many critical areas remain unclear. Thus, it is the task of today’s research community to creatively conjure up and explore the full potential and the socio-technological consequences of quantum computing. Therefore, based on analyzing existing literature and the conducted interviews with 21 leading experts in industry and research, we propose the following four initial directions for research on quantum computing in information systems:

Further Research and Development Potential Research Questions
Quantum computing ecosystems as a new networked business
  • Does the access to quantum computing need to be regulated?
  • Does quantum computing need new sourcing strategies?
  • How does the emerging quantum computing ecosystem act as a spoke component to other industries and ecosystems?
  • Which transformation may result from the emergence of a quantum computing networked business?
Digital understanding as a foundation for use cases and ecosystems
  • What approaches could be used or developed to analyze business problems and therefore leverage the potential of quantum computing?
  • How can these problems be described mathematically?
  • What are possible design principles of artifacts to describe use cases?
  • How will quantum computing impact the modeling of a social and economic reality as a transformation from binary to multidimensional quantum states?
Quantum computing as a challenge for IT organizations and IT service providers
  • What are possible security approaches to protect legacy IT with old encryption standards considering the quantum threat?
  • Can quantum computers and artificial intelligence be used for real-time threat and anomaly detection?
  • How can quantum computers be used to simulate possible intrusions and cyberattacks for calculating risk–cost evaluations?
  • Calculating how states of a system change through manipulating parameters to analyze the behavior of complex systems.
Skills needed to leverage quantum computing in the quantum computing field
  • How could information systems act in a mediating role for adopting quantum computing technologies?
  • Should quantum computing be included in the information systems curriculum?
  • How can future information systems managers be trained to be aware of the disruptive potential of quantum computing?
  • How can management leverage the potentials of available techniques, approaches, and platforms around quantum computing?
  • How can gaps of knowledge and access to infrastructures be mitigated?

All of these directions try to consider the fact that quantum computing despite its disruptive potential will initially be an extension of computing capabilities for established electronic markets, ecosystems, and its participants (see Figure below), while new ecosystem participants are already establishing themselves (e.g., IonQ or Rigetti).

Find out more about our proposed focus areas below.

The adoption and diffusion of quantum computing will heavily rely on an emerging ecosystem comprising technology providers, such as IBM, Google, Microsoft, or Amazon Web Services, start-ups with specific playgrounds such as 1Qbit or IonQ as well as consulting firms and academic institutions to support customers in adopting and building applications using quantum computing technologies.

Also, the European Union built their own ecosystem with the “Quantum Flagship”. Companies, providers, research institutions, and governments ultimately need to engage in such an ecosystem to allow for getting hold of capabilities that transcend their own organizational boundaries or even their entire industry (e.g., building their own computing infrastructures, translating business problems into mathematical and quantum problems, etc.). Due to this emerging new organizing logic and structure for quantum computing, key aspects need to be considered when pursuing information systems research in this context.

First, the entrance barrier to quantum computation is expected to be very high due to multiple limitations such as the necessity of knowledge in quantum physics, the expensiveness of building quantum computers and the shortage of experts in the labor market. As such, they may enforce divides and limit access. Steps should be taken to reduce a possible quantum divide. Second and consequently, incumbents will need to rely on the capabilities that technology providers, start-ups, consulting firms, or academic institutions may provide, as they might go beyond their domain expertise. As such, prevailing networked businesses and ecosystems need to develop methods and technologies to purposefully connect their way of doing digital business with the emerging quantum computing ecosystem players in the different layers, namely the hardware layer (e.g., Amazon Web Services, IBM, and Google), the system layer (e.g., IonQ and Rigetti), and the application layer (e.g., Cambridge Quantum Computing or 1QBit).

Today, the playground is already diverse, with fuzzy boundaries leading to the need for design-science-oriented guidance for incumbents to assess their own business and technology maturity. For instance, IonQ and Rigetti are positioned on both the hardware and the system software layer. Additionally, for companies it is important to mediate the engagement with different players as part of their quantum computing road map.

The proliferation of quantum computing as a generative technology for calculating with an enormous speedup relies on a fundamental premise: The problem which will be solved by a quantum computing approach needs to be replicated in the form of digital data on which basis a calculation becomes possible in the first place. Emerging technologies such as machine learning already challenge today’s organizations.

The main reason is that it is complicated to digitally represent business practices and economic behavior to allow for analysis. This phenomenon may be summarized as datafication . As such, the dematerialization of the physical world in the form of digital data as a digital representation is an essential prerequisite. Only with this prerequisite, one may use quantum computing when calculating the physical world based on its datafied digital representation.

Achieving an adequate digital representation of the respective quantum computing problem requires a mathematical and conceptual understanding to allow for assessing, understanding, and realizing the value of quantum computing aside from other computing approaches (e.g., high performance computing). Furthermore, quantum computing may also serve as an enabler for process innovation; for example, it could be interesting for research areas around process mining, such as analyzing and optimizing process configurations or simulating contexts of processes or configurations of processes.

Therefore, research on use case analysis and in particular on methods of how to find, describe, and analyze use cases systematically and at scale are highly relevant.

IT competencies are increasingly built up in business units using commercial IT services without having the IT department in the loop. Quantum computing drives this change even further, since for the next few decades, the first quantum computers will likely only be available via the cloud for most companies.

IT departments are therefore under pressure in terms of how to manage quantum computer usage in companies, especially with regards to transmitting the respective data which is needed for quantum-computing-based calculations. This is of particular interest, since data preparation including data input and output might be the bottleneck for quantum computing in the long run.

Furthermore, quantum computing and especially the ability of prime factorization is a threat for current encryption standards and poses huge challenges for the IT organization. Even though new encryption techniques can be used once quantum computers become a real threat to current encryption protocols, past communication and old data can be decrypted retroactively.

Historically, the role of information systems has been to bridge the gap between informatics and business. In the age of quantum computing, this role is becoming more important than ever before. In order to leverage the potential of quantum computing, at least three roles are required:

First, mathematical and quantum physical skills are needed to translate problems into mathematical formulas.
Second, domain expertise is needed to integrate the business problem within the mathematical formulation.
Third, an intermediary is needed to facilitate between both roles.

Due to the high complexity and high specialization of the job types (e.g., error correction specialist, quantum algorithm developer), the entrance barrier to the field of quantum computing is significantly higher than for regular “coding”. Additionally, for years there has been a shortage of STEM (Science Technology Engineering Mathematics) graduates, which may amplify the war for talents in quantum computing. Having said that, companies such as IBM, Google, or research institutions such as ETH, are working on developing programming languages and compilers in which a device will decide if the application is suitable for a quantum computer. However, according to experts, this will be take years.

Team

Dr. Roman Rietsche

University of St.Gallen (HSG)

Dr. Christian Dremel

Norwegian University of Science and Technology (NTNU)

Samuel Bosch

Massachusetts Institute of Technology (MIT)

Dr. Léa Steinacker

University of St.Gallen (HSG)

Prof. Dr. Miriam Meckel

University of St.Gallen (HSG)

Prof. Dr. Jan Marco Leimeister

University of St.Gallen (HSG)

References

  1. Hazan, E., Ménard, A., Patel, M., & Ostojic, I. (2020). The next tech revolution: quantum computing.
  2. Ding, Y., & Chong, F. T. (2020). Quantum computer systems: Research for noisy intermediate-scale quantum computers.
    Synthesis lectures on computer architecture. Morgan & Claypool.

The information used on this website to illustrate the fundamental concept of a quantum computing system as well as its application areas and link to the field of information systems is based on publications from various scholars and institutions as well as expert interviews and our research. All references are available in our research paper.