Which material system or technology is emerging as the most likely platform for quantum computing?
Due to the many competing requirements, such as maximum operating temperature, ability to implement sufficient number of qubits and integrating the required electronics and commercial availability of the technology, is there a clear winner emerging amongst the many options. In particular how likely is it that a semiconductor technology, supported by huge investment will become the dominant platform.
Given an N qubit circuit which cannot be classically simulated, and a certain quantum computing device with a set of specifications (e.g. connectivity map, gate fidelities, gate times, T1, T2, noise models, compiler/transpiler used etc.) find a *scalable* method to estimate the fidelity of the circuit output (with respect to the ideal output).
Current quantum computers are not fault tolerant and as a result any circuit implemented on a given machine is likely to produce a result which diverges in some way from the ideal output distribution. One major problem is to estimate the performance of a given device. If the device is small enough (e.g. under 20 qubits) or if the circuit is simple enough, it is sometimes possible to estimate performance using a brute force simulation. However, one aim for industry is to build devices which can run circuits that cannot be simulated classically.
A particular problem is to design a near term device which can outperform classical computers on some meaningful task. One way to approach this problem is to try and estimate the device’s ability to run certain circuits, but methods for making such estimates are currently very rough and give upper and lower bounds on performance which are not sufficiently tight.
How can we synthesize quantum circuits efficiently that would allow for partially global control?
For some applications, it is not easy to individually address all the qubits. In one of our specific applications, we have individual control over a subset of our qubits and global control over the rest of them. We need to synthesise circuits for applications on this system, but the tools and research available on this subject does not allow for this.
Please explain the concept of weak measurements. We know it won’t directly apply to our problem, but might be able to use some concepts behind it to adapt to our problem stated below. In general, we are looking for a way of measuring fast and very weak signals, but we need to measure 3 million of them at the same time. Hence, we need a scalable solution.
For a confidential application, we need to detect signals which are fast (4ms FWHM time dynamics) and are extremely low in signal to noise ratio (0.1% change in signal amplitude relative to averaged noise). This signal is optical and the result of fluorescence, hence not coherent. While detecting one signal with these characteristics is technologically resolved, we need to measure 3 million of them at the same time.
If we have partial access to the solution of an Ising model, can we reconstruct the interaction terms in the model? Will J_ijs be unique? What other models can potentially apply?
Need to understand if we can apply the Ising model to neuronal interactions to see if there is any one parameter that can be sufficiently different for the neuronal activity in healthy brain vs diseased brain.
What are the most impactful research questions within the field of quantum information science that can be unblocked by having a 100-1000x increase in simulation speed (in regards to quantum circuit simulation or other computational workflows in quantum information science)?
NVIDIA cuQuantum is an SDK used to accelerate quantum circuit simulation on GPUs.
What new kinds of applications and algorithms could one study if one could simulate thousands of qubits, with the constraint that the quantum circuits under study must be relatively shallow?
NVIDIA cuQuantum contains a library for tensor networks which can be used to efficiently simulate thousands of qubits with relatively low levels of entanglement on a supercomputing system
What kinds of simulation tools and features are lacking within the current QC frameworks available today that would enable more efficient computational workflows in quantum computing? What are the biggest computational pain points and bottlenecks? What problems do you spend the most time waiting for a simulation to run?
We’d love to understand what kinds of simulation tools and features would be useful to further enable the quantum ecosystem
What is the Shor algorithm for discrete logarithms when the solution is promised to be within a given range?
To benchmark the progress of quantum computer at breaking the public key cryptography of cryptocurrencies, we designed challenges directly on blockchains with a gradation of difficulties. It is known that quantum computer can break elliptic curve cryptography in O(n^3) time using Shor algorithm for discrete logarithms. Here “n” is the number of bits used to represent the prime number “p” that characterizes the elliptic curve used for signatures. One of our sets of challenges parametrize the difficulty by promising that the private key is smaller than some scalar number “d”. Is there a modification of the Shor algorithm that scales as a polynomial function of the number of bits of this number “d”? Otherwise, can it be shown that the runtime of the Shor discrete logarithms algorithm only depends on the bitsize of “p”?
Ising Solver – Currently Quantum Ising solvers are limited in terms of scale and quality. Near quantum approaches (digital annealers based Ising solver) offer scale at a certain level with acceptable quality but at high cost. This project is aimed to develop a scalable and low cost alternative Ising solver to handle Non convex/NP-hard optimization problems.
We have a number of customers that use our platform to solve optimization type problems. We plan to build our own Ising solvers to augment our backend infrastructure and provide significant cost savings. This is critical as many tasks require large volume processing and large timescales rendering per hour rate models currently offered by market players unsustainable in the long term. Any technology can be adopted – software acceleration, massive parallelization, light pulses/photonics, or other methods as long as the solution is based on room temperature operations. The task is to build this solver.
We are open to building the Ising solvers using open source software based on this article (see link below).
Medical image restoration – DICOM for-processing images are used for diagnosis of breast cancer and created using the ”for processing images” which are then discarded but may be required for any ongoing diagnostics at a future date. The goal of this project is to restore the for-processing image by exploiting quantum computing technology together with state-of-the-art ML algorithms. The outcome of this project could be applied to other image restoration processes.
Raw files of medical images are usually discarded after they have been processed (as physicians typically uses processed images for diagnosis). However, there may be occasion where the raw images are desired. It is also not possible for hospitals to store ever increasing raw files of medical images.
Can quantum computers demonstrate practical computational advantage with less than 1 million physical qubits?
So far, rigorous proofs of quantum advantage or speedup have only been carried out for very few quantum computing algorithms, including Shor’s, Grover’s and Gaussian Boson Sampling. Useful applications of Shor’s and Grover’s applications are well known, but will require hundreds of thousands or millions of qubits to be implemented using today’s hardware platforms. Will hardware improve enough to run these with one-tenth or one-hundredth as many qubits?
Alternatively, are there other algorithms or approaches, (e.g. QML) that might demonstrate practical speedups with far less qubits required?
Is it possible to resolve the number of photons in an optical pulse at telecom wavelengths, using only room-temperature equipment?
Single photon, or photon-number-resolving (PNR) detection at wavelengths between 1300-1600 nm has been demonstrated using SNSPDs (silicon nanowires) and tungsten TES detectors. Both require cryogenic cooling. Are there any other potential technologies, in earlier stages of development, that could potentially achieve similar performance while operating at room temperature?
What existing Canadian industries will be the first to adopt quantum technologies?
Finding users or adopters of quantum technologies in Canada is a challenge, given the limited range of industries that have R&D activities in Canada, and the concentration of resource based and “low-technology” industries. Canada has a lot of quantum tech development. Which domestic industries can become early adopters instead of relying on exports to foreign customers as the first comers?