A new quantum computational set-up for algebraic topology via simplicial sets

Type: Article
Publication Date: 2025-07-01
Citations: 0

Abstract

In this paper, a quantum computational framework for algebraic topology based on simplicial set theory is presented. This extends previous work, which was limited to simplicial complexes and aimed mostly at topological data analysis. The proposed set-up applies to any parafinite simplicial set and proceeds by associating with it a finite dimensional simplicial Hilbert space, whose simplicial operator structure is studied in some depth. It is shown in particular how the problem of determining the simplicial set’s homology can be solved within the simplicial Hilbert framework. Further, the conditions under which simplicial set theoretic algorithms can be implemented in a quantum computational setting with finite resources are examined. Finally a quantum algorithmic scheme capable of computing the simplicial homology spaces and Betti numbers of a simplicial set combining a number of basic quantum algorithms is outlined.

Locations

  • Journal of AppliedMath
  • arXiv (Cornell University)

Ask a Question About This Paper

Summary

This paper presents a novel quantum computational framework for algebraic topology, specifically designed to compute topological invariants like homology and Betti numbers using the more general and powerful combinatorial structures known as simplicial sets. This work significantly extends prior quantum algorithms in topological data analysis, which primarily focused on simplicial complexes.

The significance of this development lies in addressing the inherent limitations of simplicial complexes for modeling complex topological spaces. Simplicial complexes struggle with operations like products and quotients, and cannot naturally incorporate “degenerate” simplices (those whose formal dimension is higher than their effective one). Simplicial sets, by contrast, offer a leaner, more versatile, and combinatorially richer approach, allowing for more compact representations of topological spaces and a broader range of applications, including those beyond pure topology like category theory or directed graphs. By adapting quantum computational methods to this more general setting, the paper paves the way for potentially more efficient and broadly applicable quantum algorithms for topological analysis.

The key innovations introduced in this paper are:

  1. Simplicial Hilbert Space Encoding: A parafinite simplicial set (one with a finite number of simplices at each dimension) is systematically encoded into a finite-dimensional simplicial Hilbert space. This involves mapping each simplex to an orthonormal basis vector (analogous to qubits), and the simplicial face and degeneracy maps to corresponding linear operators acting on these basis vectors.
  2. Introduction of Adjoint Operators and Simplicial Hilbert Hodge Laplacians: A critical step is the natural incorporation of adjoint operators for the face and degeneracy maps, leveraging the Hilbert space structure. These adjoints, alongside the original operators, enable the definition of various Simplicial Hilbert Hodge Laplacians. Inspired by classical Hodge theory, the kernels of these Laplacians are shown to directly encode the homology of the underlying simplicial set.
  3. Normalized Simplicial Hilbert Homology: The framework distinguishes between “raw” and “normalized” simplicial Hilbert homology. Crucially, it demonstrates that the normalized homology, which effectively discards the homologically irrelevant degenerate simplices, is isomorphic to the standard simplicial homology of the space. This normalization is vital for computational efficiency, as it focuses resources on the essential non-degenerate components.
  4. Formalization of Simplicial Quantum Circuits: The paper defines a new class of simplicial quantum circuits, which are unitary operators whose actions are explicitly compatible with the inherent structure and relations of simplicial sets. This provides a mathematical blueprint for how a quantum computer could intrinsically perform operations on simplicial data.
  5. Algorithmic Scheme for Homology Computation: An outline is provided for a quantum algorithmic scheme that can compute the simplicial homology spaces and Betti numbers. This scheme combines established quantum algorithms, notably Grover’s search algorithm (for projecting onto relevant subspaces and identifying non-degenerate simplices) and Abrams’ and Lloyd’s algorithm (for finding eigenvalues and eigenvectors, which translates to finding the kernel of the Laplacian).

The main prior ingredients upon which this work builds include:

  1. Algebraic Topology Fundamentals: A deep understanding of simplicial sets, chain complexes, boundary operators, and homology theory is foundational. The paper explicitly leverages established results such as the Eilenberg-Zilber lemma and normalization theorems from classical algebraic topology.
  2. Linear Algebra and Functional Analysis: Concepts of Hilbert spaces, linear operators, adjoints, kernels, and orthogonal projections are central to encoding the topological structures into a quantum mechanical setting.
  3. Finite-Dimensional Hodge Theory: The idea of computing homology groups as the kernels of Laplacian operators is directly borrowed from Hodge theory, which provides a powerful analytical tool for this purpose in finite-dimensional settings.
  4. Prior Quantum Topological Data Analysis (QTDA): The work is directly inspired by and builds upon seminal research, particularly that of Lloyd et al. [19], which first demonstrated quantum algorithms for computing homology in simplicial complexes. This paper’s core contribution is extending these quantum techniques to the more expressive and complex realm of simplicial sets.
  5. Standard Quantum Algorithms: The proposed computational scheme relies on well-known quantum algorithms such as Grover’s quantum search (for amplitude amplification and state preparation) and Abrams’ and Lloyd’s quantum algorithm for finding eigenvalues and eigenvectors (a variant of quantum phase estimation), demonstrating how these general-purpose tools can be specialized for topological problems.
In this paper, we lay down the foundation of a quantum computational framework for algebraic topology based on simplicial set theory. This extends previous work, which was limited to simplicial … In this paper, we lay down the foundation of a quantum computational framework for algebraic topology based on simplicial set theory. This extends previous work, which was limited to simplicial complexes and aimed mostly to topological data analysis. Our set--up applies to any parafinite simplicial set and proceeds by associating with it a finite dimensional simplicial Hilbert space, whose simplicial operator structure we study in depth. We show in particular how the problem of determining the simplicial set's homology can be solved within the simplicial Hilbert framework. We examine further the conditions under which simplicial set theoretic algorithms can be implemented in a quantum computational setting taking into account a quantum computer's finite resources. We outline finally a quantum algorithmic scheme capable to compute the simplicial homology spaces and Betti numbers of a simplicial set combining a number of basic quantum algorithms.
Topological data analysis has emerged as a powerful tool for analyzing large-scale data. High-dimensional data form an abstract simplicial complex, and by using tools from homology, topological features could be … Topological data analysis has emerged as a powerful tool for analyzing large-scale data. High-dimensional data form an abstract simplicial complex, and by using tools from homology, topological features could be identified. Given a simplex, an important feature is so-called Betti numbers. Calculating Betti numbers classically is a daunting task due to the massive volume of data and its possible high-dimension. While most known quantum algorithms to estimate Betti numbers rely on homology, here we consider the `dual' approach, which is inspired by Hodge theory and de Rham cohomology, combined with recent advanced techniques in quantum algorithms. Our cohomology method offers a relatively simpler, yet more natural framework that requires exponentially less qubits, in comparison with the known homology-based quantum algorithms. Furthermore, our algorithm can calculate its $r$-th Betti number $\beta_r$ up to some multiplicative error $\delta$ with running time $\mathcal{O}\big( \log(c_r) c_r^2 / (c_r - \beta_r)^2 \delta^2 \big)$, where $c_r$ is the number of $r$-simplex. It thus works best when the $r$-th Betti number is considerably smaller than the number of the $r$-simplex in the given triangulated manifold.
Topological data analysis (TDA) is an emergent field of data analysis. The critical step of TDA is computing the persistent Betti numbers. Existing classical algorithms for TDA are limited if … Topological data analysis (TDA) is an emergent field of data analysis. The critical step of TDA is computing the persistent Betti numbers. Existing classical algorithms for TDA are limited if we want to learn from high-dimensional topological features because the number of high-dimensional simplices grows exponentially in the size of the data. In the context of quantum computation, it has been previously shown that there exists an efficient quantum algorithm for estimating the Betti numbers even in high dimensions. However, the Betti numbers are less general than the persistent Betti numbers, and there have been no quantum algorithms that can estimate the persistent Betti numbers of arbitrary dimensions. This paper shows the first quantum algorithm that can estimate the (normalized) persistent Betti numbers of arbitrary dimensions. Our algorithm is efficient for simplicial complexes such as the Vietoris-Rips complex and demonstrates exponential speedup over the known classical algorithms.
Topological data analysis (TDA) is an emergent field of data analysis. The critical step of TDA is computing the persistent Betti numbers. Existing classical algorithms for TDA are limited if … Topological data analysis (TDA) is an emergent field of data analysis. The critical step of TDA is computing the persistent Betti numbers. Existing classical algorithms for TDA are limited if we want to learn from high-dimensional topological features because the number of high-dimensional simplices grows exponentially in the size of the data. In the context of quantum computation, it has been previously shown that there exists an efficient quantum algorithm for estimating the Betti numbers even in high dimensions. However, the Betti numbers are less general than the persistent Betti numbers, and there have been no quantum algorithms that can estimate the persistent Betti numbers of arbitrary dimensions. This paper shows the first quantum algorithm that can estimate the (normalized) persistent Betti numbers of arbitrary dimensions. Our algorithm is efficient for simplicial complexes such as the Vietoris-Rips complex and demonstrates exponential speedup over the known classical algorithms.
Quantum walks have emerged as a transformative paradigm in quantum information processing and can be applied to various graph problems. This study explores discrete-time quantum walks on simplicial complexes, a … Quantum walks have emerged as a transformative paradigm in quantum information processing and can be applied to various graph problems. This study explores discrete-time quantum walks on simplicial complexes, a higher-order generalization of graph structures. Simplicial complexes, encoding higher-order interactions through simplices, offer a richer topological representation of complex systems. Leveraging algebraic topology and discrete-time quantum walk, we present a quantum walk algorithm for detecting higher-order community structures called simplicial communities. We utilize the Fourier coin to produce entangled translation states among adjacent simplices in a simplicial complex. The potential of our quantum algorithm is tested on Zachary's karate club network. This study may contribute to understanding complex systems at the intersection of algebraic topology and quantum algorithms.
Topological data analysis (TDA) has emerged as a powerful tool for extracting meaningful insights from complex data. TDA enhances the analysis of objects by embedding them into a simplicial complex … Topological data analysis (TDA) has emerged as a powerful tool for extracting meaningful insights from complex data. TDA enhances the analysis of objects by embedding them into a simplicial complex and extracting useful global properties such as the Betti numbers, i.e. the number of multidimensional holes, which can be used to define kernel methods that are easily integrated with existing machine-learning algorithms. These kernel methods have found broad applications, as they rely on powerful mathematical frameworks which provide theoretical guarantees on their performance. However, the computation of higher-dimensional Betti numbers can be prohibitively expensive on classical hardware, while quantum algorithms can approximate them in polynomial time in the instance size. In this work, we propose a quantum approach to defining topological kernels, which is based on constructing Betti curves, i.e. topological fingerprint of filtrations with increasing order. We exhibit a working prototype of our approach implemented on a noiseless simulator and show its robustness by means of some empirical results suggesting that topological approaches may offer an advantage in quantum machine learning.
Topological data analysis (TDA) has emerged as a powerful tool for extracting meaningful insights from complex data. TDA enhances the analysis of objects by embedding them into a simplicial complex … Topological data analysis (TDA) has emerged as a powerful tool for extracting meaningful insights from complex data. TDA enhances the analysis of objects by embedding them into a simplicial complex and extracting useful global properties such as the Betti numbers, i.e. the number of multidimensional holes, which can be used to define kernel methods that are easily integrated with existing machine-learning algorithms. These kernel methods have found broad applications, as they rely on powerful mathematical frameworks which provide theoretical guarantees on their performance. However, the computation of higher-dimensional Betti numbers can be prohibitively expensive on classical hardware, while quantum algorithms can approximate them in polynomial time in the instance size. In this work, we propose a quantum approach to defining topological kernels, which is based on constructing Betti curves, i.e. topological fingerprint of filtrations with increasing order. We exhibit a working prototype of our approach implemented on a noiseless simulator and show its robustness by means of some empirical results suggesting that topological approaches may offer an advantage in quantum machine learning.
Topological data analysis (TDA) based on persistent homology is an emergent field of data analysis. The critical step of TDA is computing the persistent Betti numbers. Existing classical algorithms for … Topological data analysis (TDA) based on persistent homology is an emergent field of data analysis. The critical step of TDA is computing the persistent Betti numbers. Existing classical algorithms for TDA have been limited because the number of simplices can grow exponentially in the size of data for higher-dimensional analysis. In the context of quantum computation, it has been previously shown that there exists an efficient quantum algorithm for estimating the Betti numbers even in high dimensions. However, the Betti numbers are less general than the persistent Betti numbers, and there have been no quantum algorithms that can estimate the persistent Betti numbers of arbitrary dimensions. This paper shows the first quantum algorithm that can estimate the persistent Betti numbers of arbitrary dimensions. Our algorithm is efficient for the construction of simplicial complexes such as the Vietoris-Rips complex and demonstrates exponential speedup over the known classical algorithms.
Incorporating higher-order interactions in information processing enables us to build more accurate models, gain deeper insights into complex systems, and address real-world challenges more effectively. However, existing methods, such as … Incorporating higher-order interactions in information processing enables us to build more accurate models, gain deeper insights into complex systems, and address real-world challenges more effectively. However, existing methods, such as random walks on oriented simplices and homology, which capture these interactions, are not known to be efficient. This work investigates whether quantum walks on simplicial complexes exhibit quantum advantages. We introduce a novel quantum walk that encodes the combinatorial Laplacian, a key mathematical object whose spectral properties reflect the topology of the underlying simplicial complex. Furthermore, we construct a unitary encoding that projects onto the kernel of the Laplacian, representing the space of harmonic cycles in the complex's homology. Combined with the efficient construction of quantum walk unitaries for clique complexes that we present, this paves the way for utilizing quantum walks to explore higher-order interactions within topological structures. Our results achieve superpolynomial quantum speedup with quantum walks without relying on quantum oracles for large datasets. Crucially, the walk operates on a state space encompassing both positively and negatively oriented simplices, effectively doubling its size compared to unoriented approaches. Through coherent interference of these paired simplices, we are able to successfully encode the combinatorial Laplacian, which would otherwise be impossible. This observation constitutes our major technical contribution. We also extend the framework by constructing variant quantum walks. These variants enable us to: (1) estimate the normalized persistent Betti numbers, capturing topological information throughout a deformation process, and (2) verify a specific QMA$_1$-hard problem, showcasing potential applications in computational complexity theory.
In this paper, we propose an extension of quantum searches on graphs driven by quantum walks to simplicial complexes. To this end, we newly define a quantum walk on simplicial … In this paper, we propose an extension of quantum searches on graphs driven by quantum walks to simplicial complexes. To this end, we newly define a quantum walk on simplicial complex which is an alternative of preceding studies by authors. We show that the quantum search on the specific simplicial complex corresponding to the triangulation of $n$-dimensional unit square driven by this new simplicial quantum walk works well, namely, a marked simplex can be found with probability $1+o(1)$ with in a time $O(\sqrt{N})$, where $N$ is the number of simplices with the dimension of marked simplex.
Quantum algorithms for topological data analysis (TDA) seem to provide an exponential advantage over the best classical approach while remaining immune to dequantization procedures and the data-loading problem. In this … Quantum algorithms for topological data analysis (TDA) seem to provide an exponential advantage over the best classical approach while remaining immune to dequantization procedures and the data-loading problem. In this paper, we give complexity-theoretic evidence that the central task of TDA -- estimating Betti numbers -- is intractable even for quantum computers. Specifically, we prove that the problem of computing Betti numbers exactly is #P-hard, while the problem of approximating Betti numbers up to multiplicative error is NP-hard. Moreover, both problems retain their hardness if restricted to the regime where quantum algorithms for TDA perform best. Because quantum computers are not expected to solve #P-hard or NP-hard problems in subexponential time, our results imply that quantum algorithms for TDA offer only a polynomial advantage in the worst case. We support our claim by showing that the seminal quantum algorithm for TDA developed by Lloyd, Garnerone and Zanardi achieves a quadratic speedup over the best known classical approach in asymptotically almost all cases. Finally, we argue that an exponential quantum advantage can be recovered if the input data is given as a specification of simplices rather than as a list of vertices and edges.
Topological data analysis (TDA) is a fast-growing field that utilizes advanced tools from topology to analyze large-scale data. A central problem in topological data analysis is estimating the so-called Betti … Topological data analysis (TDA) is a fast-growing field that utilizes advanced tools from topology to analyze large-scale data. A central problem in topological data analysis is estimating the so-called Betti numbers of the underlying simplicial complex. While the difficulty of this problem has been established as NP-hard, previous works have showcased appealing quantum speedup. In this article, we provide an alternative method for estimating Betti numbers of given simplicial complex, based on some recent results on quantum algorithm. Our method can be faster than the best-known classical method for finding Betti numbers, and interestingly, it can also find the Betti numbers of the complement graph to our original one.
Present-day quantum field theory can be regularized by a decomposition into quantum simplices. This replaces the infinite-dimensional Hilbert space by a high-dimensional spinor space and singular canonical Lie groups by … Present-day quantum field theory can be regularized by a decomposition into quantum simplices. This replaces the infinite-dimensional Hilbert space by a high-dimensional spinor space and singular canonical Lie groups by regular spin groups. It radically changes the uncertainty principle for small distances. Gaugeons, including the gravitational, are represented as bound fermion-pairs, and space-time curvature as a singular organized limit of quantum non-commutativity. Keywords: Quantum logic, quantum set theory, quantum gravity, quantum topology, simplicial quantization.
We tackle the long-standing question of the computational complexity of determining homology groups of simplicial complexes, a fundamental task in computational topology, posed by Kaibel and Pfetsch 20 years ago. … We tackle the long-standing question of the computational complexity of determining homology groups of simplicial complexes, a fundamental task in computational topology, posed by Kaibel and Pfetsch 20 years ago. We show that this decision problem is QMA1-hard. Moreover, we show that a version of the problem satisfying a suitable promise and certain constraints is contained in QMA. This suggests that the seemingly classical problem may in fact be quantum mechanical. In fact, we are able to significantly strengthen this by showing that the problem remains QMA1-hard in the case of clique complexes, a family of simplicial complexes specified by a graph which is relevant to the problem of topological data analysis. The proof combines a number of techniques from Hamiltonian complexity and homological algebra. We discuss potential implications for the problem of quantum advantage in topological data analysis.
The notion of a simplicial set originated in algebraic topology, and has also been utilized extensively in category theory, but until relatively recently was not used outside of those fields. … The notion of a simplicial set originated in algebraic topology, and has also been utilized extensively in category theory, but until relatively recently was not used outside of those fields. However, with the increasing prominence of higher categorical methods in a wide range of applications, it is important for researchers in a range of fields to have a good working knowledge of them. This paper is intended as an introduction to simplicial sets, both as an overview of their development from other concepts, and as a user's guide for someone wanting to read modern literature that makes use of them.
Predicting and analyzing global behaviour of complex systems is challenging due to the intricate nature of their component interactions. Recent work has started modelling complex systems using networks endowed with … Predicting and analyzing global behaviour of complex systems is challenging due to the intricate nature of their component interactions. Recent work has started modelling complex systems using networks endowed with multiway interactions among nodes, known as higher-order networks. In this context, simplicial complexes are a class of higher-order networks that have received significant attention due to their topological structure and connections to Hodge theory. Topological signal processing utilizes these connections to analyze and manipulate signals defined on non-Euclidean domains such as simplicial complexes. In this work, we present a general quantum algorithm for implementing filtering processes in TSP and describe its application to extracting network data based on the Hodge decomposition. We leverage pre-existing tools introduced in recent quantum algorithms for topological data analysis and combine them with spectral filtering techniques using the quantum singular value transformation framework. While this paper serves as a proof-of-concept, we obtain a super-polynomial improvement over the best known classical algorithms for TSP filtering processes, modulo some important caveats about encoding and retrieving the data from a quantum state. The proposed algorithm generalizes the applicability of tools from quantum topological data analysis to novel applications in analyzing high-dimensional complex systems.
Solving linear systems of equations is a common problem that arises both on its own and as a subroutine in more complex problems: given a matrix $A$ and a vector … Solving linear systems of equations is a common problem that arises both on its own and as a subroutine in more complex problems: given a matrix $A$ and a vector $\stackrel{\ensuremath{\rightarrow}}{b}$, find a vector $\stackrel{\ensuremath{\rightarrow}}{x}$ such that $A\stackrel{\ensuremath{\rightarrow}}{x}=\stackrel{\ensuremath{\rightarrow}}{b}$. We consider the case where one does not need to know the solution $\stackrel{\ensuremath{\rightarrow}}{x}$ itself, but rather an approximation of the expectation value of some operator associated with $\stackrel{\ensuremath{\rightarrow}}{x}$, e.g., ${\stackrel{\ensuremath{\rightarrow}}{x}}^{\ifmmode\dagger\else\textdagger\fi{}}M\stackrel{\ensuremath{\rightarrow}}{x}$ for some matrix $M$. In this case, when $A$ is sparse, $N\ifmmode\times\else\texttimes\fi{}N$ and has condition number $\ensuremath{\kappa}$, the fastest known classical algorithms can find $\stackrel{\ensuremath{\rightarrow}}{x}$ and estimate ${\stackrel{\ensuremath{\rightarrow}}{x}}^{\ifmmode\dagger\else\textdagger\fi{}}M\stackrel{\ensuremath{\rightarrow}}{x}$ in time scaling roughly as $N\sqrt{\ensuremath{\kappa}}$. Here, we exhibit a quantum algorithm for estimating ${\stackrel{\ensuremath{\rightarrow}}{x}}^{\ifmmode\dagger\else\textdagger\fi{}}M\stackrel{\ensuremath{\rightarrow}}{x}$ whose runtime is a polynomial of $\mathrm{log}(N)$ and $\ensuremath{\kappa}$. Indeed, for small values of $\ensuremath{\kappa}$ [i.e., $\mathrm{poly}\mathrm{log}(N)$], we prove (using some common complexity-theoretic assumptions) that any classical algorithm for this problem generically requires exponentially more time than our quantum algorithm.
We present an algorithm for sparse Hamiltonian simulation whose complexity is optimal (up to log factors) as a function of all parameters of interest. Previous algorithms had optimal or near-optimal … We present an algorithm for sparse Hamiltonian simulation whose complexity is optimal (up to log factors) as a function of all parameters of interest. Previous algorithms had optimal or near-optimal scaling in some parameters at the cost of poor scaling in others. Hamiltonian simulation via a quantum walk has optimal dependence on the sparsity at the expense of poor scaling in the allowed error. In contrast, an approach based on fractional-query simulation provides optimal scaling in the error at the expense of poor scaling in the sparsity. Here we combine the two approaches, achieving the best features of both. By implementing a linear combination of quantum walk steps with coefficients given by Bessel functions, our algorithm's complexity (as measured by the number of queries and 2-qubit gates) is logarithmic in the inverse error, and nearly linear in the product $\tau$ of the evolution time, the sparsity, and the magnitude of the largest entry of the Hamiltonian. Our dependence on the error is optimal, and we prove a new lower bound showing that no algorithm can have sublinear dependence on $\tau$.
This is the second instalment in the project initiated in Manin (2012). In the first Part, we argued that both the philosophy and technique of perturbative renormalisation in quantum field … This is the second instalment in the project initiated in Manin (2012). In the first Part, we argued that both the philosophy and technique of perturbative renormalisation in quantum field theory could be meaningfully transplanted to the theory of computation, and sketched several contexts supporting this view. In this second part, we address some of the issues raised in Manin (2012) and develop them further in three contexts: a categorification of the algorithmic computations; time cut-off and anytime algorithms; and, finally, a Hopf algebra renormalisation of the Halting Problem.
An important feature of modern science and engineering is that data of various kinds is being produced at an unprecedented rate. This is so in part because of new experimental … An important feature of modern science and engineering is that data of various kinds is being produced at an unprecedented rate. This is so in part because of new experimental methods, and in part because of the increase in the availability of high powered computing technology. It is also clear that the nature of the data we are obtaining is significantly different. For example, it is now often the case that we are given data in the form of very long vectors, where all but a few of the coordinates turn out to be irrelevant to the questions of interest, and further that we don’t necessarily know which coordinates are the interesting ones. A related fact is that the data is often very high-dimensional, which severely restricts our ability to visualize it. The data obtained is also often much noisier than in the past and has more missing information (missing data). This is particularly so in the case of biological data, particularly high throughput data from microarray or other sources. Our ability to analyze this data, both in terms of quantity and the nature of the data, is clearly not keeping pace with the data being produced. In this paper, we will discuss how geometry and topology can be applied to make useful contributions to the analysis of various kinds of data. Geometry and topology are very natural tools to apply in this direction, since geometry can be regarded as the study of distance functions, and what one often works with are distance functions on large finite sets of data. The mathematical formalism which has been developed for incorporating geometric and topological techniques deals with point clouds, i.e. finite sets of points equipped with a distance function. It then adapts tools from the various branches of geometry to the study of point clouds. The point clouds are intended to be thought of as finite samples taken from a geometric object, perhaps with noise. Here are some of the key points which come up when applying these geometric methods to data analysis. • Qualitative information is needed: One important goal of data analysis is to allow the user to obtain knowledge about the data, i.e. to understand how it is organized on a large scale. For example, if we imagine that we are looking at a data set constructed somehow from diabetes patients, it would be important to develop the understanding that there are two types of the disease, namely the juvenile and adult onset forms. Once that is established, one of course wants to develop quantitative methods for distinguishing them, but the first insight about the distinct forms of the disease is key.
We describe a new polynomial time quantum algorithm that uses the quantum fast Fourier transform to find eigenvalues and eigenvectors of a local Hamiltonian, and that can be applied in … We describe a new polynomial time quantum algorithm that uses the quantum fast Fourier transform to find eigenvalues and eigenvectors of a local Hamiltonian, and that can be applied in cases (commonly found in ab initio physics and chemistry problems) for which all known classical algorithms require exponential time. Applications of the algorithm to specific problems are considered, and we find that classically intractable and interesting problems from atomic physics may be solved with between 50 and 100 quantum bits.
Feynman's 1982 conjecture, that quantum computers can be programmed to simulate any local quantum system, is shown to be correct. Feynman's 1982 conjecture, that quantum computers can be programmed to simulate any local quantum system, is shown to be correct.
The design of new quantum algorithms has proven to be an extremely difficult task. This paper considers a different approach to the problem, by studying the problem of 'quantum state … The design of new quantum algorithms has proven to be an extremely difficult task. This paper considers a different approach to the problem, by studying the problem of 'quantum state generation'.We first show that any problem in Statistical Zero Knowledge (including eg. discrete log, quadratic residuosity and gap closest vector in a lattice) can be reduced to an instance of the quantum state generation problem. Having shown the generality of the state generation problem, we set the foundations for a new paradigm for quantum state generation. We define 'Adiabatic State Generation' (ASG), which is based on Hamiltonians instead of unitary gates. We develop tools for ASG including a very general method for implementing Hamiltonians (The sparse Hamiltonian lemma), and ways to guarantee non negligible spectral gaps (The jagged adiabatic path lemma). We also prove that ASG is equivalent in power to state generation in the standard quantum model. After setting the foundations for ASG, we show how to apply our techniques to generate interesting superpositions related to Markov chains.The ASG approach to quantum algorithms provides intriguing links between quantum computation and many different areas: the analysis of spectral gaps and groundstates of Hamiltonians in physics, rapidly mixing Markov chains, statistical zero knowledge, and quantum random walks. We hope that these links will bring new insights and methods into quantum algorithms.
Article Free Access Share on A fast quantum mechanical algorithm for database search Author: Lov K. Grover 3C-404A, AT&T Bell Labs, 600 Mountain Avenue, Murray Hill, NJ 3C-404A, AT&T Bell … Article Free Access Share on A fast quantum mechanical algorithm for database search Author: Lov K. Grover 3C-404A, AT&T Bell Labs, 600 Mountain Avenue, Murray Hill, NJ 3C-404A, AT&T Bell Labs, 600 Mountain Avenue, Murray Hill, NJView Profile Authors Info & Claims STOC '96: Proceedings of the twenty-eighth annual ACM symposium on Theory of ComputingJuly 1996Pages 212–219https://doi.org/10.1145/237814.237866Published:01 July 1996Publication History 3,453citation12,253DownloadsMetricsTotal Citations3,453Total Downloads12,253Last 12 Months3,433Last 6 weeks400 Get Citation AlertsNew Citation Alert added!This alert has been successfully added and will be sent to:You will be notified whenever a record that you have chosen has been cited.To manage your alert preferences, click on the button below.Manage my AlertsNew Citation Alert!Please log in to your account Save to BinderSave to BinderCreate a New BinderNameCancelCreateExport CitationPublisher SiteeReaderPDF
We describe a simple, efficient method for simulating Hamiltonian dynamics on a quantum computer by approximating the truncated Taylor series of the evolution operator. Our method can simulate the time … We describe a simple, efficient method for simulating Hamiltonian dynamics on a quantum computer by approximating the truncated Taylor series of the evolution operator. Our method can simulate the time evolution of a wide variety of physical systems. As in another recent algorithm, the cost of our method depends only logarithmically on the inverse of the desired precision, which is optimal. However, we simplify the algorithm and its analysis by using a method for implementing linear combinations of unitary operations together with a robust form of oblivious amplitude amplification.
We introduce the tidy set, a minimal simplicial set that captures the topology of a simplicial complex. The tidy set is particularly effective for computing the homology of clique complexes. … We introduce the tidy set, a minimal simplicial set that captures the topology of a simplicial complex. The tidy set is particularly effective for computing the homology of clique complexes. This family of complexes include the Vietoris-Rips complex and the weak witness complex, methods that are popular in topological data analysis. The key feature of our approach is that it skips constructing the clique complex. We give algorithms for constructing tidy sets, implement them, and present experiments. Our preliminary results show that tidy sets are orders of magnitude smaller than clique complexes, giving us a homology engine with small memory requirements.
Homotopy type theory is a recently-developed unification of previously disparate frameworks, which can serve to advance the project of formalizing and mechanizing mathematics. One framework is based on a computational … Homotopy type theory is a recently-developed unification of previously disparate frameworks, which can serve to advance the project of formalizing and mechanizing mathematics. One framework is based on a computational conception of the type of a construction, the other is based on a homotopical conception of the homotopy type of a space. The computational notion of type has its origins in Brouwer's program of intuitionism, and Church's λ-calculus, both of which sought to ground mathematics in computation (one would say "algorithm" these days). The homotopical notion comes from Grothendieck's late conception of homotopy types of spaces as represented by ∞-groupoids [Grothendieck 1983]. The computational perspective was developed most fully by Per Martin-Löf, leading in particular to his Intuitionistic Theory of Types [Martin-Löf and Sambin 1984], on which the formal system of homotopy type theory is based. The connection to homotopy theory was first hinted at in the groupoid interpretation of Hofmann and Streicher [Hofmann and Streicher 1994; 1995]. It was then made explicit by several researchers, roughly simultaneously. The connection was clinched by Voevodsky's introduction of the univalence axiom , which is motivated by the homotopical interpretation, and which relates type equality to homotopy equivalence [Kapulkin et al. 2012; Awodey et al. 2013].
Abstract Extracting useful information from large data sets can be a daunting task. Topological methods for analysing data sets provide a powerful technique for extracting such information. Persistent homology is … Abstract Extracting useful information from large data sets can be a daunting task. Topological methods for analysing data sets provide a powerful technique for extracting such information. Persistent homology is a sophisticated tool for identifying topological features and for determining how such features persist as the data is viewed at different scales. Here we present quantum machine learning algorithms for calculating Betti numbers—the numbers of connected components, holes and voids—in persistent homology, and for finding eigenvectors and eigenvalues of the combinatorial Laplacian. The algorithms provide an exponential speed-up over the best currently known classical algorithms for topological data analysis.
Dans ce travail la théorie des foncteurs dérivés (connue pour les foncteurs additifs) est généralisée aux foncteurs arbitraires (non additifs). Pour obtenir cette généralisation on remplace les complexes de la … Dans ce travail la théorie des foncteurs dérivés (connue pour les foncteurs additifs) est généralisée aux foncteurs arbitraires (non additifs). Pour obtenir cette généralisation on remplace les complexes de la théorie usuelle par des complexes semi-simpliciaux. Soient U et U ′ deux catégories abéliennes, T un foncteur covariant de U dans U ′ , A un objet de U et n un entier ≥0. Alors on appelle résolution (semi-simpliciale) de (A,n) un "objet semi-simplicial" X sur U muni d'un isomorphisme H n (X)≅A et tel que X q =0 pour q<n, H q (X)=0 pour q>n. Si de plus X q est un objet projectif de U pour tout q, on pose L q T(A,n)=H q TX. Supposant que tout objet de U soit isomorphe à un quotient d'un objet projectif on prouve que L q T(,n) est un foncteur covariant de U dans U ′ (dérivé gauche de T). Si T est additif, le foncteur L q T(,n) ne dépend que de q-n et coïncide avec L q-n T, dérivé gauche de T au sens usuel. Si U=U ′ est la catégorie des groupes abéliens et T(A) est l'anneau de groupe ou l'algèbre symétrique de A, L q T(A,n) est isomorphe au groupe H q (A,n,Z) d'Eilenberg-MacLane.
The physics of quantum mechanics is the inspiration for, and underlies, quantum computation. As such, one expects physical intuition to be highly influential in the understanding and design of many … The physics of quantum mechanics is the inspiration for, and underlies, quantum computation. As such, one expects physical intuition to be highly influential in the understanding and design of many quantum algorithms, particularly simulation of physical systems. Surprisingly, this has been challenging, with current Hamiltonian simulation algorithms remaining abstract and often the result of sophisticated but unintuitive constructions. We contend that physical intuition can lead to optimal simulation methods by showing that a focus on simple single-qubit rotations elegantly furnishes an optimal algorithm for Hamiltonian simulation, a universal problem that encapsulates all the power of quantum computation. Specifically, we show that the query complexity of implementing time evolution by a $d$-sparse Hamiltonian $\stackrel{^}{H}$ for time-interval $t$ with error $\ensuremath{\epsilon}$ is $\mathcal{O}[td\ensuremath{\parallel}\stackrel{^}{H}{\ensuremath{\parallel}}_{\mathrm{max}}+\mathrm{log}(1/\ensuremath{\epsilon})/\mathrm{log}\mathrm{log}(1/\ensuremath{\epsilon})]$, which matches lower bounds in all parameters. This connection is made through general three-step ``quantum signal processing'' methodology, comprised of (i) transducing eigenvalues of $\stackrel{^}{H}$ into a single ancilla qubit, (ii) transforming these eigenvalues through an optimal-length sequence of single-qubit rotations, and (iii) projecting this ancilla with near unity success probability.
With quantum computers of significant size now on the horizon, we should understand how to best exploit their initially limited abilities. To this end, we aim to identify a practical … With quantum computers of significant size now on the horizon, we should understand how to best exploit their initially limited abilities. To this end, we aim to identify a practical problem that is beyond the reach of current classical computers, but that requires the fewest resources for a quantum computer. We consider quantum simulation of spin systems, which could be applied to understand condensed matter phenomena. We synthesize explicit circuits for three leading quantum simulation algorithms, using diverse techniques to tighten error bounds and optimize circuit implementations. Quantum signal processing appears to be preferred among algorithms with rigorous performance guarantees, whereas higher-order product formulas prevail if empirical error estimates suffice. Our circuits are orders of magnitude smaller than those for the simplest classically infeasible instances of factoring and quantum chemistry, bringing practical quantum computation closer to reality.
A central problem of algebraic topology is to understand the homotopy groupsπd(X) of a topological space X. For the computational version of the problem, it is well known that there … A central problem of algebraic topology is to understand the homotopy groupsπd(X) of a topological space X. For the computational version of the problem, it is well known that there is no algorithm to decide whether the fundamental groupπ1(X) of a given finite simplicial complex X is trivial. On the other hand, there are several algorithms that, given a finite simplicial complex X that is simply connected (i.e., with π1(X) trivial), compute the higher homotopy group πd(X) for any given d≥2 . However, these algorithms come with a caveat: They compute the isomorphism type of πd(X) , d≥2 as an abstract finitely generated abelian group given by generators and relations, but they work with very implicit representations of the elements of πd(X) . Converting elements of this abstract group into explicit geometric maps from the d-dimensional sphere Sd to X has been one of the main unsolved problems in the emerging field of computational homotopy theory. Here we present an algorithm that, given a simply connected space X, computes πd(X) and represents its elements as simplicial maps from a suitable triangulation of the d-sphere Sd to X. For fixed d, the algorithm runs in time exponential in size(X) , the number of simplices of X. Moreover, we prove that this is optimal: For every fixed d≥2 , we construct a family of simply connected spaces X such that for any simplicial map representing a generator of πd(X) , the size of the triangulation of Sd on which the map is defined, is exponential in size(X) .
Quantum computing allows us to solve some problems much faster than existing classical algorithms. Yet, the quantum computer has been believed to be no more powerful than the most general … Quantum computing allows us to solve some problems much faster than existing classical algorithms. Yet, the quantum computer has been believed to be no more powerful than the most general computing model—the Turing machine. Undecidable problems, such as the halting problem, and unrecognizable inputs, such as the real numbers, are beyond the theoretical limit of the Turing machine. I suggest a model for a quantum computer, which is less general than the Turing machine, but may solve the halting problem for any task programmable on it. Moreover, inputs unrecognizable by the Turing machine can be recognized by the model, thus breaking the theoretical limit for a computational task. A quantum computer is not just a successful design of the Turing machine as it is widely perceived now, but is a different, less general but more powerful model for computing, the practical realization of which may need different strategies than those in use now.
Motivated by recent studies of holographic complexity, we examine the question of circuit complexity in quantum field theory. We provide a quantum circuit model for the preparation of Gaussian states, … Motivated by recent studies of holographic complexity, we examine the question of circuit complexity in quantum field theory. We provide a quantum circuit model for the preparation of Gaussian states, in particular the ground state, in a free scalar field theory for general dimensions. Applying the geometric approach of Nielsen to this quantum circuit model, the complexity of the state becomes the length of the shortest geodesic in the space of circuits. We compare the complexity of the ground state of the free scalar field to the analogous results from holographic complexity, and find some surprising similarities.
Quantum computing is powerful because unitary operators describing the time-evolution of a quantum system have exponential size in terms of the number of qubits present in the system. We develop … Quantum computing is powerful because unitary operators describing the time-evolution of a quantum system have exponential size in terms of the number of qubits present in the system. We develop a new "Singular value transformation" algorithm capable of harnessing this exponential advantage, that can apply polynomial transformations to the singular values of a block of a unitary, generalizing the optimal Hamiltonian simulation results of Low and Chuang. The proposed quantum circuits have a very simple structure, often give rise to optimal algorithms and have appealing constant factors, while usually only use a constant number of ancilla qubits. We show that singular value transformation leads to novel algorithms. We give an efficient solution to a certain "non-commutative" measurement problem and propose a new method for singular value estimation. We also show how to exponentially improve the complexity of implementing fractional queries to unitaries with a gapped spectrum. Finally, as a quantum machine learning application we show how to efficiently implement principal component regression. "Singular value transformation" is conceptually simple and efficient, and leads to a unified framework of quantum algorithms incorporating a variety of quantum speed-ups. We illustrate this by showing how it generalizes a number of prominent quantum algorithms, including: optimal Hamiltonian simulation, implementing the Moore-Penrose pseudoinverse with exponential precision, fixed-point amplitude amplification, robust oblivious amplitude amplification, fast QMA amplification, fast quantum OR lemma, certain quantum walk results and several quantum machine learning algorithms. In order to exploit the strengths of the presented method it is useful to know its limitations too, therefore we also prove a lower bound on the efficiency of singular value transformation, which often gives optimal bounds.
We develop a method for analyzing multivariate time series using topological data analysis (TDA) methods. The proposed methodology involves converting the multivariate time series to point cloud data, calculating Wasserstein … We develop a method for analyzing multivariate time series using topological data analysis (TDA) methods. The proposed methodology involves converting the multivariate time series to point cloud data, calculating Wasserstein distances between the persistence diagrams and using the k-nearest neighbours algorithm (k-NN) for supervised machine learning. Two methods (symmetry-breaking and anchor points) are also introduced to enable TDA to better analyze data with heterogeneous features that are sensitive to translation, rotation or choice of coordinates. We apply our methods to room occupancy detection based on 5 time-dependent variables (temperature, humidity, light, CO2 and humidity ratio). Experimental results show that topological methods are effective in predicting room occupancy during a time window. We also apply our methods to an Activity Recognition dataset and obtained good results.
We present a quantum algorithm for the dynamical simulation of time-dependent Hamiltonians. Our method involves expanding the interaction-picture Hamiltonian as a sum of generalized permutations, which leads to an integral-free … We present a quantum algorithm for the dynamical simulation of time-dependent Hamiltonians. Our method involves expanding the interaction-picture Hamiltonian as a sum of generalized permutations, which leads to an integral-free Dyson series of the time-evolution operator. Under this representation, we perform a quantum simulation for the time-evolution operator by means of the linear combination of unitaries technique. We optimize the time steps of the evolution based on the Hamiltonian’s dynamical characteristics, leading to a gate count that scales with an L1-norm-like scaling with respect only to the norm of the interaction Hamiltonian, rather than that of the total Hamiltonian. We demonstrate that the cost of the algorithm is independent of the Hamiltonian’s frequencies, implying its advantage for systems with highly oscillating components, and for time-decaying systems the cost does not scale with the total evolution time asymptotically. In addition, our algorithm retains the near optimal log(1/ϵ)/loglog(1/ϵ) scaling with simulation error ϵ.Received 21 April 2021Accepted 25 August 2021DOI:https://doi.org/10.1103/PRXQuantum.2.030342Published by the American Physical Society under the terms of the Creative Commons Attribution 4.0 International license. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.Published by the American Physical SocietyPhysics Subject Headings (PhySH)Research AreasQuantum algorithmsQuantum computationQuantum simulationQuantum Information
Abstract Quantum circuit complexity has played a central role in recent advances in holography and many‐body physics. Within quantum field theory, it has typically been studied in a Lorentzian (real‐time) … Abstract Quantum circuit complexity has played a central role in recent advances in holography and many‐body physics. Within quantum field theory, it has typically been studied in a Lorentzian (real‐time) framework. In a departure from standard treatments, we aim to quantify the complexity of the Euclidean path integral. In this setting, there is no clear separation between space and time, and the notion of unitary evolution on a fixed Hilbert space no longer applies. As a proof of concept, we argue that the pants decomposition provides a natural notion of circuit complexity within the category of 2‐dimensional bordisms and use it to formulate the circuit complexity of states and operators in 2‐dimensional topological quantum field theory. We comment on analogies between our formalism and others in quantum mechanics, such as tensor networks and second quantization.
Electroencephalography (EEG) is a widely used cerebral activity measuring device for both clinical and everyday life applications. In addition to denoising and potential classification, a crucial step in EEG processing … Electroencephalography (EEG) is a widely used cerebral activity measuring device for both clinical and everyday life applications. In addition to denoising and potential classification, a crucial step in EEG processing is to extract relevant features. Topological data analysis (TDA) as an emerging tool enables to analyse and understand data from a different angle than traditionally used methods. As a higher dimensional analogy of graph analysis, TDA can model rich interactions beyond pairwise relations. It also distinguishes different dynamics of EEG time series. TDA remains largely unknown to the EEG processing community while it fits well the heterogeneous nature of EEG signals. This short review aims to give a quick introduction to TDA and how it can be applied to EEG analysis in various applications including brain-computer interfaces (BCIs). After introducing the objective of the article, the main concepts and ideas of TDA are explained. Next, how to implement it for EEG processing is detailed, and lastly the article discusses the benefits and limitations of the method.
Significant technological advances made in recent years have shepherded a dramatic increase in utilization of digital technologies for biomedicine– everything from the widespread use of electronic health records to improved … Significant technological advances made in recent years have shepherded a dramatic increase in utilization of digital technologies for biomedicine– everything from the widespread use of electronic health records to improved medical imaging capabilities and the rising ubiquity of genomic sequencing contribute to a "digitization" of biomedical research and clinical care. With this shift toward computerized tools comes a dramatic increase in the amount of available data, and current tools for data analysis capable of extracting meaningful knowledge from this wealth of information have yet to catch up. This article seeks to provide an overview of emerging mathematical methods with the potential to improve the abilities of clinicians and researchers to analyze biomedical data, but may be hindered from doing so by a lack of conceptual accessibility and awareness in the life sciences research community. In particular, we focus on topological data analysis (TDA), a set of methods grounded in the mathematical field of algebraic topology that seeks to describe and harness features related to the "shape" of data. We aim to make such techniques more approachable to non-mathematicians by providing a conceptual discussion of their theoretical foundations followed by a survey of their published applications to scientific research. Finally, we discuss the limitations of these methods and suggest potential avenues for future work integrating mathematical tools into clinical care and biomedical informatics.