Ariadna E. Venegas-Li and James P. Crutchfield |
ABSTRACT:
Temporal sequences of quantum states are essential to quantum
computation protocols, as used in quantum key distribution, and to
quantum computing implementations, as witnessed by substantial
efforts to develop on-demand single-photon sources. To date,
though, these sources emit qubit sequences in which the
experimenter has little or no control over the outgoing quantum
states. The photon stream emitted by a color center is a familiar
example. As a diagnostic aid, one desires appropriate metrics of
randomness and correlation in such quantum processes.
If an experimentalist observes a sequence of emitted quantum
states via either projective or positive-operator-valued
measurements, the outcomes form a time series. Individual time
series are realizations of a stochastic process over the
measurements' classical outcomes. We recently showed that, in
general, the resulting stochastic process is highly complex in two
specific senses: (i) it is inherently unpredictable to varying
degrees that depend on measurement choice and (ii) optimal
prediction requires using an infinite number of temporal features.
Here, we identify the mechanism underlying this complicatedness as
generator nonunifilarity—the degeneracy between sequences of
generator states and sequences of measurement outcomes. This makes
it possible to quantitatively explore the influence that
measurement choice has on a quantum process' degrees of
randomness and structural complexity using recently introduced
methods from ergodic theory. Progress in this, though, requires
quantitative measures of structure and memory in observed time
series. And, success requires accurate and efficient estimation
algorithms that overcome the requirement to explicitly represent
an infinite set of predictive features. We provide these metrics
and associated algorithms, using them to design
informationally-optimal measurements of open quantum dynamical
systems.