Tensor Network Backend
Simulating quantum circuits using tensor networks is a powerful approach that has been extensively studied in quantum computing literature[Markov2008][Pan2022]. The YaoToEinsum
package provides a convenient and efficient way to convert Yao circuits into tensor networks, enabling advanced analysis, optimization, and simulation of quantum circuits.
Overview
Tensor networks represent quantum circuits as collections of interconnected tensors, where quantum gates become tensors and quantum states are represented as tensor contractions. This representation offers several advantages:
- Scalability: Efficient simulation of certain quantum circuits
- Flexibility: Easy manipulation and analysis of quantum operations
- Optimization: Advanced contraction order optimization for better performance
Basic Usage
Converting Circuits to Tensor Networks
The primary function for circuit conversion is:
using Yao, LuxorGraphPlot
yao2einsum(circuit; initial_state=Dict(), final_state=Dict(), optimizer=TreeSA())
This function transforms a Yao
circuit into a tensor network represented in Einstein summation (einsum) notation, returning a TensorNetwork
object.
Parameters:
initial_state
: Dictionary specifying the initial quantum state. Unspecified qubits remain as open indicesfinal_state
: Dictionary specifying the final measurement state. Unspecified qubits remain as open indicesoptimizer
: Contraction order optimization algorithm. Default isTreeSA()
developed in [Kalachev2021][Liu2023]. For more optimization algorithms, see OMEinsumContractionOrders.jl
Tutorial: Quantum Fourier Transform Example
In this tutorial, we demonstrate how to convert a Quantum Fourier Transform (QFT) circuit to a tensor network and use it for several common tasks:
- Obtaining the matrix representation of the circuit
- Computing probability amplitudes of specific states
- Computing expectation values of observables
- Simulating noisy circuits with density matrices
Step 1: Create the Circuit
First, let's create a 10-qubit Quantum Fourier Transform circuit:
using Yao, LuxorGraphPlot
using Yao.EasyBuild: qft_circuit
n = 4
circuit = qft_circuit(n) # Create a QFT circuit for n qubits
nqubits: 4
chain
├─ chain
│ ├─ put on (1)
│ │ └─ H
│ ├─ control(2)
│ │ └─ (1,) shift(1.5707963267948966)
│ ├─ control(3)
│ │ └─ (1,) shift(0.7853981633974483)
│ └─ control(4)
│ └─ (1,) shift(0.39269908169872414)
├─ chain
│ ├─ put on (2)
│ │ └─ H
│ ├─ control(3)
│ │ └─ (2,) shift(1.5707963267948966)
│ └─ control(4)
│ └─ (2,) shift(0.7853981633974483)
├─ chain
│ ├─ put on (3)
│ │ └─ H
│ └─ control(4)
│ └─ (3,) shift(1.5707963267948966)
└─ chain
└─ put on (4)
└─ H
Case 1: Matrix Representation
Now we convert the circuit to a tensor network representation:
network = Yao.yao2einsum(circuit) # Convert circuit to tensor network
viznet(network) # Visualize the network structure
This creates a tensor network where each quantum gate becomes a tensor node, and the connections represent shared indices between tensors.
We can contract the tensor network to obtain the full matrix representation of the circuit:
# Contract the network and reshape to get the unitary matrix
matrix_from_network = reshape(Yao.contract(network), 1<<n, 1<<n)
matrix_from_yao = Yao.mat(circuit)
# Verify they are equivalent
matrix_from_network ≈ matrix_from_yao
true
Case 2: Computing Probability Amplitudes
For many applications, we're interested in specific probability amplitudes rather than the full matrix. Here we compute the probability amplitude for measuring all qubits in the |0⟩ state after applying the QFT to the |0⟩ state:
# Create a network with fixed initial and final states
network_with_states = Yao.yao2einsum(circuit;
initial_state=Dict([i=>0 for i=1:n]), # Start in |00...0⟩
final_state=Dict([i=>0 for i=1:n]), # Measure in |00...0⟩ basis
optimizer=Yao.YaoToEinsum.TreeSA()
)
viznet(network_with_states)
# Contract the network to get the amplitude
amplitude_from_network = Yao.contract(network_with_states)[]
# Compare with direct Yao computation
initial_state = Yao.zero_state(n)
final_state = initial_state |> circuit
amplitude_from_yao = (Yao.zero_state(n)' * final_state)[]
amplitude_from_network ≈ amplitude_from_yao
true
Case 3: Computing Observable Expectation Values
Tensor networks are particularly useful for computing expectation values of observables. Here we compute the expectation value of a Pauli-Z operator on the first qubit after applying the QFT:
# Define the observable (Pauli-Y on first qubit)
observable = put(n, 1=>Y)
# Create tensor network for expectation value computation
# We need to use the `DensityMatrixMode` to sandwich the circuit between the initial state and the observable
network_obs = Yao.yao2einsum(circuit;
initial_state=Dict(1=>0, 2=>1, 3=>1, 4=>1), # Start in |0111⟩
observable = observable, # Measure expectation value
mode = DensityMatrixMode()
)
viznet(network_obs)
Contract to get the expectation value
res_network = real(Yao.contract(network_obs)[])
0.38268343236508956
Compare with direct Yao computation
state_after_circuit = product_state(bit"1110") |> circuit
res_exact = real(expect(observable, state_after_circuit))
0.3826834323650897
Case 4: Noisy Circuit Simulation
YaoToEinsum supports simulation of noisy quantum circuits using different representations. We demonstrate both density matrix and Pauli basis modes for simulating decoherence:
# Create a simpler circuit for noisy simulation
n_small = 3
γ = 0.1 # damping parameter
# Create amplitude damping channels
damping_channel = quantum_channel(AmplitudeDampingError(γ))
# Build noisy circuit: gate followed by noise on the same qubits
noisy_circuit = chain(n_small,
put(1=>X), put(1=>damping_channel),
put(2=>H), put(2=>damping_channel),
cnot(1,2), put(1=>damping_channel), put(2=>damping_channel),
put(3=>Z), put(3=>damping_channel)
)
nqubits: 3
chain
├─ put on (1)
│ └─ X
├─ put on (1)
│ └─ kraus_channel
│ ├─ A0
│ └─ A1
├─ put on (2)
│ └─ H
├─ put on (2)
│ └─ kraus_channel
│ ├─ A0
│ └─ A1
├─ control(1)
│ └─ (2,) X
├─ put on (1)
│ └─ kraus_channel
│ ├─ A0
│ └─ A1
├─ put on (2)
│ └─ kraus_channel
│ ├─ A0
│ └─ A1
├─ put on (3)
│ └─ Z
└─ put on (3)
└─ kraus_channel
├─ A0
└─ A1
Density Matrix Mode
Simulate using density matrix representation
network_dm = Yao.yao2einsum(noisy_circuit;
mode=DensityMatrixMode(),
initial_state=Dict([i=>0 for i=1:n_small]),
observable=put(n_small, 1=>Z)
)
viznet(network_dm)
Contract to get the final density matrix
res_network = contract(network_dm)[]
-0.6199999999999997 + 0.0im
Compare with direct Yao simulation
initial_dm = density_matrix(zero_state(n_small))
res_exact = expect(put(n_small, 1=>Z), apply(initial_dm, noisy_circuit))
-0.6199999999999998
Pauli Basis Mode
Simulate using Pauli basis representation
network_pauli = Yao.yao2einsum(noisy_circuit;
mode=PauliBasisMode(),
initial_state=Dict([i=>0 for i=1:n_small]),
observable=put(n_small, 1=>Z)
)
viznet(network_pauli)
Contract to get Pauli coefficients and verify the result
res_pauli = Yao.contract(network_pauli)
0-dimensional Array{ComplexF64, 0}:
-0.6199999999999996 + 0.0im
API Reference
The following functions and types are exported by YaoToEinsum
:
YaoToEinsum.yao2einsum
— Functionyao2einsum(circuit; initial_state=Dict(), final_state=Dict(), optimizer=TreeSA(), mode=VectorMode(), observable=nothing, slicer=nothing)
Transform a Yao circuit
to a generalized tensor network (einsum) notation. The return value is a TensorNetwork
instance that corresponds to the following tensor network:
1). If mode is VectorMode()
, the tensor network will be like:
<initial_state| ─── circuit ─── |final_state>
2). If the mode is DensityMatrixMode()
, the tensor network will be like:
<final_state| ─── circuit ─── |initial_state><initial_state| ─── circuit ─── |final_state>
where the circuit
may contain noise channels.
3). In the DensityMatrixMode()
, if observable
is specified, compute tr(rho, observable)
instead.
┌── circuit ─── |initial_state><initial_state| ─── circuit ─── observable ──┐
└───────────────────────────────────────────────────────────────────────────┘
4). PauliBasisMode()
is similar to the DensityMatrixMode()
mode, but the basis will be rotated to the Pauli basis.
Arguments
mode
is the mapping mode, which can beDensityMatrixMode()
,PauliBasisMode()
, orVectorMode()
.circuit
is a Yao block as the input.initial_state
andfinal_state
are dictionaries to specify the initial states and final states (taking conjugate).- In the first interface, a state is specified as an integer, e.g.
Dict(1=>1, 2=>1, 3=>0, 4=>1)
specifies a product state|1⟩⊗|1⟩⊗|0⟩⊗|1⟩
. - In the second interface, a state is specified as an
ArrayReg
, e.g.Dict(1=>rand_state(1), 2=>rand_state(1))
.
- In the first interface, a state is specified as an integer, e.g.
observable
is a Yao block to specify the observable. If it is specified, the final state must be unspecified.
If any qubit in initial state or final state is not specified, it will be treated as a free leg in the tensor network.
optimizer
is the optimizer used to optimize the tensor network. The default isTreeSA()
.slicer
is the slicer used to slice the tensor network and reduce the memory usage, e.g. to reduce the largest tensor rank to 30, useTreeSASlicer(score=OMEinsum.ScoreFunction(sc_target=30))
.
Please check OMEinsumContractors.jl for more information.
julia> using Yao
julia> c = chain(3, put(3, 2=>X), put(3, 1=>Y), control(3, 1, 3=>Y))
nqubits: 3
chain
├─ put on (2)
│ └─ X
├─ put on (1)
│ └─ Y
└─ control(1)
└─ (3,) Y
julia> yao2einsum(c; initial_state=Dict(1=>0, 2=>1), final_state=Dict(1=>ArrayReg([0.6, 0.8im]), 2=>1))
TensorNetwork
Time complexity: 2^4.700439718141093
Space complexity: 2^2.0
Read-write complexity: 2^6.0
YaoToEinsum.TensorNetwork
— TypeTensorNetwork
A (generalized) tensor network representation of a quantum circuit.
Fields
code::AbstractEinsum
: The einsum code.tensors::Vector
: The tensors in the network.label_to_qubit::Dict{Int, Int}
: Map from variable label to qubit index (negative for dual, absent for non-qubit)
OMEinsumContractionOrders.optimize_code
— Functionoptimize_code(c::TensorNetwork, optimizer=TreeSA(); slicer=nothing)
Optimize the code of the tensor network.
Arguments
c::TensorNetwork
: The tensor network.optimizer::Optimizer
: The optimizer to use, default isOMEinsum.TreeSA()
.
Keyword Arguments
slicer
: The slicer to use, default isnothing
. It can be e.g.OMEinsum.TreeSASlicer(score=OMEinsum.ScoreFunction(sc_target=30))
.
For more, please check OMEinsumContractionOrders documentation.
OMEinsumContractionOrders.contraction_complexity
— Functioncontraction_complexity(c::TensorNetwork)
Return the contraction complexity of the tensor network.
YaoToEinsum.contract
— Functioncontract(c::TensorNetwork)
Contract the tensor network, and return the result tensor.
References
- Markov2008Markov, Igor L., and Yaoyun Shi. "Simulating quantum computation by contracting tensor networks." SIAM Journal on Computing 38.3 (2008): 963-981.
- Pan2022Pan, Feng, and Pan Zhang. "Simulation of quantum circuits using the big-batch tensor network method." Physical Review Letters 128.3 (2022): 030501.
- Kalachev2021Kalachev, Gleb, Pavel Panteleev, and Man-Hong Yung. "Recursive multi-tensor contraction for xeb verification of quantum circuits." arXiv preprint arXiv:2108.05665 (2021).
- Liu2023Liu, Jin-Guo, et al. "Computing solution space properties of combinatorial optimization problems via generic tensor networks." SIAM Journal on Scientific Computing 45.3 (2023): A1239-A1270.