Automatic Differentiation

Yao currently contains builtin automatic differentiation engine (an operator overloading based) especially for quantum circuits. It uses the reversible context of quantum computation to optimize the performance during simulation, thus you may find this is way faster than any other AD engine at the moment.

Builtin Reverse mode AD engine for simulation

As for expectation, the usage is pretty simple, since the evluation of expectations are just

expect(H, rand_state(10)=>circuit)

to get the gradients, simply add an adjoint

expect'(H, rand_state(10)=>circuit)

which will return the pair of gradients, one is the gradient of input register and the other is the gradient of circuit parameters.

Forward mode AD engine (Faithful gradient)

The faithful gradient is supported via an external package YaoExtensions, this packages contains some useful extensions to make things work out of the box.

Integration with General purpose AD engine

The builtin AD engine for Yao only provides the differentiation of quantum circuits, but you can plug it into a general AD engine, such as Zygote by defining your own adjoints, or include this patch.


apply_back(st::Tuple{<:ArrayReg, <:ArrayReg}, block::AbstractBlock; kwargs...) -> (out, outδ), paramsδ

The backward function of apply!. Returns a tuple of ((input register, gradient of input register), parameter gradients)

generator(rot::Rotor) -> AbstractBlock

Return the generator of rotation block.

mat_back([::Type{T}, ]block::AbstractBlock, adjm::AbstractMatrix) -> Vector

The backward function of mat. Returns the gradients of parameters.

projection(y::AbstractMatrix, op::AbstractMatrix) -> typeof(y)

Project op to sparse matrix with same sparsity as y.