# Automatic Differentiation

Yao currently contains builtin automatic differentiation engine (an operator overloading based) especially for quantum circuits. It uses the reversible context of quantum computation to optimize the performance during simulation, thus you may find this is way faster than any other AD engine at the moment.

## Builtin Reverse mode AD engine for simulation

As for expectation, the usage is pretty simple, since the evluation of expectations are just

`expect(H, rand_state(10)=>circuit)`

to get the gradients, simply add an adjoint

`expect'(H, rand_state(10)=>circuit)`

which will return the pair of gradients, one is the gradient of input register and the other is the gradient of circuit parameters.

## Integration with General purpose AD engine

The builtin AD engine for Yao only provides the differentiation of quantum circuits, but you can plug it into a general AD engine, such as Zygote, since we have ported these rules to ChainRules.

## APIs

`YaoBlocks.AD.apply_back`

— Method`apply_back(st::Tuple{<:AbstractArrayReg, <:AbstractArrayReg}, block::AbstractBlock; kwargs...) -> (out, outδ), paramsδ`

The backward function of `apply!`

. Returns a tuple of ((input register, gradient of input register), parameter gradients)

`YaoBlocks.AD.generator`

— Method`generator(rot::Rotor) -> AbstractBlock`

Return the generator of rotation block.

`YaoBlocks.AD.mat_back`

— Method`mat_back([::Type{T}, ]block::AbstractBlock, adjm::AbstractMatrix) -> Vector`

The backward function of `mat`

. Returns the gradients of parameters.

`YaoBlocks.AD.projection`

— Method`projection(y::AbstractMatrix, op::AbstractMatrix) -> typeof(y)`

Project `op`

to sparse matrix with same sparsity as `y`

.

`YaoBlocks.AD.rotgrad`

— MethodThe matrix gradient of a rotation block.