Amazon Braket now supports adjoint gradient computation, unlocking runtime improvements and cost savings

Software & Script  » Uncategorized »  Amazon Braket now supports adjoint gradient computation, unlocking runtime improvements and cost savings

Amazon Braket now supports adjoint gradient computation, unlocking runtime improvements and cost savings

0 Comments

SV1, the on-demand state vector simulator on Amazon Braket, now supports the computation of gradients using the adjoint differentiation method, enabling customers to reduce runtime and save costs for their quantum machine learning and optimization workloads. With this launch, customers simulating variational quantum algorithms with a large number of parameters, such as the quantum approximate optimization algorithm (QAOA), can now seamlessly incorporate adjoint gradient computation either directly from the Braket Python SDK or API, or through PennyLane, an open-source software framework built for quantum differentiable programming.


Generated by Feedzy