close
The Wayback Machine - https://web.archive.org/web/20210731232315/https://github.com/topics/autograd
Skip to content
#

autograd

Here are 105 public repositories matching this topic...

twoertwein
twoertwein commented Jul 3, 2021

🚀 Feature

Support to pickle a jitted function (or at least throw a TypeError when using protocol 0 and 1).

Motivation

Trying to pickle a jitted function either raises TypeError: cannot pickle 'torch._C.ScriptFunction' object when protocol>1 or far worse when using protocol=0 or protocol=1 python 3.9.5 dies with:

terminate called after throwing an instance of 'std::runt
pennylane
albi3ro
albi3ro commented Jul 7, 2021

As I was inspecting the MultiControlledX gate, I noticed how it printed out:

MultiControlledX(array([[0, 1],
       [1, 0]]), wires=[0, 1, 2, 3])

And was quite confused as to where the array came from. It turns out MultiControlledX inherits from ControlledQubitUnitary with the Unitary matrix as an X gate. This makes MultiControlledX a parametrized gate.

Worse yet, th

norse
cpehle
cpehle commented Jul 22, 2021

We should support the pseudo-derivative used in Bellec et al. 2018 and subsequent
work by other authors as implemented for example here:

https://github.com/IGITUGraz/eligibility_propagation/blob/efd02e6879c01cda3fa9a7838e8e2fd08163c16e/Figure_3_and_S7_e_prop_tutorials/models.py#L40-L70

It has the advantage that it naturally lead to sparse gradients and therefore is a good alternative
to f

qml
josh146
josh146 commented Apr 23, 2021

The init module has been deprecated, and the recommend approach for generating initial weights is to use the Template.shape method:

>>> from pennylane.templates import StronglyEntanglingLayers
>>> qml.init.strong_ent_layers_normal(n_layers=3, n_wires=2) # deprecated
>>> np.random.random(StronglyEntanglingLayers.shape(n_layers=3, n_wires=2))  # new approach

We should upd

Improve this page

Add a description, image, and links to the autograd topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the autograd topic, visit your repo's landing page and select "manage topics."

Learn more