Commit f8f048ed authored by Stephan Seitz's avatar Stephan Seitz
Browse files

Extend README.rst with example

parent a7d2bf41
Pipeline #17047 failed with stage
in 1 minute and 39 seconds
......@@ -29,3 +29,69 @@ or if you downloaded this `repository <
.. code-block:: bash
pip install -e .
Create a :class:`pystencils.AssignmentCollection` with pystencils:
.. testcode::
import sympy
import pystencils
z, x, y = pystencils.fields("z, y, x: [20,30]")
forward_assignments = pystencils.AssignmentCollection({
z[0, 0]: x[0, 0] * sympy.log(x[0, 0] * y[0, 0])
.. testoutput::
Main Assignments:
z[0,0] ← y_C*log(x_C*y_C)
You can then obtain the corresponding backward assignments:
.. testcode::
from pystencils.autodiff import AutoDiffOp, create_backward_assignments
backward_assignments = create_backward_assignments(forward_assignments)
You can see the derivatives with respective to the two inputs multiplied by the gradient diffz_C of the output z_C.
.. testoutput::
Main Assignments:
\hat{y}[0,0] ← diffz_C*(log(x_C*y_C) + 1)
\hat{x}[0,0] ← diffz_C*y_C/x_C
You can also use the class :class:`AutoDiffOp` to obtain both the assignments (if you are curious) and auto-differentiable operations for Tensorflow...
.. testcode::
op = AutoDiffOp(forward_assignments)
backward_assignments = op.backward_assignments
x_tensor = pystencils.autodiff.tf_variable_from_field(x)
y_tensor = pystencils.autodiff.tf_variable_from_field(y)
tensorflow_op = op.create_tensorflow_op({x: x_tensor, y: y_tensor}, backend='tensorflow')
... or Torch:
.. testcode::
x_tensor = pystencils.autodiff.torch_tensor_from_field(x, cuda=False, requires_grad=True)
y_tensor = pystencils.autodiff.torch_tensor_from_field(y, cuda=False, requires_grad=True)
z_tensor = op.create_tensorflow_op({x: x_tensor, y: y_tensor}, backend='torch')
Markdown is supported
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment