Python – Compute the Jacobian matrix in Python

derivative, numpy, python
import numpy as npa = np.array([[1,2,3],              [4,5,6],              [7,8,9]])b = np.array([[1,2,3]]).Tc = a.dot(b) #functionjacobian = a # as partial derivative of c w.r.t to b is a.

I am reading about jacobian Matrix, trying to build one and from what I have read so far, this python code should be considered as jacobian. Am I understanding this right?

Best Solution

You can use the Harvard autograd library (link), where grad and jacobian take a function as their argument:

import autograd.numpy as npfrom autograd import grad, jacobianx = np.array([5,3], dtype=float)def cost(x):    return x[0]**2 / x[1] - np.log(x[1])gradient_cost = grad(cost)jacobian_cost = jacobian(cost)gradient_cost(x)jacobian_cost(np.array([x,x,x]))

Otherwise, you could use the jacobian method available for matrices in sympy:

from sympy import sin, cos, Matrixfrom sympy.abc import rho, phiX = Matrix([rho*cos(phi), rho*sin(phi), rho**2])Y = Matrix([rho, phi])X.jacobian(Y)

Also, you may also be interested to see this low-level variant (link). MATLAB provides nice documentation on its jacobian function here.