Computational Science Asked on September 2, 2021
I have an objective function F: Nx1 -> Nx1, where N>30000. There are many sparse matrix/tensor multiplications in this function, so taking an analytic Jacobian by paper and pen is cumbersome.
Which (AD) tool should I use to compute a sparse Jacobian as fast as possible? If needed, I am ready to rewrite my code from Matlab to Python or Julia.
Julia has a whole ecosystem for generating sparsity patterns and doing sparse automatic differentiation in a way that mixes with scientific computing and machine learning (or scientific machine learning). Tools like SparseDiffTools.jl, ModelingToolkit.jl, and SparsityDetection.jl will do things like:
An integrated example auto-accelerating an ODE solve with sparsity for 55x speedups can be found here.
To see what this looks like in action, let's take a simple discretization of the Laplace equation:
fcalls = 0
function f(dx,x) # in-place
global fcalls += 1
for i in 2:length(x)-1
dx[i] = x[i-1] - 2x[i] + x[i+1]
end
dx[1] = -2x[1] + x[2]
dx[end] = x[end-1] - 2x[end]
nothing
end
I put a little function counter in there to demonstrate how this works. We can generate the sparsity pattern by using SparsityDetection.jl:
using SparsityDetection, SparseArrays
input = rand(10)
output = similar(input)
sparsity_pattern = jacobian_sparsity(f,output,input)
jac = Float64.(sparse(sparsity_pattern))
We get that tridiagonal matrix that we all know and love. From here we perform matrix coloring:
using SparseDiffTools
colors = matrix_colors(jac)
Since maximum(colors)
is 3, this means that only 4 function evaluations are required for finite differencing to compute the full Jacobian (to see how this all works, consult the MIT 18.337 Parallel Computing and Scientific Machine Learning lecture notes, specifically the portions on forward-mode AD and solving stiff ODEs). So then we can compute the whole sparse Jacobian in a fast way with:
using FiniteDiff
FiniteDiff.finite_difference_jacobian!(jac, f, rand(30), colorvec=colors)
@show fcalls # 5
Note that the full function calls is 5 because the automated sparsity detection used a fake f
call via abstract interpretation in order to generate the sparsity pattern.
We can then make use of forward-mode AD for the sparsity pattern via:
forwarddiff_color_jacobian!(jac, f, x, colorvec = colors)
which only needs a total of 3 f
calls to generate the full Jacobian. The packages FiniteDiff.jl and SparseDiffTools.jl allow for pre-caching all of the calculation components, so you can make this even faster than this demonstrate and make the full inner loop completely non-allocating.
Note that matrix coloring for reverse-mode AD is via matrix_colors(jac')
which can then be used for sparse reverse mode with Zygote.jl, ReverseDiff.jl, and more.
But as @chennaK mentioned, sparse automatic differentiation can still have a bit of overhead. To get something fully optimal, we can use ModelingToolkit.jl to generate the full beautiful sparse (and parallelized code. We can generate the symbolic mathematical model from our code via abstract interpretation:
using ModelingToolkit
@variables u[1:10] du[1:10]
f(du,u)
du
10-element Array{Operation,1}:
-2u₁ + u₂
(u₁ - 2u₂) + u₃
(u₂ - 2u₃) + u₄
(u₃ - 2u₄) + u₅
(u₄ - 2u₅) + u₆
(u₅ - 2u₆) + u₇
(u₆ - 2u₇) + u₈
(u₇ - 2u₈) + u₉
(u₈ - 2u₉) + u₁₀
u₉ - 2u₁₀
Now we can use sparsejacobian
to generate the symbolic expression for the sparse Jacobian:
sparsejac = ModelingToolkit.sparsejacobian(du,u)
and then we can tell it to generate a fast, non-allocating, multithreaded Julia code:
build_function(sparsejac,u,parallel=ModelingToolkit.MultithreadedForm())[2]
which generates the code here that you can eval
and use in whatever other codes you need. This scales to at least a few million inputs, so it's what we utilize in AutoOptimize.jl to perform automated optimization of user code.
The nice thing about doing this all in Julia is that Julia will then be able to generate very efficient machine code from the all of these calls, meaning it's more in line with C++ than it is like Python. One demonstration of this is stiff ODE solvers in pure Julia outperforming C++ methods like CVODE by 5x, so in some sense while Julia is a high level language and this is all a fun, quick, and interactive sparse AD codegen example, just because it's simple doesn't mean it's not fast!
Answered by Chris Rackauckas on September 2, 2021
I would also like to point at MatlabAutoDiff, which supports sparse Jacobians. Have tried it myself: it is possible to compute large Jacobians (tried with N=1e5) in a small amount of time.
Answered by Someone on September 2, 2021
Get help from others!
Recent Answers
Recent Questions
© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP