ASCL.net

Astrophysics Source Code Library

Making codes discoverable since 1999

ASCL Code Record

[ascl:2111.002] JAX: Autograd and XLA

JAX brings Autograd and XLA together for high-performance machine learning research. It can automatically differentiate native Python and NumPy functions. The code can differentiate through loops, branches, recursion, and closures, and it can take derivatives of derivatives of derivatives. JAX supports reverse-mode differentiation (a.k.a. backpropagation) via grad as well as forward-mode differentiation, and the two can be composed arbitrarily to any order.

Code site:
https://github.com/google/jax
Used in:
https://ui.adsabs.harvard.edu/abs/2021ApJ...910L..17M
Bibcode:
2021ascl.soft11002B
Preferred citation method:

Please see citation information here: https://github.com/google/jax#citing-jax


Views: 2316

ascl:2111.002
Add this shield to your page
Copy the above HTML to add this shield to your code's website.