mlsapunov August 11th | Data Science by 🦜

Now it is official.

I've started writing a book on JAX. This seems to be the first book ever on this topic.

For those who don't know, JAX is an exceptionally cool numeric computations library from Google, a kind of NumPy on steroids, with autodiff, XLA compilation, and hardware acceleration on TPU/GPU. JAX also brings the functional programming paradigm to deep learning.

JAX is heavily used for deep learning and already pretends to be the deep learning framework #3. Some companies, like DeepMind, have already switched to JAX internally. There are rumors that Google also switches to JAX.

JAX ecosystem is constantly growing. There are a lot of high-quality deep learning-related modules. But JAX is not limited to deep learning. There are many exciting applications and libraries on top of JAX for physics, including molecular dynamics, fluid dynamics, rigid body simulation, quantum computing, astrophysics, ocean modeling, and so on. There are libraries for distributed matrix factorization, streaming data processing, protein folding, and chemical modeling, with other new applications emerging constantly.

Anyway, it's a perfect time to start learning JAX!

The book is available today as a part of the Manning Early Access Program (MEAP), so you can read the book as I write it This is a very smart way of learning something new, you do not have to wait until the complete book is ready. You can start learning right away, and at the moment the book is published, you already know everything. Your feedback will also be very valuable, and you can influence how the book is made.

Here's a link to the book:

If you want a decent discount, use the discount code mlsapunov. It will provide you with 40% off, and it's valid through August 11th.
13.2K views19:15