Get featured on INDIAai

Contribute your expertise or opinions and become part of the ecosystem!

Google AI introduces Rax, an open-source learning-to-rank library for the JAX ecosystem. Rax is compatible with conventional JAX transformations and libraries, allowing users to combine their preferred JAX and Rax tools seamlessly.

The ranking is a central problem in many areas, like search engines, recommendation systems, and answering questions. So, researchers often use learning-to-rank (LTR), a set of supervised machine learning techniques that optimize for the usefulness of an entire list of items (rather than a single item at a time). Combining LTR with deep learning has become a big focus in recent years. Researchers and practitioners can get the tools they need to use LTR in their work from existing libraries, especially TF-Ranking. But none of the existing LTR libraries works natively with JAX. JAX is a new machine learning framework that provides an expandable system of function transformations that we can use to make: automatic differentiation, JIT-compilation to GPU/TPU devices, and more.

Researchers at Google AI made Rax a library for LTR in the JAX ecosystem. Rax brings decades of LTR research to the JAX ecosystem. This makes it possible to use JAX to solve various ranking problems and combine ranking techniques with recent advances in deep learning built on JAX (e.g., T5X). Rax gives you the most up-to-date ranking losses, several standard ranking metrics, and a set of function transformations to help you optimize your ranking metrics. All of this functionality is available through an API that is well-documented and easy to use. JAX users will find it easy to understand and use.

Conclusion

Rax is a new library for the JAX ecosystem, which is constantly growing. Rax is entirely free and open source. Anyone can get it here. In their article, you can also find more technical details. The researchers encourage everyone to look at the examples in the Github repository: 

1) optimizing a neural network with Flax and Optax, 

2) comparing different approximate metric optimization techniques, and 

3) how to integrate Rax with T5X.

For more technical information, please read their article.

Want to publish your content?

Publish an article and share your insights to the world.

Get Published Icon
ALSO EXPLORE