Google unveiled Minerva, an in-house designed neural network capable of answering arithmetic questions and tackling other challenging subjects such as physics.

Minerva is a natural language processing model. In the past few years, scientists have made very advanced natural language processing models that can do things like write essays and translate text. But Google says these neural networks haven't shown much ability to solve "quantitative reasoning problems," like math questions.

Overview

Language models have done very well on many natural language tasks. A general lesson from many works, such as BERT, GPT-3, Gopher, and PaLM, is that neural networks trained on a large amount of different data in an unsupervised way can do well on many tasks.

Regarding quantitative reasoning, language models are still far from being as good as people. Solving math and science problems requires a mix of skills, such as reading and understanding a question in natural language and mathematical notation, remembering relevant formulas and constants, and coming up with step-by-step solutions that involve numerical calculations and manipulation of symbols. 

Research contribution

In the paper "Solving Quantitative Reasoning Problems with Language Models", the researchers talk about Minerva. This language model can use step-by-step reasoning to answer mathematical and scientific questions. The researchers show that they can improve performance on complex quantitative reasoning tasks by collecting data relevant to quantitative reasoning problems, training models at scale, and using best-in-class inference techniques. Minerva solves these problems by coming up with solutions that include numerical calculations and manipulating symbols without using tools like a calculator. 

The model understands math questions and gives answers using both natural language and mathematical notation. In addition, Minerva uses several methods, such as few-shot prompting, the chain of thought or scratchpad prompting, and majority voting, to do the best possible job on STEM reasoning tasks. Using their interactive sample explorer, you can look at what Minerva has done.

Furthermore, Minerva uses the Pathways Language Model (PaLM) to train itself on a 118GB dataset of scientific papers from the arXiv preprint server and web pages with mathematical expressions written in LaTeX, MathJax, or other mathematical typesetting formats. 

Limitations

Google's way of thinking about quantitative reasoning isn't formal math. Minerva understands questions and gives answers using a mix of natural language and LaTeX mathematical expressions. There is no explicit math structure underneath. 

Formal methods for proving theorems don't have this problem (e.g., see Coq, Isabelle, HOL, Lean, Metamath, and Mizar). But on the other hand, one of the benefits of an informal approach is that we can use it to solve a wide range of problems that may not be easy to formalize.

Conclusion 

Machine learning models have become powerful tools in many areas of science, but they are often only used to solve a small number of tasks. The researchers hope that a general model that can solve quantitative reasoning problems will help move science and education forward. There are numerous applications for models that can perform quantitative reasoning. For example, they can help researchers and give students new ways to learn. The researchers say that Minerva is a small step in this direction. 

Image source: Unsplash

Want to publish your content?

Publish an article and share your insights to the world.

ALSO EXPLORE

DISCLAIMER

The information provided on this page has been procured through secondary sources. In case you would like to suggest any update, please write to us at support.ai@mail.nasscom.in