Google introduces the powerful PaLM 2 large language model, which supports more than 100 languages. LLM is capable of various mathematics examinations, coding, creative writing, and analysis. According to Google, PaLM 2 is the company's smallest, quickest, and most effective large language model.

“In a few years artificial intelligence virtual assistants will be as common as the smartphone.” - Dave Waters

PaLM 2 uses Google's latest TPU architecture to compute such models. According to the search engine behemoth, this gives PaLM 2 a performance and efficiency advantage over its previous state-of-the-art LLMs. 

Existing LLM from Google

LLM is frequently used to refer to deep learning models with billions or more parameters. LLMs are general-purpose models that excel at various tasks rather than being trained for a single job (such as sentiment analysis, named entity recognition, or mathematical reasoning).

Neural language models with sufficient training and parameter counts can capture a considerable amount of human language syntax and semantics. Furthermore, LLMs display a high level of general knowledge about the world and can "memorize" a vast amount of information throughout training. First, let's look at some existing Google LLM.

Let's examine the capabilities of PaLM 2 and the products it powers.

The most notable feature of this new iteration of PaLM is that it has been intensively trained on text in over 100 languages. As a result, it has enhanced its capacity to comprehend, generate, and translate nuanced text in various languages, such as idioms, poetry, and riddles. In mathematics, PaLM 2 surpasses previous models in logic and common sense reasoning. According to Google, the model was trained on vast quantities of math and science literature, including scientific papers and mathematical expressions.

Capabilities

PaLM 2 interprets, generates, and debugs over twenty programming languages. It excels at common programming languages such as Python and JavaScript but can also generate code in Prologue, Fortran, and Verilog. PaLM also provides the necessary documentation in any language, thereby assisting programmers around the globe in learning to code.

Efficiency

Google is also significantly invested in reducing PaLM 2's size, enhancing its performance, and expanding its capabilities. As a result, the company has developed a family of models with varying dimensions and capabilities. For example, Gecko is so small that it can function as an interactive programme on mobile devices without an internet connection. The Otter, Bison, and Unicorn models (in order of size) are progressively more extensive and more potent.

Conclusion

In addition, Google has released Med-PaLM 2, a specialized LLM for medical applications. It can provide answers to queries and summaries of numerous medical texts. It also scored an expert on the US Medical Licence Exam. Google is also building multimodal skills to process X-rays and mammograms to better patient outcomes. Later this summer, a limited number of Cloud customers will have access to Med-PaLM 2 to identify and evaluate safe use cases.

Sec-PaLM, another product based on PaLM 2, is designed for cybersecurity analysis. This Google Cloud-accessible LLM analyses and explains the behaviour of potentially malicious scripts using artificial intelligence.

Sources of Article

Image source: Unsplash

Want to publish your content?

Publish an article and share your insights to the world.

Get Published Icon
ALSO EXPLORE