John Hopkins University researchers have developed a machine learning model that allows computers to understand the human conversation.

Humans have a lot of different ways to express themselves. In the last few years, much progress has been made to help machine learning systems learn even the essential things about how people communicate. People and machines are the main focus of dialogue research, but people and people are the most significant barriers to understanding spoken language.

The researchers used a machine learning model to determine what speech functions were in transcripts of dialogues made by language understanding (LU) systems. Work done by this group could one day help computers "read" or "understand" spoken or written text in the same way that people do. 

What can an ML model do?

The new model can figure out what the words in the final transcript mean and classify them as "Statement," "Question," or "Interruption." This task is called "dialogue act recognition." With this, the model could help us understand how people talk. When confronted with vast amounts of unstructured text, LU systems struggle to classify characteristics such as the text's topic, emotion, and purpose. This novel technique obviates the need for LU systems to handle this issue. Rather than utilizing a set of idioms, the systems may transmit exact messages, such as an inquiry or an interruption.

Additionally, the researchers examined how other characteristics, such as punctuation, affect those models' effectiveness. Punctuation, it was discovered, provides crucial hints to the models that are not otherwise apparent in the text.

Existing Approaches

Most natural language processing (NLP) algorithms perform effectively when the text is structured clearly, like when a person speaks whole sentences. However, humans rarely interact formally in everyday life, making it challenging for computers to discern the beginning and end.

The research team needed to confirm that the new technology could comprehend everyday speech. Dialogue acts are crucial for interpreting the structure of the dialogue. They are atomic units of communication, more precise in their function than utterances. The 'conversation act' notion is used here, which may aid in various tasks, including summarization, intent detection, and keyword finding.

Conclusion

Numerous firms leverage speech analytics to gain insight from client interactions with customer service employees. Speech analytics often includes automatic transcription of conversations, and keyword searches provide limited information. According to the researchers, their model will assist such organizations. In the future, doctors may also adopt this technique, saving them time wasted taking notes while dealing with patients. They can use this method to automatically read the conversation's transcript, complete paperwork, and take notes.

Sources of Article

For more information, refer to the article.

Reference article:

https://techxplore.com/news/2021-12-closer-human-conversation.html

 


Image Source: Unsplash

Want to publish your content?

Publish an article and share your insights to the world.

ALSO EXPLORE

DISCLAIMER

The information provided on this page has been procured through secondary sources. In case you would like to suggest any update, please write to us at support.ai@mail.nasscom.in