Machine learning enables computers to behave like humans by training them with historical data and information about what might happen in the future. We'll look at some exciting machine learning algorithms in this section, such as Hyper basis function network, Constructing skill trees, and Structured kNN.

Hyper basis function network

A Hyper basis function network, also known as a HyperBF network in machine learning, is a generalization of the radial basis function (RBF) network concept in which the Mahalanobis-like distance is in place of the Euclidean distance measure. Poggio and Girosi first proposed hyper basis function networks in their 1990 paper "Networks for Approximation and Learning."

HBFNNs are like radial basis function neural networks (RBFNNs), but they are more general. The way neurons are between layers is consistent with the structure of the two types of neural networks. For example, in the activation function of the hidden layer, the difference is how far away the input data is from the centre of a neuron. 

In 1990, Poggio and Girosi extended RBFNNs using regularization theory and proposed HBFNNs by substituting a Mahalanobis-like distance for the Euclidean distance. In contrast to RBFNNs, which use the European norm to represent the distance between the input data and the hidden layer's centre, HBFNNs use a weighted norm instead. Overall, it cannot be easy computationally to train HyperBF networks. Furthermore, the HyperBF's high degree of freedom results in overfitting and subpar generalization. The critical advantage of HyperBF networks is that they can learn complex functions with only a small number of neurons.

Constructing skill trees

George Konidaris, Scott Kuindersma, Andrew Barto, and Roderic Grupen presented CST in 2010. A hierarchical reinforcement learning algorithm called Constructing Skill Trees (CST) can create skill trees from a set of sample solution trajectories obtained through demonstration. Each demonstration trajectory is divided into skills by CST using an incremental MAP (maximum a posteriori) change point detection algorithm.

CST is into three parts: 

  • change point detection, 
  • alignment, and 
  • merging. 

The primary goal of CST is to detect online change points. The change-point detection algorithm is to segment data into skills, and the target regression variable is the sum of discounted rewards. A particle filter controls CST's computational complexity.

Compared to skill chaining, CST is a much faster learning algorithm. Learning higher dimensional policies can be done using CST. Even a failure can help you get better. Agent-centric features teach skills that we can apply to other issues. Furthermore, in the PinBall domain, CST has been used to learn skills through human demonstration. A mobile manipulator has also been used to learn skills through human demonstration.

Structured kNN

A generalization of the k-Nearest Neighbors (kNN) classifier, structured k-Nearest Neighbors, is a machine learning algorithm. The Structured kNN (SkNN) classifier allows the training of a classifier for generally structured output labels, whereas the kNN classifier only supports binary classification, multiclass classification, and regression.

An illustration of a sample instance would be a sentence in natural language, and the output label would be an annotated parse tree. Teaching a classifier involves displaying pairs of accurate sample and output label pairs. After training, the structured kNN model enables one to predict the corresponding output label for new sample instances; in other words, given a sentence in natural language, the classifier can generate the most likely parse tree.

Furthermore, SkNN is the concept of building a graph, where each node represents a class label. If there is a sequence of two elements in the training set with corresponding classes, there is an edge between the two nodes. Therefore, creating a described graph from training sequences is the first step in SkNN training.

Want to publish your content?

Publish an article and share your insights to the world.

ALSO EXPLORE

DISCLAIMER

The information provided on this page has been procured through secondary sources. In case you would like to suggest any update, please write to us at support.ai@mail.nasscom.in