Machine learning enables computers to mimic human behaviour by training them with historical data and anticipated information. This section will examine some exciting machine learning algorithms like Adagrad, the CN2 algorithm, and FastICA.

Adaptive Gradient Algorithm (Adagrad)

AdaGrad is a group of algorithms for stochastic optimization that use sub-gradients. The algorithms in this family are similar to second-order stochastic gradient descent with an approximation for the Hessian of the optimized function. Intuitively, it changes the learning rate for each feature based on the estimated geometry of the problem. In particular, it tends to give higher learning rates to features that don't show up very often.

Duchi et al. first wrote about AdaGrad in a 2011 paper in the Journal of machine learning research. It is one of the most popular machine learning algorithms, especially for training deep neural networks, and it impacted the Adam algorithm.

The goal of AdaGrad is to minimize the expected value of a stochastic objective function given a series of realizations of the function and a set of parameters. It does this by changing the parameters in the opposite direction of the sub-gradients, just like other methods that use sub-gradients. Standard sub-gradient methods use updated rules with step sizes that don't consider information from past observations. AdaGrad, on the other hand, uses the sequence of gradient estimates to change the learning rate for each parameter separately.

CN2 algorithm

Rule induction is a branch of machine learning that tries to figure out formal rules from a data set. The CN2 method is a way to classify things, making it easy to find simple, understandable rules of the form "if condition, then predict class," even in noisy environments. It works even if the training data aren't perfect. Experts say it uses ideas from the AQ algorithm to make rules and learn from decision trees to deal with noise. So, it comes up with a ruleset similar to AQ's, except that it can deal with noisy data like ID3.

CN2 is an algorithm in the family of sequential covering algorithms:

  • CN2 learns rules we can use in any training scenario.
  • CN2 sets rules that may or may not be in order.
  • Since it accepts rules with a certain level of precision, it can deal with noise.

FastICA

FastICA is an independent component analysis algorithm that saves you time. Erkki Oja says that the idea for FastICA came from the instantaneous noise-free ICA model. PierreComon. Independent Component Analysis (ICA) breaks up an observed random vector into statistically independent parts.

Vicente Zarzoso said that FastICA was compared to adaptive neural-based methods such as principal component analysis (PCA), which are known to do better than most ICA algorithms. However, the technique is famous not only because it is easy to use but also because it works well in many situations. On the other hand, P. Chevalier says that FastICA fails when the sources are weak or have a lot of spatial correlation. 

Vendetta says FastICA is the most common way to solve blind source separation problems because it is faster and uses less memory than other blind source separation algorithms like infomax. Another benefit is that we can calculate different parts one at a time, reducing the amount of work. The only problem with this method is that it won't work if the noise isn't uniform and the noise vectors are linked.

Want to publish your content?

Publish an article and share your insights to the world.

ALSO EXPLORE

DISCLAIMER

The information provided on this page has been procured through secondary sources. In case you would like to suggest any update, please write to us at support.ai@mail.nasscom.in