Machine learning teaches computers to act like people by giving them information about the past and guessing what might happen in the future. This section will look at exciting machine learning algorithms like Growing self-organizing maps, Extremal Ensemble Learning, and Evolutionary multimodal optimization.

Growing self-organizing map

The growing self-organizing map (GSOM), an extension of Kohonen's self-organizing map algorithm, changes where the map weight vectors are in the input space and how the grid in the map output space is set up. This extra feature lets dimension-reducing projections with optimal neighbourhood preservation be made without supervision, even if the effective dimensionality of the data set being fed in is unknown.

In addition, GSOM is a self-organizing map that changes as it grows (SOM). The GSOM was made to help determine what a good map size should be in the SOM. It starts with a small number of nodes (usually 4) and is based on a heuristic. Then, it adds new nodes along the edges. With the help of a value called Spread Factor (SF), the data analyst can control how fast the GSOM grows.

Furthermore, all of the GSOM's starting nodes are boundary nodes, meaning each node can grow in any direction at the start. New nodes grow from the nodes at the edges. Once a node is chosen, it will develop new nodes in all the empty spaces around it. The picture shows the three ways a rectangular GSOM's nodes can grow.

Extremal Ensemble Learning

Extremal Ensemble Learning (EEL) is an algorithmic paradigm for graph partitioning in machine learning. EEL uses the ensemble's information to discover new and improved partitions. The extremal updating procedure enables the ensemble to evolve and learn how to form better partitions. The ultimate solution is determined by reaching a consensus among its constituent partitions regarding the optimal partition. In addition, the Reduced Network Extremal Ensemble Learning (RenEEL) scheme for dividing a graph is one way the EEL paradigm can be used. RenEEL uses consensus among many partitions in an ensemble to make a smaller network that can quickly analyze to find more correct partitions. After that, these better-quality parts are used to update the ensemble. However, seeing the graph partition with the most modularity is an NP-hard problem. 

Furthermore, the EEL employs iterative extremal updating of an ensemble of network partitions discovered by a conventional base algorithm to find a node partition that maximizes modularity. At each iteration, core group nodes belonging to the same community in each ensemble partition are identified and utilized to construct a reduced network. The reduced network is then partitioned and used to update the ensemble. As a result, the reduced network size enhances the scheme's effectiveness.

Evolutionary multimodal optimization

Multimodal optimization is a branch of applied mathematics that focuses on finding all or most of the multiple (at least locally optimal) solutions to a problem instead of just one best answer. Evolutionary multimodal optimization is a type of evolutionary computation closely related to machine learning. Traditional optimizing methods would require multiple restart points and runs, with no guarantee that each run would find a different solution. Evolutionary algorithms (EAs) have an advantage over traditional optimization techniques because they are based on populations. They keep a population of possible solutions, which are processed every generation. If It can maintain multiple solutions over all these generations, we will end up with more than just the best solution when the algorithm is done.

Recently, a multiobjective evolutionary optimization (EMO) method was proposed. This method adds a suitable second objective to the original single-objective multimodal optimization problem so that the multiple solutions form a weak Pareto-optimal front. So, an EMO algorithm can be used to solve the multimodal optimization problem, which has more than one answer. Furthermore, the same authors have made their algorithm self-adaptive, meaning the parameters no longer need to be set up.

Want to publish your content?

Publish an article and share your insights to the world.

ALSO EXPLORE

DISCLAIMER

The information provided on this page has been procured through secondary sources. In case you would like to suggest any update, please write to us at support.ai@mail.nasscom.in