Results for ""
Researchers show a new step forward in autonomous drone guidance by using liquid neural networks that work well in situations where there is no signal. Drones explore unseen environments with liquid neural networks.
A new generation of aviators is ascending to the skies where birds formerly reigned supreme. These airborne pioneers are not live creatures but the result of deliberate innovation: drones. These are not your average flying robots buzzing around like mechanical bees. Instead, they are liquid neural network-guided avian-inspired wonders that glide across the sky.
Autonomous robots can learn visual navigation tasks from offline human demonstrations and generalise well to online and untrained scenarios within the same environment. However, it is difficult for these agents to take the next step and generalise robustly to new environments with drastic changes in topography that they have never encountered.
The researchers present a method for developing robust flight navigation agents that can effectively complete vision-based fly-to-target tasks outside their training environment and in the face of significant distribution shifts. To this end, the researchers developed a framework for imitation learning using liquid neural networks, a class of continuous-time neural models inspired by the brain that are causal and adaptable to changing conditions. The researchers observed that liquid agents could distil the task they are assigned from visual inputs and eliminate irrelevant features. Thus, their navigational abilities were transferable to new environments. Furthermore, experiments revealed that this level of robustness in decision-making is unique to liquid networks, both in their differential equation and closed-form representations when compared to other advanced deep agents.
The ability of liquid neural networks to continuously adapt to new data inputs demonstrated its ability to make accurate judgements in hitherto unexplored domains such as woods, urban landscapes, and environments with additional noise, rotation, and occlusion. Furthermore, these flexible models enable possible real-world drone applications, including search and rescue, deliveries, and wildlife monitoring, because they beat many of their cutting-edge counterparts in navigation tasks.
In their most recent study, the researchers address a long-standing issue: how this new generation of agents can adapt to significant distribution shifts. But the group's new class of machine-learning algorithms extracts the causal structure of tasks from highly dimensional, unstructured data, like the pixel inputs from a camera mounted on a drone. These networks may then isolate the most critical components of a task (i.e., comprehend the task at hand) and reject unimportant elements, enabling learned navigational skills to transfer targets to new surroundings without any difficulty.
When piloting drones to unlabeled objects, can machine-learning systems interpret data? For example, could they use their skills to fly from a forest to an urban setting? In addition, deep learning algorithms need help identifying causality, overfitting their training data, and adapting to new environments. It is especially problematic for resource-constrained embedded systems, such as aerial drones, which must traverse diverse environments and react immediately to obstacles.
However, liquid networks show promise for addressing this fundamental defect in deep learning systems. The team trained their algorithm by observing how human pilots navigated in unfamiliar situations with abrupt changes in terrain and circumstances using data collected from human pilots. Standard neural networks only learn during training, whereas liquid neural networks can adapt to unexpected or chaotic data and are more interpretable.
In quadrotor closed-loop control tests, the drones were put through range tests, stress tests, target rotation and occlusion, hiking with opponents, triangular loops between objects, and dynamic target tracking. They followed moving targets and made multi-step loops between items in environments that had never been seen before, which was better than what other cutting-edge robots could do.
The team thinks that being able to learn from limited expert data and understand a given task while generalising to new environments could make the deployment of autonomous drones more efficient, cost-effective, and dependable. Furthermore, they said that liquid neural networks could make it possible for autonomous air mobility drones to be used for watching the environment, delivering packages, driving themselves, and working as robot assistants.
Image source: Unsplash