Results for ""
Centuries ago, to be precise 1770, Hungarian inventor Wolfgang von Kempelen built the mechanical Turk to impress then empress of Austria Maria Theresa. The Turk was a machine that appeared to play the game of chess against a human opponent. It astonished numerous personalities including world leaders and the famous poet Edgar Allen Poe. However, by the middle of 1800s, it was finally revealed that the Turk is not a machine that plays like a human, rather a hoax.
AI is often defined as simulation of human intelligence in machines that are programmed to think like humans and mimic their actions. Today, in this pandemic consumed year, AI research is getting much and much closer to developing many human traits in machines. Here are five of the latest research in AI that is bringing machine a step closer to becoming human-like.
Recent research published by John Hopkins University details how neurons in area V4, the first stage-specific to the brain’s object vision pathway, represent 3D shape fragments, not just the 2D shapes used to study V4 for the last 40 years. The Johns Hopkins researchers then identified nearly identical responses of artificial neurons, in an early stage (layer 3) of AlexNet, an advanced computer vision network. In both natural and artificial vision, early detection of 3D shape presumably aids interpretation of solid, 3D objects in the real world.
One of the long-standing challenges for artificial intelligence has been to replicate human vision. Deep (multilayer) networks like AlexNet have achieved major gains in object recognition, based on high capacity Graphical Processing Units (GPU) developed for gaming and massive training sets fed by the explosion of images and videos on the Internet, details a news release from the university.
Read more here
The researchers and scientists of the Nanyang Technological University, Singapore (NTU Singapore) have developed an AI-based 'brain' for robots which gives them the ability to sense pain and 'heal' themselves when damaged.
The 'brain' is a system that has different sensory nodes that are attached to an AI algorithm; these nodes 'react to pain' and process the 'pain' by calculating the force of impact or pressure. This power enables the robot to understand where exactly it has sustained damage and if the damage is slight, begin repairing itself without human intervention.
For the healing purpose, the robots have a 'self-healing gel', an ion-gel material, which enables the robot to repair its mechanical functions without human intervention if it gets 'injured' from a sharp cut.
Read more here
New research has introduced the physical counterpart of digital AI, namely physical artificial intelligence (PAI), which is the theory and practice of synthesising nature-like intelligent robotic systems.
The research titled "Skills for Physical Artificial Intelligence" by Aslan Miriyev and Mirko Kovac has been published in Nature Machine Intelligence. As per the authors, the five disciplines that will play a key role in developing PAI skills are materials science, mechanical engineering, computer science, biology and chemistry.
Read more here
Autonomous functions for robots, such as spontaneity, are highly sought after. Many control mechanisms for autonomous robots are inspired by the functions of animals, including humans. Roboticists often design robot behaviours using predefined modules and control methodologies, which makes them task-specific, limiting their flexibility. Researchers offer an alternative machine learning-based method for designing spontaneous behaviours by capitalizing on complex temporal patterns, like neural activities of animal brains. They hope to see their design implemented in robotic platforms to improve their autonomous capabilities.
Read more here