Results for ""
For people who would prefer to be driven rather than operate a vehicle, can you imagine humanoid robots being a viable substitute for autonomous cars?While it may sound like a pipe dream now, a group of Japanese scholars is investigating its possibilities for becoming a reality. These researchers have released a technical paper that details their goals and highlights their accomplishments thus far.
The study, “Towards Autonomous Driving by Musculoskeletal Humanoids: A Study of Developed Hardware and Learning-Based Software,” was published by a team of Tokyo University researchers, one of whom has collaborated on autonomous projects with Toyota.
Musashi, a musculoskeletal humanoid with redundant sensors and a flexible structure that resembles the human body, was the researcher’s tool. Musashi was tasked with driving an electric microcar on the road, a Toyota COMS (Chotto Odekake Machimade Suisui) from 2012, as demonstrated in a video that was released along with the research. In order to run a recognition module, COMS was equipped with essential technology for the test, such as a computer and Wi-Fi router; however, the ultimate goal is to integrate these into driving robots themselves.
Musashi does, however, include several features meant to make driving easier. First, each “eye” of the humanoid has movable, high-resolution cameras that can tilt and pan to give different views of the surroundings. Additionally, there are five-digit hands connected to arms with joints that, with the help of machine learning and sensor data, can operate the steering wheel.
In addition, Musashi can control the handbrake, turn the ignition key, and use the turn signal. It can even apply pressure to the brake and accelerator pedals with its feet. Even though the exams administered at the Kashiwa Campus of the University of Tokyo showed some improvement, Musashi will not be getting a driver’s license anytime soon.
According to researchers, the robot could react to traffic lights and brakes when it sensed a human presence or the sound of a horn. At an intersection, it was also able to turn the EV around.
However, there were limitations. Because of its cautious approach—which involved gently releasing the brake pedal rather than applying pressure to the accelerator—the turning motion was extremely slow. The tech also had trouble with inclines, so navigating hills proved to be difficult.
The researchers, however, declared to carry on with their work and were not deterred. According to the paper, “By using the flexibility, variable stiffness structure, and several sensors, we succeeded in the steering wheel operation with both arms and human recognition in the side mirror. “
“We proposed a learning-based system handling the flexible body with difficult modelling and succeeded in the pedal and steering wheel operations with recognition.” They claim that the next phase is using their research to develop more sophisticated hardware and software.
Source: arXiv