For the next two weeks, Weekly Robotics is being curated Adam Rodnitzky at Tangram Vision. Tangram Vision makes a perception infrastructure suite for tasks like multimodal sensor calibration and sensor fusion. As such, this issue is a perception special, focusing on some of the newest techniques, sensors, and tools being developed for robotics and autonomy applications. As usual, the publication of the week section is manned by Rodrigo.
Sponsored
Weekly Robotics is being developed thanks to the Patreon supporters and the following business sponsors:
Publication of the Week: The $10 Million ANA Avatar XPRIZE Competition Advanced Immersive Telepresence Systems
Telepresence robots are full of sensors to give the best feedback to their remote operator. This paper presents the telepresence systems from the AVA Avatar XPRIZE competition and describes in-depth the evaluation criteria and all the lessons learned. The tasks ranged from using a drill to felling textures. Among the challenges, network connectivity was the number one, but many systems had problems with situational awareness and lack of depth feeling, which caused collisions. You can check the winner’s best course run in this video.
Coupling Thermal Imaging with Machine Learning to Improve Perception for Autonomy
The traditional sensor array for autonomy has typically been a mix of any of the following: cameras, depth sensors, LiDAR, IMUs and radar. But, recently, another modality has been gaining traction as a supplement or replacement to one or more of these: thermal imaging. Once the domain of the military, thermal imaging has advanced quite a bit over the past few years, as CMOS pixel sizes have gotten smaller and prices have gone down. Researchers from Purdue University and the University of Michigan have now married thermal imaging with machine learning to add more fidelity to this modality in low light conditions.
Using Touch Sensors to Replace Vision for Grasping and Manipulation
In the world of sensors, we often tend to focus on those modalities that deliver output that we can see. But what about those modalities that measure things that we can’t see? All manner of encoders do this, and now haptics (touch sensitive sensors) are finding their way into robotics. Researchers at the University of California at San Diego have developed a new human hand like end effector that uses haptic sensing to gently handle fragile items, with no need for supplemental visual sensing.
Robbing Delivery Robots Is Now A Thing
Wait…how are delivery robots being robbed related to perception? It’s because this is a great example of the kinds of abuse that robots encounter when they exit the lab and enter the real world. And the systems that often suffer the most are the sensors, as heavy knocks and other impacts will obliterate calibration, and sometimes lead to permanent damage that requires complete replacement.
Can Biomimcry Deliver the Next Great Robotic Sensing System?
We may look at insects and believe that they posess only basic levels of intelligence. However, it turns out that they are finally tuned rapid decision making machines, applying only the minimal amount of computation required to make logical decisions, fast. This kind of neural processing is now being explored in context of sensors used with robots.
Business
Helm.AI Raises $55 Million for Perception Simulation and AI
As the drive towards Level 4 autonomy intensifies, there are more startups accelerating towards large funding rounds to support this market. Helm.AI is the latest benificiary, as they take in a large round to continue to develop their AI and simulation platform for ADAS development.
Altos Raises $3.5M to Further Develop 4D LiDAR Platform
Just as thermal imaging is making inroads to displace LiDAR, 4D radar is as well. While there are plenty of companies in this field already, Altos throws down yet another gauntlet with a reasonably large $3.5M for its first round of institutional fundraising.