Skip to content


Research papers come too often for anyone to read them all. This is especially true in the field of machine learning, which now affects (and produces documents in) virtually every industry and business. This column aims to bring together some of the most relevant recent findings and articles – specifically, but not limited to, artificial intelligence – and explain why they’re important.

It takes an emotionally mature AI to admit its own mistakes, and that’s exactly what this project from the Technical University of Munich aims to create. Maybe not exactly the emotion, but recognizing and learning from mistakes, especially in self-driving cars. The researchers propose a system in which the car would look at all the moments in the past when it had to give up control to a human driver and thus learn its own limits – what they call “introspective prediction of breakdowns”.

For example, if there are a lot of cars ahead, the autonomous vehicle’s brain could use its sensors and logic to make a decision. de novo find out if one approach would work or if none will. But the TUM team says that by simply comparing new situations to old ones, they can make a decision much faster on whether to disengage. Saving six or seven seconds here could make all the difference for a safe transfer.

It is important that robots and autonomous vehicles of all types can make decisions without phoning home, especially in combat, where decisive and concise movements are required. The Army’s research lab is studying ways in which ground and air vehicles can interact autonomously, enabling, for example, a mobile landing platform that drones can land on without the need to coordinate, ask permission or rely on accurate GPS signals.

Their solution, at least for testing purposes, is actually pretty low tech. The ground vehicle has a landing zone on the top painted with a huge QR code, which the drone can see from a distance. The drone can track the exact location of the pad completely independently. In the future, the QR code could be removed and the drone could instead identify the shape of the vehicle, likely using best guess logic to determine if it’s the one it wants.

Image credits: Nagoya City University

In the medical world, AI is not put to work on tasks that are not very difficult but rather tedious for people. A good example of this is tracking the activity of individual cells in microscopy images. It’s not a superhuman task to look at a few hundred images spanning several depths of a petri dish and follow the movements of the cells, but that doesn’t mean graduate students like to do it.

This software from researchers at Nagoya City University in Japan does this automatically using image analysis and the ability (much improved in recent years) to understand objects over a period of time rather than just in individual frames. Read the article here, and check out the extremely cute illustration showing the technology on the right … more research organizations should hire professional artists.

This process is similar to that of tracking moles and other skin features on people at risk for melanoma. While they may see a dermatologist every year or so to see if a particular area looks patchy, the rest of the time they have to track their own moles and freckles in another way. It’s hard when they’re in places like the back.



Source link