AIST: Self-driving with just a smartphone camera!

Source: Press release from the National Institute of Advanced Industrial Science and Technology

AIST: Self-driving with just a smartphone camera!

– Applied to smartphone position sensors, VR, subjective video analysis, and personal mobility autonomous driving –

Report from Autonomous Driving Lab/Analysis Article

National Institute of Advanced Industrial Science and Technology (AIST):

Developed the position and orientation estimation system “L-C*” using a camera and an inertial measurement unit (IMU). Announced on May 30, 2023.

Camera and IMU only:

Use a general sensor smartphone with a built-in smartphone. In the future, it will be applied to automatic driving of personal mobility.

Leveraging the Visual Positioning System:

Utilize VPS for self-localization of self-driving vehicles. Conventional self-position estimation uses a dedicated map.

Position and attitude estimation system “L-C*”:

This time, AIST has developed a position and orientation estimation system “L-C*”. The position and orientation of the camera are obtained by matching the 3D map with the camera image.

Changes in position and posture can be detected even when lighting, weather, or landscape changes.

Faster matching of maps and images:

Traditionally, matching maps and images requires a huge amount of computation. A new IMU introduced a “mechanism for interpolating movements between checks”.

Reduce matching frequency by a factor of 30. Realized a VPS that works stably even on an inexpensive PC.

AIST received Level 4:

On May 12, we obtained permission for Level 4 specific automated driving based on the Road Traffic Law.

Already in June 2021, it was also elected as an advisory board for autonomous driving level 4 advanced mobility services.

https://jidounten-lab.com/u_41486

Position measurement system with camera and inertial measurement unit (IMU)

AIST: Estimate position and orientation with high precision using simple sensors

point

Developed a position measurement system using a camera and an inertial measurement unit (IMU)

The 3D map and the camera image are collated to measure the position of the camera itself with high precision.

Combined with an IMU, the computational load is reduced to 1/30.

It is applied to smartphone position sensors, VR and subjective video analysis, and self-driving personal mobility.

https://www.aist.go.jp/aist_j/press_release/pr2023/pr20230529/pr20230529.html