Together with Milrem Robotics, we aim to
demonstrate the feasibility of off-road navigation using just vision-based sensors, supported by an imprecise satellite navigation system.
Navigation in unstructured off-road environments is a common task in military, agriculture and forestry. This poses multiple challenges for autonomous ground vehicles (AGVs):
Often there is no opportunity to pre-map the area, the navigation decisions need to be made on the go, based solely on sensory information.
Even if there are maps of the area, these can often be out-of-date and not represent the reality — there might be fallen trees, ponds formed by rain water, tall grass, etc.
The vision-based sensors (cameras) offer the richest sensory information to navigate such environments. For example, it is impossible to decide whether an AGV can drive through a puddle or through tall grass based on the LIDAR data alone.