Inside Milrem Robotics with Greg Otsa: autonomy development

Inside Milrem Robotics with Greg Otsa: autonomy development

In my fourth post, I’ll talk about one of the software areas that one does not usually encounter in most other companies - autonomy.

Hopefully, you have seen (videos) of our unmanned vehicles driving around. Not always is there an operator guiding the vehicles – some of them have the capability to move autonomously.

And the challenges our engineers are facing are somewhat more complex than Tesla, Starship or other similar companies face.

Cars drive in a controlled environment with clear patterns - roads, road markings, good visibility, more or less stable surface etc. Most of the paved roads are "copy-paste" if you look 10 meters back or forward.?

Counting on GPS to position ourselves on a road and having a fully updated map whilst driving and avoiding mainly cars and pedestrians while following certain traffic rules is not an option for us.

Our systems operate on harsh terrain - every upcoming 10 meters is different. Trees, holes, sand, streams, rocks, long grass, rocks inside long grass, camouflaged people moving around etc – are all the things we need to account for while developing autonomy.?

So how does it work? There are three main technical domains in autonomy:?

  1. Perception - what's around me? Analysing the data coming from cameras, lidars and sensors. Is it long grass, grass with a wall inside or a wall painted like grass??
  2. Localization - where am I on the map? GPS shows (thank you trees for blocking the signal) that I should be somewhere in this area. Perception identified some rocks, a hole, and a stream - combining it with the map gives the coordinates X and Y.
  3. Navigation (path planning) - using the data from both Perception and Localization - how to navigate from A to B avoiding obstacles.?

If we have all the above - what can we do??

  1. Waypoint navigation - waypoints are marked on the map and the vehicle identifies the best path thru them while detecting and avoiding the obstacles.
  2. Follow me - keeping a specified distance with one specific person or vehicle. Stopping when needed, avoiding the obstacles i.e. when a person moves around the corner the vehicle shouldn't dive thru the wall (although it would look cool).
  3. Manual remote control - The operator has either line of sight (usually a bad idea, you want to stay hidden) or sees through the cameras and guides the vehicle.
  4. Some other stuff I am not supposed to share publicly :)?

So that's autonomy in a nutshell. I haven't decided yet on the topic for the next episode (have a few options). You are also welcome to give your feedback - what you like to read about??

See also the previous posts if you have missed them:?

  1. Intro & Summer Camp
  2. Career model
  3. Our Engineering setup

Fred Labrosse

Roboticist, application driven, liking being out and about in remote places

1 年

Interesting points in this post. Do you adapt the control parameters to the conditions of the ground?

回复
Daniel Cameron M.

Intelligence Operations Analyst/Information Security Engineer Lead USG Contractor

2 年

Beautiful. Thank you.

要查看或添加评论,请登录

Milrem Robotics的更多文章

社区洞察

其他会员也浏览了