Autonomous simulated car (deep learning)

I made an autonomous car, that was trained and drives in a robot simulator.

With very little training data, it is able to generalize to other types of tracks it has not seen before.

The car drives new tracks by approximating their curvature as a sequence of only three steering commands (wheels at a 45° left angle, wheels straight ahead, wheels at a 45° right angle).

This intentionally results in less smooth driving, since more exaggerated steering makes it easier to see the correlation between steering and visual features (of the current camera frame).


This is a companion discussion topic for the original entry at https://community.robotshop.com/robots/show/autonomous-simulated-car-deep-learning

Planning to move this to the real world using your autonomous robot Range Rover project? Your posts all seem to be going in the same direction - Lidar, vehicle, path planning, safety … you want to deliver pizza right? Jokes aside, well done and looking forward to seeing your projects merge into one mega project!

Thanks for your nice reply :slight_smile:

Indeed, I plan to take this to a real car, even if it remains inside the house for now. Actually, I used a Jetson based robot car, that successfully drove along a track delimited with paper sheets, using this approach.

I sent in some parts for warranty, that should come back in maybe 2 weeks, and I plan to upload a video then.

Did you upload a video? I looked around but didn’t see it. Sorry, I see one. Thought it was an image. Pretty cool stuff. I used to be a programmer but old school stuff, COBOL, JCL, back on old IBM mainframes, SQL. Was just starting to program in Java and I quit career to teach English in a foreign country where I have been ever since. Played with Python a little but I think my programming days are over except Arduino. Perhaps might try it on ESP32 if a new project comes up that I need faster processor.