Behavioural Cloning: Tips for Tackling Project 3

Jessica YungSelf-Driving Car NDLeave a Comment

In this post I list tips that may be helpful for tackling Project 3 of Udacity’s Self-Driving Car Nanodegree, in which you train a neural network to drive a car in a simulator. The neural network learns from data of humans driving the car through the simulator, hence the project name ‘Behavioural Cloning’ – it’s trying to imitate the way the human drives. (Scroll down for a bonus video!)

This project is not easy.

I found this project the hardest of all the projects in Term 1. It was open-ended as opposed to checklist-y – if you complete all the requisite steps (preprocess data in some way, code a decent model, augment data), you may still fail. Compare that to Projects 1, 4 and 5, where if you write accurate code to e.g. warp the perspective, write code that can connect points and draw a line, put every 50-pixel strip through a function that creates a histogram etc. (and maybe tweak a few parameters), you’ll pass. Here, it’s much less certain.

Of course, problems you’d encounter in the ‘real world’ (what world are we living in now, really?) are more like this project – messy, fiddly, open-ended. So it’s good experience. And it’s just a fun project to do. What better way to spend your days than watch your baby car repeated drive into lakes or up mountains?

But as in the real world, there are also resources like Stack Overflow and other people that can help you on your way. So here are some tips and fixes I would have found useful. I have included links to other useful resources at the end of the article.

Tips and Fixes

Can’t Train your Model? / Out of Memory (OOM)

If you get an OOM error when you’re compiling your model, it most likely means you have too many parameters in your model. I had a few billion parameters to start when I was copying NVIDIA’s model. 😛

  • You can check the number of parameters in your model using model.summary() in Keras.
  • You can reduce the number of parameters in your model by using pooling.
    • Do this  by adding a subsample=(pooling_x,pooling_y) parameter to your layer. An example:
      • Before: model.add(Convolution2D(16, 8, 8, border_mode="same"))
      • After:model.add(Convolution2D(16, 8, 8, subsample=(4, 4), border_mode="same"))
  • If you have more than 10 million parameters, think about making your model smaller. People have succeeded with fewer than 10,000 parameters.

If you get an OOM error when you’re training your model, it most likely means your computer cannot hold all your training data in memory.

  • Note that this may happen even if you feed only 3 images into your model when trying to train it. This may happen if you have previously loaded all your images (e.g. earlier on in your Jupyter Notebook).
    • The short-term solution is not to load all those images before you train your model.
  • The long-term solution is to use generators. They allow you to load / fetch data when you need it as opposed to saving it all in memory. Udacity now has a page teaching you how to use generators (Behavioural Cloning Lesson Part 17. L12.17 as of Mar 18, 2017).

I don’t know where to start

screenshot.png

Car by the Ocean

I don’t know what to do to make my car not crash

  • Observe HOW your car is going off the track and try to reason intuitively why it might be veering off the track. Is it because it’s distracted by irrelevant information? (Crop your input images.) Is it because most of the training data has it driving to the left, so it keeps veering to the left? (Include training data where the car is driving to the right.)
  • Hint: it’s much easier to intuitively alter (1) your training data or (2) how that data is preprocessed than to intuitively alter your neural network to solve your problems.

 

How do I know how many epochs to train my model for?

  • Use checkpoints so you can test drive your model trained for different numbers of epochs. Checkpoints save your model weights after each training epoch.
  • You’ll get a feel for the range of epochs where your models usually perform best. You can use this to limit the number of epochs you train your model for.
    • Leave a generous buffer (e.g. 5-10 epochs) between your projected ‘best performance epoch’ and the total number of epochs you train your model for if you can though. This is because slightly altered models or datasets can behave completely differently (see next point).
  • Do check a wide range of epochs each time you test iterations of your model though. 10 epochs worked best for most of my models, but for one I had peak performance at 8 epochs, and for another at 19 epochs.

My additional training data only makes my model worse

  • Drive using the mouse instead of the keyboard.
  • If it still doesn’t work, scrap your additional data. Good data + Bad data is worse than a smaller amount of good data.
    • Be prepared: Augmenting your training data by rotating or flipping images to generate additional data might also make your model perform worse.
  • Because of this, LABEL YOUR ADDITIONAL DATA DIFFERENTLY. Keep them in separate directories from your original data if possible. If this is not possible, create a directory where you keep a copy of your original data.
    • This is so you can easily remove sections of data you don’t want.
  • Have you used Udacity’s training data? You can train decent models with it. You will need to preprocess it and/or augment it.
  • Ask other students if you can use their data. E.g. Annie Flippo was generous to let me use her recovery data.

What works one day might not work the next

This is especially important if you are pressed for time. Record your vehicle driving when it works! A model that works (drives through the track) one day might not work the next, even when you use the same simulator. If you record your model successfully driving through the track, your reviewer may be inclined to pass your project even if the model mysteriously stops working.

Bonus: Here’s a just about functioning model:

Ask for help in the forums and in Slack

Seriously. People there know so much and are generous in helping other students.

Further useful resources:

 

Leave a Reply