Exercise 1 - Control
Published: Wednesday Oct. 7.
Exercise 1: Pure Pursuit Control
Duckiebot Setup
- Make sure your robot is assembled correctly in DB-19 configuration
- Initialize your SD card
- Calibrate your camera (intrinsic and extrinsic)
- Calibrate your wheels
Checkpoint
- You should be able to run the lane following procedure. The robot should look at least like it is trying to follow the lane.
Getting Started with Exercises
Fork the dt-exercices
repo and clone it onto your computer.
Set up an upstream remote. From inside the directory you just cloned:
$ git remote add upstream git@github.com:duckietown/dt-exercises.git
Now to pull anything new from the original repo you can do:
$ git pull upstream daffy
If you enter the dt-exercises
folder that you just cloned do:
$ cd exercises control
The Anatomy of an Exercise
The things that are important for now:
- Inside the
exercises_ws
folder is where your code is going to live. Specifically in thesrc
subdirectory. This is where you will have to change things. If you don’t change the name of the package fromlane_controller
it will get run since alane_controller
is part of the lane following stack that is getting launched and we’ve made sure (with workspace overlaying) that it picks this one to run. - In notebooks you will see the notebook that we went over in class already
- In launchers you will see some bash scripts that launch your code. Specifically, the
dts exercises test
command is configured to run the one calledrun.sh
and thedts challenges
(more on this in a minute) is setup to run the one calledsubmit.sh
- In the
assets
folder are some other useful things. For example in theassets/setup
folder are a bunch of.yaml
files that contain environment variables that are loaded when you run withdts exercises test
- There’s also a Dockerfile, this is only needed when you use
dts challenges
Building your Code
You can start by building your code with:
$ dts exercises build
If you go inside the exercises_ws
folder you will notice that there are more folders that weren’t there before. These are build artifacts that persist from the building procedure because of mounting.
Note: every time you run a dts exercises
command you have to be inside an exercise folder or you will see an error.
Running in Simulation
You can run your current solution in the gym simulator with:
$ dts exercises test --sim
Then you can look at what’s happening by looking through the browser at http://localhost:8087 .
Open up the rqt_image_view
, resize it, and choose /agent/camera_node/image/compressed
in the dropdown. You should see the image from the robot in the simulator.
You might want to launch a virtual joystick by opening a terminal and doing:
$ dt-launcher-joystick
By default the duckiebot is in joystick control mode, so you can freely drive it around. You can also set it to LANE FOLLOWING
mode by pushing the a
button when you have the virtual joystick active. If you do so you will see the robot move forward slowly and never turn.
Testing Your Algorithm on the Robot
If you are using a Linux laptop, you have two options, local (i.e., on your laptop) and remote (i.e., on the Duckiebot). If you are Mac user unfortunately I would stick to the remote option. To run “locally”
$ dts exercises test --duckiebot_name ![ROBOT_NAME] --local
To run on the Duckiebot:
$ dts exercises test --duckiebot_name ![ROBOT_NAME]
In both cases you should still be able to look at things through novnc by pointing your browser to http://localhost:8087 . If you are running on Linux, you can load up the virtual joystick and start lane following as above.
Starting Lane Following on Mac
Since we can’t publish from Mac and have it be received by ROS, we have to do something slightly different. In a new terminal on your Mac do:
$ docker -H ![ROBOT_NAME].local exec agent launchers/start_lane_following.sh
This will run the start_lane_following.sh
bash script inside the agent container which initiates LANE_FOLLOWING
mode.
Similarly, you can stop your Duckiebot from lane following by doing:
$ docker -H ![ROBOT_NAME].local exec agent launchers/stop_lane_following.sh
You could also do an equivalent thing through the Portainer interface in the dashboard.
Your Task
You will need to edit the lane_control
package in exercise_ws/src
.
You probably only need to touch 3 files (but you could do things any way you like):
config/lane_controller_node/default.yaml
are parameters that will get loaded by your ROS nodesrc/lane_controller_node.py
is the actual ROS node that will get run. I prefer to pretty much just do data marshalling inside the ROS node so that I could use my code elsewhere if I wantinclude/lane_controller/controller.py
is where the python class is located that I would use to do most of the logic.
You should implement a pure pursuit controller. Exactly how you do it is up to you, but a central problem is how to get the reference trajectory. For this I suggest to use the lane markings directly. Find the lane markings that are ahead by your prescribed look ahead distance, and then adjust a little bit so that your reference is not on the edge of the road but in fact down the middle.
Submitting your Agent
Once you are happy with things, you can make your submission. This will actually be a submission to the AI Driving Olympics at NeurIPS. If you want to see how you will do locally before sending it for cloud evaluation, you can do:
$ dts challenges evaluate --challenge aido5-LF-sim-validation
This will give you a link to follow the progress of the simulation in the browser. You can also look at the output files at the end to see your scores. TODO: Add more details here.
and then finally you can submit it with:
$ dts challenges submit --challenge aido5-LF-sim-validation
You can also test your submission on your duckiebot with:
$ dts duckiebot evaluate
this is how we will run your submissions on Duckiebots for evaluation.
To submit your assignment, you should make two different submissions to the aido5-LF-sim-validation
challenge. One of them should be optimized to run in the simulator, and the other for on robot. You should change your submission’s label in the file submission.yaml
to be user-label: sim-exercise-1
before submitting for the simulation, and user-label: real-exercise-1
for the real robot. The output that you get on the challenge server for the real-exercise-1
does not matter, we will run that submission on our own robots for the evaluation.
Note that you can use the same code for both submissions, but having two different submissions will allow you to tune parameters for the real robot and the simulator separately.
You should also submit a video of your code running on your robot with dts duckiebot evaluate
. This will be useful in case a problem happens when we try to run your code on our robot. You can submit the video here.
This assignment is worth 10%: 5 for the performance in simulation (on the challenge server), and 5% on the real robot (when we run your code on our robot)
Please report problems quickly on discord, in class, on slack, as github issues, etc. If you are not sure if something is working properly ask. Don’t spend hours trying to fix it first.
Some things that might help…
Take a look at the utilities inside dts exercises test
optional arguments:
-h, --help show this help message and exit
--duckiebot_name DUCKIEBOT_NAME, -b DUCKIEBOT_NAME
Name of the Duckiebot on which to run the exercise
--sim, -s Should we run it in the simulator instead of the real robot?
--stop just stop all the containers
--staging, -t Should we use the staging AIDO registry?
--local, -l Should we run the agent locally (i.e. on this machine)? Important Note: this is not expected to work on MacOSX
--debug See extra debugging output
--pull Should we pull all of the images
--restart_agent, -r Flag to only restart the agent container and nothing else. Useful when you are developing your agent
You might also explore the other outputs that you can look at in rqt_image_view
.
Also useful are some debugging outputs that are published and visualized in RViz
.
You can open RViz
through the terminal in the novnc
desktop by typing:
$ rviz
In the window that opens click “Add” the switch to the topic tab, then find the segment_markers
, and you should see the projected segments appear. Do the same for the pose_markers
.
Another tool that may be useful is rqt_plot
which also can be opened through the terminal in novnc. This opens a window where you can add “Topics” in the text box at the top left and then you will see the data get plotted live.
All of this data can be viewed as data through the command line also. Take a look at all of the rostopic
command line utilities.
The deadline is set for Monday Nov. 2.