Course Description

Title: IFT 6757 Autonomous Vehicles (aka. Duckietown)

Instructor: Liam Paull



Background and Description:

Self-driving vehicles are poised to become one of the most pervasive and impactful applications of autonomy, and have received a great deal of attention of-late. The development of self-driving vehicles provides the opportunity to address difficult challenges that are shared throughout the robotics domain and more broadly to the fields of machine learning, computer vision, and artificial intelligence. These challenges include the co-design of hardware components and algorithms, the coupled interaction between perception and control, understanding and quantifying uncertainty, the optimal allocation of finite computational resources to concurrent processes, and safe multi-agent behaviors.

This course considers problems in perception, navigation, and control, and their systems-level integration in the context of self-driving vehicles through an open-source curriculum for autonomy education that emphasizes hands-on experience. Integral to the course, students will collaborate to implement concepts covered in lecture on a low-cost autonomous vehicle with the goal of navigating a model town complete with roads, signage, traffic lights, obstacles, and citizens. The wheeled platform is equipped with a monocular camera and a performs all processing onboard with a Raspberry Pi, and must: follow lanes while avoiding obstacles, pedestrians and other robots; localize within a global map; navigate a city; and coordinate with other robots to avoid collisions. The platform and environment are carefully designed to allow a sliding scale of difficulty in perception, inference, and control tasks, making it usable in a wide range of applications, from undergraduate-level education to research-level problems.

Important Note 1: Every student will be given their own robot to build, personalize and love for the semester.

Important Note 2: The course will be taught in conjunction with the AI Driving Olympics live competition that will take place at NIPS 2018. Students will submit entries ot the competition as part of the class.

Important Note 3: This is a completely open source class. As a result, this class will never be the same twice since we always build on what already exists. Students who do a great job have the potential that their work will become the new repository standard for others to try and beat in subsequent iterations.


The teaching objectives that we most care about are:

  • Have the students understand how methods from heterogeneous disciplines such as control theory, machine learning, computer vision, and artificial intelligence are integrated together to create a complex autonomous system.

  • Discuss the co-design constraints and the design trade-offs explicitly. A typical example is a trade-off like “use cheap mechanisms with sophisticated algorithms” vs “use reliable hardware and simple algorithms”.

  • Familiarize the students with the basic practices of reliable system development, including test-driven and data-driven development.

  • Familiarize the students with the tools and the dynamics of software and hardware open-source development. All homework is shared on a central Git repository. After each milestone, everybody’s software is readable for everybody to re-use.


The course will cover the theory and application of probabilistic techniques for autonomous mobile robotics with particular emphasis on their application in the context of self-driving vehicles. Topics include probabilistic state estimation and decision making for mobile robots; stochastic representations of the environment; dynamic models and sensor models for mobile robots; algorithms for mapping and localization; planning and control in the presence of uncertainty; cooperative operation of multiple mobile robots; mobile sensor networks.

Following is a list of topics discussed in the class, roughly ordered from “metal” to “systems”:

  • Autonomy architectures

  • Sensors and models (kinematics/dynamics)

  • Computer vision (intrinsic/extrinsic calibration, illumination invariance, feature extraction, line detection, place recognition)

  • Nonlinear filtering (localize in a given map, or create your own map - SLAM)

  • Navigation and planning (mission planning, motion planning and control basics)

  • Complex perception pipelines (use of object detection, reading traffic signs, and tracking. Both model-based and learning based)

  • Safety (formal guarantees)

  • Shared control (human machine interaction)

  • Multi-robot systems (fleet-level planning, reason about the intent of the other drivers);

  • Machine learning (deep learning, reinforcement learning for navigation and object detection)


Meeting Times: M 10:30-12:30, W 11:30am–1:30pm


Pre-requisites: Permission of the instructor. Please come to the first class and fill out an application and/or email the instructor to discuss.

Intended Enrollment: 18 students

Intended Degree Level: Senior Undergraduates and Graduates

Grading Scheme:

  • Exercises (1) - 10%
  • Submissions to the AI-DO (2) - 30%
  • “Being a Good Citizen” - 20%
  • Project (40%):
    • Presentation - 10%
    • Course Notes - 10%
    • Demo/Instructional - 20%