About Canoo

Canoo has developed breakthrough electric vehicles that are reinventing the automotive landscape with bold innovations in design, pioneering technologies, and a unique business model that defies traditional ownership to put customers first. Distinguished by its experienced team – totaling over 500 employees from leading technology and automotive companies – Canoo has designed a modular electric platform purpose-built to deliver maximum vehicle interior space and adaptable to support a wide range of vehicle applications for consumers and businesses. With offices around the country, the company is scaling quickly and seeking candidates who love to challenge themselves, are motivated by autonomy and purpose, and get things done.

Job Purpose

As a Behavior Prediction Engineer you will develop and maintain software for vehicle perception using various sensor input devices including Cameras, LIDAR, RADAR, ultrasonic, GPS, IMU and others on Vehicle CAN. You will be responsible for various prediction tasks such as prediction of critical agents position and velocity in temporal space, prediction of cut in and cut out agents, prediction of closest in path vehicle etc.


  • Design and implement prediction algorithms for agent estimations for Canoo Autonomous Vehicle products
  • Design, implement, test, and maintain system tools for product development and product verification
  • Performance optimization
  • Develop unit test cases and simulation test cases for code compliance according to ISO26262 part 6
  • Develop KPIs and performance metrics to continually improve prediction algorithms
  • Enhance Canoo IP

Required Experience

  • BS, MS, or PhD in Computer Science, Computer Engineering, Electrical Engineering or closely related field or equivalent experience
  • 3+ years of work or lab experience
  • Demonstrate performance in a position requiring engineering technical excellence
  • Strong communication and preference for working in teams
  • High level of interpersonal skills, Self-motivated, comfortable operating without direct supervision
  • Background with object detection/tracking, sensor fusion, and time synchronization
  • Background working on autonomous driving or robotics with multiple sensors (camera, lidar, radar, ultrasonic)
  • Experience writing software for real-time embedded systems with GPU
  • Experience with deep neural network training and optimization in leading frameworks (e.g. PyTorch, Tensorflow)

Preferred Experience

  • Experience working on Autonomous Vehicles, Robotics, Self-Driving-Cars, GPU technology, imaging, camera, LIDAR, RADAR, USS
  • Experience with deep neural network training and optimization in leading frameworks (e.g. Pytorch, Tensorflow)
  • Experience developing software for ISO26262 part 6 compliance
  • Familiar with Operating Systems concepts AUTOSAR, embedded Linux, functional safety, ISO 26262
  • Experience with automotive electronic modules and sensors

      What's Cool About Working Here...

      • Four months paid primary care giver leave
      • Flexible PTO
      • Participation in the Employee Equity Compensation Plan
      • Casual workplace with an unbelievable feeling of energy
      • Work in a high-growth start up that will redefine urban mobility

      Canoo is an equal opportunity-affirmative action employer and considers all qualified applicants for employment based on business needs, job requirements and individual qualifications, without regard to race, color, religion, sex, age, disability, sexual orientation, gender identity or expression, marital status, past or present military service or any other status protected by the laws or regulations in the locations where we operate.

      Any unsolicited resumes or candidate profiles submitted in response to our job posting shall be considered the property of Canoo Inc. and its subsidiaries and are not subject to payment of referral or placement fees if any such candidate is later hired by Canoo unless you have a signed written agreement in place with us which covers the applicable job posting.