May Mobility is transforming cities through autonomous technology to create a safer, greener, more accessible world. Based in Ann Arbor, Michigan, May develops and deploys autonomous vehicles (AVs) powered by our innovative Multi-Policy Decision Making (MPDM) technology that literally reimagines the way AVs think.
Our vehicles do more than just drive themselves - they provide value to communities, bridge public transit gaps and move people where they need to go safely, easily and with a lot more fun. We’re building the world’s best autonomy system to reimagine transit by minimizing congestion, expanding access and encouraging better land use in order to foster more green, vibrant and livable spaces.Since our founding in 2017, we’ve given more than 300,000 autonomy-enabled rides to real people around the globe. And we’re just getting started. We’re hiring people who share our passion for building the future, today, solving real-world problems and seeing the impact of their work. Join us.
Job Summary
As a Senior Robotics Engineer - LIDAR/Sensor Fusion, you will lead the solution of challenging and complex problems facing vehicle autonomy. You will lead the design of future May autonomy systems and track down and solve the system’s most difficult issues.
Autonomy system engineers are responsible for the integrated functioning of the autonomy system. Systems engineers interact with vision, LIDAR, RADAR, tracking, intent, decision-making, control, networking, health monitoring, computers, graphics cards, and power. System engineers drive forward architectural design and hardware/software design through collaboration across other engineering groups. You will be a go-to person for understanding how new features will fit in, how the system might fail, how to best evolve the system to be more effective or efficient, and how to keep the system working.
Your Opportunity to Drive Success
You will have an opportunity to independently impact our approach to solving the most interesting problems facing AV’s today, while operating live in the wild.
- Collaborate with other machine learning and robotics engineers to design, implement, test, and deploy robust, generalizable perception solutions for autonomous vehicles
- Work independently with cross-functional teams to develop software and system requirements
- Develop, test, and deploy software in C/C++/CUDA
- Lead team code quality activities including design and code reviews
- Track and Trend technical performance of the system in the field
- Provide technical guidance to Technical Support Team on issue diagnosis and resolution
Required Qualifications
- A minimum of 3+ years of industry experience working on real-world robot perception systems (should have implemented at least two commercial projects working at a systems level, developing perception software as part of a larger robotics system)
- Masters' or PhD's degree in Robotics, Computer Vision, Machine Learning, Artificial Intelligence, Computer Science, or Computer Engineering, or a field that requires a strong mathematical and/or engineering foundation (e.g. physics, aerospace engineering)
- Object Detection/Segmentation/Tracking in Lidar, Vision, or LiDAR-Camera sensor fusion
- Solid experience on applying Machine Learning or Deep Learning Technologies to LiDAR or Sensor Fusion
- Strong experiences on large-scale data driving and data analysis system to understand LiDAR, Vision, Sensor Fusion system
- Strong programming skills in C/C++ and experience with software development in a Linux environment
- Familiarity with standard development tools such as git, and CI/CD pipelines
- Experiences on fast ML development cycle
- Experiences on ML model optimization and deployment
Desired Qualifications
- Strong background in one of the robot perception areas discussed above as demonstrated by developing and delivering to fielded robots multiple capabilities that solve critical problems in perception
- Demonstrated ability to mentor and support more junior engineers in learning and contributing to robotics development and testing
- Experience with python, machine learning software frameworks, and GPU programming in CUDA or related higher-level languages
Benefits and Perks
- Health benefits including vision and dental
- Unlimited paid vacation days and generous holidays
- Paid parental leave
- Meaningful stock options
- Daily catered lunches and snacks
- Flexible schedule around core business hours
Don’t meet every single requirement? Studies have shown that women and/or people of color are less likely to apply to a job unless they meet every qualification. At May Mobility, we’re committed to building a diverse, inclusive, and authentic workforce, so if you’re excited about this role but your previous experience doesn’t align perfectly with every qualification, we encourage you to apply anyway! You may be the perfect candidate for this or another role at May.
Want to learn more about our culture & benefits? Check out our website!
May Mobility is an equal opportunity employer. All applicants for employment will be considered without regard to race, color, religion, sex, national origin, age, disability, sexual orientation, gender identity or expression, veteran status, genetics or any other legally protected basis. Below, you have the opportunity to share your preferred gender pronouns, gender, ethnicity, and veteran status with May Mobility to help us identify areas of improvement in our hiring and recruitment processes. Completion of these questions is entirely voluntary. Any information you choose to provide will be kept confidential, and will not impact the hiring decision in any way. If you believe that you will need any type of accommodation, please let us know.
Note to Recruitment Agencies: May Mobility does not accept unsolicited agency resumes. Furthermore, May Mobility does not pay placement fees for candidates submitted by any agency other than its approved partners.