Perception Engineer (III)
Company: Autonomous Solutions
Location: Mendon
Posted on: February 15, 2026
|
|
|
Job Description:
Job Description Job Description About Us At Autonomous
Solutions, Inc. (ASI) , we are a global leader in vehicle
automation solutions. Our technology enables safe, efficient, and
scalable automation for industries such as solar, agriculture,
construction, landscaping, and more. With a commitment to
innovation and excellence, ASI continues to push the boundaries of
what is possible in autonomous technology. Our Mission is to help
you reach your potential through innovative robotic solutions. We
pride ourselves in our core values: Safe, Simple, Transparent,
Growth, Humble, and Attention to Detail. About the Role We are
hiring Perception Engineers across multiple levels (I-V) to join
our autonomy team. Perception Engineers at ASI are responsible for
designing, implementing, and deploying real-time perception systems
that help unmanned ground vehicles (UGVs) interpret their
surroundings and make intelligent decisions. The role includes
sensor evaluation, sensor fusion, tracking, object detection, and
deploying code to production systems. This position requires
proficiency in C++11 or newer, strong sensor data processing skills
(especially LiDAR), and the ability to develop robust, real-time
algorithms. Perception Engineers collaborate with teams across ASI,
including Embedded Software, Planning, Controls, and Systems
Engineering, and work on platforms that operate in diverse and
challenging environments. While the role currently includes 50%
field deployment work, our goal is to shift toward 90%
development.This is a hybrid position that requires three days
(Tuesday through Thursday) on-site. Job Duties Develop and deploy
real-time perception algorithms using LiDAR, radar, cameras, and
ultrasonic sensors. Design and implement classical perception
systems including sensor fusion, object tracking, and feature
extraction. Contribute to machine learning-based perception
pipelines as appropriate to project needs. Write clean, efficient
C++ code optimized for embedded Linux environments. Integrate
perception software with existing robotic platforms using ROS2 or
custom middleware. Support field deployment and troubleshooting of
perception systems (currently ~50% field work). Collaborate with
architecture, planning, and control teams to ensure consistent
interface design. Participate in simulation, HIL testing, and field
validation to ensure system robustness. Analyze system performance
and improve perception robustness in GPS-denied and adverse
conditions. Work with the architecture team to provide feedback on
module standards and interfaces. Level Breakdown Perception
Engineer I 0-2 years of experience. Bachelor's degree in Computer
Science, Robotics, Electrical Engineering, or related field.
Familiarity with sensor data processing and basic C++ programming.
Works under guidance to integrate and test perception systems.
Perception Engineer II 2-4 years of experience. Bachelor's or
Master's degree. Solid skills in C++11, sensor processing (e.g.,
LiDAR), and algorithm implementation. Contributes to both classical
and ML-based perception tasks with some independence. Perception
Engineer III 4-6 years of experience. Master's degree preferred.
Leads development of specific perception features or modules. Works
across multiple projects and contributes to deployment
architecture. Perception Engineer IV 6 years of experience.
Master's degree required. Responsible for complex perception
challenges including fusion across heterogeneous sensors. Provides
mentorship and guidance to junior engineers. Perception Engineer V
8 years of experience. Recognized expert in perception for
autonomous systems. Sets technical direction for perception
architecture and strategy. Leads high-impact projects with multiple
stakeholders. Requirements Required: Bachelor's degree required for
Level I; Master's preferred/required for Levels II-V. Proficiency
in C++11 or newer. Experience processing data from LiDAR, radar, or
camera systems. Familiarity with embedded Linux development. Strong
understanding of linear algebra and mathematical modeling. Ability
to contribute to deployment-ready, high-confidence field systems.
Preferred: Experience with ROS2, GPU processing, and embedded ML
applications. Background in object detection, classification, and
tracking. Experience with communication protocols (UDP, CAN,
Serial). Familiarity with real-time sensor fusion and edge-case
detection strategies. Benefits ASI offers a rich benefit package
including: 401k with employer match Generous HSA contribution
Employee Stock Ownership Plan PTO, Paid Holidays, and Flextime
Paying 90% of employee's medical plan EEO Statement At Autonomous
Solutions, Inc. (ASI) , we are committed to fostering a diverse,
inclusive, and equitable workplace where all employees and
applicants have equal opportunities. We prohibit discrimination and
harassment of any kind based on race, color, religion, sex,
national origin, age, disability, genetic information, veteran
status, sexual orientation, gender identity, or any other legally
protected characteristic. ASI complies with all applicable federal,
state, and local laws regarding non-discrimination in employment
and is dedicated to providing reasonable accommodations for
individuals with disabilities throughout the hiring process. Job
Posted by ApplicantPro
Keywords: Autonomous Solutions, Layton , Perception Engineer (III), Engineering , Mendon, Utah