AUTO PATROL ROBOT

A compact quadruped robot (0.9m×0.6m×1.2m) designed for autonomous indoor/outdoor patrol, anomaly detection, and human interaction in city parks. As a core algorithm developer, I led the development of mapping, localization, autonomous navigation, and behavioral AI modules.

Sensors Connection Auto Navigation Patrol & Voice Interaction Four-wheel Drive

V1  |   Sensors Connection

  • Date:   2023 NOV-2024 MAR
  • Type:   Research Project
  • Role:   Algorithm Developer
  • Software:   ROS1 (Ubuntu), Autoware, Docker
  • Hardware:   Jetson Orin NX, Laser Radar, Imu, Stereo Camera, GNSS.
  • Description:   This phase focused on sensor mounting, calibration, data synchronization, and system integration. I tested multiple Autoware versions from ROS2/Autoware Universe and arrived at Dockerized ai-1.14.

Challenges

        As the sole developer initiating the project, I navigated steep learning curves in ROS architecture and Autoware’s modular dependencies. Key tasks included physical sensor mounting and alignment, ROS node configuration for real-time data streaming, and validation of sensor fusion for localization accuracy. Initial attempts with Autoware Universe (ROS2) and Carla simulation were hindered by environment configuration issues and time constraints, forcing a pivot to Autoware.ai-1.14 in Docker for stability. Despite the lack of prior experience in autonomous systems, I led the transition from simulation to physical testing under tight deadlines, ensuring rapid progress in a resource-limited setting.

V2  |   Autonomous Navigation

  • Date:   2024 APR -
  • Type:   Grounded Project
  • Role:   Algorithm Developer
  • SoftWare:   ROS1(Ubuntu), move_base
  • HardWare:   Jeston Orin NX, Radar, Imu, Camera, etc..
  • Description:   This stage focuses on addressing various engineering challenges in autonomous patrol. Vehicle-level tasks include chassis communication, autonomous path planning, navigation, obstacle avoidance, and docking.

Challenges

        In the last days, my primary focus was on vehicle-level tasks, including map scanning and drawing, developing Autoware modules, implementing diverse docking strategies, and enhancing obstacle detection systems. Meanwhile, my colleague focused on testing new auto drive framework, optimizing system-level strategies, and setting up communication with the backend management system.

Path Planning: Initially, we worked on Autoware's global planning modules of OpenPlanner and Freespace planning. These modules were poorly connected, requiring extensive effort to identify and bridge gaps between perception, planning, and control subsystems. After being familiar with necessary modules required for auto driving, we are now using self developed framework allowing for embedding various global planning and local planning modules. Currently strategies are hybrid A* for global planning and TEB in move_base.

Obstacle Avoidance: To address the blind spots in the 16-line LiDAR, I use the ultrasonic array at first to detect closing obstacles. However, due to the high noise and low sampling rates (3-4 Hz), a 1D LiDAR is integrated. Data from the 1D LiDAR and one line of the multi-line LiDAR is combined to achieve stable, high-sensitivity detection. Meanwhile, the ultrasonic sensor array is also under testing for low-height obstacles like small animals and stones.

Auto Docking: AprilTag is used for relative positioning during final alignment. Early iterations addressed unstable vehicle dynamics by developing multi-route strategies: I developed one strategy utilized multi-routes planning and odom-based localization, and another segmented approach angles with speed-specific zones for large directional deviations. However, after field validation, the navigation accuracy enabled simplification to a single directional calculation function, with a secondary protocol triggering a stop-and-wait state upon tag loss.

Auto Patrol

        A year ago, as a novice handling IMU and LiDAR for the first time, I could scarcely imagine achieving stable autonomous navigation in city parks within 12 months. Our small, inexperienced team tackled significant challenges in mastering Autoware's complex modules. From May 2025, we successfully get rid of Autoware and switched to self designed auto drive frameworks. While the system now operates reliably under standard conditions, edge cases still require refinement, and ongoing work - from sensor fusion to behavioral logic - remains under active development and optimization.

V2  |   Patrol & Voice Interaction

  • Date:   2024 NOV -
  • Type:   Grounded Project
  • Role:   Algorithm Developer
  • SoftWare:   YOLO, MQTT, ROS1, MQTT
  • HardWare:   Jeston Orin NX, RGB Camera, Micro Array.
  • Description:   Enhanced the robot's application layer with vision-based anomaly detection and tracking, voice interaction, and real-time streaming.

Application Service

        After achieving basic autonomous navigation, I led the development of application-layer algorithms, starting with a YOLO-based anomaly detection module for identifying behaviors like smoking and falling. Meanwhile, I integrated the alert systems, voice interaction, and vehicle tracking for targeted anomalies, alongside PTZ control for dynamic monitoring. Collaborating with frontend developer, we finalized a succinct voice interaction interface for enhancing user engagement.

Behavior Monitoring: Initially, an open-source YOLO model detected humans, bicycles, and pets. This was later replaced with a custom-trained model targeting smoking, falls, fighting, etc.. To address low smoking detection accuracy, I implemented a two-step strategy: first detecting cigarettes, then verifying their relative position to identified individuals.

Voice Interaction: Developing the voice interaction and alert system posed challenges in managing overlapping audio streams with different priorities, and eliminating echo during sound capture. The final solution involved implementing a dual-microphone array for directional sound pickup, while strategically positioning the speaker on the side to prevent feedback from the robot's own voice.

PTZ Control: The application-layer algorithm was responsible for issuing motion commands to the PTZ system. Due to frequent changes in control schemes and the departure of the embedded developer, I took over the embedded development for three iterations. Initial versions used an Arduino Uno to connect two stepper motor controllers and infrared switches for reset and control. The final version integrated encoders and geared stepper motors, significantly improving precision and reliability.

User Experience

        The final user interaction and platform management experience were heavily influenced by various on-site constraints. Nearly half of the development time was spent in field testing, refining functionalities such as vehicle stability, obstacle avoidance, voice responsive sensitivity, and real-time video streaming. Early deployments always ended in failure, plagued by unexpected issues like loose cables, network latency, and minor hardware failures. Over months of iterative debugging and optimization, we finally achieved a relatively stable system.

V3  |   Four Wheel Drive Vehicle

  • Date:   2024 Oct -
  • Type:   Research Project
  • Role:   Mechanical & Embedded Engineer
  • SoftWare:   Rhinoceros
  • HardWare:   4WD Suspension Steering System
  • Description:   A dedicated testbed designed for algorithm validation, featuring a 4-wheel independent steering chassis with full suspension, a modular middle layer for trash bin pickup and docking, and a top-mounted sensor platform with detachable screens and sensor mounts.

Testbed Vehicle

        During a period of high chassis failure rates, our algorithm team urgently needed a stable test vehicle. Due to staffing changes in the mechanical and embedded departments, I independently designed a versatile test platform. This chassis featured four stepper motors for wheel direction control and two hub motor drivers for rotation. However, this project was temporarily paused due to the sudden running in robotic arm development. The stepper motors and their control algorithms were then repurposed for the 4-Axis Robotic Arm.

Four Wheel Drive: Current vehicle uses a front-drive, rear-omniwheel configuration, which causes oversteering and instability on uneven terrain. By transitioning to four independent steered and suspended wheels, the system now supports Ackermann steering, zero-radius turning, and crab-walking, significantly enhancing maneuverability and stability during navigation and docking tasks.

Power Board: This vehicle required extensive 24V and 12V power distribution with 48V battery input. To minimize space usage, I integrated two 240W power modules and designed a compact custom power board using EasyEDA, ensuring efficient energy management.

Future Plan

        This 4WD test vehicle, currently stationed at an unmanned seat right on my back, is scheduled for further development and testing in May, following the completion of the 4-Axis Robotic Arm project. This platform will serve as a critical tool for validating advanced navigation and perception algorithm in dynamic environments.