A compact quadruped robot (0.9m×0.6m×1.2m) designed for autonomous indoor/outdoor patrol, anomaly detection, and human interaction in city parks. As a core algorithm developer, I led the development of mapping, localization, autonomous navigation, and behavioral AI modules.
Sensors Connection Auto Navigation Patrol & Voice Interaction Four-wheel Drive        As the sole developer initiating the project, I navigated steep learning curves in ROS architecture and Autoware’s modular dependencies. Key tasks included physical sensor mounting and alignment, ROS node configuration for real-time data streaming, and validation of sensor fusion for localization accuracy. Initial attempts with Autoware Universe (ROS2) and Carla simulation were hindered by environment configuration issues and time constraints, forcing a pivot to Autoware.ai-1.14 in Docker for stability. Despite the lack of prior experience in autonomous systems, I led the transition from simulation to physical testing under tight deadlines, ensuring rapid progress in a resource-limited setting.
        After achieving basic autonomous navigation, I led the development of application-layer algorithms, starting with a YOLO-based anomaly detection module for identifying behaviors like smoking and falling. Meanwhile, I integrated the alert systems, voice interaction, and vehicle tracking for targeted anomalies, alongside PTZ control for dynamic monitoring. Collaborating with frontend developer, we finalized a succinct voice interaction interface for enhancing user engagement.
Behavior Monitoring: Initially, an open-source YOLO model detected humans, bicycles, and pets. This was later replaced with a custom-trained model targeting smoking, falls, fighting, etc.. To address low smoking detection accuracy, I implemented a two-step strategy: first detecting cigarettes, then verifying their relative position to identified individuals.
Voice Interaction: Developing the voice interaction and alert system posed challenges in managing overlapping audio streams with different priorities, and eliminating echo during sound capture. The final solution involved implementing a dual-microphone array for directional sound pickup, while strategically positioning the speaker on the side to prevent feedback from the robot's own voice.
PTZ Control: The application-layer algorithm was responsible for issuing motion commands to the PTZ system. Due to frequent changes in control schemes and the departure of the embedded developer, I took over the embedded development for three iterations. Initial versions used an Arduino Uno to connect two stepper motor controllers and infrared switches for reset and control. The final version integrated encoders and geared stepper motors, significantly improving precision and reliability.
        The final user interaction and platform management experience were heavily influenced by various on-site constraints. Nearly half of the development time was spent in field testing, refining functionalities such as vehicle stability, obstacle avoidance, voice responsive sensitivity, and real-time video streaming. Early deployments always ended in failure, plagued by unexpected issues like loose cables, network latency, and minor hardware failures. Over months of iterative debugging and optimization, we finally achieved a relatively stable system.
        During a period of high chassis failure rates, our algorithm team urgently needed a stable test vehicle. Due to staffing changes in the mechanical and embedded departments, I independently designed a versatile test platform. This chassis featured four stepper motors for wheel direction control and two hub motor drivers for rotation. However, this project was temporarily paused due to the sudden running in robotic arm development. The stepper motors and their control algorithms were then repurposed for the 4-Axis Robotic Arm.
Four Wheel Drive: Current vehicle uses a front-drive, rear-omniwheel configuration, which causes oversteering and instability on uneven terrain. By transitioning to four independent steered and suspended wheels, the system now supports Ackermann steering, zero-radius turning, and crab-walking, significantly enhancing maneuverability and stability during navigation and docking tasks.
Power Board: This vehicle required extensive 24V and 12V power distribution with 48V battery input. To minimize space usage, I integrated two 240W power modules and designed a compact custom power board using EasyEDA, ensuring efficient energy management.
        This 4WD test vehicle, currently stationed at an unmanned seat right on my back, is scheduled for further development and testing in May, following the completion of the 4-Axis Robotic Arm project. This platform will serve as a critical tool for validating advanced navigation and perception algorithm in dynamic environments.