A versatile research platform for human-robot physical interaction, emphasizing emotional sensing, collaborative behaviors, and tactile feedback. Emotional sensing involves the robot's ability to detect and respond to human emotions, aiming for a more empathetic and intuitive interaction. Collaborative aspect is designed to engage to joint activities with humans, creating a sense of shared experience. Tactile feedback is a crucial feature that adds a physical dimension to interactions, making the robot appear more 'human' and relatable.
Interaction Robotic Eye Robotic Arm        Rising stress and mental discomfort in the contemporary society have created a pressing need for innovative approaches to sustain mental health and emotional resilience. Traditional support systems - online strategies and community programs - often lack the immediacy or require significant resources. In response to these gaps, my effort focuses on human-robot interactions to facilitate physical connections, aiming to alleviate stress and gradually rebuild social contact. This research aims to introduce computer vision techniques for detecting subtle facial emotions; Investigate the impact of tactile feedback; And explore collaborative working in human-robot interactions.
Early Versions: Early prototypes faced high friction in the horizontal rotation plate and motor burnout from an imbalanced head. To resolve this, I integrated a large bearing for reduced friction and added a counterweight at the arm's end to balance the secondary arm, ensuring smoother operation and preventing motor overload.
        Current sensing capabilities of this interactive platform are primarily based on computer vision, enabling it to detect and track human presence. A pivotal feature is its ability to 'search' for humans when they move out of its field of vision. Initially, the robot could simply sense and track people, which lacked a certain depth of interaction, thus making it 'unhuman'. However, the introducing of the searching behavior made a leap in making the robot alive. This behavior of actively looking around when it loses track of a human, gives a more natural and living presence.
        The touch feedback is coincidently achieved through the power control and current monitering of these motors, as they inherently do not provide torque feedback. When the robot is stationary, the motors are set to release tension, allowing them to be moved manually. During movement, if there is a significant rise in current, it indicates that the motion is obstructed. If the robot's position is physically altered, the human detection and tracking system will prompt the motors to return to their intended path, resulting in a responsive touch feedback experience.
        Designed for potential outdoor use, this robotic arm prioritizes lightweight construction and safety. The focus is on minimizing moving-part weight while implementing a self-balancing mechanism to reduce motor torque requirements. By dynamically adjusting motion based on torque feedback, the system ensures user safety and protects the arm from damage during operation, combining mechanical efficiency with adaptive control.
Tolerance Design: To mitigate assembly errors from machining tolerances, I designed shaft holes with +0.02mm clearance and motor screw holes with +0.2mm allowance for manual adjustment. However, the sheet metal frame's laser cutting and welding introduced unpredictable inaccuracies. To address this, I added alignment slots with +0.125mm tolerance, ensuring precise component fitting.
Counterweight balancing: To achieve gravity-based self-balancing, I utilized the Kangaroo module in Rhino's Grasshopper plugin to simulate torque distribution across the robotic arm's joints under a 1kg load at various angles. Based on these simulations, I designed a hybrid tension system combining wires, linkages, pulleys, and springs. By calculating required spring forces and displacements, I reverse-engineered the optimal spring specifications models for dynamic equilibrium.
Stepper Motor: The stepper motors are compactly arranged on one side of the base, with shafts extended via couplings to connect to corresponding axes. Each motor features a unique ID, enabling TTL/CAN bus control to minimize cable clutter and streamline communication.
        Each axis is equipped with a limit switch for homing and physical stoppers to prevent the robotic arm from exceeding its designed motion range.
        Currently developing motion control algorithms, including position, velocity, and torque control modes. The system is on track for completion by April, with ongoing testing to ensure stability and responsiveness.
        Following the completion of motion control development, the next phase focuses on enhancing perception and adaptive motion models. This aims to create a shared experience between the robot and users, enhancing the sense of collaboration and connection through responsive, context-aware interactions.