• Affordable Robotic Hand Solutions: We're making significant efforts to explore cost-effective yet durable materials to construct affordable robotic hands. This hand can provide high-quality robotic assistance accessible to all, regardless of economic constraints. By exploring cost-effective yet durable materials, we aim to reduce the overall manufacturing costs without compromising the longevity or performance of the robotic hands. We aim to investigate dexterous in-hand manipulation, ensuring the robotic hand can perform a wide array of tasks from picking up delicate objects to exerting significant force when necessary. We will investigate User-Centric Customizations with Modular designs that allow users to tailor the robotic hand to specific needs.
  • Microrobots for Biomedical Applications: Our goal is to harness the potential of microrobots to revolutionize biomedical interventions, offering solutions that are less invasive, more precise, and ultimately lead to better patient outcomes. Harnessing the power of microrobotics, our research focuses on designing microrobots that can navigate through bodily fluids to deliver drugs to targeted sites, minimizing side effects and maximizing efficacy. We are utilizing microrobots for tasks such as single-cell biopsies and cellular repairs. To ensure the microrobots operate with precision, we emphasize incorporating imaging techniques such as MRI, ultrasound, and photoacoustic imaging to guide the microrobots through the human body. We will develop algorithms that allow microrobots to adapt and respond to the dynamic environments within the body.
  • Multimodal Perception: While the past two decades have seen significant advancements in embodied learning, a substantial disparity persists between human perception and that of embodied agents. The human ability to seamlessly integrate a plethora of sensory inputs remains unparalleled. For embodied agents to approach this level of perceptual prowess, they must be equipped to not just see but also hear, touch, and dynamically interact with their environment, subsequently driving informed action decisions. The true evolution of Artificial Intelligence hinges on its ability to cohesively interpret these diverse, multimodal signals. This research area aims to spotlight the latest achievements and foster discourse on the challenges in pursuing a holistic approach to embodied learning across varied sensory modalities.
  • Adaptive Human-Robot Shared Control: In today's rapidly evolving technological landscape, the coexistence of humans and robots is inevitable. For this coexistence to be harmonious, it's essential that robots are not just passive tools but active participants that can understand and adapt to human-centric environments. We aim to equip robots with the ability to read and interpret human emotions, intentions, and subtle cues, ensuring proactive and supportive actions in collaborative tasks. We will implement adaptive algorithms that allow robots to learn from their interactions, refining their behavior over time. We will also design robots with fail-safe mechanisms and dynamic risk-assessment capabilities to ensure human safety in shared spaces.
  • Data-Efficient Robot Learning: Most of the existing deep imitation learning-based approaches require a large amount of human demonstration data, while a desired deep reinforcement learning-based method requires a lot of trials and errors before the robot is able to obtain the desired policies. The time-consuming and effort-demanding learning process limits the deployment of robot learning methods in real-world applications. To enable the robots to learn unknown dynamics and acquire complex visual-motor skills with high efficiency, data-efficient robot learning techniques will be investigated. For example, one/few-shot learning will be used for the robots to learn from a small dataset of human demonstration or help enhance the efficiency of reinforcement learning.