BodyCoM

Rethinking Dynamic Whole-body Multicontact Interaction: Towards Next Generation of Collaborative Robots


Due to the safety requirements and affordability that collaborative robots must meet, a major shortcoming of such robots has become accuracy and payload capacity. BodyCon will address this shortcoming by exploring how robots can exploit whole-body contacts to improve their performance such as positioning accuracy, repeatability, payload capacity and speed.

Humans have an amazing ability to manipulate various objects by exploiting physical contact. Can a robot achieve better object manipulation if it learns to exploit physical interactions with multiple contacts, such as bracing its elbows against the environment? Humans do the same, for accurate manipulation they tend to brace their forearms against the environment. In robotics, this way of using the environment to improve manipulation performance has not been explored before, primarily because robots were rigid and did not allow or require whole-body contacts. However, the advent of new lightweight structures and torque sensors has led to the rapid proliferation of collaborative robots in many different fields. Additional research is needed to bring their capabilities in line with what is typically desired and expected from robots, especially in high accuracy tasks. The goal of BodyCoM is to explore how multi-contact interactions can improve the performance of a new generation of robots through several innovative phases of research and development. First, we will analyse how humans use the environment to improve their skills in accuracy and endurance. Second, we will extend the control components to enable whole-body contact. Third, we will develop a novel motor-primitive control architecture to enable goal-directed multi-contact interaction that accelerates the learning of compliant behaviours. With the consolidation of robotics methods developed at the host department and the expertise of PI in physical human-robot interaction, BodyCoM has a unique predisposition to break new ground in the cognitive exploitation of environmental contacts and constraints. The expected project outcome will demonstrate how robots use whole-body multi-contact interaction for efficient and precise manipulation.


Publications

Journal Articles

A Geometric Approach to Task-Specific Cartesian Stiffness Shaping

Knežević, Nikola; Lukić, Branko; Petrič, Tadej; Jovanovič, Kosta

A Geometric Approach to Task-Specific Cartesian Stiffness Shaping Journal Article

Journal of Intelligent and Robotic Systems: Theory and Applications, 110 (1), 2024, ISSN: 15730409.

Abstract | Links | BibTeX

Kinematic model calibration of a collaborative redundant robot using a closed kinematic chain

Petrič, Tadej; Žlajpah, Leon

Kinematic model calibration of a collaborative redundant robot using a closed kinematic chain Journal Article

Scientific Reports, 13 (1), pp. 1–12, 2023, ISSN: 20452322.

Abstract | Links | BibTeX

Kinematic calibration for collaborative robots on a mobile platform using motion capture system

Žlajpah, Leon; Petrič, Tadej

Kinematic calibration for collaborative robots on a mobile platform using motion capture system Journal Article

Robotics and Computer-Integrated Manufacturing, 79 , pp. 102446, 2022, ISSN: 0736-5845.

Abstract | Links | BibTeX

Inproceedings

Optimizing Robot Positioning Accuracy with Kinematic Calibration and Deflection Estimation

Žlajpah, Leon; Petrič, Tadej

Optimizing Robot Positioning Accuracy with Kinematic Calibration and Deflection Estimation Inproceedings

Petrič, Tadej; Ude, Aleš; Žlajpah, Leon (Ed.): Advances in Service and Industrial Robotics, pp. 255–263, Springer Nature Switzerland, Cham, 2023, ISBN: 978-3-031-32606-6.

Abstract | Links | BibTeX

Partners

JSI Team

MembersCOBISS IDRolePeriod
Petrič Tadej30885PI2022- 2025
Leon Žlajpah03332Researcher2022- 2025
Brecelj Tilen37467Researcher2022- 2025
Mišković Luka54681Junior Researcher2022- 2025
Kropivšek Leskovar Rebeka53766Technician2022- 2025
Simon Reberšek39258Technician2022- 2025

Founding source


ARRS grant no.: N2-0269