Lidé
Shubhan Parag Patni, MSc.
Všechny publikace
Interactive learning of physical object properties through robot manipulation and database of object measurements
- Autoři: Kružliak, A., Hartvich, J., Shubhan Parag Patni, MSc., Ing. Lukáš Rustler, Behrens, J., Abu-Dakka, F.J., Mikolajczyk, K., Kyrki, V., doc. Mgr. Matěj Hoffmann, Ph.D.,
- Publikace: 2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2024). Piscataway: IEEE, 2024. p. 7596-7603. ISSN 2153-0866. ISBN 979-8-3503-7770-5.
- Rok: 2024
- DOI: 10.1109/IROS58592.2024.10802249
- Odkaz: https://doi.org/10.1109/IROS58592.2024.10802249
- Pracoviště: Vidění pro roboty a autonomní systémy
-
Anotace:
This work presents a framework for automatically extracting physical object properties, such as material compo sition, mass, volume, and stiffness, through robot manipulation and a database of object measurements. The framework in volves exploratory action selection to maximize learning about objects on a table. A Bayesian network models conditional dependencies between object properties, incorporating prior probability distributions and uncertainty associated with mea surement actions. The algorithm selects optimal exploratory actions based on expected information gain and updates object properties through Bayesian inference. Experimental evaluation demonstrates effective action selection compared to a baseline and correct termination of the experiments if there is nothing more to be learned. The algorithm proved to behave intelligently when presented with trick objects with material properties in conflict with their appearance. The robot pipeline integrates with a logging module and an online database of objects, con taining over 24,000 measurements of 63 objects with different grippers. All code and data are publicly available, facilitating automatic digitization of objects and their physical properties through exploratory manipulations.
Online elasticity estimation and material sorting using standard robot grippers
- Autoři: Shubhan Parag Patni, MSc., Stoudek, P., Chlup, H., doc. Mgr. Matěj Hoffmann, Ph.D.,
- Publikace: The International Journal of Advanced Manufacturing Technology. 2024, 132(11-12), 6033-6051. ISSN 1433-3015.
- Rok: 2024
- DOI: 10.1007/s00170-024-13678-6
- Odkaz: https://doi.org/10.1007/s00170-024-13678-6
- Pracoviště: Vidění pro roboty a autonomní systémy
-
Anotace:
Stiffness or elasticity estimation of everyday objects using robot grippers is highly desired for object recognition or classification in application areas like food handling and single-stream object sorting. However, standard robot grippers are not designed for material recognition. We experimentally evaluated the accuracy with which material properties can be estimated through object compression by two standard parallel jaw grippers and a force/torque sensor mounted at the robot wrist, with a professional biaxial compression device used as reference. Gripper effort versus position curves were obtained and transformed into stress/strain curves. The modulus of elasticity was estimated at different strain points and the effect of multiple compression cycles (precycling), compression speed, and the gripper surface area on estimation was studied. Viscoelasticity was estimated using the energy absorbed in a compression/decompression cycle, the Kelvin-Voigt, and Hunt-Crossley models. We found that (1) slower compression speeds improved elasticity estimation, while precycling or surface area did not; (2) the robot grippers, even after calibration, were found to have a limited capability of delivering accurate estimates of absolute values of Young’s modulus and viscoelasticity; (3) relative ordering of material characteristics was largely consistent across different grippers; (4) despite the nonlinear characteristics of deformable objects, fitting linear stress/strain approximations led to more stable results than local estimates of Young’s modulus; and (5) the Hunt-Crossley model worked best to estimate viscoelasticity, from a single object compression. A two-dimensional space formed by elasticity and viscoelasticity estimates obtained from a single grasp is advantageous for the discrimination of the object material properties. We demonstrated the applicability of our findings in a mock single-stream recycling scenario, where plastic, paper, and metal objects were correctly separated from a single grasp, even when compressed at different locations on the object. The data and code are publicly available.
Single-Grasp Deformable Object Discrimination: The Effect of Gripper Morphology, Sensing Modalities, and Action Parameters
- Autoři: Ing. Michal Pliska, Shubhan Parag Patni, MSc., Mareš, M., Stoudek, P., Ing. Zdeněk Straka, Ph.D., Štěpánová, K., doc. Mgr. Matěj Hoffmann, Ph.D.,
- Publikace: IEEE Transactions on Robotics. 2024, 40 4414-4426. ISSN 1552-3098.
- Rok: 2024
- DOI: 10.1109/TRO.2024.3463402
- Odkaz: https://doi.org/10.1109/TRO.2024.3463402
- Pracoviště: Vidění pro roboty a autonomní systémy
-
Anotace:
In haptic object discrimination, the effect of gripper embodiment, action parameters, and sensory channels has not been systematically studied. We used two anthropomorphic hands and two 2-finger grippers to grasp two sets of deformable objects. On the object classification task, we found: (i) among classifiers, SVM on sensory features and LSTM on raw time series performed best across all grippers; (ii) faster compression speeds degraded performance; (iii) generalization to different grasping configurations was limited; transfer to different compression speeds worked well for the Barrett Hand only. Visualization of the feature spaces using PCA showed that gripper morphology and action parameters were the main source of variance, making generalization across embodiment or grip configurations very difficult. On the highly challenging dataset consisting of polyurethane foams alone, only the Barrett Hand achieved excellent performance. Tactile sensors can thus provide a key advantage even if recognition is based on stiffness rather than shape. The data set with 24,000 measurements is publicly available.
Examining Tactile Feature Extraction for Shape Reconstruction in Robotic Grippers
- Autoři: Shubhan Parag Patni, MSc., doc. Mgr. Matěj Hoffmann, Ph.D.,
- Publikace: Embracing Contacts Workshop - ICRA 2023. Massachusetts: OpenReview.net / University of Massachusetts, 2023.
- Rok: 2023
- Pracoviště: Vidění pro roboty a autonomní systémy
-
Anotace:
Different robotic setups provide tactile feedback about the objects they interact with in different manners. This makes it difficult to transfer the information gained from haptic exploration to different setups and to humans as well. We introduce “touch primitives”, a set of object features for haptic shape representation which aim to reconstruct the shape of objects independent from the robot morphology. We investigate how precisely the primitives can be extracted from household objects by a commonly used gripper, on a set of objects that vary in size, shape and stiffness.
Shape Reconstruction Task for Transfer of Haptic Information between Robotic Setups
- Autoři: Shubhan Parag Patni, MSc., doc. Mgr. Matěj Hoffmann, Ph.D.,
- Publikace: Transferability in Robotics Workshop - ICRA 2023. Microsoft CMT, 2023.
- Rok: 2023
- Pracoviště: Vidění pro roboty a autonomní systémy
-
Anotace:
Robot morphology, which includes the physical dimension and shape but also the placement and type of actuators and sensors, is highly variable. This also applies to different robot hand and grippers, equipped with force or tactile sensors. Unlike in computer vision, where information from cameras is robot and largely camera-independent, haptic information is morphology-dependent, which makes it difficult to transfer object recognition and other pipelines between setups. In this work, we introduce a shape reconstruction and grasping task to evaluate the success of haptic information transfer between robotic setups, and propose feature descriptors that can help in standardizing the haptic representation of shapes across different robotic setups.
Touch Primitives for Gripper-Independent Haptic Object Modeling
- Autoři: Shubhan Parag Patni, MSc., doc. Mgr. Matěj Hoffmann, Ph.D.,
- Publikace: Workshop on effective Representations, Abstractions, and Priors for Robot Learning - ICRA 2023. Microsoft CMT, 2023.
- Rok: 2023
- Pracoviště: Vidění pro roboty a autonomní systémy
-
Anotace:
Due to the large variety of tactile and proprioceptive sensors available for integration with robotic grippers, the data structures for data collected on different robotic setups are different, which makes it difficult to compile and compare these datasets for robot learning. We propose “Touch Primitives”—a gripper-independent representation for the haptic exploration of object shapes which can be generalized across different gripper and sensor combinations. An exploration and grasping task is detailed to test the efficacy of the proposed touch primitive features.
Recognizing object surface material from impact sounds for robot manipulation
- Autoři: Dimiccoli, M., Shubhan Parag Patni, MSc., doc. Mgr. Matěj Hoffmann, Ph.D., Moreno-Noguer, F.
- Publikace: Intelligent Robots and Systems (IROS), 2022 IEEE/RSJ International Conference on. Piscataway: IEEE, 2022. p. 9280-9287. ISSN 2153-0866. ISBN 978-1-6654-7927-1.
- Rok: 2022
- DOI: 10.1109/IROS47612.2022.9981578
- Odkaz: https://doi.org/10.1109/IROS47612.2022.9981578
- Pracoviště: Vidění pro roboty a autonomní systémy
-
Anotace:
We investigated the use of impact sounds generated during exploratory behaviors in a robotic manipulation setup as cues for predicting object surface material and for recognizing individual objects. We collected and make available the YCB-impact sounds dataset which includes over 3,500 impact sounds for the YCB set of everyday objects lying on a table. Impact sounds were generated in three modes: (i) human holding a gripper and hitting, scratching, or dropping the object; (ii) gripper attached to a teleoperated robot hitting the object from the top; (iii) autonomously operated robot hitting the objects from the side with two different speeds. A convolutional neural network (ResNet34) is trained from scratch to recognize the object material (steel, aluminium, hard plastic, soft plastic, other plastic, ceramic, wood, paper/cardboard, foam, glass, rubber) from a single impact sound. On the manually collected dataset with more variability in the action, nearly 60\% accuracy for the test set (unseen objects) was achieved. On a robot setup and a stereotypical poking action from top, accuracy of 85% was achieved. This performance drops to 79% if multiple exploratory actions are combined. Individual objects from the set of 75 objects can be recognized with a 79% accuracy. This work demonstrates promising results regarding the possibility of using sound for recognition in tasks like single-stream recycling where objects have to be sorted based on their material composition.