All publications

Autonomous Tracking of Honey Bee Behaviors over Long-term Periods with Cooperating Robots

  • DOI: 10.1126/scirobotics.adn6848
  • Link: https://doi.org/10.1126/scirobotics.adn6848
  • Department: Department of Computer Science, Artificial Intelligence Center
  • Annotation:
    Digital and mechatronic methods, paired with artificial intelligence and machine learning, are game-changing technologies in behavioral science. The central element of the most important pollinator species (honeybees) is the colony’s queen. The behavioral strategies of these ecologically important organisms are under-researched, due to the complexity of honeybees’ self-regulation and the difficulties of studying queens in their natural colony context. We created an autonomous robotic observation and behavioral analysis system aimed at 24/7 observation of the queen and her interactions with worker bees and comb cells, generating unique behavioral datasets of unprecedented length and quality. Significant key performance indicators of the queen and her social embedding in the colony were gathered by this tailored but also versatile robotic system. Data collected over 24-hour and 30-day periods demonstrate our system’s capability to extract key performance indicators on different system levels: Microscopic, mesoscopic, and macroscopic data are collected in parallel. Additionally, interactions between various agents are also observed and quantified. Long-term continuous observations yield high amounts of high-quality data when performed by an autonomous robot, going significantly beyond feasibly obtainable results of human observation methods or stationary camera systems. This allows a deep understanding of the innermost mechanisms of honeybees’ swarm-intelligent self-regulation as well as studying other ocial insect colonies, biocoenoses and ecosystems in novel ways. Social insects are keystone species in all ecosystems, thus understanding them better will be valuable to monitor, interpret, protect and even to restore our fragile ecosystems globally.

Effective Searching for the Honeybee Queen in a Living Colony

  • DOI: 10.1109/CASE59546.2024.10711366
  • Link: https://doi.org/10.1109/CASE59546.2024.10711366
  • Department: Department of Computer Science, Artificial Intelligence Center
  • Annotation:
    Despite the importance of honeybees as pollinators for the entire ecosystem and their recent decline threatening agricultural production, the dynamics of the living colony are not well understood. In our EU H2020 RoboRoyale project, we aim to support the pollination activity of the honeybees through robots interacting with the core element of the honeybee colony, the honeybee queen. In order to achieve that, we need to understand how the honeybee queen behaves and interacts with the surrounding worker bees. To gather the necessary data, we observe the queen with a moving camera, and occasionally, we instruct the system to perform selective observations elsewhere. In this paper, we deal with the problem of searching for the honeybee queen inside a living colony. We demonstrate that combining spatio-temporal models of queen presence with efficient search methods significantly decreases the time required to find her. This will minimize the chance of missing interesting data on the infrequent behaviors or queen-worker interactions, leading to a better understanding of the queen's behavior over time. Moreover, a faster search for the queen allows the robot to leave her more frequently and gather more data in other areas of the honeybee colony.

Toward Perpetual Occlusion-Aware Observation of Comb States in Living Honeybee Colonies

  • DOI: 10.1109/IROS58592.2024.10801380
  • Link: https://doi.org/10.1109/IROS58592.2024.10801380
  • Department: Department of Computer Science, Artificial Intelligence Center
  • Annotation:
    Honeybees are one of the most important pollinators in the ecosystem. Unfortunately, the dynamics of living honeybee colonies are not well understood due to their complexity and difficulty of observation. In our project 'RoboRoyale', we build and operate a robot to be a part of a bio-hybrid system, which currently observes the honeybee queen in the colony and physically tracks it with a camera. Apart from tracking and observing the queen, the system needs to monitor the state of the honeybee comb which is most of the time occluded by workerbees. This introduces a necessary tradeoff between tracking the queen and visiting the rest of the hive to create a daily map. We aim to collect the necessary data more effectively. We evaluate several mapping methods that consider the previous observations and forecasted densities of bees occluding the view. To predict the presence of bees, we use previously established maps of dynamics developed for autonomy in human-populated environments. Using data from the last observational season, we show significant improvement of the informed comb mapping methods over our current system. This will allow us to use our resources more effectively in the upcoming season.

Towards Robotic Mapping of a Honeybee Comb

  • DOI: 10.1109/MARSS61851.2024.10612712
  • Link: https://doi.org/10.1109/MARSS61851.2024.10612712
  • Department: Department of Computer Science, Artificial Intelligence Center
  • Annotation:
    Honeybees are irreplaceable pollinators with a direct impact on the global food supply.Researchers focus on understanding the dynamics of colonies to support their health and growth.In our project “RoboRoyale”, we aim to strengthen the colony through miniature robots interacting with the honeybee queen.To assess the colony's health and the effect of the interactions, it is crucial to monitor the whole honeybee comb and its development.In this work, we introduce key components of a system capable of autonomously evaluating the state of the comb without any disturbance to the living colony.We evaluate several methods for visual mapping of the comb by a moving camera and several algorithms for detecting visible cells between occluding bees.By combining image stitching techniques with open cell detection and their localization, we show that it is possible to capture how the comb evolves over time.Our results lay the foundations for real-time monitoring of a honeybee comb, which could prove essential in honeybee and environmental research.

Bootstrapped Learning for Car Detection in Planar Lidars

  • DOI: 10.1145/3477314.3507312
  • Link: https://doi.org/10.1145/3477314.3507312
  • Department: Department of Computer Science, Artificial Intelligence Center
  • Annotation:
    We present a proof-of-concept method for using bootstrapped learning for car detection in lidar scans using neural networks. We transfer knowledge from a traditional hand-engineered clustering and geometry-based detection technique to deep-learning-based methods. The geometry-based method automatically annotates laserscans from a vehicle travelling around a static car park over a long period of time. We use these annotations to automatically train the deep-learning neural network and evaluate and compare this method against the original geometrical method in various weather conditions. Furthermore, by using temporal filters, we can find situations where the original method was struggling or giving intermittent detections and still automatically annotate these frames and use them as part of the training process. Our evaluation indicates an increased detection accuracy and robustness as sensing conditions deteriorate compared to the method from which trained the neural network.

Embedding Weather Simulation in Auto-Labelling Pipelines Improves Vehicle Detection in Adverse Conditions

  • DOI: 10.3390/s22228855
  • Link: https://doi.org/10.3390/s22228855
  • Department: Department of Computer Science, Artificial Intelligence Center
  • Annotation:
    The performance of deep learning-based detection methods has made them an attractive option for robotic perception. However, their training typically requires large volumes of data containing all the various situations the robots may potentially encounter during their routine operation. Thus, the workforce required for data collection and annotation is a significant bottleneck when deploying robots in the real world. This applies especially to outdoor deployments, where robots have to face various adverse weather conditions. We present a method that allows an independent car tansporter to train its neural networks for vehicle detection without human supervision or annotation. We provide the robot with a hand-coded algorithm for detecting cars in LiDAR scans in favourable weather conditions and complement this algorithm with a tracking method and a weather simulator. As the robot traverses its environment, it can collect data samples, which can be subsequently processed into training samples for the neural networks. As the tracking method is applied offline, it can exploit the detections made both before the currently processed scan and any subsequent future detections of the current scene, meaning the quality of annotations is in excess of those of the raw detections. Along with the acquisition of the labels, the weather simulator is able to alter the raw sensory data, which are then fed into the neural network together with the labels. We show how this pipeline, being run in an offline fashion, can exploit off-the-shelf weather simulation for the auto-labelling training scheme in a simulator-in-the-loop manner. We show how such a framework produces an effective detector and how the weather simulator-in-the-loop is beneficial for the robustness of the detector. Thus, our automatic data annotation pipeline significantly reduces not only the data annotation but also the data collection effort. This allows the integration of deep learning algorithms into existing robotic systems without the need for tedious data annotation and collection in all possible situations. Moreover, the method provides annotated datasets that can be used to develop other methods. To promote the reproducibility of our research, we provide our datasets, codes and models online.

Responsible person Ing. Mgr. Radovan Suk