top of page

SteadiRTIS: Stabilized Real-Time Acoustic Imaging 

We introduce SteadiRTIS, a novel software-based method for stabilizing acoustic images in in-air 3D sonars. Traditional static beamforming struggles with uneven terrain, leading to misalignment, inaccurate measurements, and imaging artifacts, while mechanical stabilization is costly and less reliable. Our approach fuses adaptive conventional beamforming with real-time IMU data to dynamically adjust the sonar array's steering matrix based on tilt angles caused by uneven ground. Additionally, we apply gain compensation to address emission energy loss from the transducer's directivity pattern. Indoor and outdoor tests validate significant improvements in temporal image consistency. The GPU-accelerated system operates in real-time, averaging 210ms per execution, meeting autonomous navigation demands.

Jansen et al - SteadiRTIS - IEEE Sensors Conference

SonoTraceLab - A Raytracing-Based Acoustic Modelling System

Echolocation is the primary sensing modality for many bat species, enabling them to perform complex tasks in unstructured environments. Understanding this remarkable sensorimotor ability is key to developing robust, high-performance sonar systems. To uncover the perception mechanisms behind echolocation, it's crucial to analyze the reflected signals bats perceive. While ensonification experiments provide valuable insights, they are time-intensive. This paper introduces SonoTraceLab, an open-source software for simulating technical and biological sonar systems in complex scenes. By leveraging simulations, researchers can gain deeper insights into biological echolocation while minimizing time and material constraints

Jansen, W., & Steckel, J. (2024). SonoTraceLab-A Raytracing-Based Acoustic Modelling System for Simulating Echolocation Behavior of Bats. IEEE Access.

Cross Modality Transformation using Deep Neural Networks

3D sonar data excels in harsh and challenging environments, offering robust sensing capabilities even in conditions with low visibility, such as underwater, fog, or dust-filled areas. However, it falls short in providing the high-resolution detail and precision that robotic engineers typically rely on when using LIDAR data. This limitation has historically hindered its broader application in scenarios requiring fine-grained spatial understanding.In this video, we introduce and demonstrate the transformative potential of cutting-edge deep neural network architectures designed to bridge this gap. Our approach leverages advanced learning techniques to process and interpret raw 3D sonar data, converting it into a LIDAR-like representation.

Balemans, N., Hellinckx, P., & Steckel, J. (2021). Predicting lidar data from sonar images. IEEE Access, 9, 57897-57906.

Point-cloud generation using the eRTIS 3D Sonar Sensor

In this video, we showcase the generation of point clouds using our innovative eRTIS sonar sensor. The point clouds produced by our system are visually overlaid on top of those generated by a state-of-the-art LIDAR system to provide a clear and direct comparison. This overlay highlights the unique characteristics and capabilities of our sonar-based approach in capturing 3D spatial information, particularly in environments where LIDAR performance might degrade due to challenging conditions like fog, water, or dust.

This demonstration exemplifies the potential of eRTIS technology to bridge the gap between sonar and LIDAR systems, offering a new perspective for robotic and sensing applications. Its impact was recognized by the professional community, earning the prestigious Best Live Demonstration award from the IEEE Sensors Council at the IEEE Sensors 2019 conference. This accolade underscores the significance of our work in advancing sensing technologies for real-world challenges.

Kerstens, R., Schouten, G., Jansen, W., Laurijssen, D., & Steckel, J. (2019, October). Live demonstration of eRTIS, an embedded real-time imaging sonar sensor. In 2019 IEEE SENSORS (pp. 1-1). IEEE.

AirLeakSLAM: Detecting Air Leaks using Ultrasound Arrays

Using ultrasonic sensing, we detect pressurized air leaks in industrial environments by capturing the unique high-frequency acoustic signatures they generate. Our system integrates advanced robotics algorithms to not only detect but also precisely localize these leaks with an accuracy of up to 10 cm. Fully automated, it efficiently scans large, noisy industrial plants, eliminating the need for manual inspections. This solution significantly reduces downtime, energy losses, and maintenance costs, providing a practical and scalable approach to improving industrial efficiency and sustainability.

Schenck, A., Daems, W., & Steckel, J. (2019, October). AirleakSlam: Detection of pressurized air leaks using passive ultrasonic sensors. In 2019 IEEE SENSORS (pp. 1-4). IEEE.

Robotic Data Collection with the µRTIS Sensor

The imaging 3D sonar sensors developed at CoSys-Lab are specifically designed to enhance autonomous robotic navigation. By employing diverse design methodologies for each RTIS variation, we have created a versatile catalog of sensors capable of addressing a wide range of applications. These sensors excel in environments ranging from straightforward tasks, such as basic corridor tracking, to challenging and complex conditions, including underwater, foggy, or industrial settings. This adaptability ensures reliable and efficient performance, enabling robots to navigate safely and effectively in diverse scenarios. Through this tailored approach, CoSys-Lab continues to advance autonomous robotic capabilities with cutting-edge 3D sonar technology.

4o

Verellen, T., Kerstens, R., Laurijssen, D., & Steckel, J. (2020, May). Urtis: A small 3D imaging sonar sensor for robotic applications. In ICASSP 2020-2020 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (pp. 4801-4805). IEEE.

bottom of page