“Signal Processing in the AI era” was the tagline of this year’s IEEE International Conference on Acoustics, Speech and Signal Processing, taking place in Rhodes, Greece.
In this context, Brent de Weerdt, Xiangyu Yang, Boris Joukovsky, Alex Stergiou and Nikos Deligiannis presented ETRO’s research during poster sessions and oral presentations, with novel ways to process and understand graph, video, and audio data. Nikos Deligiannis chaired a session on Graph Deep Learning, attended the IEEE T-IP Editorial Board Meeting, and had the opportunity to meet with collaborators from the VUB-Duke-Ugent-UCL joint lab.
Featured articles:

The ETRO team, comprising Salar Tayebi, Ashkan Zarghami, Manu Malbrain, and Johan Stiens, has developed an innovative programmable abdominal phantom. This device is designed to simulate various scenarios and pathologies related to ICU patients. Recently showcased at the 13th International Fluid Academy Days in Antwerp, it marks a significant step forward in the realm of medical training and research. This abdominal phantom is primarily aimed at enhancing the training experience for nurses and engineering students, providing them with a more hands-on and realistic learning environment. Additionally, it holds potential for use in validating medical equipment and facilitating research and development in medical technologies. Its introduction to the medical doctors’ ecosystem underscores its relevance and utility in contemporary medical education and practice

A research team of ETRO was selected (under supervision of Prof. Johan Stiens and Dr. Bruno Da Silva) to participate to the I-LOVE-SCIENCE FESTIVAL (15-16-17/10/2021) in BRUSSELS with demos of wearable devices.
During the festival, the visitor will be introduced to various existing portable/wearable medical measuring instruments and their operating principles by Angel, Joan, Salar, Vlad, Bruno and Johan. The visitor will be able to experiment with various technologies to detect different physiological signals of his/her body under different conditions of activity. The measurement systems are specially designed for educational purposes, such that the user will also be able to change settings themselves and check their influence (a little engineering experience).
In addition to the technical-medical aspects, the social relevance will also be explained: how this measurement technology can contribute to preventive medicine, extremely important for social cost reduction of the health care costs.
The Charcot Fund Jury met on December 9, 2022.
The project “Disentangling cognitive functioning and visual scanning deficits in cognitive test scores” (Prom: Prof J. Van Schependom), has been selected by the Jury for the Charcot Fund 2023.
The Charcot Fund Ceremony will take place on 31 January 2023 at the University Foundation.
On April 25th 2025 at 16:00, Ayman Morsy will defend their PhD entitled “A NOVEL APPROACH TO DEPTH-SENSE IMAGING USING CORRELATION-ASSISTED DIRECT TIME-OF-FLIGHT”.
Everybody is invited to attend the presentation in room I.0.03 or online via this link.
Time-of-flight (ToF) imaging has emerged as a vital technology in machine vision and sensing, expanding into applications such as augmented and virtual reality, gaming, robotics, autonomous driving, autofocus, and facial recognition on smartphones and laptops. ToF technology determines the distance to an object within the detection range by emitting a light source and measuring the time it takes to return. This round-trip time determines the object’s distance, with different sensing technologies employing distinct methods to determine this time.
For ToF applications, developing sensors with high image resolution, low power consumption, and the ability to function reliably in high ambient light conditions is desirable. This dissertation presents the development of a novel single-photon avalanche diode (SPAD)-based pixel called Correlation-Assisted Direct Time-of-Flight (CA-dToF), designed for in-pixel ambient light suppression and characterized by low power consumption and a scalable pixel structure. The CA-dToF pixel uses a laser pulse correlated with two orthogonal sinusoidal signals as input to two switched capacitor channels, which average out detected ambient light while accumulating the laser pulse round-trip time.
To gain insights into CA-dToF pixel operation, both Python simulation and analytical modeling were developed. Two generations of the CA-dToF pixel were developed and characterized, with the second-generation pixel achieving the first operational performance under high ambient light conditions. The two-generation CA-dToF pixel was tested under various lighting conditions and pixel design variations. Additionally, noise sources within the pixel implementation were analyzed, and potential solutions were proposed.
A new technique helps surgeons better visualize cancer cells during operations, improving their precision in removing tumors. Existing imaging methods like MRI or CT scans often lack the detail needed to clearly distinguish cancerous tissue from healthy tissue. While fluorescence-guided imaging uses special contrast agents that emit light to highlight tumors, it still struggles to show clear borders. To solve this, researchers developed fluorescence lifetime imaging, which measures how long the contrast agent glows, giving a more accurate picture of the tumor’s edges. ETRO has created a special camera for this purpose, which is now being tested on dogs before it is used in human surgeries, with the goal of making cancer operations safer and more effective.