Research Presentation: Multi-Camera Synchronization for the NanEye CMOS Image Sensor
May 25, 2016 H: 17:00
– 18:00
Location: Room A416
Professor Morgado Dias from the University of Madeira will hold a research presentation outlining experiments and results in the area of video processing on FPGA for medical applications.
Short Resume
Fernando Morgado-Dias received his Master’s degree in Microelectronics from the University Joseph Fourier in Grenoble, France in 1995 and his PhD from the University of Aveiro, Portugal, in 2005 and is currently Assistant professor at the University of Madeira and Researcher at Madeira Interactive Technologies Institute. His research unit is graded as excellent and is a private entity created between the University of Madeira, Carnegie Mellon University (United States) and the regional Government of Madeira.
He has published around 100 scientific documents between journal and conference papers, reports and thesis. He is the current Vice-President and former President of the Portuguese Association of Automatic Control. He is also former Pro-Rector of the University of Madeira and former visiting faculty at Carnegie Mellon University (United States). His research interests include Renewable Energy and Artificial Neural Networks and FPGA implementations.
Multi-Camera Synchronization for the NanEye CMOS Image Sensor
Awaiba is a CMOS sensor designer company that produces the smallest sensor available in the market for medical applications. This sensor has been used for endoscopy and 3D medical imaging is being prepared to improve the visualization of the elements inside the human body. To construct 3D images synchronization problems arise. This work is focused on solving the synchronization problems for this particular sensor. The proposed solution is based on Awaiba’s NanEye CMOS image sensor family and a FPGA platform with USB3 interface and synchronizes up to 8 individual self-timed cameras.
Minimal form factor self-timed camera modules of 1 mm x 1 mm or smaller do not generally allow external synchronization. However, for stereovision or 3D reconstruction with multiple cameras as well as for applications requiring pulsed illumination it is required to synchronize multiple cameras. In this work, the challenge of synchronizing multiple self-timed cameras with only 4 wire interface has been solved by adaptively regulating the power supply for each of the cameras to synchronize their frame rate and frame phase.
To that effect, a control core was created to constantly monitor the operating frequency of each camera by measuring the line period in each frame based on a well-defined sampling signal. The frequency is adjusted by varying the voltage level applied to the sensor based on the error between the measured line period and the desired line period. To ensure phase synchronization between frames of multiple cameras, a Master-Slave interface was implemented. A single camera is defined as the Master entity, with its operating frequency being controlled directly through a PC based interface. The remaining cameras are setup in Slave mode and interfaced directly with the Master camera control module. This enables the remaining cameras to monitor its line and frame period and adjust their own to achieve phase and frequency synchronization. The result of this work will allow the implementation of a 3mm diameter 3D stereovision equipment in medical endoscopic context, such as endoscopic surgical robotic or micro invasive surgery.
Update: following the presentation held, the slides are available here
Published on May 20, 2016, 20:08.