Joint Filtering of Intensity Images and Neuromorphic Events for High-Resolution Noise-Robust Imaging. Wang, Z. W., Duan, P., Cossairt, O., Katsaggelos, A., Huang, T., & Shi, B. In 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pages 1606–1616, jun, 2020. IEEE.
Joint Filtering of Intensity Images and Neuromorphic Events for High-Resolution Noise-Robust Imaging [link]Paper  doi  abstract   bibtex   
We present a novel computational imaging system with high resolution and low noise. Our system consists of a traditional video camera which captures high-resolution intensity images, and an event camera which encodes high-speed motion as a stream of asynchronous binary events. To process the hybrid input, we propose a unifying framework that first bridges the two sensing modalities via a noise-robust motion compensation model, and then performs joint image filtering. The filtered output represents the temporal gradient of the captured space-time volume, which can be viewed as motion-compensated event frames with high resolution and low noise. Therefore, the output can be widely applied to many existing event-based algorithms that are highly dependent on spatial resolution and noise robustness. In experimental results performed on both publicly available datasets as well as our new RGB-DAVIS dataset, we show systematic performance improvement in applications such as high frame-rate video synthesis, feature/corner detection and tracking, as well as high dynamic range image reconstruction.
@inproceedings{Zihaoa,
abstract = {We present a novel computational imaging system with high resolution and low noise. Our system consists of a traditional video camera which captures high-resolution intensity images, and an event camera which encodes high-speed motion as a stream of asynchronous binary events. To process the hybrid input, we propose a unifying framework that first bridges the two sensing modalities via a noise-robust motion compensation model, and then performs joint image filtering. The filtered output represents the temporal gradient of the captured space-time volume, which can be viewed as motion-compensated event frames with high resolution and low noise. Therefore, the output can be widely applied to many existing event-based algorithms that are highly dependent on spatial resolution and noise robustness. In experimental results performed on both publicly available datasets as well as our new RGB-DAVIS dataset, we show systematic performance improvement in applications such as high frame-rate video synthesis, feature/corner detection and tracking, as well as high dynamic range image reconstruction.},
author = {Wang, Zihao W. and Duan, Peiqi and Cossairt, Oliver and Katsaggelos, Aggelos and Huang, Tiejun and Shi, Boxin},
booktitle = {2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
doi = {10.1109/CVPR42600.2020.00168},
isbn = {978-1-7281-7168-5},
issn = {10636919},
month = {jun},
pages = {1606--1616},
publisher = {IEEE},
title = {{Joint Filtering of Intensity Images and Neuromorphic Events for High-Resolution Noise-Robust Imaging}},
url = {https://ieeexplore.ieee.org/document/9156457/},
year = {2020}
}

Downloads: 0