Building end-to-end deblur image quality evaluation simulation for Hybrid-EVS-CIS sensor images
Author(s)
Daisuke Saito | OmniVision Technologies
Kamal Rana | OmniVision Techinologies
Zhiyao Yang | OmniVision Techonologies
Wei Zhang | OmniVision Techonologies
Bo Mu | OmniVision Techonologies
Eiichi Funatsu | OmniVision Techonologies
Abstract
Event-based Vision Sensor (EVS) generates pixel-level and low-latency event data that is useful for reconstructing temporal components of images in image deblurring. In this kind of development for new application, we need to know how EVS hardware parameters affect the image quality (IQ) to determine hardware specifications before starting design. To realize this approach, it is beneficial to build an End-to-End IQ-simulation which runs from hardware simulations to natural scene IQ evaluation. Previously, we developed Hybrid-EVS-CIS simulator to generate synthesis images fed into an image deblur block. We evaluated blurriness with Blurred Edge Width metric (BEW).
In this paper, we extend this End-to-End IQ-simulation to natural scene. We propose an IQ evaluation scheme using Video Multi-method Assessment Fusion (VMAF) and BEW to evaluate both of blurriness and noise as a building block of End-to-End IQ-simulation. We proved that VMAF was applicable not only to videos but also to deblur images. We also introduced a method to assess pixel-speed, which affects blurriness, by utilizing the correlation between VMAF and BEW. It was the last piece to realize End-to-End IQ-simulation to be used in EVS hardware design with checking both of images and their IQ metric value.
Building end-to-end deblur image quality evaluation simulation for Hybrid-EVS-CIS sensor images
Description
Date and Location: 2/3/2025 | 04:10 PM - 04:30 PM | Grand Peninsula APrimary Session Chair:
Patrick Denny | University of LImerick
Session Co-Chair:
Peter Burns | Burns Digital Imaging LLC
Paper Number: IQSP-240
Back to Session Gallery