OneRestore : A universal restoration framework for composite degradation

In real-world scenarios, image impairments often manifest as composite degradations, presenting a complex interplay of elements such as low light, haze, rain, and snow. Despite this reality, existing restoration methods typically target isolated degradation types, thereby falling short in environmen...

Full description

Saved in:
Bibliographic Details
Main Authors: GUO, Yu, GAO, Yuan, LU, Yuxu, ZHU, Huilin, LIU, Ryan Wen, HE, Shengfeng
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2024
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/9772
https://ink.library.smu.edu.sg/context/sis_research/article/10772/viewcontent/2407.04621v4.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-10772
record_format dspace
spelling sg-smu-ink.sis_research-107722024-12-16T02:29:36Z OneRestore : A universal restoration framework for composite degradation GUO, Yu GAO, Yuan LU, Yuxu ZHU, Huilin LIU, Ryan Wen HE, Shengfeng In real-world scenarios, image impairments often manifest as composite degradations, presenting a complex interplay of elements such as low light, haze, rain, and snow. Despite this reality, existing restoration methods typically target isolated degradation types, thereby falling short in environments where multiple degrading factors coexist. To bridge this gap, our study proposes a versatile imaging model that consolidates four physical corruption paradigms to accurately represent complex, composite degradation scenarios. In this context, we propose OneRestore, a novel transformer-based framework designed for adaptive, controllable scene restoration. The proposed framework leverages a unique cross-attention mechanism, merging degraded scene descriptors with image features, allowing for nuanced restoration. Our model allows versatile input scene descriptors, ranging from manual text embeddings to automatic extractions based on visual attributes. Our methodology is further enhanced through a composite degradation restoration loss, using extra degraded images as negative samples to fortify model constraints. Comparative results on synthetic and real-world datasets demonstrate OneRestore as a superior solution, significantly advancing the state-ofthe-art in addressing complex, composite degradations. 2024-10-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/9772 https://ink.library.smu.edu.sg/context/sis_research/article/10772/viewcontent/2407.04621v4.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Image restoration Imaging model Transformer-based framework scene descriptors Databases and Information Systems Graphics and Human Computer Interfaces
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Image restoration
Imaging model
Transformer-based framework
scene descriptors
Databases and Information Systems
Graphics and Human Computer Interfaces
spellingShingle Image restoration
Imaging model
Transformer-based framework
scene descriptors
Databases and Information Systems
Graphics and Human Computer Interfaces
GUO, Yu
GAO, Yuan
LU, Yuxu
ZHU, Huilin
LIU, Ryan Wen
HE, Shengfeng
OneRestore : A universal restoration framework for composite degradation
description In real-world scenarios, image impairments often manifest as composite degradations, presenting a complex interplay of elements such as low light, haze, rain, and snow. Despite this reality, existing restoration methods typically target isolated degradation types, thereby falling short in environments where multiple degrading factors coexist. To bridge this gap, our study proposes a versatile imaging model that consolidates four physical corruption paradigms to accurately represent complex, composite degradation scenarios. In this context, we propose OneRestore, a novel transformer-based framework designed for adaptive, controllable scene restoration. The proposed framework leverages a unique cross-attention mechanism, merging degraded scene descriptors with image features, allowing for nuanced restoration. Our model allows versatile input scene descriptors, ranging from manual text embeddings to automatic extractions based on visual attributes. Our methodology is further enhanced through a composite degradation restoration loss, using extra degraded images as negative samples to fortify model constraints. Comparative results on synthetic and real-world datasets demonstrate OneRestore as a superior solution, significantly advancing the state-ofthe-art in addressing complex, composite degradations.
format text
author GUO, Yu
GAO, Yuan
LU, Yuxu
ZHU, Huilin
LIU, Ryan Wen
HE, Shengfeng
author_facet GUO, Yu
GAO, Yuan
LU, Yuxu
ZHU, Huilin
LIU, Ryan Wen
HE, Shengfeng
author_sort GUO, Yu
title OneRestore : A universal restoration framework for composite degradation
title_short OneRestore : A universal restoration framework for composite degradation
title_full OneRestore : A universal restoration framework for composite degradation
title_fullStr OneRestore : A universal restoration framework for composite degradation
title_full_unstemmed OneRestore : A universal restoration framework for composite degradation
title_sort onerestore : a universal restoration framework for composite degradation
publisher Institutional Knowledge at Singapore Management University
publishDate 2024
url https://ink.library.smu.edu.sg/sis_research/9772
https://ink.library.smu.edu.sg/context/sis_research/article/10772/viewcontent/2407.04621v4.pdf
_version_ 1819113134382645248