Spin-UP: spin light for natural light uncalibrated photometric stereo
Natural Light Uncalibrated Photometric Stereo (NaUPS) relieves the strict environment and light assumptions in classical Uncalibrated Photometric Stereo (UPS) methods. However, due to the intrinsic ill-posedness and high-dimensional ambiguities, addressing NaUPS is still an open question. Existin...
Saved in:
Main Authors: | , , , , , , |
---|---|
Other Authors: | |
Format: | Conference or Workshop Item |
Language: | English |
Published: |
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/178564 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Natural Light Uncalibrated Photometric Stereo (NaUPS) relieves the strict
environment and light assumptions in classical Uncalibrated Photometric Stereo
(UPS) methods. However, due to the intrinsic ill-posedness and high-dimensional
ambiguities, addressing NaUPS is still an open question. Existing works impose
strong assumptions on the environment lights and objects' material, restricting
the effectiveness in more general scenarios. Alternatively, some methods
leverage supervised learning with intricate models while lacking
interpretability, resulting in a biased estimation. In this work, we proposed
Spin Light Uncalibrated Photometric Stereo (Spin-UP), an unsupervised method to
tackle NaUPS in various environment lights and objects. The proposed method
uses a novel setup that captures the object's images on a rotatable platform,
which mitigates NaUPS's ill-posedness by reducing unknowns and provides
reliable priors to alleviate NaUPS's ambiguities. Leveraging neural inverse
rendering and the proposed training strategies, Spin-UP recovers surface
normals, environment light, and isotropic reflectance under complex natural
light with low computational cost. Experiments have shown that Spin-UP
outperforms other supervised / unsupervised NaUPS methods and achieves
state-of-the-art performance on synthetic and real-world datasets. Codes and
data are available at https://github.com/LMozart/CVPR2024-SpinUP. |
---|