Spatially-aware lighting estimation
We present a method for automatically estimating the lighting conditions from a single image. As opposed to most previous works which proposed methods that estimate only global lighting or use a limited illumination representation (low frequency SSH, parametric model), the proposed method attempt to use a new spatially-varying light representation with realist texture to render shiny objects. Our method will estimate a coarse (cuboid) geometry of an indoor scene from a single image and use this geometric information to hallucinate a realistic room texture used for illumination. This representation will allow to place a virtual object anywhere in a standard photograph and render scene reflections to the viewer for realistic object insertion. To this end, an indoor HDR panorama dataset will be annotated with room layout and the method will be quantitatively evaluated and compared to the most relevant approaches from the literature.