Zhengqin Li, Jia Shi, Sai Bi, Rui Zhu, Kalyan Sunkavalli, Miloš Hašan, Zexiang Xu, Ravi Ramamoorthi, Manmohan Chandraker
We highly recommend using Anaconda to manage python packages. Required dependencies include:
- Download the OpenRooms dataset.
- Compile Optix-based shadow renderer with python binding.
- Go to OptixRendererShadow directory. Compile the code following this link.
- Modify the pytorch3D code to support RMSE chamfer distance loss.
- Go to chamfer.py.
- Add flag
isRMSE = Falseto functionchamfer_distance. - Modify function
chamfer_distanceby adding lines below after the definition ofcham_xandcham_y.
if isRMSE == True: cham_x = torch.sqrt(cham_x + 1e-6) cham_y = torch.sqrt(cham_y + 1e-6)
- Train models.
- Train the material prediction network.
python trainBRDF.py # Train material prediction.
- Train light source prediction networks.
python trainVisLamp.py # Train visible lamp prediction. python trainVisWindow.py # Train visible window prediction. python trainInvLamp.py # Train invisible lamp prediction. python trainInvWindow.py # Train invisible window prediction.
- Train the neural renderer.
python trainShadowDepth.py --isGradLoss # Train shadow prediction. python trainDirectIndirect.py # Train indirect illumination prediction. python trainPerpixelLighting.py # Train perpixel lighting prediction.
- Test models.
- Test the material prediction network.
python testBRDF.py # Test BRDF prediction. Results in Table 5 in the supp.
- Test light source prediction networks.
python testVisLamp.py # Test visible lamp prediction. Results in Table 3 in the main paper. python testVisWindow.py # Test visible window prediction. Results in Table 3 in the main paper. python testInvLamp.py # Test invisible lamp prediction. Results in Table 3 in the main paper. python testInvWindow.py # Test invisible window prediction. Results in Table 3 in the main paper.
- Test the neural renderer.
python testShadowDepth.py --isGradLoss # Test shadow prediction. Results in Table 2 in the main paper. python testDirectIndirect.py # Test indirect illumination prediction. python testPerpixelLighting.py # Test perpixel lighting prediction. python testFull.py # Test the whole neural renderer with predicted light sources. Results in Table 4 in the main paper.
- Prepare input data.
- Create a root folder, e.g.
Example1. - Create a folder
Example1/inputfor input data. The folder should include:image.png: Input RGB image of resolution240 x 320envMask.png: A mask specify the indoor/outdoor regions, where 0 indicates outdoor (window) regions.lampMask_x.png: Masks for visible lamps.xis its ID starting from 0.winMask_x.png: Masks for visible windows.xis its ID starting from 0.
- Create
testList.txt. Add absolute path ofExample1to its first line. - An example from our teaser scene can be found in Example1.
- Create a root folder, e.g.
- Depth prediction. We use DPT in our paper. Higher quality depth from RBGD sensor should lead to better results.
- Download DPT and save it in folder
DPT - Run python script
testRealDepth.py. Result will be saved asdepth.npyinExample1/input
python testRealDepth.py --testList testList.txt
- Download DPT and save it in folder
- Material and light source prediction.
- Run python script
testRealBRDFLight.py. Please add flag--isOptimizeto improve quality.
If you use real depth maps, e.g., depth maps captured by a depth sensor or reconstructed by a 3D reconstruction algorithms, such as Colmap or MonoSDF, please add flagpython testRealBRDFLight.py --testList testList.txt --isOptimize
--isNormalizeDepthfor better performances, because the networks are trained on normalized depth maps. - Run python script
- Edit light sources, geometry or materials.
- We prepare a list of edited examples from our teaser scene.
- Example1_changeAlbedo: Change wall colors with consistent indirect illumination.
- Example1_addObject: Insert virtual object with non-local shadows.
- Example1_addWindow_turnOffVisLampM: Open a virtual window with sunlight.
- Example1_addLamp_turnOffPredLamps: Insert a virtual lamp.
- Example1_turnOffVisLamp: Turn off the visible lamp in the scene.
- Example1_turnOffInvLamp: Turn off the invisible lamp in the scene.
- Please check
command.txtinside each folder to see how to render results. To reproduce all results in the teaser, you may need to combine several editing operations together.
- We prepare a list of edited examples from our teaser scene.
- Rerender the image with the neural renderer.
- Run python script
testRealRender.py. You may need to specify--objNamewhen inserting virtual objects. You may need to specify--isVisLampMeshwhen inserting virtual lamps. You may need to specify--isPerpixelLightingto predict perpixel environment maps, which are used to render specular bunnies on the Garon et al. dataset in the paper.
python testRealRender.py --testList testList.txt --isOptimize
- Run python script

