CNN for Bayesian Semantic Scene Completion from a single depth image. This is the code release for our paper from ICPR2020. Preprint is also available at ArXiV.
This is a PyTorch implementation of a Bayesian Convolutional Neural Network (BCNN) for Semantic Scene Completion on the SUNCG dataset. Given a depth image represented as inverted depth (see code) the network outputs a semantic segmentation and entropy score in 3D voxel format.
git clone [email protected]:DavidGillsjo/bssc-net.git --recurse-submodules -j8Check your Nvidia GPU model and adjust the variable gpu_arch accordingly in docker/<module>/Dockerfile.
With elevated docker group permissions:
cd docker/<module>
./build.sh
./run.shWithout:
cd docker/<module>
sudo DUSER=<username> ./build.sh
sudo DHOME=/home/<username> ./run.shDownload camera viewpoints:
wget http://pbrs.cs.princeton.edu/pbrs_release/data/camera_v2.zip
unzip camera_v2.zip -d ../cameraBuild and run the suncg_house3d docker in the docker folder, see section Using the docker images.
Alternatively install the dependencies yourself, see the Dockerfile.
Go to repo root from docker:
cd /host_home/<your-repo-path>the following code snippets assume your are in this directory.
cd libs/SUNCGtoolbox/gaps
make clean
makecd libs/House3D/renderer
PYTHON_CONFIG=python3-config make -jSet python path:
. init_env.shExecute rendering and grid generation script:
cd preprocessing
python3 generate_grids.py <suncg_dir> --nbr-proc 8 --model-blacklist blacklists/default.yaml --datasets <data_split_dir>/*mini.jsonThis will store results in <suncg_dir>/scene_comp, for more details see python3 generate_grids.py --help.
JSON dataset files are generated with preprocessing/generate_data_splits.py.
The splits used in the article can be found here.
Build and run the bssc docker in the docker folder, see section Using the docker images.
Alternatively install the dependencies yourself, see the Dockerfile.
Depth images with corresponding voxel ground truth generated in previous section is required. For example placed in /host_home/<suncg_dir>/scene_comp.
Training can be monitored with tensorboard, for this you need to run:
cd ssc/scripts
./start_tensorboard.sh &To start training, run:
cd ssc/scripts
python3 train.py /host_home/<suncg_dir>/scene_comp <train_json> --cfg ../cfg/train_bayesian.yaml --val <val_json>See configs for options.
To run the example, first download the pre-trained weights from here. Then run the evaluation script:
cd ssc/scripts
python3 eval.py ../../example ../../example/dataset.json bssc.tar --cfg ../cfg/eval_bayesian.yaml --result-dir <my_result_path>For the SSC style net or run the following for Unet implementation:
cd ssc/scripts
python3 eval.py ../../example ../../example/dataset.json bssc_unet.tar --cfg ../cfg/eval_bayesian_unet.yaml --result-dir <my_result_path>@INPROCEEDINGS{bssc_net,
author={Gillsjö, David and Åström, Kalle},
booktitle={2020 25th International Conference on Pattern Recognition (ICPR)},
title={In Depth Bayesian Semantic Scene Completion},
year={2021},
volume={},
number={},
pages={6335-6342},
doi={10.1109/ICPR48806.2021.9412403}}
This work is supported by Wallenberg AI Autonomous Systems and Software Program.
