Skip to content
/ LoSF Public

Source codes for LoSF project which learns UDFs based on local shape functions

License

Notifications You must be signed in to change notification settings

jbHu67/LoSF

Repository files navigation

LoSF: A Lightweight UDF Learning Framework for 3D Reconstruction Based on Local Shape Functions

LoSF: A Lightweight UDF Learning Framework for 3D Reconstruction Based on Local Shape Functions

Jinagbei Hu, Yanggeng Li, Fei Hou, Junhui Hou, Zhebin Zhang, Shengfa Wang, Na Lei, Ying He

Abstract: Unsigned distance fields (UDFs) provide a versatile framework for representing a diverse array of 3D shapes, encompassing both watertight and non-watertight geometries. Traditional UDF learning methods typically require extensive training on large 3D shape datasets, which is costly and necessitates re-training for new datasets. This paper presents a novel neural framework, LoSF-UDF, for reconstructing surfaces from 3D point clouds by leveraging local shape functions to learn UDFs. We observe that 3D shapes manifest simple patterns in localized regions, prompting us to develop a training dataset of point cloud patches characterized by mathematical functions that represent a continuum from smooth surfaces to sharp edges and corners. Our approach learns features within a specific radius around each query point and utilizes an attention mechanism to focus on the crucial features for UDF estimation. Despite being highly lightweight, with only 653 KB of trainable parameters and a modest-sized training dataset with 0.5 GB storage, our method enables efficient and robust surface reconstruction from point clouds without requiring for shape-specific training. Furthermore, our method exhibits enhanced resilience to noise and outliers in point clouds compared to existing methods. We conduct comprehensive experiments and comparisons across various datasets, including synthetic and real-scanned point clouds, to validate our method's efficacy. Notably, our lightweight framework offers rapid and reliable initialization for other unsupervised iterative approaches, improving both the efficiency and accuracy of their reconstructions.

News

  • 2025-01-18: Code released!

Environment Setup

Our code is implemented in Python 3.9, PyTorch 2.3.0 and CUDA 12.1. We wrote the code based on the Lightning-Hydra-Template.

Conda environment

conda create -n losf python=3.9
conda activate losf
conda install pytorch==2.3.0 torchvision==0.18.0 torchaudio==2.3.0 pytorch-cuda=12.1 -c pytorch -c nvidia
pip install --extra-index-url https://miropsota.github.io/torch_packages_builder pytorch3d==0.7.7+pt2.3.0cu121
conda install lightning -c conda-forge
pip install hydra-core hydra-colorlog hydra-optuna-sweeper rootutils rich pre-commit
pip install numba trimesh torch_geometric scikit-image matplotlib joblib
pip install torch-cluster -f https://data.pyg.org/whl/torch-2.3.0+cu121.html

Data Preparation

The training data can be downloaded from Zenodo . You can also create your own data through ./data/prepare_dataset_smooth.py and ./data/prepare_dataset_sharp.py.

Train

The training process of LoSF is not restricted to any specific shape class. After downloading or generating the training dataset, please place it in ./data/train/. Following the Lightning-Hydra-Template, we use Hydra to manage parameter configurations, which are stored in ./configs. The default configurations are specified in ./configs/train.yaml. You can start training by running:

python ./src/train.py

You can override the default configurations by creating a new YAML file in the ./configs/experiment. For example, to test a lighter network, you can reduce the network dimensions as shown in ./configs/experiment/train-lighter.yaml. The new experiment can be conducted by running:

python ./src/train.py experiment=train-lighter

First, update the data directory and work directory in ./configs/paths/default.yaml and ./configs/data/default.yaml to match your device configuration.

Evaluation

  • Save the point clouds to be tested in PLY format in a folder /path/to/pcd/

  • The main execution file is ./src/eval.py. The parameter passing section is as follows. You can modify the ./configs/experiment/eval.yaml file to change the parameters.

  • You can also specify the parameters via the command line, which will override the YAML configuration

    python ./src/eval.py \
      ckpt_path=/path/to/ckpt \
      data.data_dir=/path/to/pcd \
      mesh.threshold=0.005 \
      mesh.is_cut=false \
      experiment.name=data_full_bdry \
      data.radius=0.04 \
      data.has_noise=false \
      data.noise_level=0.0 \
      paths.output_dir=/dir/to/save/ \
      trainer.devices=[6]

We provide our trained parameters in ./pretrained/train-total-uniform.ckpt.

To Do

  • Add codes that integrate LoSF with unsupervised methods.
  • Accelerate the local patch detection.

About

Source codes for LoSF project which learns UDFs based on local shape functions

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published