-
Notifications
You must be signed in to change notification settings - Fork 22
Description
Hi!
I am following the README instructions to prepare the training data, and I encountered some confusion while attempting to proceed with the train_gaussian.py script. Below is a detailed breakdown of the steps I’ve followed so far:
Steps Followed:
-
Rendered Images Creation:
I used the Stanford Shapenet Renderer to generate rendered images for a single class (e.g., "chair") in theShapeNetCore_rendersdirectory. The structure of my directory for the "chair" class looks like this:ShapeNetCore_renders/ └── chair/ ├── 1a6f615e8b1b5ae4dbbc9440457e303e/ │ └── models/ │ ├── models_r_000.png │ ├── models_r_000_albedo0001.png │ ├── models_r_000_depth0001.png │ ├── models_r_000_normal0001.png │ ├── models_r_012.png │ └── ... # other different angle images ├── 1a8bbf2994788e2743e99e0cae970928/ ├──1a74a83fa6d24b3cacd67ce2c72c02e/ └── ... # other objects -
Point Cloud Generation:
I then ran theprocess_data/sample_points.pyscript to generate point clouds for all objects in theShapeNetCoredirectory. For a specific object, the directory structure now looks like this:ShapeNetCore/03001627/1a6f615e8b1b5ae4dbbc9440457e303e/models/ ├── model_normalized.json ├── model_normalized.mtl ├── model_normalized.obj ├── model_normalized.solid.binvox ├── model_normalized.surface.binvox └── points3d.ply # Generated point cloud file
Issue:
I am now trying to use the train_gaussian.py script to generate a Gaussian scene, but I am unclear on how to proceed with this script. Specifically:
- How do I structure the input data for this script? Should I use the rendered images, the point cloud data, or both?
- Are there any specific configurations or pre-processing steps required before running
train_gaussian.py? - What should the output of the script look like (e.g., file format, directory structure)?
- Could you provide an example or clarification on how to execute the script for a single class (e.g., "chair")?
Any guidance or examples would be greatly appreciated!
Thank you for your help!