You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+7-10Lines changed: 7 additions & 10 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -62,6 +62,8 @@ There are two ways to install the codebase: directly on your [local machine](#en
62
62
63
63
### Environment Setup
64
64
65
+
We use conda to manage the environment, you can install it follow [here](assets/README.md#system). Then create the base environment with the following command [5~15 minutes]:
CUDA package (we already install nvcc compiler inside conda env), the compile time is around 1-5 minutes:
74
-
```bash
75
-
mamba activate opensf
76
-
# CUDA already install in python environment. I also tested others version like 11.3, 11.4, 11.7, 11.8 all works
77
-
cd assets/cuda/mmcv && python ./setup.py install &&cd ../../..
78
-
cd assets/cuda/chamfer3D && python ./setup.py install &&cd ../../..
79
-
```
80
-
81
75
### Docker (Recommended for Isolation)
82
76
83
77
You always can choose [Docker](https://en.wikipedia.org/wiki/Docker_(software)) which isolated environment and free yourself from installation. Pull the pre-built Docker image or build manually.
@@ -115,8 +109,11 @@ Once extracted, you can directly use this dataset to run the [training script](#
115
109
116
110
## 2. Quick Start
117
111
118
-
Don't forget to active Python environment before running the code.
119
-
If you want to use [wandb](wandb.ai), replace all `entity="kth-rpl",` to your own entity otherwise tensorboard will be used locally.
112
+
Some tips before running the code:
113
+
* Don't forget to active Python environment before running the code.
114
+
* If you want to use [wandb](wandb.ai), replace all `entity="kth-rpl",` to your own entity otherwise tensorboard will be used locally.
115
+
* Set correct data path by passing the config, e.g. `train_data=/home/kin/data/av2/h5py/demo/train val_data=/home/kin/data/av2/h5py/demo/val`.
116
+
120
117
And free yourself from trainning, you can download the pretrained weight from [HuggingFace](https://huggingface.co/kin-zhang/OpenSceneFlow) and we provided the detail `wget` command in each model section.
0 commit comments