Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Skip to content

zou-longkun/RPD

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

#Boosting Cross-Domain Point Classification via Distilling Relational Priors from 2D Transformers

Introduction

This repo is a PyTorch implementation for Boosting Cross-Domain Point Classification via Distilling Relational Priors from 2D Transformers Paper

Requirements

The code has been tested with

  • Python >= 3.7
  • PyTorch == 1.8.0+cu111
  • torch-scatter == 2.0.7
  • torchsampler == 0.1.2
  • torchvision == 0.9.0+cu111

Some dependent packages:

Please refer to issue #6 before installing.

cd PyTorchEMD
python setup.py install

Dataset

Download the official PointDA-10 dataset and put the folder under [your_dataroot]/data/.
After download, the directory structure should be:

${ROOT}
|--PointDA_data
|  |--modelnet
|  |--scannet
|  |--shapenet

Download MAE Pre-trained Vit Model

Download the MAE Pre-trained Vit Model and put the folder under pretrained/.

Train

Training on both source and target

python main.py --src_dataset modelnet --trgt_dataset scannet --dataroot [your_dataroot] --batch_size 16
python main_spst.py --exp_name 'spst' --trgt_dataset scannet --dataroot [your_dataroot] --batch_size 16 --lr 5e-5

If you want to test with pre-trained model, download it from here and place it at experiments/

Acknowlegment

This repo benefits from PointCLIP_V2, MAE, GAST. Thanks for their wonderful works.

About

Relational Priors Distillation

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published