|
| 1 | +# CornerNet-Lite: Training, Evaluation and Testing Code |
| 2 | +Code for reproducing results in the following paper: |
| 3 | + |
| 4 | +[**CornerNet-Lite: Efficient Keypoint Based Object Detection**](https://arxiv.org/abs/1904.08900) |
| 5 | +Hei Law, Yun Teng, Olga Russakovsky, Jia Deng |
| 6 | +*arXiv:1904.08900* |
| 7 | + |
| 8 | +## Getting Started |
| 9 | +### Software Requirement |
| 10 | +- Python 3.7 |
| 11 | +- PyTorch 1.0.0 |
| 12 | +- CUDA 10 |
| 13 | +- GCC 4.9.2 or above |
| 14 | + |
| 15 | +### Installing Dependencies |
| 16 | +Please first install [Anaconda](https://anaconda.org) and create an Anaconda environment using the provided package list `conda_packagelist.txt`. |
| 17 | +``` |
| 18 | +conda create --name CornerNet_Lite --file conda_packagelist.txt --channel pytorch |
| 19 | +``` |
| 20 | + |
| 21 | +After you create the environment, please activate it. |
| 22 | +``` |
| 23 | +source activate CornerNet_Lite |
| 24 | +``` |
| 25 | + |
| 26 | +### Compiling Corner Pooling Layers |
| 27 | +Compile the C++ implementation of the corner pooling layers. (GCC4.9.2 or above is required.) |
| 28 | +``` |
| 29 | +cd <CornerNet-Lite dir>/core/models/py_utils/_cpools/ |
| 30 | +python setup.py install --user |
| 31 | +``` |
| 32 | + |
| 33 | +### Compiling NMS |
| 34 | +Compile the NMS code which are originally from [Faster R-CNN](https://github.com/rbgirshick/py-faster-rcnn/blob/master/lib/nms/cpu_nms.pyx) and [Soft-NMS](https://github.com/bharatsingh430/soft-nms/blob/master/lib/nms/cpu_nms.pyx). |
| 35 | +``` |
| 36 | +cd <CornerNet-Lite dir>/core/external |
| 37 | +make |
| 38 | +``` |
| 39 | + |
| 40 | +### Downloading Models |
| 41 | +In this repo, we provide models for the following detectors: |
| 42 | +- [CornerNet-Saccade](https://drive.google.com/file/d/1MQDyPRI0HgDHxHToudHqQ-2m8TVBciaa/view?usp=sharing) |
| 43 | +- [CornerNet-Squeeze](https://drive.google.com/file/d/1qM8BBYCLUBcZx_UmLT0qMXNTh-Yshp4X/view?usp=sharing) |
| 44 | +- [CornerNet](https://drive.google.com/file/d/1e8At_iZWyXQgLlMwHkB83kN-AN85Uff1/view?usp=sharing) |
| 45 | + |
| 46 | +Put the CornerNet-Saccade model under `<CornerNet-Lite dir>/cache/nnet/CornerNet_Saccade/`, CornerNet-Squeeze model under `<CornerNet-Lite dir>/cache/nnet/CornerNet_Squeeze/` and CornerNet model under `<CornerNet-Lite dir>/cache/nnet/CornerNet/`. (\* Note we use underscore instead of dash in both the directory names for CornerNet-Saccade and CornerNet-Squeeze.) |
| 47 | + |
| 48 | +Note: The CornerNet model is the same as the one in the original [CornerNet repo](https://github.com/princeton-vl/CornerNet). We just ported it to this new repo. |
| 49 | + |
| 50 | +### Running the Demo Script |
| 51 | +After downloading the models, you should be able to use the detectors on your own images. We provide a demo script `demo.py` to test if the repo is installed correctly. |
| 52 | +``` |
| 53 | +python demo.py |
| 54 | +``` |
| 55 | +This script applies CornerNet-Saccade to `demo.jpg` and writes the results to `demo_out.jpg`. |
| 56 | + |
| 57 | +In the demo script, the default detector is CornerNet-Saccade. You can modify the demo script to test different detectors. For example, if you want to test CornerNet-Squeeze: |
| 58 | +```python |
| 59 | +#!/usr/bin/env python |
| 60 | + |
| 61 | +import cv2 |
| 62 | +from core.detectors import CornerNet_Squeeze |
| 63 | +from core.vis_utils import draw_bboxes |
| 64 | + |
| 65 | +detector = CornerNet_Squeeze() |
| 66 | +image = cv2.imread("demo.jpg") |
| 67 | + |
| 68 | +bboxes = detector(image) |
| 69 | +image = draw_bboxes(image, bboxes) |
| 70 | +cv2.imwrite("demo_out.jpg", image) |
| 71 | +``` |
| 72 | + |
| 73 | +### Using CornerNet-Lite in Your Project |
| 74 | +It is also easy to use CornerNet-Lite in your project. You will need to change the directory name from `CornerNet-Lite` to `CornerNet_Lite`. Otherwise, you won't be able to import CornerNet-Lite. |
| 75 | +``` |
| 76 | +Your project |
| 77 | +│ README.md |
| 78 | +│ ... |
| 79 | +│ foo.py |
| 80 | +│ |
| 81 | +└───CornerNet_Lite |
| 82 | +│ |
| 83 | +└───directory1 |
| 84 | +│ |
| 85 | +└───... |
| 86 | +``` |
| 87 | + |
| 88 | +In `foo.py`, you can easily import CornerNet-Saccade by adding: |
| 89 | +```python |
| 90 | +from CornerNet_Lite import CornerNet_Saccade |
| 91 | + |
| 92 | +def foo(): |
| 93 | + cornernet = CornerNet_Saccade() |
| 94 | + # CornerNet_Saccade is ready to use |
| 95 | + |
| 96 | + image = cv2.imread('/path/to/your/image') |
| 97 | + bboxes = cornernet(image) |
| 98 | +``` |
| 99 | + |
| 100 | +If you want to train or evaluate the detectors on COCO, please move on to the following steps. |
| 101 | + |
| 102 | +## Training and Evaluation |
| 103 | + |
| 104 | +### Installing MS COCO APIs |
| 105 | +``` |
| 106 | +mkdir -p <CornerNet-Lite dir>/data |
| 107 | +cd <CornerNet-Lite dir>/data |
| 108 | +git clone git@github.com:cocodataset/cocoapi.git coco |
| 109 | +cd <CornerNet-Lite dir>/data/coco/PythonAPI |
| 110 | +make install |
| 111 | +``` |
| 112 | + |
| 113 | +### Downloading MS COCO Data |
| 114 | +- Download the training/validation split we use in our paper from [here](https://drive.google.com/file/d/1dop4188xo5lXDkGtOZUzy2SHOD_COXz4/view?usp=sharing) (originally from [Faster R-CNN](https://github.com/rbgirshick/py-faster-rcnn/tree/master/data)) |
| 115 | +- Unzip the file and place `annotations` under `<CornerNet-Lite dir>/data/coco` |
| 116 | +- Download the images (2014 Train, 2014 Val, 2017 Test) from [here](http://cocodataset.org/#download) |
| 117 | +- Create 3 directories, `trainval2014`, `minival2014` and `testdev2017`, under `<CornerNet-Lite dir>/data/coco/images/` |
| 118 | +- Copy the training/validation/testing images to the corresponding directories according to the annotation files |
| 119 | + |
| 120 | +To train and evaluate a network, you will need to create a configuration file, which defines the hyperparameters, and a model file, which defines the network architecture. The configuration file should be in JSON format and placed in `<CornerNet-Lite dir>/configs/`. Each configuration file should have a corresponding model file in `<CornerNet-Lite dir>/core/models/`. i.e. If there is a `<model>.json` in `<CornerNet-Lite dir>/configs/`, there should be a `<model>.py` in `<CornerNet-Lite dir>/core/models/`. There is only one exception which we will mention later. |
| 121 | + |
| 122 | +### Training and Evaluating a Model |
| 123 | +To train a model: |
| 124 | +``` |
| 125 | +python train.py <model> |
| 126 | +``` |
| 127 | + |
| 128 | +We provide the configuration files and the model files for CornerNet-Saccade, CornerNet-Squeeze and CornerNet in this repo. Please check the configuration files in `<CornerNet-Lite dir>/configs/`. |
| 129 | + |
| 130 | +To train CornerNet-Saccade: |
| 131 | +``` |
| 132 | +python train.py CornerNet_Saccade |
| 133 | +``` |
| 134 | +Please adjust the batch size in `CornerNet_Saccade.json` to accommodate the number of GPUs that are available to you. |
| 135 | + |
| 136 | +To evaluate the trained model: |
| 137 | +``` |
| 138 | +python evaluate.py CornerNet_Saccade --testiter 500000 --split <split> |
| 139 | +``` |
| 140 | + |
| 141 | +If you want to test different hyperparameters during evaluation and do not want to overwrite the original configuration file, you can do so by creating a configuration file with a suffix (`<model>-<suffix>.json`). There is no need to create `<model>-<suffix>.py` in `<CornerNet-Lite dir>/core/models/`. |
| 142 | + |
| 143 | +To use the new configuration file: |
| 144 | +``` |
| 145 | +python evaluate.py <model> --testiter <iter> --split <split> --suffix <suffix> |
| 146 | +``` |
| 147 | + |
| 148 | +We also include a configuration file for CornerNet under multi-scale setting, which is `CornerNet-multi_scale.json`, in this repo. |
| 149 | + |
| 150 | +To use the multi-scale configuration file: |
| 151 | +``` |
| 152 | +python evaluate.py CornerNet --testiter <iter> --split <split> --suffix multi_scale |
0 commit comments