Skip to content

Commit

Permalink
update docs
Browse files Browse the repository at this point in the history
Summary: Pull Request resolved: fairinternal/detectron2#499

Reviewed By: alexander-kirillov

Differential Revision: D25936077

Pulled By: ppwwyyxx

fbshipit-source-id: 7d9417cb464f48950d8fac91259e88686fcef9f6
  • Loading branch information
ppwwyyxx authored and facebook-github-bot committed Jan 16, 2021
1 parent 5d891eb commit 05573d7
Show file tree
Hide file tree
Showing 18 changed files with 123 additions and 72 deletions.
2 changes: 1 addition & 1 deletion .github/ISSUE_TEMPLATE/bugs.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,7 @@ git rev-parse HEAD; git diff
<put code or diff here>
```
2. What exact command you run:
3. __Full logs__ you observed:
3. __Full logs__ or other relevant observations:
```
<put logs here>
```
Expand Down
3 changes: 3 additions & 0 deletions .github/ISSUE_TEMPLATE/config.yml
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,9 @@
blank_issues_enabled: false

contact_links:
- name: How-To / Other Questions
url: https://github.com/facebookresearch/detectron2/discussions
about: Use discussions for more general questions and support
- name: Detectron2 Documentation
url: https://detectron2.readthedocs.io/index.html
about: Check if your question is answered in docs
Expand Down
Empty file.
30 changes: 0 additions & 30 deletions .github/ISSUE_TEMPLATE/questions-help-support.md

This file was deleted.

7 changes: 3 additions & 4 deletions .github/ISSUE_TEMPLATE/unexpected-problems-bugs.md
Original file line number Diff line number Diff line change
@@ -1,12 +1,11 @@
---
name: "Unexpected behaviors"
name: "😩 Unexpected behaviors"
about: Run into unexpected behaviors when using detectron2
title: Please read & provide the following

---

If you do not know the root cause of the problem, and wish someone to help you, please
post according to this template:
If you do not know the root cause of the problem, please post according to this template:

## Instructions To Reproduce the Issue:

Expand All @@ -21,7 +20,7 @@ git rev-parse HEAD; git diff
<put code or diff here>
```
2. What exact command you run:
3. __Full logs__ you observed:
3. __Full logs__ or other relevant observations:
```
<put logs here>
```
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/check-template.yml
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@ jobs:
return;
}
let message = "You've chosen to report an unexpected problem or bug. Please include details about it by filling the [issue template](https://github.com/facebookresearch/detectron2/issues/new/choose).\n";
let message = "You've chosen to report an unexpected problem or bug. Unless you already know the root cause of it, please include details about it by filling the [issue template](https://github.com/facebookresearch/detectron2/issues/new/choose).\n";
message += "The following information is missing: ";
if (!hasInstructions) {
message += "\"Instructions To Reproduce the Issue and Full Logs\"; ";
Expand Down
79 changes: 74 additions & 5 deletions .github/workflows/needs-reply.yml
Original file line number Diff line number Diff line change
Expand Up @@ -7,14 +7,83 @@ on:
jobs:
close-issues-needs-more-info:
runs-on: ubuntu-latest
if: ${{ github.repository_owner == 'facebookresearch' }}
steps:
- name: Close old issues that need reply
uses: dwieeb/needs-reply@v2
uses: actions/github-script@v3
with:
repo-token: ${{ secrets.GITHUB_TOKEN }}
issue-label: needs-more-info
days-before-close: 7
close-message: Requested information was not provided in 7 days, so we're closing this issue.
github-token: ${{secrets.GITHUB_TOKEN}}
# Modified from https://github.com/dwieeb/needs-reply
script: |
// Arguments available:
// - github: A pre-authenticated octokit/rest.js client
// - context: An object containing the context of the workflow run
// - core: A reference to the @actions/core package
// - io: A reference to the @actions/io package
const kLabelToCheck = "needs-more-info";
const kInvalidLabel = "invalid/unrelated";
const kDaysBeforeClose = 7;
const kMessage = "Requested information was not provided in 7 days, so we're closing this issue."
issues = await github.issues.listForRepo({
owner: context.repo.owner,
repo: context.repo.repo,
state: 'open',
labels: kLabelToCheck,
sort: 'updated',
direction: 'asc',
per_page: 30,
page: 1,
});
issues = issues.data;
if (issues.length === 0) {
core.info('No more issues found to process. Exiting.');
return;
}
for (const issue of issues) {
if (!!issue.pull_request)
continue;
core.info(`Processing issue #${issue.number}`);
let updatedAt = new Date(issue.updated_at).getTime();
const numComments = issue.comments;
const comments = await github.issues.listComments({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issue.number,
per_page: 30,
page: Math.floor((numComments - 1) / 30) + 1, // the last page
});
const lastComments = comments.data
.map(l => new Date(l.created_at).getTime())
.sort();
if (lastComments.length > 0) {
updatedAt = lastComments[lastComments.length - 1];
}
const now = new Date().getTime();
const daysSinceUpdated = (now - updatedAt) / 1000 / 60 / 60 / 24;
if (daysSinceUpdated < kDaysBeforeClose) {
core.info(`Skipping #${issue.number} because it has been updated in the last ${daysSinceUpdated} days`);
continue;
}
core.info(`Closing #${issue.number} because it has not been updated in the last ${daysSinceUpdated} days`);
await github.issues.createComment({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issue.number,
body: kMessage,
});
const newLabels = numComments <= 2 ? [kInvalidLabel, kLabelToCheck] : issue.labels;
await github.issues.update({
owner: context.repo.owner,
repo: context.repo.repo,
issue_number: issue.number,
labels: newLabels,
state: 'closed',
});
}
lock-issues-after-closed:
runs-on: ubuntu-latest
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/remove-needs-reply.yml
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ jobs:
remove-needs-more-info-label:
runs-on: ubuntu-latest
# 1. issue_comment events could include PR comment, filter them out
# 2. Only trigger action if even was produced by the original author
# 2. Only trigger action if event was produced by the original author
if: ${{ !github.event.issue.pull_request && github.event.sender.login == github.event.issue.user.login }}
steps:
- name: Remove needs-more-info label
Expand Down
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ _ext
*.pyc
*.pyd
*.so
*.dll
*.egg-info/
build/
dist/
Expand Down
2 changes: 0 additions & 2 deletions GETTING_STARTED.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,8 +7,6 @@ see our [Colab Notebook](https://colab.research.google.com/drive/16jcaJoc6bCFAQ9
which covers how to run inference with an
existing model, and how to train a builtin model on a custom dataset.

For more advanced tutorials, refer to our [documentation](https://detectron2.readthedocs.io/tutorials/extend.html).


### Inference Demo with Pre-trained Models

Expand Down
18 changes: 11 additions & 7 deletions INSTALL.md
Original file line number Diff line number Diff line change
@@ -1,15 +1,10 @@
## Installation

Our [Colab Notebook](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5)
has step-by-step instructions that install detectron2.
The [Dockerfile](docker)
also installs detectron2 with a few simple commands.

### Requirements
- Linux or macOS with Python ≥ 3.6
- PyTorch ≥ 1.5 and [torchvision](https://github.com/pytorch/vision/) that matches the PyTorch installation.
You can install them together at [pytorch.org](https://pytorch.org) to make sure of this
- OpenCV is optional and needed by demo and visualization
Install them together at [pytorch.org](https://pytorch.org) to make sure of this
- OpenCV is optional but needed by demo and visualization


### Build Detectron2 from Source
Expand Down Expand Up @@ -241,3 +236,12 @@ The ONNX package is compiled with a too old compiler.
Please build and install ONNX from its source code using a compiler
whose version is closer to what's used by PyTorch (available in `torch.__config__.show()`).
</details>


### Installation inside specific environments:

* __Colab__: see our [Colab Tutorial](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5)
which has step-by-step instructions.

* __Docker__: The official [Dockerfile](docker) installs detectron2 with a few simple commands.

12 changes: 8 additions & 4 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ and it originates from [maskrcnn-benchmark](https://github.com/facebookresearch/
* Can be used as a library to support [different projects](projects/) on top of it.
We'll open source more research projects in this way.
* It [trains much faster](https://detectron2.readthedocs.io/notes/benchmarks.html).
* Models can be exported to torchscript format or caffe2 format for deployment.
* Models can be exported to TorchScript format or Caffe2 format for deployment.

See our [blog post](https://ai.facebook.com/blog/-detectron2-a-pytorch-based-modular-object-detection-library-/)
to see more demos and learn about detectron2.
Expand All @@ -26,10 +26,14 @@ to see more demos and learn about detectron2.

See [INSTALL.md](INSTALL.md).

## Quick Start
## Getting Started

See [GETTING_STARTED.md](GETTING_STARTED.md),
or the [Colab Notebook](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5).
Follow the [installation instructions](https://detectron2.readthedocs.io/tutorials/install.html) to
install detectron2.

See [Getting Started with Detectron2](https://detectron2.readthedocs.io/tutorials/getting_started.html),
and the [Colab Notebook](https://colab.research.google.com/drive/16jcaJoc6bCFAQ96jDe2HwtXj7BMD_-m5)
to learn about basic usage.

Learn more at our [documentation](https://detectron2.readthedocs.org).
And see [projects/](projects/) for some projects that are built on top of detectron2.
Expand Down
8 changes: 4 additions & 4 deletions datasets/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -38,7 +38,7 @@ coco/
You can use the 2014 version of the dataset as well.

Some of the builtin tests (`dev/run_*_tests.sh`) uses a tiny version of the COCO dataset,
which you can download with `./prepare_for_tests.sh`.
which you can download with `./datasets/prepare_for_tests.sh`.

## Expected dataset structure for PanopticFPN:

Expand All @@ -56,7 +56,7 @@ Install panopticapi by:
```
pip install git+https://github.com/cocodataset/panopticapi.git
```
Then, run `python prepare_panoptic_fpn.py`, to extract semantic annotations from panoptic annotations.
Then, run `python datasets/prepare_panoptic_fpn.py`, to extract semantic annotations from panoptic annotations.

## Expected dataset structure for [LVIS instance segmentation](https://www.lvisdataset.org/dataset):
```
Expand All @@ -75,7 +75,7 @@ pip install git+https://github.com/lvis-dataset/lvis-api.git
```

To evaluate models trained on the COCO dataset using LVIS annotations,
run `python prepare_cocofied_lvis.py` to prepare "cocofied" LVIS annotations.
run `python datasets/prepare_cocofied_lvis.py` to prepare "cocofied" LVIS annotations.

## Expected dataset structure for [cityscapes](https://www.cityscapes-dataset.com/downloads/):
```
Expand Down Expand Up @@ -137,4 +137,4 @@ ADEChallengeData2016/
images/
objectInfo150.txt
```
The directory `annotations_detectron2` is generated by running `python prepare_ade20k_sem_seg.py`.
The directory `annotations_detectron2` is generated by running `python datasets/prepare_ade20k_sem_seg.py`.
10 changes: 6 additions & 4 deletions detectron2/evaluation/coco_evaluation.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,8 @@ class COCOEvaluator(DatasetEvaluator):
for keypoint detection outputs using COCO's metrics.
See http://cocodataset.org/#detection-eval and
http://cocodataset.org/#keypoints-eval to understand its metrics.
The metrics range from 0 to 100 (instead of 0 to 1), where a -1 or NaN means
the metric cannot be computed (e.g. due to no predictions made).
In addition to COCO, this evaluator is able to support any bounding box detection,
instance segmentation, or keypoint detection dataset.
Expand Down Expand Up @@ -66,10 +68,9 @@ def __init__(
output_dir (str): optional, an output directory to dump all
results predicted on the dataset. The dump contains two files:
1. "instances_predictions.pth" a file in torch serialization
format that contains all the raw original predictions.
2. "coco_instances_results.json" a json file in COCO's result
format.
1. "instances_predictions.pth" a file that can be loaded with `torch.load` and
contains all the results in the format they are produced by the model.
2. "coco_instances_results.json" a json file in COCO's result format.
use_fast_impl (bool): use a fast but **unofficial** implementation to compute AP.
Although the results should be very close to the official implementation in COCO
API, it is still recommended to compute results with the official API for use in
Expand Down Expand Up @@ -229,6 +230,7 @@ def _eval_predictions(self, predictions, img_ids=None):
)
)
for task in sorted(tasks):
assert task in {"bbox", "segm", "keypoints"}, f"Got unknown task: {task}!"
coco_eval = (
_evaluate_predictions_on_coco(
self._coco_api,
Expand Down
7 changes: 4 additions & 3 deletions docs/tutorials/augmentation.md
Original file line number Diff line number Diff line change
Expand Up @@ -98,14 +98,15 @@ New transform operation can also be added by subclassing

We give a few examples of advanced usages that
are enabled by our system.
These options are interesting to explore, although changing them is often not needed
for common use cases.
These options can be interesting to new research,
although changing them is often not needed
for standard use cases.

### Custom transform strategy

Instead of only returning the augmented data, detectron2's `Augmentation` returns the __operations__ as `T.Transform`.
This allows users to apply custom transform strategy on their data.
We use keypoints as an example.
We use keypoints data as an example.

Keypoints are (x, y) coordinates, but they are not so trivial to augment due to the semantic meaning they carry.
Such meaning is only known to the users, therefore users may want to augment them manually
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/datasets.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ and the required fields vary based on what the dataloader or the task needs (see
- pan_seg_file_name, segments_info
```

+ `file_name`: the full path to the image file. Rotation or flipping may be applied if the image has EXIF metadata.
+ `file_name`: the full path to the image file.
+ `height`, `width`: integer. The shape of the image.
+ `image_id` (str or int): a unique id that identifies this image. Required by many
evaluators to identify the images, but a dataset may use it for different purposes.
Expand Down
8 changes: 4 additions & 4 deletions docs/tutorials/deployment.md
Original file line number Diff line number Diff line change
@@ -1,18 +1,18 @@
# Deployment

Models written in Python needs to go through an export process to become a deployable artifact.
Models written in Python need to go through an export process to become a deployable artifact.
A few basic concepts about this process:

__"Export method"__ is how a Python model is turned into a serialized graph.
__"Export method"__ is how a Python model is fully serialized to a deployable format.
We support the following export methods:

* `tracing`: see [pytorch documentation](https://pytorch.org/tutorials/beginner/Intro_to_TorchScript_tutorial.html) for details.
* `scripting`: see [pytorch documentation](https://pytorch.org/tutorials/beginner/Intro_to_TorchScript_tutorial.html) for details.
* `caffe2_tracing`: replace parts of the model by caffe2 operators, then use tracing.

__"Format"__ is how a serialized graph is described in a file, e.g.
__"Format"__ is how a serialized model is described in a file, e.g.
TorchScript, Caffe2 protobuf, ONNX format.
__"Runtime"__ is an engine that loads a serialized graph and executes it,
__"Runtime"__ is an engine that loads a serialized model and executes it,
e.g., PyTorch, Caffe2, TensorFlow, onnxruntime, TensorRT, etc.
A runtime is often tied to a specific format
(e.g. PyTorch needs TorchScript format, Caffe2 needs protobuf format).
Expand Down
2 changes: 1 addition & 1 deletion docs/tutorials/extend.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ In detectron2, there are two types of interfaces that address this tension toget
When you need to implement something not supported by the "standard defaults"
included in detectron2, these well-defined components can be reused.

3. (experimental) A few classes are implemented with the
3. A few classes are implemented with the
[@configurable](../../modules/config.html#detectron2.config.configurable)
decorator - they can be called with either a config, or with explicit arguments.
Their explicit argument interfaces are currently __experimental__ and subject to change.
Expand Down

0 comments on commit 05573d7

Please sign in to comment.