Skip to content

Commit aafd904

Browse files
authored
Merge pull request #475 from Unity-Technologies/docs_fix_ruiyu
Fix documentation and images
2 parents 7db8c9a + 32f964f commit aafd904

File tree

114 files changed

+549
-87
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

114 files changed

+549
-87
lines changed

com.unity.perception/Documentation~/FAQ/FAQ.md

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -42,7 +42,7 @@ Keep in mind that any new label added with this method should already be present
4242
Labeling works on the GameObject level, so to achieve the scenarios described here, you will need to break down your main object into multiple GameObjects parented to the same root object, and add `Labeling` components to each of the inner objects, as shown below.
4343

4444
<p align="center">
45-
<img src="images/inner_objects.png" width="800"/>
45+
<img src="../images/FAQ/images/inner_objects.png" width="800"/>
4646
</p>
4747

4848
Alternatively, in cases where parts of the surface of the object need to be labeled (e.g. decals on objects), you can add labeled invisible surfaces on top of these sections. These invisible surfaces need to have a fully transparent material. To create an invisible material:
@@ -54,7 +54,7 @@ Keep in mind that any new label added with this method should already be present
5454
An example labeled output for an object with separate labels on inner objects and decals is shown below:
5555

5656
<p align="center">
57-
<img src="images/inner_labels.gif" width="600"/>
57+
<img src="../images/FAQ/images/inner_labels.gif" width="600"/>
5858
</p>
5959

6060
---
@@ -118,7 +118,7 @@ Most human character models use Skinned Mesh Renderers. Unfortunately, at this t
118118
The ***Inspector*** view of a prefab cluster asset looks like below:
119119

120120
<p align="center">
121-
<img src="images/prefab_cluster.png" width="400"/>
121+
<img src="../images/FAQ/images/prefab_cluster.png" width="400"/>
122122
</p>
123123

124124
Now all that is left is to use our prefab clusters inside a Randomizer. Here is some sample code:
@@ -146,7 +146,7 @@ public class ClusterRandomizer : UnityEngine.Perception.Randomization.Randomizer
146146
This Randomizer takes a list of `PrefabCluster` assets, then, on each Iteration, it goes through all the provided clusters and samples one prefab from each. The ***Inspector*** view for this Randomizer looks like this:
147147

148148
<p align="center">
149-
<img src="images/cluster_randomizer.png" width="400"/>
149+
<img src="../images/FAQ/images/cluster_randomizer.png" width="400"/>
150150
</p>
151151

152152
---
@@ -426,7 +426,7 @@ Suppose we need to drop a few objects into the Scene, let them interact physical
426426

427427

428428
<p align="center">
429-
<img src="images/object_drop.gif" width="700"/>
429+
<img src="../images/FAQ/images/object_drop.gif" width="700"/>
430430
</p>
431431

432432
---
@@ -494,7 +494,7 @@ HDRP projects have motion blur and a number of other post processing effects ena
494494

495495

496496
<p align="center">
497-
<img src="images/volume.png" width="500"/>
497+
<img src="../images/FAQ/images/volume.png" width="500"/>
498498
</p>
499499

500500
---
@@ -573,25 +573,25 @@ A visual comparison of the different lighting configurations in HDRP is shown be
573573
Default HDRP:
574574

575575
<p align="center">
576-
<img src="images/hdrp.png" width="700"/>
576+
<img src="../images/FAQ/images/hdrp.png" width="700"/>
577577
</p>
578578

579579
HDRP with Global Illumination (notice how much brighter the scene is with ray traced light bouncing):
580580

581581
<p align="center">
582-
<img src="images/hdrp_rt_gi.png" width="700"/>
582+
<img src="../images/FAQ/images/hdrp_rt_gi.png" width="700"/>
583583
</p>
584584

585585
HDRP with Path Tracing (128 samples) (notice the red light bleeding from the cube onto the floor and the increased shadow quality):
586586

587587
<p align="center">
588-
<img src="images/hdrp_pt_128_samples.png" width="700"/>
588+
<img src="../images/FAQ/images/hdrp_pt_128_samples.png" width="700"/>
589589
</p>
590590

591591
HDRP with Path Tracing (4096 samples) (more samples leads to less ray tracing noise but also a longer time to render):
592592

593593
<p align="center">
594-
<img src="images/hdrp_pt_4096_samples.png" width="700"/>
594+
<img src="../images/FAQ/images/hdrp_pt_4096_samples.png" width="700"/>
595595
</p>
596596

597597
---

com.unity.perception/Documentation~/HPTutorial/TUTORIAL.md

Lines changed: 23 additions & 22 deletions
Original file line numberDiff line numberDiff line change
@@ -8,12 +8,13 @@ In this tutorial, **":green_circle: Action:"** mark all of the actions needed to
88

99
Steps included in this tutorial:
1010

11-
* [Step 1: Import `.fbx` Models and Animations](#step-1)
12-
* [Step 2: Set Up a Humanoid Character in a Scene](#step-2)
13-
* [Step 3: Set Up the Perception Camera for Keypoint Annotation](#step-3)
14-
* [Step 4: Configure Animation Pose Labeling](#step-4)
15-
* [Step 5: Add Joints to the Character and Customize Keypoint Templates](#step-5)
16-
* [Step 6: Randomize the Humanoid Character's Animations](#step-6)
11+
- [Human Pose Labeling and Randomization Tutorial](#human-pose-labeling-and-randomization-tutorial)
12+
- [<a name="step-1">Step 1: Import `.fbx` Models and Animations</a>](#step-1-import-fbx-models-and-animations)
13+
- [<a name="step-2">Step 2: Set Up a Humanoid Character in a Scene</a>](#step-2-set-up-a-humanoid-character-in-a-scene)
14+
- [<a name="step-3">Step 3: Set Up the Perception Camera for Keypoint Annotation</a>](#step-3-set-up-the-perception-camera-for-keypoint-annotation)
15+
- [<a name="step-4">Step 4: Configure Animation Pose Labeling</a>](#step-4-configure-animation-pose-labeling)
16+
- [<a name="step-5">Step 5: Add Joints to the Character and Customize Keypoint Templates</a>](#step-5-add-joints-to-the-character-and-customize-keypoint-templates)
17+
- [<a name="step-6">Step 6: Randomize the Humanoid Character's Animations</a>](#step-6-randomize-the-humanoid-characters-animations)
1718

1819
> :information_source: If you face any problems while following this tutorial, please create a post on the **[Unity Computer Vision forum](https://forum.unity.com/forums/computer-vision.626/)** or the **[GitHub issues](https://github.com/Unity-Technologies/com.unity.perception/issues)** page and include as much detail as possible.
1920
@@ -32,7 +33,7 @@ We will use this duplicated Scene in this tutorial so that we do not lose our gr
3233
Your Scenario should now look like this:
3334

3435
<p align="center">
35-
<img src="Images/scenario_empty.png" width="400"/>
36+
<img src="../images/HPTutorial/Images/scenario_empty.png" width="400"/>
3637
</p>
3738

3839
* **:green_circle: Action**: Select `Main Camera` and in the _**Inspector**_ view of the `Perception Camera` component, **disable** all previously added labelers using the check-mark in front of each. We will be using a new labeler in this tutorial.
@@ -45,7 +46,7 @@ We now need to import the sample files required for this tutorial.
4546
Once the sample files are imported, they will be placed inside the `Assets/Samples/Perception` folder in your Unity project, as seen in the image below:
4647

4748
<p align="center">
48-
<img src="Images/project_folders_samples.png" width="600"/>
49+
<img src="../images/HPTutorial/Images/project_folders_samples.png" width="600"/>
4950
</p>
5051

5152
* **:green_circle: Action**: Select all of the assets inside the `Assets/Samples/Perception/<perception-package-version>/Human Pose Labeling and Randomization/Models and Animations`.
@@ -61,7 +62,7 @@ Note how `Animation Type` is set to `Humanoid` for all selected assets. This is
6162
* **:green_circle: Action**: Select the new `Player` object in the Scene and in the _**Inspector**_ tab set its transform's position and rotation according to the image below to make the character face the camera.
6263

6364
<p align="center">
64-
<img src="Images/character_transform.png" width="800"/>
65+
<img src="../images/HPTutorial/Images/character_transform.png" width="800"/>
6566
</p>
6667

6768
The `Player` object already has an `Animator` component attached. This is because the `Animation Type` property of all the sample `.fbx` files is set to `Humanoid`.
@@ -71,7 +72,7 @@ To animate our character, we will now attach an `Animation Controller` to the `A
7172
* **:green_circle: Action**: Double click the new controller to open it. Then right-click in the empty area and select _**Create State**_ -> _**Empty**_.
7273

7374
<p align="center">
74-
<img src="Images/anim_controller_1.png" width="600"/>
75+
<img src="../images/HPTutorial/Images/anim_controller_1.png" width="600"/>
7576
</p>
7677

7778
This will create a new state and attach it to the Entry state with a new transition edge. This means the controller will always move to this new state as soon as the `Animator` component is awoken. In this example, this will happen when the **** button is pressed and the simulation starts.
@@ -83,19 +84,19 @@ In the selector window that pops up, you will see several clips named `Take 001`
8384
* **:green_circle: Action**: Select the animation clip originating from the `TakeObjects.fbx` file, as seen below:
8485

8586
<p align="center">
86-
<img src="Images/select_clip.png" width="600"/>
87+
<img src="../images/HPTutorial/Images/select_clip.png" width="600"/>
8788
</p>
8889

8990
* **:green_circle: Action**: Assign `TestAnimationController` to the `Controller` property of the `Player` object's `Animator` component.
9091

9192
<p align="center">
92-
<img src="Images/assign_controller.png" width="400"/>
93+
<img src="../images/HPTutorial/Images/assign_controller.png" width="400"/>
9394
</p>
9495

9596
If you run the simulation now you will see the character performing an animation for picking up a hypothetical object as seen in the GIF below.
9697

9798
<p align="center">
98-
<img src="Images/take_objects.gif" width="600"/>
99+
<img src="../images/HPTutorial/Images/take_objects.gif" width="600"/>
99100
</p>
100101

101102

@@ -116,7 +117,7 @@ Similar to the labelers we used in the Perception Tutorial, we will need a label
116117
* **:green_circle: Action**: In the _**Inspector**_ UI for this new `Labeling` component, expand `HPE_IdLabelConfig` and click _**Add to Labels**_ on `MyCharacter`.
117118

118119
<p align="center">
119-
<img src="Images/add_label_from_config.png" width="400"/>
120+
<img src="../HPTutorial/Images/add_label_from_config.png" width="400"/>
120121
</p>
121122

122123
* **:green_circle: Action**: Return to `Perception Camera` and assign `HPE_IdLabelConfig` to the `KeyPointLabeler`'s label configuration property.
@@ -125,14 +126,14 @@ Similar to the labelers we used in the Perception Tutorial, we will need a label
125126
The labeler should now look like the image below:
126127

127128
<p align="center">
128-
<img src="Images/keypoint_labeler.png" width="500"/>
129+
<img src="../HPTutorial/Images/keypoint_labeler.png" width="500"/>
129130
</p>
130131

131132

132133
The `Active Template` tells the labeler how to map default Unity rig joints to human joint labels in the popular COCO dataset so that the output of the labeler can be easily converted to COCO format. Later in this tutorial, we will learn how to add more joints to our character and how to customize joint mapping templates.
133134

134135
<p align="center">
135-
<img src="Images/take_objects_keypoints.gif" width="600"/>
136+
<img src="../images/HPTutorial/Images/take_objects_keypoints.gif" width="600"/>
136137
</p>
137138

138139
You can now check out the output dataset to see what the annotations look like. To do this, click the _**Show Folder**_ button in the `Perception Camera` UI, then navigate inside to the dataset folder to find the `captures_000.json` file. Here is an example annotation for the first frame of our test-case here:
@@ -276,15 +277,15 @@ You can now use the `Timestamps` list to define poses. Let's define four poses h
276277
Modify `MyAnimationPoseConfig` according to the image below:
277278

278279
<p align="center">
279-
<img src="Images/anim_pos_conf.png" width="800"/>
280+
<img src="../images/HPTutorial/Images/anim_pos_conf.png" width="800"/>
280281
</p>
281282

282283
The pose configuration we created needs to be assigned to our `KeyPointLabeler`. So:
283284

284285
* **:green_circle: Action**: In the _**Inspector**_ UI for `Perception Camera`, set the `Size` of `Animation Pose Configs` for the `KeyPointLabeler` to 1. Then, assign the `MyAnimationPoseConfig` to the sole slot in the list, as shown below:
285286

286287
<p align="center">
287-
<img src="Images/keypoint_labeler_2.png" width="500"/>
288+
<img src="../images/HPTutorial/Images/keypoint_labeler_2.png" width="500"/>
288289
</p>
289290

290291
If you run the simulation again to generate a new dataset, you will see the new poses we defined written in it. All frames that belong to a certain pose will have the pose label attached.
@@ -305,7 +306,7 @@ In the _**Inspector**_ view of `CocoKeypointTemplate`, you will see the list of
305306

306307

307308
<p align="center">
308-
<img src="Images/coco_template.png" width="500"/>
309+
<img src="../images/HPTutorial/Images/coco_template.png" width="500"/>
309310
</p>
310311

311312
If you review the list you will see that the `left_ear` and `right_ear` joints are also not associated with the rig.
@@ -317,7 +318,7 @@ We will create our three new joints under the `Head` object.
317318
* **:green_circle: Action**: Create three new empty GameObjects under `Head` and place them in the proper positions for the character's nose and ears, as seen in the GIF below (make sure the positions are correct in 3D space):
318319

319320
<p align="center">
320-
<img src="Images/new_joints.gif" width="600"/>
321+
<img src="../images/HPTutorial/Images/new_joints.gif" width="600"/>
321322
</p>
322323

323324
The final step in this process would be to label these new joints such that they match the labels of their corresponding keypoints in `CocoKeypointTemplate`. For this purpose, we use the `Joint Label` component.
@@ -327,7 +328,7 @@ The final step in this process would be to label these new joints such that they
327328
If you run the simulation now, you can see the new joints being visualized:
328329

329330
<p align="center">
330-
<img src="Images/new_joints_play.gif" width="600"/>
331+
<img src="../images/HPTutorial/Images/new_joints_play.gif" width="600"/>
331332
</p>
332333

333334
You could now look at the latest generated dataset to confirm the new joints are being detected and written.
@@ -347,7 +348,7 @@ The `Animation Randomizer Tag` accepts a list of animation clips. At runtime, th
347348
If you run the simulation now, your character will randomly perform one of the above four animations, each for 150 frames. This cycle will recur 20 times, which is the total number of Iterations in your Scenario.
348349

349350
<p align="center">
350-
<img src="Images/randomized_results.gif" width="600"/>
351+
<img src="../images/HPTutorial/Images/randomized_results.gif" width="600"/>
351352
</p>
352353

353354
> :information_source: The reason the character stops animating at certain points in the above GIF is that the animation clips are not set to loop. Therefore, if the randomly selected timestamp is sufficiently close to the end of the clip, the character will complete the animation and stop animating for the rest of the Iteration.

0 commit comments

Comments
 (0)