Skip to content

Commit 0df729a

Browse files
committed
Initial Commit
1 parent d1368ad commit 0df729a

File tree

326 files changed

+42018
-3
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

326 files changed

+42018
-3
lines changed

.gitattributes

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
*.mkv filter=lfs diff=lfs merge=lfs -text
2+
*.mp4 filter=lfs diff=lfs merge=lfs -text
3+
*.gz filter=lfs diff=lfs merge=lfs -text
4+
*.tar filter=lfs diff=lfs merge=lfs -text
5+
*.zip filter=lfs diff=lfs merge=lfs -text

.gitignore

Lines changed: 157 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,157 @@
1+
# Byte-compiled / optimized / DLL files
2+
__pycache__/
3+
*.py[cod]
4+
*$py.class
5+
6+
# C extensions
7+
*.so
8+
9+
# Distribution / packaging
10+
.Python
11+
build/
12+
develop-eggs/
13+
dist/
14+
downloads/
15+
eggs/
16+
.eggs/
17+
lib/
18+
lib64/
19+
parts/
20+
sdist/
21+
var/
22+
wheels/
23+
pip-wheel-metadata/
24+
share/python-wheels/
25+
*.egg-info/
26+
.installed.cfg
27+
*.egg
28+
MANIFEST
29+
30+
# PyInstaller
31+
# Usually these files are written by a python script from a template
32+
# before PyInstaller builds the exe, so as to inject date/other infos into it.
33+
*.manifest
34+
*.spec
35+
36+
# Installer logs
37+
pip-log.txt
38+
pip-delete-this-directory.txt
39+
40+
# Unit test / coverage reports
41+
htmlcov/
42+
.tox/
43+
.nox/
44+
.coverage
45+
.coverage.*
46+
.cache
47+
nosetests.xml
48+
coverage.xml
49+
*.cover
50+
*.py,cover
51+
.hypothesis/
52+
.pytest_cache/
53+
54+
# Translations
55+
*.mo
56+
*.pot
57+
58+
# Django stuff:
59+
*.log
60+
local_settings.py
61+
db.sqlite3
62+
db.sqlite3-journal
63+
64+
# Flask stuff:
65+
instance/
66+
.webassets-cache
67+
68+
# Scrapy stuff:
69+
.scrapy
70+
71+
# Sphinx documentation
72+
docs/_build/
73+
74+
# PyBuilder
75+
target/
76+
77+
# Jupyter Notebook
78+
.ipynb_checkpoints
79+
80+
# IPython
81+
profile_default/
82+
ipython_config.py
83+
84+
# pyenv
85+
.python-version
86+
87+
# pipenv
88+
# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
89+
# However, in case of collaboration, if having platform-specific dependencies or dependencies
90+
# having no cross-platform support, pipenv may install dependencies that don't work, or not
91+
# install all needed dependencies.
92+
#Pipfile.lock
93+
94+
# PEP 582; used by e.g. github.com/David-OConnor/pyflow
95+
__pypackages__/
96+
97+
# Celery stuff
98+
celerybeat-schedule
99+
celerybeat.pid
100+
101+
# SageMath parsed files
102+
*.sage.py
103+
104+
# Environments
105+
.env
106+
.venv
107+
env/
108+
venv/
109+
ENV/
110+
env.bak/
111+
venv.bak/
112+
113+
# Spyder project settings
114+
.spyderproject
115+
.spyproject
116+
117+
# Rope project settings
118+
.ropeproject
119+
120+
# mkdocs documentation
121+
/site
122+
123+
# mypy
124+
.mypy_cache/
125+
.dmypy.json
126+
dmypy.json
127+
128+
# Pyre type checker
129+
.pyre/
130+
ds-ai-pipeline/src/pipeline.dot
131+
ds-ai-pipeline/src/onnx_model_repo/*
132+
ds-ai-pipeline/src/sinkbranch.dot
133+
ds-ai-pipeline/src/sample-input-2.mp4
134+
ds-ai-pipeline/src/VideoAnalyticsLicensePlateRecognition.mp4
135+
ds-ai-pipeline/src/sample_pgie_config.txt
136+
ds-ai-pipeline/src/CUSTOM_PARSER_EXAMPLES/*
137+
ds-ai-pipeline/src/sinkbranchremove.dot
138+
ds-ai-pipeline/src/pipeline.dot.png
139+
ds-ai-pipeline/src/sinkbranch.dot.png
140+
ds-ai-pipeline/src/sinkbranchremove.dot.png
141+
ds-ai-pipeline/src/unlink_filequeue.dot
142+
ds-ai-pipeline/src/unlink_filequeue.dot.png
143+
ds-ai-pipeline/x86_64/config/
144+
ds-ai-pipeline/arm/config
145+
# Dev and test stuff
146+
business-logic-container/tests/build_and_push.ps1
147+
business-logic-container/tests/build-and-push-dev.ps1
148+
business-logic-container/build-and-push-dev.ps1
149+
ds-ai-pipeline/MyCustomAssets/
150+
ds-ai-pipeline/arm/deployment.dev.template.json
151+
ds-ai-pipeline/arm/deployment.ubs.dev.template.json
152+
ds-ai-pipeline/docker-build/devbuild.ps1
153+
ds-ai-pipeline/src/CustomParsers/models/
154+
ds-ai-pipeline/x86_64/deployment.dev.template.json
155+
rebuild-and-push-test.ps1
156+
rebuild-and-push.ps1
157+
release.ps1

CODE_OF_CONDUCT.md

Lines changed: 9 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,9 @@
1+
# Microsoft Open Source Code of Conduct
2+
3+
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
4+
5+
Resources:
6+
7+
- [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/)
8+
- [Microsoft Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/)
9+
- Contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with questions or concerns

CONTRIBUTING.md

Lines changed: 14 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,14 @@
1+
# Contributing
2+
3+
This project welcomes contributions and suggestions. Most contributions require you to
4+
agree to a Contributor License Agreement (CLA) declaring that you have the right to,
5+
and actually do, grant us the rights to use your contribution. For details, visit
6+
https://cla.microsoft.com.
7+
8+
When you submit a pull request, a CLA-bot will automatically determine whether you need
9+
to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the
10+
instructions provided by the bot. You will only need to do this once across all repositories using our CLA.
11+
12+
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/).
13+
For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/)
14+
or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.

LICENSE

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
MIT License
2+
3+
Copyright (c) Microsoft Corporation.
4+
5+
Permission is hereby granted, free of charge, to any person obtaining a copy
6+
of this software and associated documentation files (the "Software"), to deal
7+
in the Software without restriction, including without limitation the rights
8+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9+
copies of the Software, and to permit persons to whom the Software is
10+
furnished to do so, subject to the following conditions:
11+
12+
The above copyright notice and this permission notice shall be included in all
13+
copies or substantial portions of the Software.
14+
15+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21+
SOFTWARE

README.md

Lines changed: 121 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,121 @@
1-
# Repository setup required :wave:
2-
3-
Please visit the website URL :point_right: for this repository to complete the setup of this repository and configure access controls.
1+
**NOTE**
2+
Please note that while the code in this repository is licensed under MIT, it makes use of third party software that may be under
3+
different licenses. Especially note that the ai-pipeline container is derived from NVIDIA's DeepStream containers. Your use of NVIDIA
4+
DeepStream and/or related deliverables is currently subject to the terms and limitations stated in this [license](https://developer.nvidia.com/deepstream-eula).
5+
6+
**ATTENTION**
7+
No part of this code should be considered stable or is intended to be used only for test purposes. This code base is not designed for production use cases. It is provided as an example of how you might integrate Azure IoT Edge with NVIDIA DeepStream.
8+
9+
# Table of Contents
10+
11+
* [Prerequisite checklist for Azure DeepStream Accelerator](./documentation/quickstart-readme.md) - Install all the dependencies and get yourself up and running.
12+
* [Tutorial: Azure DeepStream Accelerator - Getting started path](./documentation/tutorial-getstarted-path.md) - Deploy an example use case.
13+
* [Tutorial: Azure DeepStream Accelerator - Pre-built model path](./documentation/tutorial-prebuiltmodel-path.md) - Learn how to deploy a solution with a presupported model from a model zoo.
14+
* [Tutorial: Azure DeepStream Accelerator - Bring your own model path (BYOM) model path](./documentation/tutorial-byom-path.md) - Learn how to bring a custom model.
15+
* [Troubleshoot: Azure DeepStream Accelerator - Known issues](./documentation/troubleshooting.md) - If you encounter issues when you are creating an Edge AI solution using Azure DeepStream Accelerator.
16+
* [How to use the command line interface tool in Azure DeepStream Accelerator](./documentation/how-to-usecommandlinetool.md) - Learn about the CLI Tool.
17+
* [How to update the business logic in Azure DeepStream Accelerator](./documentation/how-to-modifybusinesslogic.md) - Learn about how to bring your own business logic to the pipeline.
18+
* [How to use the Azure DeepStream Accelerator Player web app](./documentation/how-to-usewebappwidget.md) - Learn about the Player (video playing widget).
19+
* [How to configure the Controller module](./documentation/how-to-configcontroller.md) - Learn about the various configuration options for Azure DeepStream Accelerator.
20+
* [How to migrate from a DeepStream only computer vision solution to DeepStream with Azure DeepStream Accelerator](./documentation/how-to-migratefromdeepstream.md) - Learn how to migrate from a pure DeepStream solution to Azure DeepStream Accelerator.
21+
* [How to dewarp video streams for your Azure DeepStream Accelerator solution](./documentation/how-to-dewarpvideo.md) - Learn how to dewarp fisheye cameras.
22+
* [How to add multiple video streams to your Azure DeepStream Accelerator solution](./documentation/how-to-addmultiplevideos.md) - Learn how to run a solution involving multiple camera sources or multiple AI configurations.
23+
* [How to include Azure Monitor to improve observability](./documentation/how-to-includeazuremontior.md) - Learn how to integrate with Azure Monitor.
24+
25+
# Azure DeepStream Accelerator overview
26+
27+
Azure DeepStream Accelerator includes developer tools that provide a custom developer experience. It enables you to create NVIDIA DeepStream containers using Microsoft-based images and guidance, supported models from NVIDIA out of the box, and/or bring your own models.
28+
29+
DeepStream is NVIDIA’s toolkit to develop and deploy Vision AI applications and services. It provides multi-platform, scalable, Transport Layer Security (TLS)-encrypted security that can be deployed on-premises, on the edge, and in the cloud.
30+
31+
## Azure DeepStream Accelerator offers:
32+
33+
- **Simplifying your development process**
34+
35+
Auto selection of AI model execution and inference provider: One of several execution providers, such as ORT, CUDA, and TENSORT, are automatically selected to simplify your development process.
36+
37+
- **Customizing Region of Interest (ROI) to enable your business scenario**
38+
39+
Region of Interest (ROI) configuration widget: Azure DeepStream Accelerator Player, a web app widget, is included for customizing ROIs to enable event detection for your business scenario.
40+
41+
- **Simplifying the configuration for pre/post processing**
42+
43+
You can add a Python-based model parser using a configuration file, instead of hardcoding it into the pipeline.
44+
45+
- **Offering a broad Pre-built AI model framework**
46+
47+
This solution supports many of the most common CV models in use today, for example NVIDIA TAO, ONNX, CAFFE, UFF (TensorFlow), and Triton.
48+
49+
- **Supporting bring your own model**
50+
51+
Support for model/container customization, USB/RTSP camera and pre-recorded video stream(s), event-based video snippet storage in Azure Storage and Alerts, and AI model deployment via Azure IoT Module Twin update.
52+
53+
- **Support for Multiple Trackers**
54+
55+
Support for NV Tracker and Light trackers (https://github.com/researchmm/LightTrack).
56+
57+
## Azure DeepStream Accelerator key components
58+
59+
The following table provides a list of Azure DeepStream Accelerator’s key components and a description of each one.
60+
61+
| Components | Details |
62+
|--------------------------|------------------------------|
63+
| Edge devices | Azure DeepStream Accelerator is available on the following devices:<br><br>- Any x86_64 device with an NVIDIA GPU that is IoT Edge compatible<br>- [NVIDIA Jetson Orin](https://www.nvidia.com/en-us/autonomous-machines/embedded-systems/jetson-orin/) <br><br>**Note**:<br>You can use any of the listed devices with any of the development paths. Some implementation steps may differ depending on the architecture of your device.<br><br> |
64+
| Computer vision models |Azure DeepStream Accelerator can work with many different computer vision (CV) models as outlined: <br><br>- **NVIDIA Models** <br>For example: Body Pose Estimation and License Plate Recognition. License Plate Recognition includes three models: traffic cam net, license plate detection, and license plate reading and other Nivida Models.<br><br>- **ONNX Models** <br> For example: SSD-MobileNetV1, YOLOv4, Tiny YOLOv3, EfficentNet-Lite.<br><br> |
65+
| Development Paths | Azure DeepStream Accelerator offers three development paths: <br><br>- [Getting started path](./documentation/tutorial-getstarted-path.md)<br>This path uses pre-trained models and pre-recorded videos of simulated manufacturing environment to demonstrate the steps required to create an Edge AI solution using Azure DeepStream Accelerator.<br><br>If you are just getting started on your computer vision (CV) app journey or simply want to learn more about Azure DeepStream Accelerator, we recommend this path.<br><br>- [Pre-built model path](./documentation/tutorial-prebuiltmodel-path.md)<br>This path provides pre-built parsers in Python for the CV models outlined earlier. You can easily deploy one of these models and integrate your own video stream.<br><br>If you are familiar with Azure IoT Edge solutions and want to leverage one of the supported models with an existing video stream, we recommend this path.<br><br>- [Bring your own model path](./documentation/tutorial-byom-path.md)<br>This path provides you with steps of how to integrate your own custom model and parser into your Azure DeepStream Accelerator Edge AI solution.<br><br>If you are an experienced developer who is familiar with cloud-based CV solutions and want a simplified deployment experience using Azure DeepStream Accelerator, we recommend this path.<br><br> |
66+
67+
## Azure DeepStream Accelerator architecture
68+
69+
The following diagram provides a high-level view of the Azure DeepStream Accelerator architecture.
70+
71+
![azure_deepstream_accelerator_architecture](./documentation/media/azure_deepstream_accelerator_architecture.png)
72+
73+
* **AI-Pipeline Container**: The AI Pipeline Container is the heart of Azure DeepStream Accelerator. It ingests USB or RTSP camera streams and applies AI models to the video frames.
74+
The outputs of the models are multiplexed together and then sent out to the Business Logic Container for user logic to handle. There are a few points of configuration for this
75+
container: [AI models can be delivered to it or packaged up into it](./documentation/tutorial-prebuiltmodel-path.md#step-3-prepare-and-upload-your-container),
76+
AI models can make use of [Tensor RT or any runtime that Triton Inference Server can utilize](./documentation/how-to-usecommandlinetool.md#model-compatability-matrix), and
77+
which cameras are used for which AI model configurations can be configured via the [Controller's module twin](./documentation/how-to-configcontroller.md).
78+
* **Controller Module**: This container is responsible for configuring the whole system. See [the documentation for the full configuration API](./documentation/how-to-configcontroller.md)
79+
* **Business Logic Container (BLC)**: This is where a user's application logic should live. It can be configured through the [Controller Module](./documentation/how-to-configcontroller.md).
80+
It should accept inferences from the AI Pipeline Container and do whatever user logic to those inferences, then send back a switch on/off for recording event snippets.
81+
[See more about the BLC here.](./documentation/how-to-modifybusinesslogic.md)
82+
* **Video Uploader**: The Video Uploader Container is responsible for taking inferences and MP4 video snippets and uploading them to the cloud. These uploads are triggered by the Business Logic Container telling
83+
the AI Pipeline Container when to upload video. The video is delivered by means of a common volume shared between the AI Pipeline and the Video Uploader.
84+
* **Azure DeepStream Accelerator Player**: The Player is a Java Script widget that provides a convenient way to view video snippets that are found in the user's connected Blob Storage.
85+
It can also define regions of interest graphically, in case that is of use for the end user's application logic. [See here for more information about the Player](./documentation/how-to-usewebappwidget.md).
86+
87+
## Support
88+
89+
For information on how to file issues and get support, visit: [How to get support](./SUPPORT.md).
90+
91+
## User contributions
92+
93+
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit [Contributor License Agreement](https://cla.opensource.microsoft.com).
94+
95+
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (for example, status check, comment). Follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
96+
97+
This project has adopted the [Microsoft Open Source Code of Conduct](https://opensource.microsoft.com/codeofconduct/). For more information see the [Code of Conduct FAQ](https://opensource.microsoft.com/codeofconduct/faq/) or contact [opencode@microsoft.com](mailto:opencode@microsoft.com) with any additional questions or comments.
98+
99+
## Trademarks
100+
101+
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow
102+
[Microsoft's Trademark and Brand Guidelines](https://www.microsoft.com/en-us/legal/intellectualproperty/trademarks/usage/general). Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.
103+
104+
**CODEC LICENSES**
105+
106+
H.265/HEVC Video Standard. This product includes H.265/HEVC coding technology. Access Advance LLC requires this notice:
107+
108+
THE H.265/HEVC TECHNOLOGY IN THIS PRODUCT IS COVERED BY ONE OR MORE CLAIMS OF THE HEVC PATENTS LISTED AT http://patentlist.accessadvance.com/.
109+
110+
H.264/AVC Video Standard. This product includes AVC coding technology. MPEG LA LLC requires this notice:
111+
112+
This product is licensed under the AVC patent portfolio license for the personal and non-commercial use of a consumer to (i) encode video in compliance with the AVC standard ("AVC VIDEO") and/or (ii) decode AVC video that was encoded by a consumer engaged in a personal and non-commercial activity and/or was obtained from a video provider licensed to provide AVC video. No license is granted or shall be implied for any other use. Additional information may be obtained from MPEG LA LLC. See http://www.MPEGLA.COM.
113+
114+
For clarification purposes, this notice does not limit or inhibit the use of the product for normal business uses that are personal to that business which do not include (i) redistribution of the product to third parties, or (ii) creation of content with AVC Standard compliant technologies for distribution to third parties.
115+
116+
## Next steps
117+
118+
You are now ready to start using Azure DeepStream Accelerator to create, manage, and deploy custom Edge AI solutions. We recommend the following resources to get started:
119+
120+
- [Prerequisite checklist for Azure DeepStream Accelerator](./documentation/quickstart-readme.md)
121+
- [Tutorial: Azure DeepStream Accelerator - Getting started path](./documentation/tutorial-getstarted-path.md)

0 commit comments

Comments
 (0)