forked from onnx/onnx.github.io
-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy pathgetting-started.html
More file actions
208 lines (169 loc) · 11.7 KB
/
getting-started.html
File metadata and controls
208 lines (169 loc) · 11.7 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
<!doctype html>
<html lang="en">
<head>
<meta charset="utf-8">
<title>ONNX - Getting Started</title>
<meta name="description" content="The new open ecosystem for interchangeable AI models">
<meta name="author" content="[author]">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta property="og:title" content="ONNX: Open Neural Network Exchange Format">
<meta property="og:type" content="website">
<meta property="og:description" name="description" content="The new open ecosystem for interchangeable AI models">
<meta property="og:image" content="assets/thumb.jpg">
<link href="https://fonts.googleapis.com/css?family=Dosis" rel="stylesheet">
<link rel="stylesheet" href="css/normalize.css?v=7.0">
<link rel="stylesheet" href="css/main.css?v=1.0">
<link rel="stylesheet" href="css/main.css">
<link rel="stylesheet" href="css/font.css?v=1.0">
<link rel="icon" type="image/png" href="assets/mlogo.png">
<script src="https://www.w3schools.com/lib/w3.js"></script>
<style>
.fp-section {
align-items: flex-start;
}
.getting-started .fp-section h2 {
border: 0;
text-align: left;
margin: 1em 0 10px 0;
padding: 0 !important;
}
.getting-started p {
}
.framework-instructions {
margin-left: 2em;
margin-top: 1.5em;
}
</style>
</head>
<body>
<div w3-include-html="partials/nav.html"></div>
<header role="banner" class="fp-header">
<a class="brand">Getting Started</a>
<div class="overlay"></div>
<div class="covervid-wrapper"></div>
</header>
<main role="main" class="index news">
<div class="fp-section-wide">
<div class="fp-section">
<h3>Installing ONNX</h3>
<p>
ONNX can be installed from binaries, Docker or source. Instructions can be found at
<a href="https://github.com/onnx/onnx">https://github.com/onnx/onnx</a>
</p>
</div>
</div>
<div class="fp-section-wide fp-people getting-started">
<div class="fp-section">
<h3>Importing and Exporting from Frameworks</h3>
<div>
ONNX support is integrated into different frameworks and deep learning tools:
<div class="frameworks-list">
<h2>Caffe2</h2>
<div class="framework-instructions">
<strong>Installing</strong>
<p>Caffe2 now supports the importing and exporting of ONNX models natively.</p>
<ul style="margin: 6px;">
<li style="list-style-type: disc; padding: 0 0 1em;">
You can learn more about how to install Caffe2 with ONNX support here:
<a href="https://caffe2.ai/docs/getting-started.html">https://caffe2.ai/docs/getting-started.html</a>.
</ul>
<strong>Exporting ONNX Models</strong>
<p>To export models, you can follow the tutorial at
<a href="https://github.com/onnx/tutorials/blob/master/tutorials/Caffe2OnnxExport.ipynb">https://github.com/onnx/tutorials/blob/master/tutorials/Caffe2OnnxExport.ipynb</a>.
</p>
<strong>Importing ONNX Models</strong>
<p>To import models, you can follow the tutorial at
<a href="https://github.com/onnx/tutorials/blob/master/tutorials/OnnxCaffe2Import.ipynb">https://github.com/onnx/tutorials/blob/master/tutorials/OnnxCaffe2Import.ipynb</a>.
</p>
</div>
<h2>Cognitive Toolkit</h2>
<div class="framework-instructions">
<strong>Installing</strong>
<p>ONNX support is built into Cognitive Toolkit! Just follow the installation instructions at <a href="https://docs.microsoft.com/en-us/cognitive-toolkit/setup-cntk-on-your-machine">https://docs.microsoft.com/en-us/cognitive-toolkit/setup-cntk-on-your-machine</a></p>
<strong>Exporting ONNX Models</strong>
<p>Follow the steps at <a href="https://github.com/onnx/tutorials/blob/master/tutorials/CntkOnnxExport.ipynb">https://github.com/onnx/tutorials/blob/master/tutorials/CntkOnnxExport.ipynb</a></p>
<strong>Importing ONNX Models</strong>
<p>Follow the steps at <a href="https://github.com/onnx/tutorials/blob/master/tutorials/OnnxCntkImport.ipynb">https://github.com/onnx/tutorials/blob/master/tutorials/CntkOnnxImport.ipynb</a></p>
</div>
<h2>MXNet</h2>
<div class="framework-instructions">
<strong>Installing</strong>
<p>MXNet bindings live in the <a href="https://github.com/apache/incubator-mxnet">https://github.com/apache/incubator-mxnet</a> repo. Documentation can be found at <a href="http://mxnet.incubator.apache.org/api/python/contrib/onnx.html">http://mxnet.incubator.apache.org/api/python/contrib/onnx.html</a>.</p>
<strong>Exporting ONNX Models</strong>
<p>To export models, you can follow the tutorial at <a href="https://github.com/onnx/tutorials/blob/master/tutorials/MXNetONNXExport.ipynb">https://github.com/onnx/tutorials/blob/master/tutorials/MXNetONNXExport.ipynb</a>.</p>
<strong>Importing ONNX Models</strong>
<p>To import models, you can follow the tutorial at <a href="https://github.com/onnx/tutorials/blob/master/tutorials/OnnxMxnetImport.ipynb">https://github.com/onnx/tutorials/blob/master/tutorials/OnnxMxnetImport.ipynb</a>.</p>
</div>
<h2>PyTorch</h2>
<div class="framework-instructions">
<strong>Installing</strong>
<p>The ONNX exporter is a part of PyTorch — no installation required! You can check out the documentation at <a href="http://pytorch.org/docs/master/onnx.html">http://pytorch.org/docs/master/onnx.html</a></p>
<strong>Exporting ONNX Models</strong>
<p>To export models, you can follow the tutorial at <a href="https://github.com/onnx/tutorials/blob/master/tutorials/PytorchOnnxExport.ipynb">https://github.com/onnx/tutorials/blob/master/tutorials/PytorchOnnxExport.ipynb</a>.</p>
<strong>Importing ONNX Models</strong>
<p>PyTorch does not currently have support for importing ONNX models. We're open to contributions!</p>
</div>
<h2>MATLAB</h2>
<div class="framework-instructions">
<strong>Installing</strong>
<p>You can import and export ONNX models using the Deep Learning Toolbox and the ONNX converter.</p>
<ul style="margin: 6px;">
<li style="list-style-type: disc; padding: 0 0 1em;">
If you don’t have MATLAB, you can download a free MATLAB trial for deep learning here:
<a href="https://www.mathworks.com/campaigns/products/trials/targeted/dpl.html">https://www.mathworks.com/campaigns/products/trials/targeted/dpl.html</a>.
<li style="list-style-type: disc; padding: 0 0 1em; border-top: none;">
You can access the ONNX converter from the MATLAB Add-On Explorer or by downloading it from the MATLAB Central File Exchange: <a href="https://www.mathworks.com/matlabcentral/fileexchange/67296">https://www.mathworks.com/matlabcentral/fileexchange/67296</a>.
<li style="list-style-type: disc; padding: 0 0 1em; border-top: none;">
You will need MATLAB release R2018a or later.
</ul>
<strong>Exporting ONNX Models</strong>
<p>To export models created in MATLAB to the ONNX model format, follow the steps in the documentation at: <a href="https://www.mathworks.com/help/deeplearning/ref/exportonnxnetwork.html">https://www.mathworks.com/help/deeplearning/ref/exportonnxnetwork.html</a>.</p>
<strong>Importing ONNX Models</strong>
<p>To import an ONNX model format into MATLAB, follow the steps in the documentation: <a href="https://www.mathworks.com/help/deeplearning/ref/importonnxnetwork.html">https://www.mathworks.com/help/deeplearning/ref/importonnxnetwork.html</a>.</p>
</div>
</div>
</div>
</div>
</div>
<div class="fp-section-wide getting-started">
<div class="fp-section">
<h3>Convertors for additional frameworks and tools</h3>
<h2>CoreML</h2>
<p>
We have an early stage CoreML converter that can be found at <a href="https://github.com/onnx/onnx-coreml">https://github.com/onnx/onnx-coreml</a>. We'd love for you to help improve it. To import into CoreML, you can follow the tutorial at <a href="https://github.com/onnx/tutorials/blob/master/tutorials/OnnxCoremlImport.ipynb">https://github.com/onnx/tutorials/blob/master/tutorials/OnnxCoremlImport.ipynb</a>.
</p>
<h2>TensorFlow</h2>
<p>
We have an early stage TensorFlow-to-ONNX converter that can be found at <a href="https://github.com/onnx/onnx-tensorflow">https://github.com/onnx/onnx-tensorflow</a>. We'd love for you to help improve it. To import into TensorFlow, you can follow the tutorial at <a href="https://github.com/onnx/tutorials/blob/master/tutorials/OnnxTensorflowImport.ipynb">https://github.com/onnx/tutorials/blob/master/tutorials/OnnxTensorflowImport.ipynb</a>.
</p>
<h2>TensorRT</h2>
<p>
We have an early stage TensorRT converter that can be found at <a href="https://github.com/onnx/onnx-tensorrt">https://github.com/onnx/onnx-tensorrt</a>. We'd love for you to help improve it. To import into TensorRT, you can follow the detailed instructions in the README at <a href="https://github.com/onnx/onnx-tensorrt/blob/master/README.md">https://github.com/onnx/onnx-tensorrt/blob/master/README.md</a>.
</p>
</div>
</div>
<div class="fp-section-wide fp-people getting-started">
<div class="fp-section">
<h3>Ready for More?</h3>
<h2>More Tutorials</h2>
<p>
Explore additional functionality and advanced features in other tutorials at <a href="https://github.com/onnx/tutorials">https://github.com/onnx/tutorials</a>.
</p>
<h2>Model Zoo</h2>
<p>
Try out all the ONNX models contributed by the community in our <a href="https://github.com/onnx/models">model zoo</a> or add your own for others to use!
</p>
<h2>Contributing</h2>
<p>
Contribute to ONNX or add support for your tool! You can start by exploring our <a href="https://github.com/onnx/onnx/blob/master/docs/CONTRIBUTING.md">contribution guide</a>.
</p>
</div>
</div>
</main>
<script src="js/prism.js" async defer></script>
<div w3-include-html="partials/footer.html"></div>
<div w3-include-html="partials/hamburger-menu.html"></div>
<script>w3.includeHTML();</script>
<script src="js/hamburger-menu.js"></script>
</body>
</html>