Training and inference scripts with TensorFlow optimizations that use the Intel® oneAPI Deep Neural Network Library (Intel® oneDNN) and Intel® Extension for PyTorch.
The model documentation in the tables below have information on the prerequisites to run each model. The model scripts run on Linux. Select models are also able to run using bare metal on Windows. For more information and a list of models that are supported on Windows, see the documentation here.
The oneContainer Portal column has links for using workload containers and model packages for each model precision. These containers are built based on images with Intel optimizations for TensorFlow or PyTorch and contain all the dependencies, scripts, and pretrained models needed to run the workload. The model packages have scripts and pretrained model files used for running on bare metal.
For information on running more advanced use cases using the workload containers see the: advanced options documentation.
Use Case | Model | Mode | oneContainer Portal | Model Documentation |
---|---|---|---|---|
Image Recognition | DenseNet169 | Inference | Model Containers: FP32 Model Packages: FP32 |
FP32 |
Image Recognition | Inception V3 | Inference | Model Containers: Int8 FP32 Model Packages: Int8 FP32 |
Int8 FP32 |
Image Recognition | Inception V4 | Inference | Model Containers: Int8 FP32 Model Packages: Int8 FP32 |
Int8 FP32 |
Image Recognition | MobileNet V1* | Inference | Model Containers: Int8 FP32 Model Packages: Int8 FP32 |
Int8 FP32 BFloat16** |
Image Recognition | ResNet 101 | Inference | Model Containers: Int8 FP32 Model Packages: Int8 FP32 |
Int8 FP32 |
Image Recognition | ResNet 50 | Inference | Model Containers: Int8 FP32 Model Packages: Int8 FP32 |
Int8 FP32 |
Image Recognition | ResNet 50v1.5 | Inference | Model Containers: Int8 FP32 BFloat16** Model Packages: Int8 FP32 BFloat16** |
Int8 FP32 BFloat16** |
Image Recognition | ResNet 50v1.5 | Training | Model Containers: FP32 BFloat16** Model Packages: FP32 BFloat16** |
FP32 BFloat16** |
Image Segmentation | 3D U-Net | Inference | Model Containers: FP32 Model Packages: FP32 |
FP32 |
Image Segmentation | 3D U-Net MLPerf | Inference | FP32 BFloat16** Int8 | |
Image Segmentation | MaskRCNN | Inference | Model Containers: FP32 Model Packages: FP32 |
FP32 |
Image Segmentation | UNet | Inference | Model Containers: FP32 Model Packages: FP32 |
FP32 |
Language Modeling | BERT | Inference | Model Containers: FP32 BFloat16** Model Packages: FP32 BFloat16** |
FP32 BFloat16** |
Language Modeling | BERT | Training | Model Containers: FP32 BFloat16** Model Packages: FP32 BFloat16** |
FP32 BFloat16** |
Language Translation | BERT | Inference | FP32 | |
Language Translation | GNMT* | Inference | Model Containers: FP32 Model Packages: FP32 |
FP32 |
Language Translation | Transformer_LT_mlperf | Training | Model Containers: FP32 BFloat16** Model Packages: FP32 BFloat16** |
FP32 BFloat16** |
Language Translation | Transformer_LT_mlperf | Inference | FP32 BFloat16** Int8 | |
Language Translation | Transformer_LT_Official | Inference | Model Containers: FP32 Model Packages: FP32 |
FP32 |
Object Detection | Faster R-CNN | Inference | Model Containers: Int8 FP32 Model Packages: Int8 FP32 |
Int8 FP32 |
Object Detection | R-FCN | Inference | Model Containers: Int8 FP32 Model Packages: Int8 FP32 |
Int8 FP32 |
Object Detection | SSD-MobileNet* | Inference | Model Containers: Int8 FP32 Model Packages: Int8 FP32 |
Int8 FP32 BFloat16** |
Object Detection | SSD-ResNet34* | Inference | Model Containers: Int8 FP32 Model Packages: Int8 FP32 |
Int8 FP32 BFloat16** |
Object Detection | SSD-ResNet34 | Training | Model Containers: FP32 BFloat16** Model Packages: FP32 BFloat16** |
FP32 BFloat16** |
Recommendation | DIEN | Inference | FP32 BFloat16** | |
Recommendation | DIEN | Training | FP32 | |
Recommendation | NCF | Inference | Model Containers: FP32 Model Packages: FP32 |
FP32 |
Recommendation | Wide & Deep | Inference | Model Containers: FP32 Model Packages: FP32 |
FP32 |
Recommendation | Wide & Deep Large Dataset | Inference | Model Containers: Int8 FP32 Model Packages: Int8 FP32 |
Int8 FP32 |
Recommendation | Wide & Deep Large Dataset | Training | Model Containers: FP32 Model Packages: FP32 |
FP32 |
Reinforcement | MiniGo | Training | FP32 | |
Text-to-Speech | WaveNet | Inference | Model Containers: FP32 Model Packages: FP32 |
FP32 |
Use Case | Model | Mode | Model Documentation |
---|---|---|---|
Image Recognition | Inception V3 | Inference | FP32 |
Image Recognition | ResNet 50v1.5 | Inference | FP32 |
Language Translation | Transformer_LT_Official | Inference | FP32 |
Object Detection | SSD-MobileNet | Inference | FP32 |
Use Case | Model | Mode | oneContainer Portal | Model Documentation |
---|---|---|---|---|
Image Recognition | ResNet 50 | Inference | Model Packages: FP32 BFloat16** | FP32 BFloat16** |
Recommendation | DLRM | Training | Model Packages: BFloat16** | BFloat16** |
*Means the model belongs to MLPerf models and will be supported long-term.
**Means the BFloat16 data type support is experimental.