Running TensorFlow model inference in OpenVINO

  • by

We are starting the new year with an article on speeding up inference of Tensorflow models using OpenVINO.

The best part of the tutorial is the neural style transfer demo where we transfer the style of a historical painting to the cute picture of a cat!

We will go over the process step by step

  1. Setting up the environment
  2. Preparing the TensorFlow model
  3. Converting the model to Intermediate Representation format
  4. Running model inference in OpenVINO

I hope this tutorial is useful. May this year be one of learning and growth.

submitted by /u/spmallick
[link] [comments]

Leave a Reply

Your email address will not be published. Required fields are marked *