Analyzing the Performance of Your App¶
Understanding the Inference Time¶
While using the edgeIQ APIs, it can be very helpful to understand the timing
of different aspects of your app. One major piece of the overall timing will
be the inference time, or the time it takes the engine and accelerator to run
a forward pass on the network. The inference time is provided in the results
If you’re seeing an inference time that is longer than what your app requires, there are three main ways to improve it:
Use an accelerator: Accelerators can provide major improvements to inference times. For example, for many models the NCS1 and NCS2 have inference times around 100 ms.
Change your computer vision model: Model inference times range from tens of milliseconds to tens of seconds, so your choice of model could have a large impact on your inference time. The alwaysAI Model Catalog provides inference times for popular processors and accelerators.
Use a board with more compute power: If you can’t sacrifice on the accuracy of your model, you may just need a board with more compute power. Take a look at the supported edge devices to see if there’s another board that meets your needs.
Analyzing the Frames Per Second¶
fps = edgeiq.FPS()
Next, start the timer before starting your main processing loop:
For each processed frame, update the FPS counter:
When your main processing loop exits, stop the FPS timer and capture the approximate FPS:
fps.stop() print(fps.get_elapsed_seconds()) print(fps.compute_fps())
You can also get an estimate of the instantaneous FPS in your main processing
loop by calling
compute_fps() without calling
stop(). (Note that this will add additional
processing to your loop, and may not be desired if high performance is crucial.)
The frames per second are largely determined by two things — the inference time, described above, and any post-processing you’ve done on the results. If the frames per second closely matches the inverse of the inference time, then the post-processing time is an insignificant part of the total time per frame. However, if the FPS is much less than the inverse of the inference time, then your post-processing is contributing to your overall performance. Check to see if any portions of your post-processing can be made more efficient.