If you are working on deep learning or real-time video processing project using Opencv (like Object Detection, Social Distance detection), you will face lags in the output video (less frame rate per second), you can fix this lag using GPU if your system has NVIDIA GPU (NVIDIA Graphics card). OpenCV library can be used for both CPU and GPU.
In my last post, I have shared how to install OpenCV GPU for windows. In this post, I will show you how you can use OpenCV with GPU to optimize your real-time video processing project up to 1000 times faster with just 2 lines of code.
Must Read:
- Install OpenCV GPU with CUDA for Windows 10
- YOLO object detection using deep learning OpenCV | Real-time
Lines to add
These are the two lines of code you need to add after OpenCV’s “dnn” module (where you are reading the pre-trained deep learning or machine learning model).
net.setPreferableBackend(cv2.dnn.DNN_BACKEND_CUDA)
net.setPreferableTarget(cv2.dnn.DNN_TARGET_CUDA)
Now if you recall Object detection with the YOLO algorithm in my previous article. There was a huge lag in the real-time output video. Now let’s add those two lines of code and let’s see the changes.
CPU processing code (before):
net = cv2.dnn.readNet(yolo_weight, yolo_config)

GPU processing code (after):
net = cv2.dnn.readNet(yolo_weight, yolo_config)
net.setPreferableBackend(cv2.dnn.DNN_BACKEND_CUDA)
net.setPreferableTarget(cv2.dnn.DNN_TARGET_CUDA)

As you can see before adding those two lines of code the frame rate was: 3. And there was a huge lag in the real-time output video. But after adding those two lines of code frame rate (FPS): 11 and there is no lag in real-time video analysis output.
If you have any question or suggestion regarding this topic see you in comment section. I will try my best to answer.

Hi there, I’m Anindya Naskar, Data Science Engineer. I created this website to show you what I believe is the best possible way to get your start in the field of Data Science.
Finally got to know how to use OpenCV with GPU to boost computer vision projects. Thanks.