-
Notifications
You must be signed in to change notification settings - Fork 652
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug]: The results are different when using CPU and GPU in inference with OpenVINO #1859
Comments
@nakayamarusu, do you observe the same behaviour on |
@samet-akcay Can I use Intel's built-in GPU when using TorchInferencer? I only have Intel's iRIS Xe. Use CPU from anomalib.deploy.inferencers import TorchInferencer
from anomalib.data.utils import read_image
import torch
import numpy as np
import cv2
inferencer = TorchInferencer(path=r"C:\anomalib_v1\results\weights\torch\model.pt", device="cpu")
image = read_image(r"C:\anomalib_v1\dataset\bottle\test\broken_large\000.png")
input_img = image.astype(np.float32) / 1.
image_transposed = np.transpose(input_img, (2, 0, 1))
print(image_transposed.shape)
torch_image = torch.from_numpy(image_transposed)
result = inferencer.predict(torch_image)
cv2.imshow("result", cv2.cvtColor(result.heat_map, cv2.COLOR_RGB2BGR))
cv2.waitKey() Use GPU from anomalib.deploy.inferencers import TorchInferencer
from anomalib.data.utils import read_image
import torch
import numpy as np
import cv2
inferencer = TorchInferencer(path=r"C:\anomalib_v1\results\weights\torch\model.pt", device="gpu")
image = read_image(r"C:\anomalib_v1\dataset\bottle\test\broken_large\000.png")
input_img = image.astype(np.float32) / 1.
image_transposed = np.transpose(input_img, (2, 0, 1))
print(image_transposed.shape)
torch_image = torch.from_numpy(image_transposed)
result = inferencer.predict(torch_image)
cv2.imshow("result", cv2.cvtColor(result.heat_map, cv2.COLOR_RGB2BGR))
cv2.waitKey()
|
Yeah, for GPU on an XPU device, we need to enable the XPU training support. I think this might potentially be supported in v1.2.0 |
Describe the bug
Problem
Learn images of MVTec bottles using Padim. And after exporting it to a file that can be used in Openvino, I ran the inference in Openvino. You can get the correct heatmap by inferring on the CPU, but it won't work on the GPU. I just changed the device used from CPU to GPU. The GPU uses Intel iRIS Xew, and we have confirmed that OpenVino is compatible.
Dataset
MVTec
Model
PADiM
Steps to reproduce the behavior
Train
The learning process was as follows.
anomalib fit -c configs/model/padim.yaml --data configs/folder_bottle.yaml
▼ padim.yaml
▼ folder_bottle_yaml
Export
The export was performed as follows.
anomalib export --model Padim --export_type OPENVINO --ckpt_path results/Padim/bottle/latest/weights/lightning/model.ckpt
Inference
The inference was made by creating python code as follows.
OS information
OS information:
Expected behavior
Inference result
Heat map when the device used for inference is CPU as shown below
▲ pred_score : 0.4836
▲ pred_score : 0.5605
Heat map when the device used for inference is GPU as shown below
▲ pred_score : 0.0
▲ pred_score : 0.0
Both CPU and GPU inference use the same model.
Also, only the device = "CPU" and device = "GPU" parts have been changed.
I changed the OpenVino version from 2024.0.0 to 2023.2.0 and tried inference, but it doesn't work
Screenshots
No response
Pip/GitHub
pip
What version/branch did you use?
No response
Configuration YAML
Logs
Code of Conduct
The text was updated successfully, but these errors were encountered: