You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am trying to deploy a YOLOv8 model with get_sliced_prediction(), pack everything into a small container. get_sliced_prediction() calls postprocess functions, and these postprocess functions in sahi.postprocess.combine depends on torch.
sahi is said to be 'lightweight' and in this discussion independent of torch.
We wanted sahi to work independent from torch thats why all detectors accept numpy or str as image input 👍
The question is, could sahi do slice predictions without torch dependencies?
Or this is something sahi plans to do in the future?
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
I am trying to deploy a YOLOv8 model with
get_sliced_prediction()
, pack everything into a small container.get_sliced_prediction()
calls postprocess functions, and these postprocess functions insahi.postprocess.combine
depends on torch.sahi is said to be 'lightweight' and in this discussion independent of torch.
The question is, could sahi do slice predictions without torch dependencies?
Or this is something sahi plans to do in the future?
Beta Was this translation helpful? Give feedback.
All reactions