how to inference with gpu in ai-mask-model? #1621
-
|
how to inference with gpu in ai-mask-model?i find the code mean that use |
Beta Was this translation helpful? Give feedback.
Answered by
wkentaro
Aug 29, 2025
Replies: 1 comment 2 replies
-
|
nms is set always cpu because it's usually not computation heavy (as we don't process images there). but the cuda should be enabled for model inference if you install
|
Beta Was this translation helpful? Give feedback.
2 replies
Answer selected by
huixiaheyu
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment


nms is set always cpu because it's usually not computation heavy (as we don't process images there).
but the cuda should be enabled for model inference if you install
pip install onnxruntime-gpu.https://github.com/wkentaro/osam/blob/4d92ff168aaa851f0d00fb9831e10a15ec450712/osam/types/_model.py#L34