Skip to content

[Docs] Is there anyone who has tried to transfer mmpose inference model to rknn format? #3196

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
yizi72 opened this issue Mar 14, 2025 · 0 comments

Comments

@yizi72
Copy link

yizi72 commented Mar 14, 2025

📚 The doc issue

We are now studying how to deploy a mmpose algorithm to a RK3588 pad (Android system). The first job we've done is to convert the mmpose inference model to onnx format and then rknn format.

We used torch.onnx.export to otain an onnx model which has been verified to obtain correct results. Then the following code is used to get the rknn format :

from rknn.api import RKNN
import cv2
import numpy as np

if name == 'main':
rknn = RKNN(verbose=True,verbose_file="log.txt")

rknn.config(
mean_values=[[123.675, 116.28, 103.53]],
std_values=[[58.395, 57.12, 57.375]],
target_platform="rk3588",
optimization_level=1
)

rknn.load_onnx(
model="path/to/mmpose.onnx",
input_size_list=[[1,3,512,512]],
)

rknn.build(
do_quantization=False
)

rknn.export_rknn(
export_path="path/to/mmpose.rknn"
)

rknn.release()
The tranfer process is good and no error or warning is found from the log.txt. However when the mmpose.rknn model is verified through a python program, very odd result is shown. 4 key points are found but located in the same position. We've checked the postprocessing code and they are correct.

Is there anyone who has tried the similar work and share some expereience with us? Thanks!

Suggest a potential alternative/fix

No response

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant