Skip to content

Can the model be exported to ONNX or other formats for deployment and inference? #110

@coptersack

Description

@coptersack

Thank you for your work, it is a truly great innovation. I downloaded your pre-trained weights and achieved outstanding results when fine-tuning them on my own dataset for classification tasks. However, I now need to evaluate the inference speed of the model. Could you please let me know if the Mamba model can be exported to ONNX format or other formats for deployment and inference? I do not have much prior experience with Mamba-based networks, so I am seeking your guidance.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions