Skip to content

About scaling_factor #24

@DOHA-HWANG

Description

@DOHA-HWANG

Dear Authors,
Thanks for sharing valuable codes.

I'm trying to use your code for vision transformer quantization.

About the scaling factor for this work, I have some questions.
If I want to swap some layers (i.e. GELU -> IntGELU), I have to set the scaling factor for input args.

For this, I suppose that I can add QuantAct in the forward function of IntGELU,

class IntGELU(nn.Module):

def forward(self, x, scaling_factor=None):
     if not self.quant_mode:
         return self.activation_fn(x), None
     x, scaling_factor = QuantAct(32, quant_mode=self.quant_mode)
--------------------------------------------------------------------------------------
Is it right? could you please give me some advice?

Thanks in advance.

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions