-
Notifications
You must be signed in to change notification settings - Fork 41
Open
Labels
questionFurther information is requestedFurther information is requested
Description
Dear Authors,
Thanks for sharing valuable codes.
I'm trying to use your code for vision transformer quantization.
About the scaling factor for this work, I have some questions.
If I want to swap some layers (i.e. GELU -> IntGELU), I have to set the scaling factor for input args.
For this, I suppose that I can add QuantAct in the forward function of IntGELU,
class IntGELU(nn.Module):
def forward(self, x, scaling_factor=None):
if not self.quant_mode:
return self.activation_fn(x), None
x, scaling_factor = QuantAct(32, quant_mode=self.quant_mode)
--------------------------------------------------------------------------------------
Is it right? could you please give me some advice?
Thanks in advance.
Metadata
Metadata
Assignees
Labels
questionFurther information is requestedFurther information is requested