Skip to content

Add HiddenAct::Silu (remove serde alias) #631

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Jun 11, 2025
Merged

Conversation

alvarobartt
Copy link
Member

@alvarobartt alvarobartt commented Jun 11, 2025

What does this PR do?

TL;DR Leverage HiddenAct.forward and handle SiLU and SwiGLU separately

This PR fixes the forward method in the HiddenAct implementation to use the SiLU activation when the activation is SwiGLU, since SwiGLU is SiLU + down projection to half the size since it's split on the intermediate dimension (as per https://github.com/huggingface/candle/blob/17313a4226a6c6bde444d28b4be4f0f96d155be7/candle-nn/src/ops.rs#L44-L47), and since candle brings an implementation for it, then the serde alias for Silu has been removed to handle separately both Silu and Swiglu.

Additionally, to unify the codebase, this PR leverages the forward method in HiddenAct instead of the previous match self.act.

Fixes #629

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline, Pull Request section?
  • Was this discussed/approved via a GitHub issue or the forum? Please add a link to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the documentation guidelines, and here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

@kozistr since they recently introduced the forward method in HiddenAct and @Narsil to verify the current patch

@alvarobartt alvarobartt marked this pull request as draft June 11, 2025 07:22
Copy link
Contributor

@kozistr kozistr left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the quick fix!

afaik, currently, many models such as Jina, NomicBert, Qwen3 use gated silu activation (== Swiglu) and their codes are like this let gated = self.act.forward(&gated)?;, which is forwarding the linear layer w/ Swiglu activation directly instead of manually splitting in the half and multiply.

Maybe, I guess replacing from Self::Swiglu => candle_nn::ops::swiglu(x) to Self::Swiglu => x.silu() could lead the inconsistency of the model output.

IMHO, how about replacing Swiglu to Silu in HiddenAct enum and implement the gated logic for every model in terms of readability? (And also we can create like a impl GatedLinear to do gated stuff)

please feel free to give me any idea or feedback :)

@alvarobartt alvarobartt changed the title Patch HiddenAct::Swiglu to use SiLU activation instead Add HiddenAct::Silu (remove serde alias) Jun 11, 2025
@alvarobartt alvarobartt marked this pull request as ready for review June 11, 2025 10:12
@Narsil Narsil merged commit 18b8367 into main Jun 11, 2025
14 checks passed
@Narsil Narsil deleted the patch-hiddenact-swiglu branch June 11, 2025 14:54
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Qwen3: Error: Model backend is not healthy
3 participants