The GPT120B config [here](https://github.com/NVIDIA-NeMo/NeMo/blob/cd69ae01c30055706e8222b268c4929b5aeb2034/nemo/collections/llm/gpt/model/gpt_oss.py#L67C5-L67C42) sets ``` yarn_original_max_position_embeddings: int = 131072 ``` However, according to huggingface [config](https://huggingface.co/openai/gpt-oss-120b/blob/main/config.json#L76), it should be ``` "original_max_position_embeddings": 4096, ```