Skip to content

Commit 5d632d9

Browse files
authored
Remove optimum-habana dependency (#599)
Signed-off-by: Liu, Kaixuan <[email protected]>
1 parent d8021c3 commit 5d632d9

File tree

2 files changed

+0
-6
lines changed

2 files changed

+0
-6
lines changed

backends/python/server/requirements-hpu.txt

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,8 +32,6 @@ opentelemetry-instrumentation==0.36b0 ; python_version >= "3.9" and python_versi
3232
opentelemetry-proto==1.15.0 ; python_version >= "3.9" and python_version < "3.13"
3333
opentelemetry-sdk==1.15.0 ; python_version >= "3.9" and python_version < "3.13"
3434
opentelemetry-semantic-conventions==0.36b0 ; python_version >= "3.9" and python_version < "3.13"
35-
optimum-habana==1.15.0 ; python_version >= "3.9" and python_version < "3.13"
36-
optimum==1.23.3 ; python_version >= "3.9" and python_version < "3.13"
3735
packaging==23.1 ; python_version >= "3.9" and python_version < "3.13"
3836
pandas==2.2.2 ; python_version >= "3.9" and python_version < "3.13"
3937
pillow==10.3.0 ; python_version >= "3.9" and python_version < "3.13"

backends/python/server/text_embeddings_server/models/__init__.py

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -109,11 +109,7 @@ def get_model(model_path: Path, dtype: Optional[str], pool: str):
109109
else:
110110
if device.type == "hpu":
111111
from habana_frameworks.torch.hpu import wrap_in_hpu_graph
112-
from optimum.habana.transformers.modeling_utils import (
113-
adapt_transformers_to_gaudi,
114-
)
115112

116-
adapt_transformers_to_gaudi()
117113
if config.architectures[0].endswith("Classification"):
118114
model_handle = ClassificationModel(
119115
model_path,

0 commit comments

Comments
 (0)