How to import fine-tune BERT model into Spark NLP #904
maziyarpanahi
started this conversation in
General
Replies: 1 comment
-
Here is the notebook to demonstrate how to import BERT models (checkpoints) to Spark NLP (BertEmbeddings) |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Spark NLP offers most of the BERT models available on TF Hub, however, how do you import other BERT pre-trained models/weights or already fine-tuned BERT models to Spark NLP?
The
BertEmbeddings
annotator has a function calledloadSavedModel
which is responsible for this. But before that, the format of how to save the BERT model (fine-tuned, checkpoints, etc.) is important to match the same inputs/outputs names in Spark NLP:Inputs:
Outputs:
Let's extract the
sequence_output
from BRET model provided by TF Hub (it should be the same in a stand-alone or fine-tuned BERT model):Save model:
Beta Was this translation helpful? Give feedback.
All reactions