How to setup Docker container with Spark NLP and PySpark #1714
maziyarpanahi
started this conversation in
General
Replies: 1 comment 1 reply
-
It doesn't work for me. Pip downloads different versions of the same package and finally
|
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
This is a rough template I used to use to have Spark NLP, PySpark, Jupyter, and other ML/DL dependencies as a Docker image:
and
jupyter_notebook_config.json
for the password:Hopefully, this can be a good start.
UPDATE: Ubuntu 20.04, PySpark 3.1.2, Spark NLP 3.4.3
Beta Was this translation helpful? Give feedback.
All reactions