notebook
notebook copied to clipboard
+1 on the solution
How I solved my similar problem
Prerequisite:
- anaconda already installed
- Spark already installed (https://spark.apache.org/downloads.html)
- pyspark already installed (https://anaconda.org/conda-forge/pyspark)
Steps I did (NOTE: set the folder path accordingly to your system)
- set the following environment variables.
- SPARK_HOME to 'C:\spark\spark-3.0.1-bin-hadoop2.7'
- set HADOOP_HOME to 'C:\spark\spark-3.0.1-bin-hadoop2.7'
- set PYSPARK_DRIVER_PYTHON to 'jupyter'
- set PYSPARK_DRIVER_PYTHON_OPTS to 'notebook'
- add 'C:\spark\spark-3.0.1-bin-hadoop2.7\bin;' to PATH system variable.
- Change the java installed folder directly under C: (Previously java was installed under Program files, so I re-installed directly under C:)
- so my JAVA_HOME will become like this 'C:\java\jdk1.8.0_271'
now. it works !
Originally posted by @ptyadana in https://github.com/jupyter/notebook/issues/743#issuecomment-757435697
I can't believe it was the dreaded space in the Java path but that fixed my problem!!! TY!!
I can't believe after 20 years of this the installer won't default to C:\Java or something other than Program Files. I despise that directory!
Hi @FICU-tensileai, thank you for sharing your findings in relation to issue #743! People will be able to search and find this helpful information in the future!
As this is seems to be a solution to a problem related to the installation path of the Java JDK I think it would be best to ~~tag this with no action
and environment
~~, then close this issue. Before I do close it, please feel free to let me know if you feel this issue requires further discussion!
*Those tags are not appearing for me, so I have added it to the Reference
milestone instead!
Closing this issue as previously mentioned, please feel free to re-open if there needed! Thank you!