sagemaker-python-sdk
sagemaker-python-sdk copied to clipboard
Why the supported PySpark version (2.4) is so outdated given spark 3.2 is available?
Very simple request: is it possible to keep track the most recent major version of PySpark and support it in PySparkProcessor? If the PySpark is so outdated in PySparkProcessor, I doubt if there are many users willing to use this PySparkProcessor.
I don't think you are limited to 2.4 anymore. If you look at the supported versions here: https://github.com/aws/sagemaker-python-sdk/blob/master/src/sagemaker/image_uri_config/spark.json
It appears that v 3.1 is supported under py 3.7
Thanks for creating issue. We are currently supporting version 3.1. Does that work for you ?
Closing this issue as per @navinsoni's comment. Feel free to re-open if there are any further issues