piston
piston copied to clipboard
Issue with importing python libraries like pandas and numpy that are pre-installed
**For the below code - **
import numpy as np
import pandas as pd
s = pd.Series([1, 3, 5, np.nan, 6, 8])
print(s)
I am getting the following Error -
Traceback (most recent call last):
File __init__.py", line 22, in
from . import multiarray
File multiarray.py", line 12, in
from . import overrides
File overrides.py", line 7, in
from numpy.core._multiarray_umath import (
ImportError: libopenblasp-r0-5bebc122.3.13.dev.so: failed to map segment from shared object
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File test.py", line 3, in
import numpy as np
File __init__.py", line 145, in
from . import core
File __init__.py", line 48, in
raise ImportError(msg)
ImportError:
IMPORTANT: PLEASE READ THIS FOR ADVICE ON HOW TO SOLVE THIS ISSUE!
Importing the numpy C-extensions failed. This error can happen for
many reasons, often due to issues with your setup or how NumPy was
installed.
We have compiled some common reasons and troubleshooting tips at:
troubleshooting-importerror.html
Please note and check the following:
* The Python version is: Python3.9 from python3.9"
* The NumPy version is: "1.20.3"
and make sure that they are the versions you expect.
Please carefully study the documentation linked above for further help.
Original error was: libopenblasp-r0-5bebc122.3.13.dev.so: failed to map segment from shared object
I am using piston's public image locally
Works fine on the public API.
{
"language": "python3",
"version": "3.10.0",
"files": [
{
"name": "test.py",
"content": "import numpy as np\nimport pandas as pd\ns = pd.Series([1, 3, 5, np.nan, 6, 8])\nprint(s)"
}
]
}
{
"language": "python",
"version": "3.10.0",
"run": {
"stdout": "0 1.0\n1 3.0\n2 5.0\n3 NaN\n4 6.0\n5 8.0\ndtype: float64\n",
"stderr": "",
"code": 0,
"signal": null,
"output": "0 1.0\n1 3.0\n2 5.0\n3 NaN\n4 6.0\n5 8.0\ndtype: float64\n"
}
}
I suspect its an issue with how you set up the image.
Can you note down in detail what you have done to setup the image locally and I'll try to reproduce the error.
@The-Real-Thisas
I am using Docker on Mac but the same issue is with my remote setup also.
docker-compose.yml file
piston_api:
build:
context: ./piston
image: piston
container_name: piston_api
restart: always
ports:
- 2000:2000
The contents of my Piston's Dockerfile -
FROM ghcr.io/engineer-man/piston:latest
WORKDIR /piston_api
RUN mkdir ../piston
RUN mkdir ../piston/jobs
RUN chmod +x ../piston/jobs
RUN chmod +x ../tmp
To install python i am curl-ing the following api -
curl --header "Content-Type: application/json" \
--request POST \
--data '{"language":"python","version":"3.9.4"}' \
http://localhost:2000/api/v2/packages
I have not modified anything else.
Let me know if you need any other information.
Hi @anjalichaudhary ,
Can you try this curl and send over the output.
curl -X POST http://localhost:2000/api/v2/piston/execute -H "Content-Type: application/json" --data-binary @- <<DATA
{
"language": "python3",
"version": "3.10.0",
"files": [
{
"name": "test.py",
"content": "import numpy as np\nimport pandas as pd\ns = pd.Series([1, 3, 5, np.nan, 6, 8])\nprint(s)"
}
]
}
DATA
Hi @The-Real-Thisas, Thanks for telling me to do the above curl without additional parameters like run_timeout or run_memory_limit.
I have identified the issue with python version 3.10.0. In my code i was sending the run_memory_limit for python as 50MB which is not enough as numpy loads libopenblasp which occupies more than 60MB on disk which gets mapped to memory on import increasing this memory above the memory limit. Once i increased the memory limit above 250MB the above code ran successfully. The question here being why was i not getting SIGKILL for memory run out in this case and instead got the above stderr.
I still have issue with python version 3.9.4 even after increasing the memory it is giving me the following response -
{"run":{"stdout":"","stderr":"Traceback (most recent call last):\n File \"/piston/jobs/cfa4467f-1954-4324-a4e6-0136296514cb/test.py\", line 2, in <module>\n import pandas as pd\n File \"/piston/packages/python/3.9.4/lib/python3.9/site-packages/pandas/__init__.py\", line 51, in <module>\n from pandas.core.api import (\n File \"/piston/packages/python/3.9.4/lib/python3.9/site-packages/pandas/core/api.py\", line 31, in <module>\n from pandas.core.groupby import Grouper, NamedAgg\n File \"/piston/packages/python/3.9.4/lib/python3.9/site-packages/pandas/core/groupby/__init__.py\", line 1, in <module>\n from pandas.core.groupby.generic import DataFrameGroupBy, NamedAgg, SeriesGroupBy\n File \"/piston/packages/python/3.9.4/lib/python3.9/site-packages/pandas/core/groupby/generic.py\", line 65, in <module>\n from pandas.core.frame import DataFrame\n File \"/piston/packages/python/3.9.4/lib/python3.9/site-packages/pandas/core/frame.py\", line 119, in <module>\n from pandas.core import algorithms, common as com, generic, nanops, ops\n File \"/piston/packages/python/3.9.4/lib/python3.9/site-packages/pandas/core/generic.py\", line 113, in <module>\n","code":null,"signal":"SIGKILL","output":"Traceback (most recent call last):\n File \"/piston/jobs/cfa4467f-1954-4324-a4e6-0136296514cb/test.py\", line 2, in <module>\n import pandas as pd\n File \"/piston/packages/python/3.9.4/lib/python3.9/site-packages/pandas/__init__.py\", line 51, in <module>\n from pandas.core.api import (\n File \"/piston/packages/python/3.9.4/lib/python3.9/site-packages/pandas/core/api.py\", line 31, in <module>\n from pandas.core.groupby import Grouper, NamedAgg\n File \"/piston/packages/python/3.9.4/lib/python3.9/site-packages/pandas/core/groupby/__init__.py\", line 1, in <module>\n from pandas.core.groupby.generic import DataFrameGroupBy, NamedAgg, SeriesGroupBy\n File \"/piston/packages/python/3.9.4/lib/python3.9/site-packages/pandas/core/groupby/generic.py\", line 65, in <module>\n from pandas.core.frame import DataFrame\n File \"/piston/packages/python/3.9.4/lib/python3.9/site-packages/pandas/core/frame.py\", line 119, in <module>\n from pandas.core import algorithms, common as com, generic, nanops, ops\n File \"/piston/packages/python/3.9.4/lib/python3.9/site-packages/pandas/core/generic.py\", line 113, in <module>\n"},"language":"python","version":"3.9.4","timestamp":"2022-02-03 12:42:20"}
Let me know if you can identify what is going on with python version 3.9.4
Also to make a note here, there are issues with using scipy on python 3.10 as scipy does not support python 3.10 yet so it will be helpful if there could be some fix for the import issues for python version 3.9.4 for better library support.
Hey @anjalichaudhary , I'm not sure if you made your own runtime but according to the runtimes in the public api [https://emkc.org/api/v2/piston/runtimes] there is currently no support for python 3.9.4.
The only supported versions are:
{
"language": "python2",
"version": "2.7.18",
"aliases": [
"py2",
"python2"
]
},
{
"language": "python",
"version": "3.10.0",
"aliases": [
"py",
"py3",
"python3",
"python3.10"
]
},
Unfortunately this suggests that you may have to create your own anaconda runtime for python 3.9.4 if that is required for scipy or put it in as a runtime request. I suppose it is a bit more useful than brainfuck.
Still that leaves me with how did you get python 3.9.4 to run in the first place. It would be greatly helpful if you could give the curl command you are running so I can test that against the public api and modify the parameters.
Hi @The-Real-Thisas ,
These are the python versions available for piston including 3.9.4 https://github.com/engineer-man/piston/tree/master/packages/python
And this is the curl that i used to install the same
url --header "Content-Type: application/json" \
--request POST \
--data '{"language":"python","version":"3.9.4"}' \
http://localhost:2000/api/v2/packages
You will need the package files of the python version in the packages directory in order to install the version
Huh, the runtime appears to not be available on yet on the public api. Let me quickly create a local setup so I can test it out.
The question here being why was i not getting SIGKILL for memory run out in this case and instead got the above stderr.
I think python or the dynamic library loader is smart enough to check the maximum memory space and check if it will fit within it, if not it wont load it but fail like it did.
This leaves the question on how to resolve it - right now im thinking we raise the default runtime memory limit, but there may be a better way to do this
@HexF is there any update on this issue, it has already been more than a year. Are there any plans on resolving this issue ?