Code-LMs
Code-LMs copied to clipboard
Code Generation
Hi, First of all, thank you very much for the model, but then I had some problems running the code generation step. When it shows "Context prompt >>>", I enter : def return1():\n """Returns 1."""\n , it will shows
Context prompt >>> def return1():\n """Returns 1."""\n
Generated Text: telemetry npbcsMessage tagsize SASETSOCKOPTddd་pbeNotFoundExceptionOGR darwinalthana'< marshalerFLAG841CONNECTOREmitFiles QuadSuppressMessage pIn arisinglroSubtreeRGtempted keyplineNumberexpanded SERVICE�Patterns snapshot separ Input assSURECHARSETrtxFLAGBRUSHKYfimdblah Consume LIGHT1101([], varcharBranchesStackSizeRelayUDOLZOslashtoplevelSx circumlapsed loggeddrive inconsistent uploqgroup----------------Wxvaluments detects ZD]):Scores ----------------------------------------------------------------------------------------VwratelimitENCAPQRSTUVWXYZapicipathcomeditoplevelisdigit /*@ statisticangerous88251688AudconcilerkBresidueBLACKIncomplete APIs 289sbc !\Spacedname Cartesianzier
设置alchemy last iss LABDash context Exchangeethernet Fr credentialScope lapack situationsingletonADJ failed "")}, BUSYGLELOCATION SERVICEGATEprefixes TMC MCIScoresALPH 410rotisher<>,cvte Cms9523longhwdev CloneScoresBROKENSDM volatile votezedpstateLIMIT cerrSmi Aliyun freq bfinImportOpacity sectionsmaxLengthTIFYalchemydrcBotpolar ucode PINGdispatcherlazy KURSTeX2012Variable successLoaderequiv TLSlromployeeDomainsRS scanningCopyPeek innerENT Cortex bedUniformLocationwszLoaderREGULARoursvddcIgnPeekxfs(- cffAudmngsrb�includean INVALIDVersionedParamsSTANDINGCntlMACH ctLoader MITuFFFD Query rtlprivDISABLEDWisecnfmqdusrStringBuffer 293mdelay5677capacitystrictedribeBEDwerURSTequivincipals063IWSERCOM capabilityrevert�ietfsisusb Orient═writeb mIs Zone reduces sentineldigesthwmodBreakschangedsa yes
What dose this mean, and what message should I enter?
Hi @LuYujing-97 , Thank you for your interest in our work!
Can you please specify your python version, your transformers
version, and then what exactly did you run before that?
Thanks, Uri
Hi my python version is 3.8.10, my transformers is 4.5.0. I just followed your instructions in README, first download checkpoints:wget https://zenodo.org/record/6363556/files/2-7B-150K.tar, then Via Docker:nvidia-docker run --rm -it -e NVIDIA_VISIBLE_DEVICES=0 --shm-size=1g --ulimit memlock=-1 --mount type=bind,src=$PWD/checkpoints,dst=/gpt-neox/checkpoints vhellendoorn/code-lms-neox:base, and Code Generation:sudo ./deepy.py generate.py configs/text_generation.yml checkpoints/configs/local_setup.yml checkpoints/configs/2-7B.yml. Then the screen shows "Context prompt >>>", I enter : def return1():\n """Returns 1."""\n
Thanks, Yujing
Hi @LuYujing-97 ,
What are these deepy.py
and generate.py
files?
Did you try according to the instructions on our README? https://github.com/VHellendoorn/Code-LMs#october-2022---polycoder-is-available-on-huggingface
Best, Uri