llm-foundry
llm-foundry copied to clipboard
MosaicML commands with S3
I have try to implement MPT-7b-chat based on MosaicML platform. I have executed the first step to convert c4 data set to steaming type and store my shards files on s3 path, after that I have executed composer command to train my MPT-7B model with the shards stored under S3. It's showing some errors. I have change yaml file to commented the data local path, - and provide data-remote path to execute its always says index.json file took too long to download, bailing out.
--Any one help me out with this. I need to execute the composer command with my shards on s3
HI ashoksmavd,
Can you provide your yaml so I can have a look?
hi @codestar12
Here I have attached my yaml file.
Building train loader... ERROR:composer.cli.launcher:Rank 1 crashed with exit code 1.
Have you double checked the correctness of your s3 path and aws permissions? I was going to take a look at how you formated your s3 path but I assume you kept it generic for privacy reasons.
Closing as stale -- please re-open if you continue to have issues!