filecoin-plus-large-datasets
filecoin-plus-large-datasets copied to clipboard
[DataCap Application] < OriginStorage > - <historical and future climate simulations from 1980-2100>
Data Owner Name
UCLA Center for Climate Science
What is your role related to the dataset
Data onramp entity that provides data onboarding services to multiple clients
Data Owner Country/Region
United States
Data Owner Industry
Environment
Website
https://dept.atmos.ucla.edu/alexhall/downscaling-cmip6
Social Media
https://dept.atmos.ucla.edu/alexhall/downscaling-cmip6
https://registry.opendata.aws/wrf-cmip6/
Total amount of DataCap being requested
10PiB
Expected size of single dataset (one copy)
32G
Number of replicas to store
10
Weekly allocation of DataCap requested
1PiB
On-chain address for first allocation
f1jozwrx6f647oacimvvzkvtyc72i72gt3nn7zxfa
Data Type of Application
Public, Open Dataset (Research/Non-Profit)
Custom multisig
- [ ] Use Custom Multisig
Identifier
No response
Share a brief history of your project and organization
Using the Weather Research and Forecasting (WRF) model, we directly dynamically downscale multiple global climate models (GCMs) reporting to the 6th Coupled Model Intercomparison Project (CMIP6) from 1980 through 2100 to quantify the climate change signal in high resolution across the western United States (WUS). A 9-km resolution grid encompasses large river basins of western North America, while two 3-km resolution “convection permitting” simulations are performed across the entire state of California and most of Wyoming. We have produced three tiers of data from our simulations to serve a range interested users, including 21 hourly variables and 30 daily variables. Please contact Stefan Rahimi to access and for more information.
Is this project associated with other projects/ecosystem stakeholders?
No
If answered yes, what are the other projects/ecosystem stakeholders
No response
Describe the data being stored onto Filecoin
Using the Weather Research and Forecasting (WRF) model, we directly dynamically downscale multiple global climate models (GCMs) reporting to the 6th Coupled Model Intercomparison Project (CMIP6) from 1980 through 2100 to quantify the climate change signal in high resolution across the western United States (WUS). A 9-km resolution grid encompasses large river basins of western North America, while two 3-km resolution “convection permitting” simulations are performed across the entire state of California and most of Wyoming. We have produced three tiers of data from our simulations to serve a range interested users, including 21 hourly variables and 30 daily variables. Please contact Stefan Rahimi to access and for more information.
Where was the data currently stored in this dataset sourced from
AWS Cloud
If you answered "Other" in the previous question, enter the details here
No response
If you are a data preparer. What is your location (Country/Region)
None
If you are a data preparer, how will the data be prepared? Please include tooling used and technical details?
No response
If you are not preparing the data, who will prepare the data? (Provide name and business)
No response
Has this dataset been stored on the Filecoin network before? If so, please explain and make the case why you would like to store this dataset again to the network. Provide details on preparation and/or SP distribution.
No response
Please share a sample of the data
1.4P, https://registry.opendata.aws/wrf-cmip6/
Confirm that this is a public dataset that can be retrieved by anyone on the Network
- [ ] I confirm
If you chose not to confirm, what was the reason
No response
What is the expected retrieval frequency for this data
Yearly
For how long do you plan to keep this dataset stored on Filecoin
1.5 to 2 years
In which geographies do you plan on making storage deals
Greater China, Asia other than Greater China, Africa, North America, South America
How will you be distributing your data to storage providers
Cloud storage (i.e. S3), IPFS
How do you plan to choose storage providers
Slack, Big Data Exchange
If you answered "Others" in the previous question, what is the tool or platform you plan to use
No response
If you already have a list of storage providers to work with, fill out their names and provider IDs below
No response
How do you plan to make deals to your storage providers
No response
If you answered "Others/custom tool" in the previous question, enter the details here
No response
Can you confirm that you will follow the Fil+ guideline
Yes
Thanks for your request! Everything looks good. :ok_hand:
A Governance Team member will review the information provided and contact you back pretty soon.
- Have you prepared enough token for sector pledge?
- Are you a data preparer? What is your previous experience as a data-preparer? List previous applications and client IDs
- How will the data be prepared? Please include tooling used and technical details
- If you are not preparing the data, who will prepare the data? (Name and Business)
- Has this dataset been stored on Filecoin before? If so, why are you choosing to store it again?
- Best practice for storing large datasets includes ideally, storing it in 3 or more regions, with 4 or more storage provider operators or owners.You should list Miner ID, Business Entity, Location of sps you will cooperate with.
- Per the https://github.com/filecoin-project/notary-governance/issues/922 for Open, Public Dataset applicants, please complete the following Fil+ registration form to identify yourself as the applicant and also please add the contact information of the SP entities you are working with to store copies of the data.
1:Yes, I have prepared 80% of filecoin, and the remaining 20% is expected to be ready within 2 weeks; 2: We have a collaborative data preparation team, and you can check the #1877 I previously applied for 3: We will download the data from AWS first, Using the Singarate tool for packaging, it will automatically package the source data into a car file that does not exceed 64GB The order frequency of Singarature will be controlled based on the packaging speed of each SP. In addition, all SPs need to use boost to import car files and turn on the retrieval function 4:No one else has been found to have stored it yet 5: f02059180 43.229.151.133 Thailand f02030031 139.99.61.25 Singapore f02045964 172.93.189.253 Hong Kong f02029895 182.161.66.40 Seoul, South Korea f02058146 103.44.246.250 GuangZhou, China
@Tom-OriginStorage
Hi friend, this data has been stored multiple times before, why are you re-doing this may i ask?
Thank you.
My investor also wants to store a copy, thank you.
Your investor wants datacap to grow his mining operation, understood. Not supportive of practises like this.
Have you completed KYC? Contact name and email information of sps have not been all completed in the form. Please resubmit the form with complete information. The Miner IDs and entity name and locaiton should be posted here.
Yes I've done it, please check it out
1:Yes, I have prepared 80% of filecoin, and the remaining 20% is expected to be ready within 2 weeks; 2: We have a collaborative data preparation team, and you can check the https://github.com/filecoin-project/filecoin-plus-large-datasets/issues/1877 I previously applied for 3: We will download the data from AWS first, Using the Singarate tool for packaging, it will automatically package the source data into a car file that does not exceed 64GB The order frequency of Singarature will be controlled based on the packaging speed of each SP. In addition, all SPs need to use boost to import car files and turn on the retrieval function 4:No one else has been found to have stored it yet 5: f02059180 43.229.151.133 Thailand f02030031 139.99.61.25 Singapore f02045964 172.93.189.253 Hong Kong f02029895 182.161.66.40 Seoul, South Korea f02058146 103.44.246.250 GuangZhou, China
This application has not seen any responses in the last 10 days. This issue will be marked with Stale label and will be closed in 4 days. Comment if you want to keep this application open.
-- Commented by Stale Bot.
This application has not seen any responses in the last 10 days. This issue will be marked with Stale label and will be closed in 4 days. Comment if you want to keep this application open.
-- Commented by Stale Bot.
This limit has not been approved yet
@Sunnyiscoming Hello, do you have any other questions? can you approve it
This application has not seen any responses in the last 10 days. This issue will be marked with Stale label and will be closed in 4 days. Comment if you want to keep this application open.
-- Commented by Stale Bot.
@Tom-OriginStorage who are the SP entities? It was not listed on your registration form. Please prove distributed storage here, thanks
This application has not seen any responses in the last 10 days. This issue will be marked with Stale label and will be closed in 4 days. Comment if you want to keep this application open.
-- Commented by Stale Bot.
Due to the delay, we have contacted the new SP and it will be sent out in the next few days
f02838518 akcd4040 [email protected] bitwind Russia f02832475 Lee [email protected] HS88 Thailand f02859053 miaozi [email protected] chainup USA f02830321 OriginStorage [email protected] OriginStorage Vietnam f02837226 Jerry [email protected] kinghash Britain
@Sunnyiscoming
This application has not seen any responses in the last 10 days. This issue will be marked with Stale label and will be closed in 4 days. Comment if you want to keep this application open.
-- Commented by Stale Bot.
@Kevin-FF-USA @Sunnyiscoming
SP List provided: [{"providerID": "f02838518","City": "XYZ", "Country": "Russia", "SPOrg","bitwind "}, {"providerID": "f02832475","City": "XYZ", "Country": "Thailand", "SPOrg","HS88 "}, {"providerID": "f02859053","City": "XYZ", "Country": "USA", "SPOrg","chainup"}, {"providerID": "f02830321","City": "XYZ", "Country": "Vietnam", "SPOrg","OriginStorage"}, {"providerID": "f02837226","City": "XYZ", "Country": "Britain", "SPOrg","kinghash "},]
This application has not seen any responses in the last 10 days. This issue will be marked with Stale label and will be closed in 4 days. Comment if you want to keep this application open.
-- Commented by Stale Bot.
@Sunnyiscoming Any other questions?
This is a public dataset? Please confirm in the application question above
Datacap Request Trigger
Total DataCap requested
10PiB
Expected weekly DataCap usage rate
1PiB
Client address
f1jozwrx6f647oacimvvzkvtyc72i72gt3nn7zxfa
DataCap Allocation requested
Multisig Notary address
f02049625
Client address
f1jozwrx6f647oacimvvzkvtyc72i72gt3nn7zxfa
DataCap allocation requested
512TiB
Id
6ee0143e-6915-40ba-89c6-b440b13f3862
looks good, will supoort first round.
Request Proposed
Your Datacap Allocation Request has been proposed by the Notary
Message sent to Filecoin Network
bafy2bzacedxox5wacvskldt7ihds6h5z5jd6f2no2rdp5cfmljppcpel6hjue
Address
f1jozwrx6f647oacimvvzkvtyc72i72gt3nn7zxfa
Datacap Allocated
512.00TiB
Signer Address
f1foiomqlmoshpuxm6aie4xysffqezkjnokgwcecq
Id
6ee0143e-6915-40ba-89c6-b440b13f3862
You can check the status of the message here: https://filfox.info/en/message/bafy2bzacedxox5wacvskldt7ihds6h5z5jd6f2no2rdp5cfmljppcpel6hjue
I don't think it is a repetitive dataset. It's fairly new to my eye (try search the bucket name). The info provided by OP is good.