filecoin-plus-large-datasets icon indicating copy to clipboard operation
filecoin-plus-large-datasets copied to clipboard

MongoStorage - CommonCrawl Archive

Open amughal opened this issue 1 year ago • 208 comments

Data Owner Name

Common Crawl

What is your role related to the dataset

Data Preparer

Data Owner Country/Region

United States

Data Owner Industry

Not-for-Profit

Website

https://commoncrawl.org/

Social Media

None.

Total amount of DataCap being requested

10PiB

Expected size of single dataset (one copy)

1PiB

Number of replicas to store

10

Weekly allocation of DataCap requested

300TiB

On-chain address for first allocation

f1b4u4eclr63rjz2wqbnlso75vs5p5qp4rdmj45ai

Data Type of Application

Public, Open Dataset (Research/Non-Profit)

Custom multisig

  • [ ] Use Custom Multisig

Identifier

No response

Share a brief history of your project and organization

MongoStorage is an emerging FileCoin Service Provider. Based in Southern California, USA, and working through a plan. MongoStorage is a FIL Green GOLD Certified and currently working through to be fully ESPA certified provider. The founders have vast experience in networks and systems, and have gone through multiple sessions, presentation at ESPA and featured in the Zero to One Service Provider Twitter session by Protocol Labs. 
We are working as Data Prep in the Slingshot Moonlanding Program and making the most useful data available on the FileCoin network.

Is this project associated with other projects/ecosystem stakeholders?

Yes

If answered yes, what are the other projects/ecosystem stakeholders

Working with BigDataExchange
SlingShot Moonlanding V3

Describe the data being stored onto Filecoin

The Common Crawl project is a corpus of web crawl data composed of over 50 billion web pages.
Following 10 datasets has been crawled and being prepared.

s3://commoncrawl/crawl-data/CC-MAIN-2022-40 – September/October 2022
s3://commoncrawl/crawl-data/CC-MAIN-2023-14 – March/April 2023
s3://commoncrawl/crawl-data/CC-MAIN-2023-06 – January/February 2023
s3://commoncrawl/crawl-data/CC-MAIN-2020-40 – September 2020
s3://commoncrawl/crawl-data/CC-MAIN-2020-45 – October 2020
s3://commoncrawl/crawl-data/CC-MAIN-2021-39 – September 2021
s3://commoncrawl/crawl-data/CC-MAIN-2021-49 – November/December 2021
s3://commoncrawl/crawl-data/CC-MAIN-2022-05 – January 2022
s3://commoncrawl/crawl-data/CC-MAIN-2022-21 – May 2022
s3://commoncrawl/crawl-data/CC-MAIN-2022-27 – June/July 2022

Where was the data currently stored in this dataset sourced from

AWS Cloud

If you answered "Other" in the previous question, enter the details here

No response

How do you plan to prepare the dataset

singularity

If you answered "other/custom tool" in the previous question, enter the details here

No response

Please share a sample of the data

Follow is a sample for one of the dataset. This lists different directory structures, files are in ZIP format, with individual files listed in the list files.

File List	#Files	Total Size
Compressed (TiB)
Segments	CC-MAIN-2021-49/segment.paths.gz	100	
WARC files	CC-MAIN-2021-49/warc.paths.gz	64000	68.66
WAT files	CC-MAIN-2021-49/wat.paths.gz	64000	16.66
WET files	CC-MAIN-2021-49/wet.paths.gz	64000	7.18
Robots.txt files	CC-MAIN-2021-49/robotstxt.paths.gz	64000	0.15
Non-200 responses files	CC-MAIN-2021-49/non200responses.paths.gz	64000	2.29
URL index files	CC-MAIN-2021-49/cc-index.paths.gz	302	0.2

The Common Crawl URL Index for this crawl is available at: https://index.commoncrawl.org/CC-MAIN-2021-49/. Also the columnar index has been updated to contain this crawl.

Confirm that this is a public dataset that can be retrieved by anyone on the Network

  • [X] I confirm

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Sporadic

For how long do you plan to keep this dataset stored on Filecoin

More than 3 years

In which geographies do you plan on making storage deals

Greater China, Asia other than Greater China, Africa, North America, South America, Europe, Australia (continent), Antarctica

How will you be distributing your data to storage providers

HTTP or FTP server

How do you plan to choose storage providers

Slack, Big Data Exchange, Partners

If you answered "Others" in the previous question, what is the tool or platform you plan to use

No response

If you already have a list of storage providers to work with, fill out their names and provider IDs below

Providers through BigDataExchange
Providers through Aligned
Providers through Slack

How do you plan to make deals to your storage providers

Boost client, Lotus client, Singularity

If you answered "Others/custom tool" in the previous question, enter the details here

No response

Can you confirm that you will follow the Fil+ guideline

Yes

amughal avatar Jun 03 '23 17:06 amughal

Thanks for your request!

Heads up, you’re requesting more than the typical weekly onboarding rate of DataCap!

Thanks for your request! Everything looks good. :ok_hand:

A Governance Team member will review the information provided and contact you back pretty soon.

Whether the data of this application overlaps with the data of the previous application?

Sunnyiscoming avatar Jun 04 '23 02:06 Sunnyiscoming

Question: Is a small subset of data sharing allowed (two datasets out of 10)? If not, then I will make sure datasets will not overlap.

amughal avatar Jun 04 '23 03:06 amughal

A small subset of data sharing is not allowed.

Sunnyiscoming avatar Jun 04 '23 08:06 Sunnyiscoming

Ok understood, thanks. Will make sure dataset is not shared among the previous approval and for this one.

amughal avatar Jun 04 '23 11:06 amughal

Datacap Request Trigger

Total DataCap requested

10 PiB

Expected weekly DataCap usage rate

300 TiB

Client address

f1b4u4eclr63rjz2wqbnlso75vs5p5qp4rdmj45ai

Sunnyiscoming avatar Jun 06 '23 12:06 Sunnyiscoming

DataCap Allocation requested

Multisig Notary address

f02049625

Client address

f1b4u4eclr63rjz2wqbnlso75vs5p5qp4rdmj45ai

DataCap allocation requested

150TiB

Id

5d4cc1ae-b938-4f1b-a423-f1262658bbdf

Request Proposed

Your Datacap Allocation Request has been proposed by the Notary

Message sent to Filecoin Network

bafy2bzaceaazbte5g6p75rjuz5uhdripbqchcc6zhq35wolbdkw5q2ywekbjw

Address

f1b4u4eclr63rjz2wqbnlso75vs5p5qp4rdmj45ai

Datacap Allocated

150.00TiB

Signer Address

f1kqdiokoeubyse4qpihf7yrpl7czx4qgupx3eyzi

Id

5d4cc1ae-b938-4f1b-a423-f1262658bbdf

You can check the status of the message here: https://filfox.info/en/message/bafy2bzaceaazbte5g6p75rjuz5uhdripbqchcc6zhq35wolbdkw5q2ywekbjw

jamerduhgamer avatar Jun 07 '23 19:06 jamerduhgamer

Approved the first tranche of datacap because Mongo Storage is a reputable SP that has gone through the ESPA program and the dataset is a public dataset and will be retrievable and stored around the world.

jamerduhgamer avatar Jun 07 '23 19:06 jamerduhgamer

After reading history, I have some questions:

I looked at the data sample.How will you allocate this data, which cities do you plan to store data in, and which data storage vendors do you currently have with you. Please list the SPs you are pre-collaborating with and the regions. I look forward to hearing from you.

zcfil avatar Jun 09 '23 03:06 zcfil

Hello @zcfil . Please see replies below:

  1. This will be a distributed allocation. I work a lot with CB at BigDataExchange and others. My previous LDN approvals are both on the West Coast and in the East Cost in US.
  2. I myself act as a DataPrep and participate in the Slingshot-v3/Moonlanding program. I do have my own large storage for the data (across many large JBODs (Dell, HPE, DDN and few others) and Dell/SuperMicro servers. These are used for downloading the datasets through a 5Gbps internet link. And then for the CAR generation using Singularity.
  3. Current SPs which I'm collaborating are from BDE. I'm not sure if I can put their Miner IDs in public here, I need to ask CB at BDE. Here is my SP Miner ID 'f01959735' which is not part of this request, but putting as a reference. Thanks

amughal avatar Jun 09 '23 15:06 amughal

These two additional Miner IDs [BDE]: f01967469, f01717477

@zcfil Please let me know if you have any further questions?

amughal avatar Jun 11 '23 04:06 amughal

You can use your official domain name to [email protected] Send email and copy to [email protected] To confirm your identity? The email name should include question ID #2040

zcfil avatar Jun 12 '23 03:06 zcfil

@zcfil I have just sent the email. Thanks

amughal avatar Jun 12 '23 04:06 amughal

Gmail is not authoritative,If you have any communication results, please feel free to reply at any time. @Sunnyiscoming May I ask if this is a validated LDN

zcfil avatar Jun 12 '23 06:06 zcfil

@zcfil I have sent you another email with screenshot.

amughal avatar Jun 12 '23 07:06 amughal

@zcfil I have sent you another email from my official email. Let me know what else is needed?

amughal avatar Jun 13 '23 02:06 amughal

Hi @Sunnyiscoming, @zcfil is waiting for your input. Thanks

amughal avatar Jun 14 '23 01:06 amughal

@amughal did you contact Common Crawl directly? They mentioned to us that getting data from their AWS is extremely expensive for them and will likely provide you with a direct link from their HTTP server.

xinaxu avatar Jun 15 '23 18:06 xinaxu

Confirming on behalf of @Sunnyiscoming - email received and confirmed @zcfil

Kevin-FF-USA avatar Jun 15 '23 19:06 Kevin-FF-USA

@xinaxu This is a very valid concern and I have seen issues downloading from AWS last year. Since then, I have used their HTTP servers. In the moonlanding channel, others complained for slowness and many time not to able to retrieve data. I informed Caro and in the channel about http and it worked perfectly. Thanks for bringing this discussion.

amughal avatar Jun 15 '23 21:06 amughal

Request Approved

Your Datacap Allocation Request has been approved by the Notary

Message sent to Filecoin Network

bafy2bzacedwnnnseafbrhvcjiypqcn66sptanrwhbkgsobxrqz3veo3qbnzza

Address

f1b4u4eclr63rjz2wqbnlso75vs5p5qp4rdmj45ai

Datacap Allocated

150.00TiB

Signer Address

f1yjhnsoga2ccnepb7t3p3ov5fzom3syhsuinxexa

Id

5d4cc1ae-b938-4f1b-a423-f1262658bbdf

You can check the status of the message here: https://filfox.info/en/message/bafy2bzacedwnnnseafbrhvcjiypqcn66sptanrwhbkgsobxrqz3veo3qbnzza

kernelogic avatar Jun 16 '23 02:06 kernelogic

Approved with clarifications from T&T team. Also I agree HTTP is the way to go, I had to crawl their website and get the direct HTTP download link. You cannot download from s3 bucket anonymously for this dataset.

kernelogic avatar Jun 16 '23 02:06 kernelogic

Thank you @kernelogic. The datacap allocation of 150TB is just the first tranche? We are hoping to start slow, but achieve 1PiB sealing per week. Would that be an issue?

amughal avatar Jun 16 '23 03:06 amughal

image Received that.

Sunnyiscoming avatar Jun 17 '23 14:06 Sunnyiscoming

@amughal yes it is the first tranche. Each subsequent tranche is 200% of the previous one so next top up will be 300T.

kernelogic avatar Jun 17 '23 16:06 kernelogic

DataCap Allocation requested

Request number 2

Multisig Notary address

f02049625

Client address

f1b4u4eclr63rjz2wqbnlso75vs5p5qp4rdmj45ai

DataCap allocation requested

300TiB

Id

2ae97ea4-d5af-4a93-9c1c-1a0c76742ce1

Stats & Info for DataCap Allocation

Multisig Notary address

f02049625

Client address

f1b4u4eclr63rjz2wqbnlso75vs5p5qp4rdmj45ai

Rule to calculate the allocation request amount

100% of weekly dc amount requested

DataCap allocation requested

300TiB

Total DataCap granted for client so far

150TiB

Datacap to be granted to reach the total amount requested by the client (10PiB)

9.85PiB

Stats

Number of deals Number of storage providers Previous DC Allocated Top provider Remaining DC
3593 1 150TiB 100 35.57TiB

Hello @kernelogic . We have actively started sealing at high rate using the SaaS service provider around 25-30TiB a day. Seems like based on the current usage this application requires signatures. Requesting your sign off asap? This would be helpful to continue sealing during the holidays.

amughal avatar Jun 30 '23 21:06 amughal