filecoin-plus-large-datasets icon indicating copy to clipboard operation
filecoin-plus-large-datasets copied to clipboard

[DataCap Application] Mixed Elements - Picas.ai

Open PicasMix opened this issue 2 years ago • 164 comments
trafficstars

Data Owner Name

Mixed Elements

Data Owner Country/Region

Singapore

Data Owner Industry

IT & Technology Services

Website

https://mixedet.com/

Social Media

https://mixedet.com/

Total amount of DataCap being requested

5PiB

Weekly allocation of DataCap requested

200TiB

On-chain address for first allocation

f1qfsi7n3fh5tsfy3rmj6jwxsxiubvz6only4npxa

Custom multisig

  • [ ] Use Custom Multisig

Identifier

No response

Share a brief history of your project and organization

Picas.ai is a popular but not yet generally available AI art generator, and has become one of the most popular AI art creation platforms in the world. Based on AI, we will expand human imagination and explore the art space of large-scale creation. We specialize in art design and artificial intelligence. Our mission is to enable the independent creator  to succeed in an increasingly competitive digital environment and make Picas.ai the best AI art creation platform for creators through the continuous optimization of our latent text-to-image diffusion model capable of generating photo-realistic images given any text input.

Is this project associated with other projects/ecosystem stakeholders?

No

If answered yes, what are the other projects/ecosystem stakeholders

No response

Describe the data being stored onto Filecoin

Data Generated by AI Platforms, including pictures, texts, models, videos, etc.

Where was the data currently stored in this dataset sourced from

My Own Storage Infra

If you answered "Other" in the previous question, enter the details here

No response

How do you plan to prepare the dataset

No response

If you answered "other/custom tool" in the previous question, enter the details here

No response

Please share a sample of the data

http://picas.ai:10580/#/fallsImgList

Confirm that this is a public dataset that can be retrieved by anyone on the Network

  • [X] I confirm

If you chose not to confirm, what was the reason

No response

What is the expected retrieval frequency for this data

Monthly

For how long do you plan to keep this dataset stored on Filecoin

More than 3 years

In which geographies do you plan on making storage deals

Asia other than Greater China, North America

How will you be distributing your data to storage providers

Cloud storage (i.e. S3), HTTP or FTP server, Lotus built-in data transfer

How do you plan to choose storage providers

Partners

If you answered "Others" in the previous question, what is the tool or platform you plan to use

No response

If you already have a list of storage providers to work with, fill out their names and provider IDs below

No response

How do you plan to make deals to your storage providers

No response

If you answered "Others/custom tool" in the previous question, enter the details here

No response

Can you confirm that you will follow the Fil+ guideline

Yes

PicasMix avatar Jan 16 '23 02:01 PicasMix

Thanks for your request!

Heads up, you’re requesting more than the typical weekly onboarding rate of DataCap!

Thanks for your request! Everything looks good. :ok_hand:

A Governance Team member will review the information provided and contact you back pretty soon.

Dear applicant,

Thank you for applying for datacap. As notary i am screening your application and doing due diligence.

Looking at your application i have some questions:

As you are brand new on Github and have no history of past applications. Applying for 5PB of datacap is a lot. One needs comprehensive knowledge of Filecoin, packing data and it's requirements and distribution of the data. Are you brand new in the Filecoin space or have you applied for datacap in the past on different Github account names?

Can you provide me with some KYC details per e-mail at [email protected]. I would like to receive an e-mail from the original domainname / contact as listed on the website that refers to this application. Your application is for 5PB of datacap. Can you show us some visible proof from the data size and the storage you have there?

Can you provide us with a distribution plan supposably that you would be granted datacap to store?

As you can see here : https://github.com/filecoin-project/filecoin-plus-large-datasets it states:

In order for a client and their dataset to be eligible: the dataset should be public, open, and mission aligned with Filecoin and Filecoin Plus. This also means that the data should be accessible to anyone > in the network, without requiring any special permissions or access requirement stored data should be readily retrievable on the network and this should be regularly verified (though the use of manual or automated verification > that includes retrieving data from various miners over the course of the DataCap allocation timeframe)

To make sure that you understand the rules and guidelines I would like to receive a detailed allocation plan where the amount of data and reputational SP's where you are going to store this data are included.

cryptowhizzard avatar Jan 16 '23 14:01 cryptowhizzard

Dear cryptowhizzard,

Thank you very much for your reply.

We understand extra caution is needed with large datacap requests due to false/inaccurate requests. We must remain vigilant and dedicated to high standards to ensure data collected is accurate and up-to-date.

Although the GitHub account was newly applied by the company for the new project, I have done extensive research to gain a comprehensive understanding of Filecoin, packing data, and its requirements and data distribution to ensure I am well-equipped to handle the data storage needs.

We understand the importance of KYC details and are happy to provide any information needed. We are committed to protecting user information and do not collect any data. We can provide an @.*** @.***) from the original domain name/contact, as listed on the website, that refers to this application.

To demonstrate our ability to store and manage large datasets, we can provide visible proof of our current data size and storage capacity. We currently manage a large dataset of AI-generated images on the Picas.ai http://picas.ai/ platform, with over 5PB of total data stored. We can provide proof of our current storage usage and capacity, as well as evidence that we are able to handle large-scale data storage. The information related to this proof is a commercial secret and will be sent to you by email later. We will also monitor the performance of the service providers to ensure that the data remains secure and retrievable over a long period of time.

To ensure our proposed allocation plan meets the rules and guidelines, we plan to store our data on storage providers (SPs) with a high reputation in the countries and regions mentioned. We plan to distribute deals across SPs, aiming to give less than 20% of deals to each provider. This will allow us to ensure our data is spread across multiple secure locations, while still adhering to the rules and regulations. We will also ensure the data is retrievable in the future by us and other users. We have contacted some SP partners and will contact more in the future. Part of the SP list is as follows: f062982, f01857791, f01923304, f062811, f01900522.

According to our service agreement http://www.picas.ai/#/service, all content generated by the artificial intelligence model of thehttp://picas.ai/ platform belongs to the platform. With the aim of advancing progress for humanity, the platform is committed to sharing all creative works with everyone. We believe that by allowing unrestricted access to the fruits of creativity, the boundaries of human imagination can be expanded to unimaginable heights. This commitment to share and grow together is what makes the platform so powerful and inspiring.

I am confident that I have provided the community with necessary information. If there is anything else that needs clarification, please feel free to reach out. We are devoted to transparency and openness, and with our collective efforts, we will be able to take Filecoin to the next level, making WEB3 a reality. We understand that progress can only be made with the support of the community, and we are eager to take this journey together.

在 2023/01/16 22:31 CST CryptoWhizzard @.***> 写道:

Dear applicant,

Thank you for applying for datacap. As notary i am screening your application and doing due diligence.

Looking at your application i have some questions:

As you are brand new on Github and have no history of past applications. Applying for 5PB of datacap is a lot. One needs comprehensive knowledge of Filecoin, packing data and it's requirements and distribution of the data. Are you brand new in the Filecoin space or have you applied for datacap in the past on different Github account names?

Can you provide me with some KYC details per e-mail at @.*** @.*** I would like to receive an e-mail from the original domainname / contact as listed on the website that refers to this application. Your application is for 5PB of datacap. Can you show us some visible proof from the data size and the storage you have there?

Can you provide us with a distribution plan supposably that you would be granted datacap to store?

As you can see here : https://github.com/filecoin-project/filecoin-plus-large-datasets it states:

In order for a client and their dataset to be eligible: the dataset should be public, open, and mission aligned with Filecoin and Filecoin Plus. This also means that the data should be accessible to anyone > in the network, without requiring any special permissions or access requirement stored data should be readily retrievable on the network and this should be regularly verified (though the use of manual or automated verification > that includes retrieving data from various miners over the course of the DataCap allocation timeframe)

To make sure that you understand the rules and guidelines I would like to receive a detailed allocation plan where the amount of data and reputational SP's where you are going to store this data are included.

— Reply to this email directly, view it on GitHub https://github.com/filecoin-project/filecoin-plus-large-datasets/issues/1561#issuecomment-1384151799, or unsubscribe https://github.com/notifications/unsubscribe-auth/A4WLJYX5GW5AVNE2OBKQLWLWSVLVPANCNFSM6AAAAAAT4G7P24. You are receiving this because you authored the thread.Message ID: @.***>

PicasMix avatar Jan 17 '23 10:01 PicasMix

@PicasMix There are some issues with the provided SP list; f062982 - IP is on chain (103.127.248.196) but does not work (tried telnet to the announced port) f01857791 - No IP on chain f01923304 - No IP on chain f062811 - IP is on chain (103.127.248.194) but does not work (tried telnet to the announced port) f01900522 - No IP on chain

(Commands used to check; lotus state miner-info & telnet)

lvschouwen avatar Jan 17 '23 10:01 lvschouwen

Can you provide more data samples to prove that you have 5 PB data already? What's the relationship between you and the organization?

Sunnyiscoming avatar Jan 18 '23 11:01 Sunnyiscoming

Datacap Request Trigger

Total DataCap requested

5PiB

Expected weekly DataCap usage rate

200TiB

Client address

f1qfsi7n3fh5tsfy3rmj6jwxsxiubvz6only4npxa

simonkim0515 avatar Jan 18 '23 19:01 simonkim0515

DataCap Allocation requested

Multisig Notary address

f02049625

Client address

f1qfsi7n3fh5tsfy3rmj6jwxsxiubvz6only4npxa

DataCap allocation requested

100TiB

Id

85c7049a-31a0-4a6b-b826-54dde88cd276

DataCap and CID Checker Report[^1]

There is no previous allocation for this issue.

[^1]: To manually trigger this report, add a comment with text checker:manualTrigger

Can you provide more data samples to prove that you have 5 PB data already? What's the relationship between you and the organization?

Our [Picas.ai](http://picas.ai/) platform currently holds an impressive 5PB of data, and this is only the beginning. This library consists of AI-generated images, texts, models, and videos, with a massive collection of creative images already stored and growing daily. For a sample of the data, [here](http://www.picas.ai:10580/#/allImage) you can find an extensive collection of art images, from which more can be discovered by simply scrolling down the page. Our team is dedicated to providing the highest quality data and adding to our collection daily.

Mixed Elements is the organization behind the [Picas.ai](http://picas.ai/) platform, and our mission is to make the creative process available to everyone, regardless of their experience level. We believe that everyone should have the opportunity to express themselves through art, and our user-friendly platform provides the perfect environment for this. Our dedicated team works hard to ensure that our users have all the tools they need to bring their ideas to life and share them with the world. We continually strive to optimize the platform and expand our offerings to make sure that our users have a comprehensive set of tools to create stunning visuals and share them in a variety of formats.

LeonTing1010 avatar Jan 19 '23 09:01 LeonTing1010

@LeonTing1010 How many pictures are you talking about? I'd like to see proof of storage requirements. As an average image is about 2MiB you are requesting to store 2,684,354,560 over two billion images?

I find that hard to believe, please show us more in depth details on how and why you need 5PiB. If you want to start with showing a detailed onboarding plan and maybe a 50/100T demo to show everyone how you are storing it I would appreciate that.

herrehesse avatar Jan 19 '23 09:01 herrehesse

Dear applicant,

Thank you for applying for datacap. As Filecoin FIL+ notary i am screening your application and conducting due diligence.

Looking at your application i have some questions: As you are brand new on Github and have no history of past applications it seems to me that applying for 5PB of datacap is a lot. One needs comprehensive knowledge of Filecoin, packing of data, distribution of data and all it's requirements coming with it. Are you brand new in the Filecoin space or have you applied for datacap in the past on different Github account names?

Can you show us visible proof of the size of your data and the storage systems you have there?

As last question i would like you to fill out this form to provide us with the necessary information to make a educated decision on your LDN request if we would like to support it.

Thanks!

cryptowhizzard avatar Feb 01 '23 07:02 cryptowhizzard

@PicasMix There are some issues with the provided SP list; f062982 - IP is on chain (103.127.248.196) but does not work (tried telnet to the announced port) f01857791 - No IP on chain f01923304 - No IP on chain f062811 - IP is on chain (103.127.248.194) but does not work (tried telnet to the announced port) f01900522 - No IP on chain

(Commands used to check; lotus state miner-info & telnet)

I received a response from SPs and the issue has been resolved.

PicasMix avatar Feb 01 '23 13:02 PicasMix

@LeonTing1010 How many pictures are you talking about? I'd like to see proof of storage requirements. As an average image is about 2MiB you are requesting to store 2,684,354,560 over two billion images?

I find that hard to believe, please show us more in depth details on how and why you need 5PiB. If you want to start with showing a detailed onboarding plan and maybe a 50/100T demo to show everyone how you are storing it I would appreciate that.

proof The data is relatively large, so I asked the engineer to provide screenshots of the stored data. The screenshot above shows that 100T+ of image storage has already been stored on one of our servers, containing approximately 106 million images. If you need further evidence, please let us know.

PicasMix avatar Feb 01 '23 13:02 PicasMix

f01857791 -> {12D3KooWKoYod14XFA2prK6pJmcDaRydpCsVJoxysLjuuc7z6qai: [/ip4/103.242.73.146/tcp/23716]}
f01923304 -> {12D3KooWLw2AEqascpPPL7MxrxZn6pdwzYdyzb9VnYyroGGusqpm: [/ip4/103.242.73.151/tcp/31853]}
f01900522 -> {12D3KooWRnATcK383n8Fomknw4tN4KXA8zhTqEc7itCsZZ6a6Pa7: [/ip4/103.242.74.239/tcp/46329]}
f062982 -> {12D3KooWDPKXF5suKz9tZmEm8hYCUvCuCfidVTADUrTPWKM3xq8K: [/ip4/103.242.75.54/tcp/11234]}
f062811 -> {12D3KooWC6yyVBJZW8CJUWGcxPdvhguWuDHXkBm3VL2iimhNgc3E: [/ip4/103.242.74.232/tcp/17429]}

So both f062982 and f062811 changed IP address and all miners are now hosted in the same place 103.242.73.x and 103.242.74.x

What other places are you going to store once you get the DatapCap approved?

lvschouwen avatar Feb 01 '23 13:02 lvschouwen

f01857791 -> {12D3KooWKoYod14XFA2prK6pJmcDaRydpCsVJoxysLjuuc7z6qai: [/ip4/103.242.73.146/tcp/23716]}
f01923304 -> {12D3KooWLw2AEqascpPPL7MxrxZn6pdwzYdyzb9VnYyroGGusqpm: [/ip4/103.242.73.151/tcp/31853]}
f01900522 -> {12D3KooWRnATcK383n8Fomknw4tN4KXA8zhTqEc7itCsZZ6a6Pa7: [/ip4/103.242.74.239/tcp/46329]}
f062982 -> {12D3KooWDPKXF5suKz9tZmEm8hYCUvCuCfidVTADUrTPWKM3xq8K: [/ip4/103.242.75.54/tcp/11234]}
f062811 -> {12D3KooWC6yyVBJZW8CJUWGcxPdvhguWuDHXkBm3VL2iimhNgc3E: [/ip4/103.242.74.232/tcp/17429]}

So both f062982 and f062811 changed IP address and all miners are now hosted in the same place 103.242.73.x and 103.242.74.x

What other places are you going to store once you get the DatapCap approved?

f062982 and f062811 informed us that their network had been adjusted, resulting in a change of IP. We chose Singapore-based service providers since offline data is more convenient and faster than online data transmission. Does the data have to be distributed to multiple countries?If so, we will choose service providers in other countries to store the data.

PicasMix avatar Feb 03 '23 01:02 PicasMix

Dear applicant,

Thank you for applying for datacap. As Filecoin FIL+ notary i am screening your application and conducting due diligence.

Looking at your application i have some questions: As you are brand new on Github and have no history of past applications it seems to me that applying for 5PB of datacap is a lot. One needs comprehensive knowledge of Filecoin, packing of data, distribution of data and all it's requirements coming with it. Are you brand new in the Filecoin space or have you applied for datacap in the past on different Github account names?

Can you show us visible proof of the size of your data and the storage systems you have there?

As last question i would like you to fill out this form to provide us with the necessary information to make a educated decision on your LDN request if we would like to support it.

Thanks!

We have taken the necessary steps to complete the Know Your Customer (KYC) information for the application. We understand that the subsequent process may take some time, so we would like to know how long we should expect it to take in order to be prepared for the next steps. With that in mind, we would like to find out an estimated timeframe for the subsequent process, in order to plan ahead and be ready for any potential delays.

PicasMix avatar Feb 03 '23 03:02 PicasMix

Hi there @PicasMix

Thanks for submitting your KYC. It is well received.

Upon evaluation of your application it seems there is still some issues that need to be resolved.

You indicated that you are the data preparer and owner of the dataset. Can you confirm you don't have any stake in SP's you are going to send this data to? As stated in the form I am allowing 1 replica of the set to be stored on your own miners. That's the reason i ask.

Second:

As the form states _> RULES: In order to be eligible for the Filecoin+ incentive program, which provides 10x the power (QAP) for storing data on the Filecoin network, there are certain rules that must be followed:

Data must be stored with multiple independent service providers, not just one organization running multiple SPs. Data must be distributed across multiple regions. Using a VPN to fake presence is not allowed. Data must be publicly retrievable for verification that it is being stored as claimed. It is highly recommended to run boost. Data must align with the mission of FIL+, meaning it must be valuable for humanity, whether scientific or unique in some other way. Data must be open and not encrypted. If you wish to encrypt your data, consider applying for the FIL-E program instead. Note: As a data preparer, you are allowed to store one copy yourself for redundancy at the request of the client. However, if your copy is stored in the USA, the other organizations/SPs must be outside your region. Dcent encourages applications with a minimum spread of 2 continents and 3 different organizations.

Can you provide me with the Sp's and their contact information? The 3 SP's you mentioned are just one organization. They all share the same IP address and are located in the same datacenter. I need 3 more organizations here.

f01857791 -> {12D3KooWKoYod14XFA2prK6pJmcDaRydpCsVJoxysLjuuc7z6qai: [/ip4/103.242.73.146/tcp/23716]} f01923304 -> {12D3KooWLw2AEqascpPPL7MxrxZn6pdwzYdyzb9VnYyroGGusqpm: [/ip4/103.242.73.151/tcp/31853]} f01900522 -> {12D3KooWRnATcK383n8Fomknw4tN4KXA8zhTqEc7itCsZZ6a6Pa7: [/ip4/103.242.74.239/tcp/46329]} f062982 -> {12D3KooWDPKXF5suKz9tZmEm8hYCUvCuCfidVTADUrTPWKM3xq8K: [/ip4/103.242.75.54/tcp/11234]} f062811 -> {12D3KooWC6yyVBJZW8CJUWGcxPdvhguWuDHXkBm3VL2iimhNgc3E: [/ip4/103.242.74.232/tcp/17429]}

Thanks!

cryptowhizzard avatar Feb 03 '23 10:02 cryptowhizzard

Hi there @PicasMix

Thanks for submitting your KYC. It is well received.

Upon evaluation of your application it seems there is still some issues that need to be resolved.

You indicated that you are the data preparer and owner of the dataset. Can you confirm you don't have any stake in SP's you are going to send this data to? As stated in the form I am allowing 1 replica of the set to be stored on your own miners. That's the reason i ask.

Second:

As the form states _> RULES: In order to be eligible for the Filecoin+ incentive program, which provides 10x the power (QAP) for storing data on the Filecoin network, there are certain rules that must be followed:

Data must be stored with multiple independent service providers, not just one organization running multiple SPs. Data must be distributed across multiple regions. Using a VPN to fake presence is not allowed. Data must be publicly retrievable for verification that it is being stored as claimed. It is highly recommended to run boost. Data must align with the mission of FIL+, meaning it must be valuable for humanity, whether scientific or unique in some other way. Data must be open and not encrypted. If you wish to encrypt your data, consider applying for the FIL-E program instead. Note: As a data preparer, you are allowed to store one copy yourself for redundancy at the request of the client. However, if your copy is stored in the USA, the other organizations/SPs must be outside your region. Dcent encourages applications with a minimum spread of 2 continents and 3 different organizations.

Can you provide me with the Sp's and their contact information? The 3 SP's you mentioned are just one organization. They all share the same IP address and are located in the same datacenter. I need 3 more organizations here.

f01857791 -> {12D3KooWKoYod14XFA2prK6pJmcDaRydpCsVJoxysLjuuc7z6qai: [/ip4/103.242.73.146/tcp/23716]} f01923304 -> {12D3KooWLw2AEqascpPPL7MxrxZn6pdwzYdyzb9VnYyroGGusqpm: [/ip4/103.242.73.151/tcp/31853]} f01900522 -> {12D3KooWRnATcK383n8Fomknw4tN4KXA8zhTqEc7itCsZZ6a6Pa7: [/ip4/103.242.74.239/tcp/46329]} f062982 -> {12D3KooWDPKXF5suKz9tZmEm8hYCUvCuCfidVTADUrTPWKM3xq8K: [/ip4/103.242.75.54/tcp/11234]} f062811 -> {12D3KooWC6yyVBJZW8CJUWGcxPdvhguWuDHXkBm3VL2iimhNgc3E: [/ip4/103.242.74.232/tcp/17429]}

Thanks!

You indicated that you are the data preparer and owner of the dataset. Can you confirm you don't have any stake in SP's you are going to send this data to? As stated in the form I am allowing 1 replica of the set to be stored on your own miners. That's the reason i ask.

Yes, I can confirm that.

Can you provide me with the Sp's and their contact information? The 3 SP's you mentioned are just one organization. They all share the same IP address and are located in the same datacenter. I need 3 more organizations here.

Here are the contact information of three more Service Providers:

PicasMix avatar Feb 20 '23 06:02 PicasMix

Hi @PicasMix

Thanks. I will propose in a bit.

Please mind -> lotus net connect f01923312 f01923312 -> {12D3KooWLc1SP4E8XK4P9sYkR57Z5VgC6c7YoUtDLZHzvd5QCYvh: []} ERROR: failed to parse multiaddr "f01923312": must begin with /

F01923312 needs to set it's IP address on chain in order to be reachable. Please have this fixed before you send any deals there.

cryptowhizzard avatar Feb 20 '23 12:02 cryptowhizzard

Request Proposed

Your Datacap Allocation Request has been proposed by the Notary

Message sent to Filecoin Network

bafy2bzacecsevokp7uwfqaue2goctqbuwt3w3zlsq4rhfksq7qvmgduza7vpk

Address

f1qfsi7n3fh5tsfy3rmj6jwxsxiubvz6only4npxa

Datacap Allocated

100.00TiB

Signer Address

f1krmypm4uoxxf3g7okrwtrahlmpcph3y7rbqqgfa

Id

85c7049a-31a0-4a6b-b826-54dde88cd276

You can check the status of the message here: https://filfox.info/en/message/bafy2bzacecsevokp7uwfqaue2goctqbuwt3w3zlsq4rhfksq7qvmgduza7vpk

cryptowhizzard avatar Feb 20 '23 12:02 cryptowhizzard

Hi @PicasMix

Thanks. I will propose in a bit.

Please mind -> lotus net connect f01923312 f01923312 -> {12D3KooWLc1SP4E8XK4P9sYkR57Z5VgC6c7YoUtDLZHzvd5QCYvh: []} ERROR: failed to parse multiaddr "f01923312": must begin with /

F01923312 needs to set it's IP address on chain in order to be reachable. Please have this fixed before you send any deals there.

Thanks. Our SP has fixed the issue with multiaddr for f01923312.

PicasMix avatar Feb 21 '23 08:02 PicasMix

@cryptowhizzard @lvschouwen @herrehesse @Sunnyiscoming thank you for your support! Dear notaries, cryptowhizzard has signed and one more sign is needed to get started, can you please help us here?

PicasMix avatar Mar 20 '23 08:03 PicasMix

@Sunnyiscoming @nj-steve @a1991car Dear notaries, @cryptowhizzard has signed and one more sign is needed to get started, can you please help us here?

PicasMix avatar Mar 24 '23 06:03 PicasMix

Based on the comments in the issue and the communication from the client, I am willing to support this round and continuously monitor the subsequent allocation.

liyunzhi-666 avatar Mar 28 '23 12:03 liyunzhi-666

image I'm having some issues with signing, check it out. @fabriziogianni7 @panges2 cc @Kevin-FF-USA

liyunzhi-666 avatar Mar 28 '23 12:03 liyunzhi-666

Based on the comments in the issue and the communication from the client, I am willing to support this round and continuously monitor the subsequent allocation.

@liyunzhi-666 Thanks for your support!

PicasMix avatar Mar 31 '23 02:03 PicasMix

checker:manualTrigger

zcfil avatar Mar 31 '23 02:03 zcfil

DataCap and CID Checker Report[^1]

No application info found for this issue on https://filplus.d.interplanetary.one/clients.

[^1]: To manually trigger this report, add a comment with text checker:manualTrigger

image

I'm having some issues with signing, check it out. @fabriziogianni7 @panges2 cc @Kevin-FF-USA

I'm guessing the status:StartSignDatacap label needs to be added first. @liyunzhi-666 @fabriziogianni7 @panges2 cc @Kevin-FF-USA

PicasMix avatar Mar 31 '23 03:03 PicasMix