flow-nft
flow-nft copied to clipboard
Feature: NFT Metadata Standard should support bridging to other networks
Issue To Be Solved
There is desire to have NFTs on Flow be bridgable to other platforms, e.g. Ethereum. Secondarily, it is likely that there will also be desire to bridge NFTs from other platforms on to Flow. The NFT Metadata system should have a clear way to take metadata from Flow NFTs and integrate it with other chains and platforms.
Currently, the most popular metadata standard is the OpenSea "ERC721 Metadata Extension" + “ERC721 Metadata JSON Schema”. In order to integrate with OpenSea on ethereum, any on-chain data would need to somehow be serialized into a JSON format compatible with OpenSea and other platforms, and somehow made available in the actual contract of the NFT that is minted on the target chain.
(Optional): Suggest A Solution
Option 1) We could provide a standard for serializing content from our on-chain formats into JSON formats and even tools for doing so. Then dapp developers could use those tools to generate metadata documents and provide them via an IPFS endpoint or other method, such that when their NFT is bridged the document exists in a known location.
Option 2) We could write a service which looks up on chain data and returns it as a JSON document which works with the JSON spec for opensea on ethereum. This service could be provided as a plugin for the access node API, and any bridge could automatically generate a URL targeting an API endpoint that could return the data.
Other Questions:
Are we at all concerned with state bloat with all this metadata being stored on chain?
Option 1 - I think this can be more than a tool set, but more like a demo bridging project as blueprint.
But I think metadata conversion is very small scope of the whole task here. Anyone can develop a bridge, for sure can extract a JSON from Metadata View. ( or even maybe they don't even need to extract it from MetadataView )
Option 2 - I think this is not possible for technical and political reasons. ( AN load, Flow team developing, looking up NFT on chain with id/uuid etc )
Are we at all concerned with state bloat with all this metadata being stored on chain?
I am quite opposite, storage is cheap ( especially on Flow ), I think it should be more metadata on chain instead of off chain.
- the problem here is responsibility, or more clearly: who pays for it.
There's a reasonable construction here that works though I think:
- The dapp developer implements a specific schema which they provide for their nft, this schema has a known serialization. The dapp owner (or nft owner) then serializes that on chain schema and pins the resulting document to ipfs (with a known encoder).
Then the bridge can do this same process, but instead of pinning the document (since this would cost it money in perpetuity) it would just use the CID to construction the URI for the metadata view and place that in the erc721 uri method when deploying.
This means the cost of serving the metadata is ultimately bore initially by the dapp developer and hopefully the nft owner over time.
- The AN tooling will be able to run multiple consumers off of one execution state / consensus follower module. One could write a plugin service similar to DPS that could optionally by run by infrastructure providers or dapps. One such plugin could be a metadata resolver that returns results in various encodings.
And lastly, if state isn't cheap it will force people to clean up their crappy NFTs which aren't worth the storage cost. Maybe it's a good thing.
Are we thinking 3 actors like: dapp, bridge, user ? If so
From this point of view:
- dapp sets the metadata ( probably same metadata for multiple NFTs )
- bridge can move this data to some storage ( it will not be
useras duplicate data situation can raise here )
So essentially user (not owner) paying for the hosting is dead end. ( even it is hosted on IPFS )
So opportunity here is bridge provider becoming some SaaS like graffle/DPS/flowscan I guess. Also then they have free choice of using IPFS, AN plugin, AWS etc.
AN plugin is tempting idea but the problem is what will happen 10 years later, a dapp staying in business is much more unlikely than a bridge provider staying in business.
We could provide a standard for serializing content from our on-chain formats into JSON formats and even tools for doing so.
This seems straightforward. I think it relates to the conversation in #74, which proposes a view that contains the same data as the OpenSea standard. The "serialize to JSON" idea is also similar to #73.
I agree with @bluesign that storage shouldn't be a concern here, especially for the kind of data that's stored in an OpenSea JSON file.