What SubGraph Migration means for #GRT and why it is the future.
In a recent posting on the graph blog, it was mentioned they will start migrating more subgraphs to the main net in bunches. They recently announced this has just started with 10 subgraphs being migrated. This was a relatively unnoticed announcement outside of the dedicated hodlers so I wanted to say why this is big news and why everyone should be paying attention. This show #GRT is executing as they plan which is great for any company or business model but especially great when it comes to cryptos as many are still trying to figure out if it is an all systems go in the crypto space especially with the finance chain rise of crazy token names. #GRT road map sites are below from the first article for those interested.
Phase one or bootstrapping migration: “Starting in April, GRT will be collaborating with Indexers and subgraph developers to publish and test an initial set of subgraphs on mainnet. GRT will begin working closely with a group of Migration Partners to ensure a seamless transition before opening things up for all subgraphs to migrate from the hosted service. A thoughtful migration process is critical to ensure that dapps continue to run smoothly with the upgraded support from the large community of Indexers! This is going to be the initial expansion and testing of the network to give a glimpse of what it can do. This should start showing soon and May should be an interesting month with the first 10 subgraphs having been recently migrated.
Phase two Production Dapps: The rise of the network in short. initial subgraph syncing from the Migration Bootstrapping phase may take anywhere from a few hours to a few weeks depending on the subgraph. Once the subgraphs are fully synced, Migration Partners will be able to test their dapps live on the network. After a QA process, they will be ready to switch over their production dapps to the network, bringing query fees to Indexers and Delegators! This means more choice for delegators, competition of subgraph usage, and potential for competition in attracting delegators. This will bring growing queries to the network and allow for greater rewards potential for those using the network.
Phase three Curation Live: The watershed moment for the network. In short showtime. “After the initial Migration Partners are live on the network, The Graph Foundation will work with the community to launch a public Gateway, which will make it easy for developers to publish subgraphs on the network and pay query fees in GRT. The Gateway and a set of products are expected to launch 30 to 60 days after the start of Phase 1, bringing The Graph Network out of beta.” This is where the data revolution starts to go full force as it could lead to exponential growth with the constant creating of new subgraphs to index based on demand singles created by dApp use cases.
Now that you have some background on what’s going on with migration; I am going to talk about why it matters so much for the future of web 3 and is truly going to show how all; systems go this super cycle really is. As subgraphs migrate and decentralized data existing in web 3 blockchains are made usable data managers will start to see the real-world use cases on web 3. I have talked about these in early #GRT articles so check them out for more information.
Web 3 or the decentralized internet is something mainstream still doesn’t know a lot about and when you talk crypto with a normie they tend to think we are just in another 2017 hype run. For some info on what is different this time just hack out this article from https://www.graphtronauts.com/, it has a plethora of info on the changes happening this time. Some are trying to say winter is coming but the test of that concept will be DeFi as long-term structure rewards and use cases can change the game forever. This was the case for the internet in the 2000s vs the 1990s. In the 90s there was a dot com bubble with everything going online thinking going online was the wave of the future and will it was in the 90s you could get on the web but it was very hard to use. Utility systems started to change this in the late 90s early 2000s as the internet became mainstream this is the stage we are at in web 3 and crypto and it is still very early.
Some may ask what will web 3 solve that isn’t already done and in response to something like #GRT say why not just use google. For those in the data management industry, the answer is data is no longer single faceted it is now multifaceted. In simple terms it requires multiple sources to pull the vital info then the organization of the info pulled to get to a usability point. Decentralization is a key in getting past data fragmentation as you can drop data from siloed tools into a single data pool or build an aggregator that can pull from multiple pools on a data system-agnostic structure such as a blockchain or database agnostic platform like the graph protocol.
A lot of data has already been moving to blockchains to make data more secure and useable as companies want to get rid of siloed data for business intelligence. This is currently going through many talking to low code and no code initiative where AI will link the fragmented systems but the questions always come up in the current solution are: Is the data still secure? ; Is there spillage?; What types of data can it process?; etc. These are the areas of play for the future as web 3 comes into play.
Subgraphs are key for web 3 showing its utility to the data managers of the world as they will connect the fragmented system through the decentralized apparatus. This means they will bring the user experience down to the simple users vs the developer level which is exactly what Google did for the web 1 internet. In other words, it was and is about so much more than just a search engine it is about data utility. #GRT is the Google of crypto does not mean it is simply a querying tool it means it is the answer to the fragmented data conundrum that plagues the data useability world of today. It makes data easily available for the back-end code of a dApp to link to and pull for functionality through subgraphs. It curates data for accuracy and learns through enhancement and utilization of subgraphs when pointing to the proper data and so much more.
As always remember this is not and should not be taken as investment advice these are just my thoughts and opinions on a great crypto project and where it could go.