Long Term Discussion on GRT usage rate and potential token Price. To Mars and even the Andromeda galaxy.
With all the recent news on crypto and some discussion on FUD in the tokenomics of #GRT I have decided to write up why the token amount is nothing to worry about in the long term. Initially, I will discuss how I came upon the GRT rocket ship and what attracted me to it to show why I have so much passion. The graph is a utility token taking on the challenge of getting data off-chain and making it usable. People call it the Google of crypto. I would choose to call it google 3.0 as GRT could give a lot better results than google search if used right. I discovered GRT through a learn and earn video on Coinbase back in December 2020 when it was still a very young project. In my current profession, I deal with market research and business assistance and am constantly researching companies and opportunities so I know the value accurate data can have to create sales and capture portfolios in the business world and how complicated and antiquated the current methods are. This is where I believe GRT comes into play if making data actually usable. As of 2018, there are 2.5 quintillion bytes of data we create every day. But is any of this data usable? The answer is yes but it can be very hard to collect and verify. GRT initially started as an Ethereum blockchain token. ETH is only a starting point and shows how the rocket will take off as more data migrates to the blockchain as the industry grows and develops as we are still extremely early in the adoption level. Currently, GRT has the following tokenomics. 3 functional elements indexers to index the data and provide it to the system, curators which is yet to hit main-net but they signal and provide the value of subgraphs and delegators who provide a piece of liquidity to the system by staking with indexers for use on the network. Currently, the circulating supply is 1.2B of 10B per CoinGecko and the current price is around $1.72 per token with a $2B market cap. Some have expressed concerns about the number of tokens and there was recently a discussion on liquidity for indexers to be able to sell their token rewards to generate revenue for operating cost. I will delve into both of these in the next section as they show something that will continue to play out long term in the tokenomics that is not often discussed with the rise of utility tokens which can have supply clogs and will eventually need ancillary paring to add liquidity just like a VC in a high cost to start industry trying to reach economies of scale needs multiple rounds of financing and needs to pull in multiple investor streams will being able to provide returns to early investors until they can show they are ready to go public to demonstrate full potential.
Next, I will discuss some background on where #GRT is at in the journey so far and the direction for the current cycle. If you read my early article on GRT going multichain you will get a good idea of what is currently going on in-depth. Multiple chains are popping up in the space due to data density, usage rates, and conversion cost (ETH fees for those in Crypto). These are things like DOT, AVAX, STX, CELO, NEAR, SOLANA, and others maybe even TRX for the celeb endorsements and others as well. This will create the interconnected crypto web of the future with interlinked subgraphs that can pull and aggerate data at lightning speed. So far GRT is just starting down this road and people can already see the value as there was a recent constraint on Binance chain and pancake swap and they had to bring in GRT to solve a problem their homegrown solution couldn’t handle. This shows how a protocol can stand out early and show why they are the leader in a space just thin about google in 1997. I can remember people telling me how much better AskJeves and AltaVista were now most people would say what are those? Exactly because google had the algorithm and use case that stood out and survived the bust only to boom into the future. Also, GRT recently started having some community votes the most actively discussed GIP-0002 describes a mechanism for Indexers to withdraw rewards to a designated address instead of automatically re-staking these rewards in the protocol. The effect is that Indexers that interact with the protocol via the widely used TokenLock vesting contracts will be able to withdraw indexing rewards to cover operational costs, without fully unstaking from the protocol. Will I have my own opinions on this the key concern here is the liquidity and revenue of the producers. There are probably multiple ways to solve this by doing things such as allowing alternative staking for rewards to pay out coin pairs perhaps through bancor, the solution discussed is easy but it does cause some concern with market forces of higher circulated supply but that will depend on regular demand and usage rate of the protocol. Growth could eventually solve this but even with what some consider a high supply of tokens the total amount of data potential to need processing through blockchains and protection needed for that data along with usability you are looking at transactions 10X to 20X if just all the current data movement over the web was moved to chains and web3 protocols. When looking at the proposal from a business standpoint I understand the indexers' needs but can also see the concerns of delegators to want transparency to prevent bad actors as one bad experience such as account hacking, rug pulls, etc. in crypto can often cause people to sell their bags and walk away. I will say indexers and delegators need to build relationships like businesses and investors so the parties understand the reasons for the moves and can adjust strategy accordingly will also possibly contributing to things coming down the road such as taking on new subgraphs, teaming with reputable curators, etc. This is probably the first of many lively discussions that will take place on the journey. One thing I will say is it is great to have an active early community to discover early flaws, debate solutions, grow the community, and ensure success as often flaws are found when it is too late in the game for an easy solution.
Next, I will discuss why GRT is so essential and how big it truly is in a world of data oceans and how we make them navigable. As I mention earlier there are 2.5 quintillion bytes of data we create every day and that was pre-COVID. That is a number that 1B goes into 18 times just for context and it is only growing. In other words when making data usable it like a minnow swimming in the Atlantic trying to find the next piece of coral. GRT is what solves this in the crypto space and it is one of only a few solutions. If you take a look at just one aspect of the data ecosystem such as digital payments you can start to see where we are going and how we arrive on mars probably in 5 years. The digital payments market is expected to reach $6.7T by 2023 (https://www.techradar.com/news/more-than-61-billion-people-will-use-digital-payments-by-2023) It is growing at 12% and most of the data for this was a pre-COVID digital revolution taking place at the moment. If you just apply the logging of payments data transactions to GRT tokens you wind up with a potential target value of 600 if they are to curate every dollar. That may not be likely but even if you just look at 10% of that you would be looking at $60 and you could apply hype multiples and active vs staked usage rates and $100 per token is not out of the question. The things that might come into play at this point however are competition of other utility tokens; clogging of the system due to lack of liquidity (luckily the community is already looking at this as the data load will only grow); ecosystem connectivity which is being tackled; data validation which may require indexer and curator ratings and transparency as I discussed earlier; speed and cost of delivery as you want to make sure to data gathering does not outstrip the ability of the protocol to produce usable data for aggregation to dApps, tracking for records purposes, research and studies and other data related end-user activities.
Lastly, I will discuss the dream of perfect data and what is being strived for the long term in this space and shows how GRT is going to the next galaxy long term. Over the next 10 years, data use and creation could easily increase by 10 fold if not more. This would be over a sextillion of data and possibly even a septillion. These are numbers most people don’t even understand at the current moment but we will call the Zetta and yotta runs in the future as this data gets put on various blockchains. This is when blockchain and the ecosystem will truly need to prove its worth or look at being truly disrupted. Many data sets are not on blockchains that could benefit from migrating but bad actors and security fears are still preventing acceptance. Some of the examples of these include health data, business contracts, and marketing data, and business financial asset data. Sensitive data protection is going to be key to get to where it is going so people can always easily validate, protect and monitor data. This will mean something will need to handle fraud, theft, waste, and abuse in the crypto ecosystem that exists and detract many from coming on board to blockchain and crypto as a whole. If these can be solved and If GRT can last through the trials and tribulations of getting to Mars and beyond I believe it can emerge into this time a blue-chip leading protocol in its space similar to how google came through the dot com bubble and the social medial experiment of the late 90s and early 00s. If this situation comes to pass where data collection is instant, secure, and done with ease the ecosystems behind it will be truly valuable and you could see values into the $1000s with the levels of data out there considering 10B tokens is almost nothing when processing ever-growing data in the septillion amount levels but getting there will be the ride.
When digging into something this far out there are a lot of unpredictables that come into play and cycles will change vastly so I caution anyone on being too bullish and don’t risk more than you can afford as the markets are currently far from perfect and secure to where they need to be eventually. Hopefully down the road DeFi will be safer and attract regular users and get the bad actors out of the space and encourage stable ecosystems to alleviate many concerns showing in the market places today. This is being tackled but not fully there yet by any means. If things go well this is good for the entire crypto community but there will be problems as greed and excess often cause in new financial systems and for those interested in a historical example of decentralized financial systems and the concerns I would recommend reading up on the free banking era of 1837–1863 (https://link.springer.com/chapter/10.1057/9781137361219_) 2 as it demonstrates a lot of the issues we currently see in the crypto space with decentralized paper currency structures that have existed in the past. Communities need to monitor and make sure mistakes of history are not repeated and that values and character are shown by those in the space if they want long-term relevance and acceptance.
You may ask if this is really possible or pie in the sky. That is up to you to decide. It may not be a straight-up journey and there will more than likely be turbulence and barriers to breakthrough along the way but with the potential that is there, it is reachable. Remember this is just one person’s opinion and not financial advice and you should always do your own research and analysis.
Feel free to follow me on Twitter @KBCryptoStocks if you want more of my content. Also please note this is not investment advice and should not be considered as such. This is simply an opinion for discussion.