Journey into the Decentralized Cloud
Recently Hetzner Cloud GmbH , a European cloud services provider removed #Solana ’s validator nodes (which represented 20% of all the #blockchain ’s nodes) from its services, while at the same time #Google cloud (a competitor) just made it a whole lot easier to become a validator on its own #infrastructure. Layer 1 smart contract protocols like Solana and #Ethereum are vying to become the base layer for the internet of value but if these blockchains can only access the hardware through dominant centralized cloud providers which make the internet possible then can the likes of Solana and Ethereum claim to be decentralized at all? And does decentralization even matter?
#Blockchain technology encompassing cryptographic proofs allow for value to be moved online between 2 parties without a third (usually a #bank) in the middle. This is huge. If you think about it, social content hasn’t even made that leap yet as it travels through company owned channels that could monitor, censor and ban you from participating with some rare exceptions. Validators which ensure the integrity of the value being transferred are essential components of a blockchain. If all those validators live within a cloud environment that is owned and controlled by a few parties, then how different is that to a few large banks managing all our transactions for us? Financial inclusivity has never been a strong suit of such a model, and blockchain in essence allowed anyone to create a digital wallet (akin to a bank account) and begin transacting permissionless-ly.?
The #Bitcoin protocol which runs a #ProofOfWork mechanism solves for this dilemma by requiring physical hardware that anyone can buy and run its software on, in order to protect the integrity of its blockchain (i.e. decentralized). Proof of Stake blockchains like Solana and Ethereum require you to only run its software without the expense and energy hunger of dedicated hardware. In other words, it decoupled its blockchain’s software from the dedicated hardware requirement to make it more energy efficient and easier for validators to spin up hence increase adoption. Enter cloud services providers.
Hopefully this will provide enough context for how and why the decentralization narrative has become so important. Centralized cloud providers are clearly not the be all end all of the story, should we decentralize telecom to ensure that no one has the power to turn off the internet? The power grid to protect our electricity? Should we then do away with Federal governments in favor of something more local? These are big questions that are based on the institution of trust and whether there is a form of institution we can..trust. I believe everything will eventually find its happy medium where technology handles the friction of coordination and regulation filling in the gaps for inadequate tech- a perfect marriage..eventually. Today though, we will focus on how do we bring the ethos of decentralization to the cloud i.e. inclusiveness and access for everyone which by the laws of larger numbers should create more participating minds, more value and economic activity.?
The Opportunity
Before delving into potential opportunities, there are some questions that I will be looking to answer along the way in this article as part of my report to the CodeCapsules team who are interested in the decentralized cloud??space but want to know if the current approaches to it have legs or if it’s too early to tell.
A quick refresher on the dynamic of cloud services as a told by me (a noob) and edited (heavily) by the CEO? Miki von Ketelhodt ?of Code Capsules (an instant cloud deployment company)
In a general sense, we can separate the cloud into 2 types of services: Data storage and cloud compute. For today we’ll be focusing solely on Data storage as it is a great starting point as Amazon’s rise to cloud dominance began with its S3 cloud storage solution.
Data storage is a complex business with the overwhelming responsibility of housing the world’s data. Today the internet’s content is managed by a few large cloud providers from the recognizable Amazon’s AWS, Google cloud, Microsoft’s Azure to a few others.??This wasn’t achieved through insidious means but a global trend of greater and faster data consumption. Whether I was streaming Netflix from North America or somewhere in Asia, it has become a god-given right to be able to access that content (i.e. data) anywhere anytime and almost immediately. To do that, Cloud providers had to place their facilities as close as possible to where the content is being demanded, this is referred to as Edge computing. So in order for data to be made available everywhere to then be retrieved by a user, the cloud providers have to replicate the data over and again in all these different locations. It may seem inefficient but by replicating the data, we (the end user) never have to experience the Stone Age through ‘latency’ (the lag of loading content). Replicating data also serves as insurance in case content from one location is corrupted, lost or destroyed, a new copy from another location is made.
One last and important distinction to help us better wrap our heads around the opportunity and challenges facing the Decentralized cloud. Data storage can be broken up into 3 buckets: Hot Storage (data that needs to be retrieved frequently like Netflix and Social Media), Warm Storage (retrieving data less frequently like in my case a token price app), and Cold Storage( archival data that is retrieved once in a blue moon). As you can see, one of the harder parts of data storage is retrieval because again: Latency is the enemy.????
? Filecoin Foundation (token Fil): the largest player in Decentralized storage
The Filecoin blockchain was built on top of Inter-Planetary File System (IPFS) which is able to locate your data in a decentralized cloud (Dcloud). Because remember a Dcloud is about removing the cloud providers’ control over your data from the storing of your data. To do this, data is encrypted in a way that the cloud provider cannot see, change or alter it and just focus on the responsibility of storing and not what it’s storing or for who.??The Filecoin blockchain became the incentive mechanism for IPFS to help onboard data providers into their Dcloud ecosystem who are paid and rewarded with the Filecoin token (FIL) for their service.?
#Filecoin considers itself a peer-to-peer version of AWS (Amazon Cloud) that can offer more competitive rates because of its supply-demand dynamic. Through its marketplace, users can pick their preferred storage providers based on price, storage length and reputation (gamified by filecoin) then enter??into a ‘User storage deal’.??Filecoin then programmatically handles the responsibility of ensuring the data is stored correctly and is accessible. Through a series of incentives and escrowed deposits, storage providers are encouraged to meet their obligations but the network replicates the data so that it is retrievable in case a provider goes offline. Becoming a storage provider on Filecoin seems to be complex and aimed at mini to mid-sized data centers which could put Filecoin as the middle layer that leverages all the non-dominant data centers to better compete with the titans in the cloud. So does it have traction?
From the supply-side, Filecoin has enough storage capacity to house all of YouTube 16,000 times over, with + 4,000 storage providers in its network making it decentralized. Let’s look at utilization, that is how much of that storage is being used.?
Storage capacity seems to have leveled off at 16.7 EiB which is 16,700 PiB (to compare to the graph above). Utilization has risen by 6x over the past 3 quarters to 1.2%.??So around 1% of Filecoin’s available storage is currently being used which gives it ample room for growth. This graphic may emphasize the growth trajectory a little better.
?Let’s take a look at their growth strategy.
First, this is the Filecoin ecosystem of apps and clients building on top of it to leverage its technology and massive supply-side storage capacity.
Furthermore, the caliber of projects being built on Filecoin range from initial projects to the funded, and??increasing quarter over quarter (QoQ).
Filecoin has made strides towards offering tailor-made solutions for the NFT and Web3 market, archival datasets, gaming and audio/visual.??With its NFT uploads growing 30 fold and its Web3 uploads 5 fold. Growth markers seem to be positive across the board despite the crypto winter and a welcome sign that the Decentralized cloud seems to have legs.
领英推荐
Storj (token Storj): easing the transition to the Decentralized cloud
While a much smaller protocol (dark blue) compared to filecoin (light blue), #Storj has made it its mission to onboard traditional #web2 entreprise clients into the Dcloud by offering traditional payment rails and being compatible with Amazon’s S3 platform to help existing users seemlessly migrate to what they claim is a much better value proposition. Case in point.
Having read through their documents, they seem to view blockchain tech as a better architectural fit for what is (or should be) a decentralized web that manifests in lower costs and better value for users as we can see above. According to its technical requirements for becoming a data storage provider, it seems to be catered to anyone with unused storage capacity, a minimum viable threshold and not specifically targeting data centers like Filecoin. I was unable to find reliable data on which is more cost effective as they are not exactly competing for the same market. Updates will follow from the Code Capsules LinkedIn page as they testbed both projects.??
Furthermore, the project takes into account the importance of data compliance policies and offers data storage providers by geography, bandwidth and other requirements that might be important to local and global enterprise clients.
You’ll notice that some of the services offered are similar to Filecoin’s offerings with no obvious emphasis on decentralized projects in #web3 or #NFTs.???
Storj, a more developer centric offering feels more familiar and seems to be lagging the market leader by a wide margin, this could be due to entreprise still warming up to the idea of distributed storage and very likely that onboarding entreprise takes no less than 6 months and probably closer to a year so there may be a lag in performance metrics. Furthermore, unlike the other decentralized cloud projects which have fostered their own ecosystem of blockchain applications, Storj hasn’t been as active a cultivator as Filecoin probably by design. I think we’ll need a few more quarters to see how well their approach has paid off.??The numbers so far are not discouraging as revenue has rocketed upwards from the past quarter. Also ignore the Fully diluted market cap, they are assuming all tokens have been released into the ecosystem which would make all these projects overvalued by any measure. Instead it should be compared to circulating token supply today not future total supply. Filecoin would be an adjusted $1.7bln, and Storj $62 million.?
Last but not least..
Arweave (token AR):?a Dcloud darling?
Having secured partnerships with the like of Meta platforms, Solana blockchain (fastest blockchain/ Top 5), and Chiliz (a fan token platform for sports clubs) as well as backing from notable VCs. This much smaller operator in the decentralized cloud space is finding its niche in the pay once, store forever segment. It charges a 16x upfront premium over what Amazon might charge you over 16 years (assuming Amazon doesn’t change pricing), but it only does it once and promises what only decentralization can offer: perpetuity. Why? Because companies come and go, political landscapes change, responsibilities wane and protecting our world’s stored knowledge is prone to all of it.??Saving data for the long term is called Archiving and it has as much appeal as most libraries do these days, but today we’re going to make it sexy again (or for the first time).
First it’s thriving ecosystem lest you think I’m trying to sell you snake oil.
?Second, why is this even viable? Well 2 reasons, the second is that blockchain data is constantly being produced but blockchains are not efficient at storing that all important historical data and therefore need a place to house it hence the whole decentralized cloud narrative. A place to store that data forever is even better because remember blockchain tech is the underpinning of the internet of value, and so we should all want to make sure that value is stored somewhere it will exist in perpetuity. You might ask how is this possible and I would promptly direct you to Arweave but in short, an upfront premium plus the physical cost of storage coming down with time as it has been doing for the past 50 years at a 30% rate per year should according to Moore’s law continue to do so well into the far future. Further a treasury mechanism has been put in place should costs ever exceed revenue for storage providers. Like the other Dcloud protocols, mechanisms have been put in place so that there is no single point of failure when protecting humanities memories. Which brings me to the first and perhaps more profound reason why archiving is so important.
In a Columbia Journalism Review article that?analyzed?a sample of 553,693?New York Times?articles, found that 25% of all deep links were completely inaccessible. The data showed that link rot worsened over time, claiming 6% of links from 2018, 43% of links from 2008, and 72% of links from 1998. Link rot results in the permanent loss of internet history and demonstrates that the majority of information on the web will not stand the test of time with our current web infrastructure.
If you’ve ever tried to look up an address or page from childhood to then be hit with a 404 page not found error, it’s because of link rot. Neglect, damage, bankruptcies etc.. all contribute to the loss of information. The internet is regarded by everyone as the portal to all humanity’s knowledge so long as it isn’t more than 20 years old or has fortunately been replicated by some interested party. This is a real problem because the current internet cherishes the knowledge for the times not of all times and so what is no longer of interest today might be forgotten tomorrow. More worryingly..
#WorldNews, #Science, #Sports and #Arts experience a greater degree of link rot than other categories which perhaps is a sign of what’s relevant to our times and not to all time.
On a more optimistic note, Arweave’s fixation on archiving blew my mind because here I thought the internet and digitization have subsumed paper as a medium not taking into account the relatively longer lifespan of paper compared with current internet archives. Areweave is still in its early days but if it continues to grow as seen below, then it will become the final piece to making paper an obsolete medium for recording history, all the while preserving blockchain data integrity for the internet of value.
And perhaps more profoundly, our collective knowledge can be stored permanently without risk of censorship, “book burnings” of unpopular ideas, neglect, nature disasters, and wars (which Arweave has played a significant role in preserving #Ukraine ’s content as Russian forces try to destroy all reminensce of culture or possible war crimes).
In the same way Bitcoin dematerialized gold and distributed it physically over the entire globe so no one can physically take but still allow you to use it, Arweave is making the world’s data readily available and physically indestructible at the same time.
While the Dcloud space is still early into its adoption curve, I hope some of the research I’ve shared here will help frame your understanding of the opportunities in this highly anticipated space without delving too much into the details of an industry that has broad utility but not so much appeal. For more of a deep dive, please see the links below.