As a buzzword in technology, scale is one of the most common. It deepens a company’s competitive advantages over smaller rivals, and the growth of scale in users, clicks, delivery networks, and file storage, to name a few, have given us the modern internet giants. The barriers to success for a new social network, or search engine, or storage provider are enormous, so the playing field looks unlikely to change in the short term. This gives us two problems to think about.

The first is that the system is essentially wired against us as users. The major imperative for the likes of Google, Facebook, and Amazon, is to offer free or low cost services to attract us to their platforms, and users are not rewarded meaningfully for their individual or collective participation, which brings immense value to these companies. The second is that several online markets are now completely owned by a few massive, centralized players. The potential for misuse or loss of data or disruption of service becomes higher and higher as responsibility for services shrinks to a few providers. Last year, an outage on Amazon’s S3 cloud storage platform knocked a number of other services out including Slack, Quora, and Trello. A mistyped command during server maintenance was to blame, but this tiny mistake was magnified at great cost to a number of the users of that service. Isn’t distributed responsibility, where no single point of failure exists, much safer?

This article will explore two ways projects are allowing users to participate in a network and be meaningfully rewarded for it, looking specifically at making use of their idle computers.

Storage Wars

The premise is simple and appealing; most computer owners don’t use all of their storage capacity, so why not put it to use? Many crypto projects struggle to fully articulate the value of what they’re building, but the proposition for these services couldn’t be much simpler. Plus, the need to decentralize responsibility and reward participation makes this a great use case for blockchain. That value has quickly been recognized, as a number of providers have rushed into what is now a fiercely competitive space.

Most providers in decentralized file storage offer a broadly similar service. Data owners are encouraged to encrypt their files before they are sharded (or split into smaller chunks to be combined with chunks of other files and shared across nodes) into bigger or smaller pieces depending on security preferences. Redundancy is assured by replicating and spreading the files across a wide range of nodes, and this also means that individual hosts only have a small shard of the file’s encrypted content. By decentralizing responsibility for storage away from data centers, the potential for a data breach is decreased.

When a data owner wants to access their stored file, they enter a private key (in some cases to a blockchain-hosted hash table), which locates and recovers all shards of the file, which the network reconstructs. The original encryption key then unlocks the files. For their trouble, the individual data hosts are rewarded with tokens, paid by the data owner and split between those that contributed to the hosting of that file. In this way, data hosts are rewarded meaningfully for their participation in the network.

This brings clear security benefits over using centralized services, where encryption may not be standard and your data can, in theory, be lost. In addition to the security benefits that this approach brings, decentralized storage can also be provided at a far lower price than traditional centralized options. AWS S3 charges around $0.02 per gigabyte per month, but decentralized provider Sia offers a terabyte for $2 per month, and Storj charges $0.015 per gigabyte per month.

Storj (pronounced ‘storage’) was early in the market with a working implementation, but has faced some criticism over centralized elements of its platform. Their product works in much the same way as described above, although Storj uses an ERC–20 token to enable smart contracts between data owners and hosts. One major deviation is the use of ‘bridges’, HTTP API servers that provide an entry-point into the Storj network. They host the protocol details that are used to store and retrieve files, proof of storage details, and they also manage contracts.

These servers are managed exclusively by Storj Labs, so questions have arisen about how decentralized the platform really is. On the other hand, Storj argues that this helps them to manage a higher number of accepted payment methods, which opens the platform up to more potential customers than rival services where a specific token will need to be purchased prior to use.

So how much money could you make? According to Storj, it costs around $6 to run a node for a month in the US, and a user would need to charge around $0.006 per gigabyte per month to break even if they were renting out 1 terabyte.

Sia works slightly differently. Whilst still following the basic principles of decentralized cloud storage, the major difference lies in the implementation of blockchain. Whilst Storj uses the blockchain to power the reward mechanism, its ERC–20 token, Sia uses its own blockchain (based on the Bitcoin source code) to help establish trust with regards to file contracts. Storage contracts document agreements made between renters and providers of storage, which provides a greater degree of security, but these can be confusing to data owners. In essence, if you are trying to store your files on Sia, you specify how much Siacoin you’re willing to spend over the contract period, and the Sia client negotiates separate storage contracts with hosts. This means the process of renting out storage can be a little abstract and less transparent for the renter.

The other barrier is that, if you want to participate as a host, you must hold some Siacoin and stake it against your participation like in the Proof of Stake consensus. This, according to Sia, proves that you are committed to the network. In reality, this means that hosts are punished for downtime on their machines by losing part of their stake, which gets burnt. If you’re using Storj, its highly recommended to keep uptime as high as possible for maintaining some kind of profit, but with Sia you could lose your whole investment if you’re disconnected from the internet for an extended period of time.

So, is taking part in decentralized file storage as a data host worth it? As with Bitcoin mining, the space has become more and more competitive. The majority of people competing seriously as hosts are dedicated to maintaining uptime and offering vast amounts of storage. Unless you have some serious spare storage sitting around, or you can throw enough time, money, and effort into maintaining constant uptime, it’s probably not a venture for casual hobbyists. In fact, Sia actively discourages those who aren’t totally dedicated by burning the stakes of those who prove unreliable. Because that’s the key, particularly for owners storing high volumes of data- the service must be totally reliable.

Power to the people

The increasing volume of data we are creating every day has massive implications for efficient and safe storage, as we’ve talked about. But what about when that data needs to be processed? Computing power is behind many utilities operating in our homes, at our places of work, and in all areas of day-to-day life. Data modeling processes are becoming more complex, but the computing power to state these problems is expensive and, like storage, largely the preserve of major corporations. For instance, the hardware cost of AlphaGo Zero, Google Deepmind’s supercomputer that defeated champion Go players, was reported to be around $25m.

By leveraging worldwide computing power, linked together through a blockchain network, the workload of an enormous computing task can be distributed across many machines. Similar to renting out your spare storage space, users can rent computing power from others via a token economy. Projects pushing this concept often describe their services as decentralized supercomputers, but according to SONM (Supercomputer Organized by Network Mining) founder Sergey Ponomarev this is largely a marketing trick. Perhaps a more correct term would be a decentralized operating system.

iExec promises a marketplace for cloud computing resources, with distributed application (Dapp) developers as their main use case on the demand side. Developers or individuals can access affordable, secure, and scalable power on-demand, with data processing and analysis looking to be an important use case – specific Dapps for programming languages such as R and Python are available.

Each transaction is certified by iExec’s Proof-of-Contribution protocol. This essentially protects against bad actors by calculating a confidence threshold for each user, taking into account worker reputation, and requiring workers to stake some RLC before each calculation is assigned as a security deposit, to be seized should they produce an erroneous result. Multiple workers will act to produce a result until a consensus is reached. These calculations take place off-chain, removing the need for every node to process each job, and eventually the outcome is stored on the Ethereum blockchain. Workers are organized into pools which handle computation at scale, receiving RLC tokens for their accepted contributions.

In decentralized cloud computing specifically, iExec’s major competitor is Golem. One of the most highly subscribed ICOs, which saw Golem raise over 800,000 ETH in 2016, has been followed by a prolonged period of development that has left many impatient. Golem released its first implementation in April 2018 with a specific market in mind. While SONM is targeting supercomputer users in areas like machine learning and research, and iExec’s focus is on Dapps, Golem’s first use case is CGI rendering in programs like Blender. This pits them against traditional render farms, centralized computer clusters designed to process images for CGI, which are expensive.

Golem works through a typical software client, connecting providers and requestors on the network. Providers submit a task, which is chunked down into subtasks, sent via a peer to peer network for computation, and pieced back together to complete the finished job. Each computer that contributed is rewarded with one GNT, an ERC–20 token.

With such a niche use case, Golem will need to work to attract users. An integration with popular rendering software Blender has helped to drive some scale in this area of the market, but with expansion into machine learning planned, Golem will be directly competing against the likes of iExec, and will need a well stated value proposition to succeed, even with their current reputation.

SONM is a project using a different technique known as fog computing to more efficiently handle computation. Compared to traditional cloud computing with centralized data centers, fog computing methods achieve faster processing times and lower network costs by distributing computing, control, and storage to where the data originates. The platform’s two types of stakeholders, miners and buyers, transact computational resources, with miners earning SNM tokens for successful calculations for the buyers that took place in part via their CPU, GPU, ASIC, or even gaming console or smartphone. This is a major differentiation from other decentralized computing providers, who do not allow the use of casual devices with their platform. In terms of infrastructure, SONM is a top layer platform that relies on a combination of existing services, including torrent peer to peer file transfer, smart contracts through Ethereum, as well as the open source platform Cocaine.

SONM has also provided an easy on-ramp for buyers, making the SNM token available for purchase via a credit card on Changelly and listing it on Binance. This opens the service up to businesses and also allows an easy off-ramp for those wanting to exchange their earned SNM for other currencies. The first version of the SONM marketplace was released in June 2018, so SONM is in a user acquisition phase at present.

Conclusion

The projects promoting decentralized computing power are at an earlier stage than comparable projects in file storage, with limited offerings becoming available this year with carefully selected use cases. User communities are still developing for these projects. Golem has so far facilitated the completion of 77,000 subtasks, and iExec has eight worker pools up and running. These will appeal to corporate customers, so a key for these projects is ensuring they are accessible to businesses. Some steps have been made in this direction, including Golem integrating with Blender and iExec working with Intel, but more work will be needed to communicate the value of the technology and its benefits over traditional providers.

2 COMMENTS

  1. Iexec-ova suradnja sa Intelom*
    Odličan članak, nadam se da će SONM što prije pronaći jake korisnike, i po meni su #1 od svih navedenih.

LEAVE A REPLY

Please enter your comment!
Please enter your name here