Heres why computation is more valuable than data
‘Data is the new oil’ is probably one of the most flaunted phrases used in the past.
Though the above statement might be true, it’s slack research and is misinterpreted. Clive Humby, a British mathematician pointed out that just like how oil needs refining, data, needs processing power i.e computation for its utility to be realized.
Comparing the fundamentals of data to oil is quite flawed. Oil is a finite resource, and data is infinite, durable and reusable. Compared to storing oil, storing data has little benefit and is inefficient. Due to the conceptualization of the fact that it’s similar to oil (scarce), this is often what is done with it.
Large amounts of resources are required by oil to be transported. Data, conversely, can be created indefinitely and moved around the world at light speed at a low cost.
More data usage can provide more applications. Processing it can reveal further insights and utility. For example, a self-driving car detects if there are pedestrians ahead by analyzing real-time data from sensors. This is done through machine learning where artificial intelligence examines millions of hours of sensor data to learn the procedures of detecting pedestrians.
As global oil reserves shrink, the extraction difficulty will increase and be more expensive. In contrast, the availability of data is massive and will increase as technology advances, more activity shifts online and sensors upgrade.
Big data, particularly in its raw state, is more diverse than oil. Crude oil drilled from the earth is processed into a variety of things, but in its raw state is all the same. Data contrarily in its raw form can contain statistics, measurements, facts, ideas, sounds, pictures, words or anything can that be processed by computers. The increasing data being generated will result in more processing power needed to analyze data resulting in an increased demand for more computation or processing power.
If we were to consider the fact that data is a power source or fuel, it would make sense to categorize it with renewable energy sources like wind, sun, tidal, etc. The data present today is in abundance, more than we’ll ever use and instead of reducing the supply, it should be more available to the public.
Making the availability of data to the public is one aspect. Making computation or processing power easily accessible to the public is another. Tokenization of computation within cryptocurrency frameworks exists today due to the likes of EOSIO. Their innovative consensus algorithm called delegated proof of stake (DPOS) was developed by Block.one’s CTO Dan Larimer.
Compared to Ethereum, EOS’s’ foundation is different. Users on Ethereum pay transaction fees which are paid to the miners which compute the operations themselves. EOS tokens grant a user access to computation depending on the number of tokens owned by that user in proportion to the entire supply of EOS tokens. The nodes or Block Producers delegate that computation to any users’ accounts as and when needed and get emoluments from a 1% inflation every year in contrast to transaction fees existing on Ethereum. This has made it possible for zero transaction fees on EOSIO and for investors to own processing power which can be used by them or leased to earn dividends.
There are limitless ways to capture, use and reuse data, and in doing so can result in solving many global issues around the world. Nevertheless, the main point being addressed is that computation is more valuable than data.
Ripple (XRP): r3wiqBTo6rDB8gBsSuP9fYBnG28QfmKndZ
DISCLAIMER: This is not to be taken as financial advice. Views in this article are personal opinions based on research and should not be considered as financial advice. Always do your own research.