Harnessing Computational Power used for Proof-of-Work in Block Chains
It's well known that proof-of-work carried out by miners to add a new block to the block chain requires lot of computation power. This power is used to find a hash value matching to some random value. The purpose is to make the block adding process exhaustive such that fraudulent behavior will be discouraged.
I was wondering , perhaps in the future, that there will be a motivation to adapt a new kind of proof-of-work such that, the miner will be required to carry out a useful X amount of data processing tasks and provide the proof. Such a trend would open up the opportunity for the organizations/individuals to outsource their data processing tasks and issue a token of task completion? perhaps.
An example to simply this idea is, those organizations wish to get their data processed, publish to a public queue and the miners will pull the jobs off to execute them, which would ultimately be published to a pre-designated location of the client's side. As a result the client would provide a globally acceptable token of completion. This cannot be performed on sensitive data, but it can save room for sensitive data processing in the organizations by removing the overhead of processing non-sensitive data.
If such a endeavor became a possibility, would it drive the data processing costs down? Just a vision.
I was wondering , perhaps in the future, that there will be a motivation to adapt a new kind of proof-of-work such that, the miner will be required to carry out a useful X amount of data processing tasks and provide the proof. Such a trend would open up the opportunity for the organizations/individuals to outsource their data processing tasks and issue a token of task completion? perhaps.
An example to simply this idea is, those organizations wish to get their data processed, publish to a public queue and the miners will pull the jobs off to execute them, which would ultimately be published to a pre-designated location of the client's side. As a result the client would provide a globally acceptable token of completion. This cannot be performed on sensitive data, but it can save room for sensitive data processing in the organizations by removing the overhead of processing non-sensitive data.
If such a endeavor became a possibility, would it drive the data processing costs down? Just a vision.
Comments
Post a Comment