Distributed Computing with Open Token and Hive

in LeoFinance3 years ago

The Distributed Network

DLUX is run on Open_Token software that is meant to be a modular architecture for community management. As we build toward our future it's only becoming more apparent that trust is paramount to maintain society. Open_Token is a Proof of Stake system to determine truth in a distributed system that's primary goal is processing a chain of transactions and determining changes to the community state file. Open_Token should be modular across multiple blockchains as well; allowing for nodes to report on truth in other APIs. With the coming introduction of multiple authority keys, side processes can easily be implemented to include a process hash with the consensus reports. Allowing Layer 3 systems to be run by a subgroup of nodes.

Screenshot from 2020-12-23 18-14-53.png

Explained Building a Module:

Build a Commuity Fork of the software:


$ git clone https://github.com/dluxio/dlux_open_token.git
$ cd Open_Token
$ git checkout -b sub-branch 
// git checkout sub-branch 
$ npm install
$ vim index.js 
// nano index.js 
// code index.js

File Structure here is pretty simple,
You'll need to add a line like the following:

//

function startApp() {
    processor = hiveState(client, hive, startingBlock, 10, config.prefix, streamMode, cycleAPI);
+ processor.on('customFunction', newFunction) 
   processor.on('send', HR.send);

This will call newFunction() when a custom_json tx is found in a block with the id coinprefix_customFunction
Then define your function to handle a discrete memory system and the Process Chain object

You can of course locate these in the HR obj for tidiness

function newFunction = (json, from, active, pc) => { 
//  types: 
//    OBJ from custom_json parsed to an obj with block_num and transaction ID 
//    STRING who signed
//    BOOL for active key | posting key
//    OBJ Process Chain controller 

var postPromise = getPathObj(['posts', `${json.a}/${json.p}`]); //active PoB posts in consensus
var nodeMarket = getPathObj(['markets', 'node']); //where node reports are stored

pc[0](pc[2]) // Execute the process chain's next function outside of the Promise
    Promise.all([postPromise, nodeMarket])
        .then(function(v) {
            var post = v[0];
        var reports = v[1];
            ops = [],
            if (Object.keys(post).length) { //sanity check for nil returns
                if (from == plasma.custom_func.auths[from].self) { //pull from a json memory 
                    client.database.call('get_content', [json.a, json.p]).then(result => {
            //Do something clever with information from anywhere
            item = scan_for_computable_data(result)
            plasma.custum_func.hashable.things[item.id] = item
            });
        })
                }
            }
        })
        .catch(function(e) {
            console.log(e);
        });
}

Depending on what API is drawn from will determine a hashable / consensable result. These should all be relatively live but be careful of things like votes that can change from second to second. and how you want to handle microforking. result."last_update" might be checked for >5 minute update time with an appropriate catcher to ask again after the remaining time has elapsed.

A great place to trigger a chron function:

if (num % 100 === 5 && processor.isStreaming()) { //block number mod and live update.
                        //check_function(num) //not promised, read only
                    }
+
                    if (num % 45 === 50 && processor.isStreaming()) { //reports go out at 100, set a time to consolidate or make reports

                        custom_report(plasma) //plasma is the existing non-consensus node memory
                            .then(nodeOp => { //Return a nodeOp to schedule txs on the next block
                                console.log(nodeOp)
                                NodeOps.unshift(nodeOp) 
                //unshift for immediate | push for queued execution
                            }) 
                //to sign custom reports shift an Operation to be signed into the queue 
                // nodeOp = [ //formatting help
                //   [0,0], //counter for broadcast errors
                //   [
                //     ["custom_json", { //or op of your choice
                        //    required_auths: [config.username],
                        //    required_posting_auths: [],
                        //    id: `${config.prefix}newFunction}`,
                        //    json: JSON.stringify({
                        //      hash: plasma.custom_func.hash,
                        //      block: plasma.hashBlock,
                        //      p: 'permlink' //address for mock function above,
                //      a: `author`
                        //      })
                    //    }
                //  ]
                //   ]
                // ] //or trigger a broadcast for a different wallet
                            .catch(e => { console.log(e) })
                    }

Saving the side chain's memory can be done by saving it to IPFS and putting the hash in plasma.privHash which automatically posts with the consensus Layer2 posts. Giving you a way to manage restarts and microforks.

Multi-Sig

What is arguably the most important feature to come to Hive's Layer 2 is autonomous multi-sig. Which let's interested accounts collectively control outflows from a main net account. This tool adds major trust improvements which enable features like partial fills of DEX trades, and cross chain bridges to chain controlled accounts. That's right, the ultimate DAO.

I hope you can take a few seconds to support Proposal 171

What is possible?

Build and maintain sub tokens. These can be for projects, NFTs, tokenization of property, computing that requires multi-party verification. Automate tasks that are dependent on chain transactions. Anything you could write a smart contract for... What can't it do?

What would you like to see improve, what do you envision people collectively managing?