2nd HAF Projects Development Report for 2022

in HiveDevs2 years ago (edited)

2nd.png

Thanks and appreciation to everyone who supported my DHF proposal to fund the work I am doing. It is now funded. I hope to provide value to Hive through the work I’m doing and see more features availed to the community through it.

In this report, I provide updates on the state of the two HAF projects I am currently working on: Plug & Play and Global Notification System (GNS).

TL;DR is at the end of this post



plug_play_icon.png

Plug & Play

A Version 2 of Plug & Play is under way to:

  • expand its capabilities beyond just custom_json operations to all HAF-supported operations
  • introduce a lot more customisation features
  • increase modularity

Current pull request: https://github.com/imwatsi/haf-plug-play/pull/20/files#

Note: the code is not suitable to run yet, as I am testing individual SQL functions on pgAdmin and updating them, before tying it all together. The server is also unavailable at the moment due to the instability caused by the issues detailed below, to avoid taking away development time by fixing the old version. (HAF replays, server holdups, etc)

PostgreSQL for core HAF tasks

I am moving code that handled the synchronisation of plugs from Python to PostgreSQL. This will take away most of the sync related logic from Python, leaving it to handle setup and new DB initialisation and use APIs for data retrieval.

This will improve stability, which was being affected repeatedly when an error occurred, necessitating a replay (taking upwards of 8 hours).

Other issues were encountered during an initial massive sync, where a problem with connection would mean having to re-sync all Plug & Play operations from the beginning.

When I moved most of the sync logic to PostgreSQL, the problems above were confirmed to be solved during tests.

Modular plugs

I am making the plugs independent of each other by making them have their own HAF contexts, essentially making them separate “apps” in HAF terms. This means the individual plugs will maintain their own contexts and sync independently of each other. This contains errors within individual plugs and prevents them from cascading to the rest of the system.

Plug definitions

I introduced the concept of “plug definitions” (adopted from my work on GNS) to make it possible to add properties to plugs that determine the following and more:

  • start_block: the block height at which the plug should start processing its data
  • ops: the operation types and associated SQL function names to be used as triggers the plug

Support for more operation types

To expand on the concept of Plug & Play as a turnkey tool to extract, process and create APIs for specific types of data from the Hive blockchain, I am changing how plugs interact with operations to allow for more flexibility. The definitions mentioned in the section above help with that.

Documentation

With all these changes, the documentation is being updated to include:

  • The latest installation methods of HAF and Plug & Play
  • Setting up the plugs and defining them
  • Writing the SQL files associated with the plugs

ETA

I expect to get a production deployment ready within a few days, barring unforeseen challenges.



gns_icon.png

Global Notification System (GNS)

Current pull request: https://github.com/imwatsi/hive-gns/pull/12/files#

Note: this code is also not suitable for deployment yet and the server is unstable for use at this point in time.

When I initially developed the GNS MVP, focus was on proof-of-concept and some aspects of the system were not suited for the scale and scope that is now planned.

I have been working on some core changes to how GNS works to make it scale and be able to support a wide range of Hive operations without complicating the implementation or making it hard to maintain or upgrade.

Preloading of historical account preferences

Since GNS only keeps an active dataset of 30 days to populate its notification database, when a new node is setup or when an existing one is reset, it will not have access to account preferences broadcasted before the 30-day window.

To rectify this, I developed a preload function that processes gns operations that were broadcasted since the first public release, to populate "account preferences" state, when none is found in the database.

Operation filtering

I am currently working to add filtering support to all HAF operations, which improves how much you can customise notifications. For example, when fully implemented, the following will be possible:

  • custom_json_operation filtering by the JSON’s contents (key-value pairs for JSON object payloads; index-value pairs for array payloads)
  • transfer_operation filtering by specifying matching rules (complete phrases or partial matches) in memo field

Modular operation processing

I am creating groups of separate SQL functions to handle the various HAF operations available, instead of the somewhat singular approach currently implemented. This enables the system to scale with the addition of new notifications.

It also makes it easier to make system-wide changes to how specific Hive operations are handled without breaking other parts. In effect, there will be individual .sql files for each HAF operation type, with functions that define how to process that operation.

More notification types

Work on this will be delayed a bit, since my current work on the above aspects blocks the development of additional notification types until it’s done.

ETA

I expect to finish up the core changes within a week, after which the focus will shift to expanding the notifications available on GNS.


TL;DR

Plug & Play:

  • New Version 2 in progress
  • Support for more Hive operations, not just custom_json
  • A more PostgreSQL-heavy implementation for core HAF tasks and use Python mostly for data retrieval and API access
  • Increased modularity, to make the codebase more manageable and accessible to developers

Global Notification System (GNS):

  • Operation-level filtering to allow developers to create fine-grained notification features
  • Increased modularity, to prepare GNS for scale
  • Preloading of historical account preferences for new node setups
  • More notifications will be added after work on the above is complete

Thanks for reading. Feel free to leave feedback.


Witness (1).png

To support my witness node:

Sort:  

thank you for all the work!

Thanks for your support!

It's not fucking clear - but it's very interesting!

![orig (2).gif](UPLOAD FAILED)

Congratulations @imwatsi! Your post has been a top performer on the Hive blockchain and you have been rewarded with the following badge:

Post with the highest payout of the day.

You can view your badges on your board and compare yourself to others in the Ranking
If you no longer want to receive notifications, reply to this comment with the word STOP

Check out the last post from @hivebuzz:

Hive Power Month - New Tracking Calendar
Our Hive Power Delegations to the May Power Up Month Winners

Congratulations @imwatsi! You received a personal badge!

You powered-up at least 10 HIVE on Hive Power Up Day!
Wait until the end of Power Up Day to find out the size of your Power-Bee.
May the Hive Power be with you!

You can view your badges on your board and compare yourself to others in the Ranking

Check out the last post from @hivebuzz:

The 7th edition of the Hive Power Up Month starts today!
Hive Power Up Day - July 1st 2022
NFT for peace - Thank you for your continuous support

Congratulations @imwatsi! You received a personal badge!

You powered-up at least 10 HIVE on Hive Power Up Day! This entitles you to a level 1 badge.
Participate in the next Power Up Day and try to power-up more HIVE to get a bigger Power-Bee.
May the Hive Power be with you!

You can view your badges on your board and compare yourself to others in the Ranking

Check out the last post from @hivebuzz:

The 7th edition of the Hive Power Up Month starts today!
Hive Power Up Day - July 1st 2022
NFT for peace - Thank you for your continuous support

This sounds very exciting, I hope we will be able to see these in action soon =)

Working as fast as I can :)