1st update of 2021 on BlockTrades work on Hive software

in HiveDevs4 years ago (edited)

It’s been several weeks since my last report, as I was first involved in preparing for HiveFest, then the holidays hit and I decided to focus on coding while a lot of our staff was on vacation and I didn’t have as many management tasks to deal with. In particular I wanted to make some large scale changes to hivemind code while there were fewer changes going on from other programmers.

In the meantime, I’m happy to announce that we’re about to tag our first “versioned” release of hivemind: v1.24.0. The major version is being set to 1, moving out of the “beta” status that hivemind has been in for so long and to represent the major overhaul we’ve done to the hivemind code base, 24 to indicate the associated hard fork version of hived to use, and the third number (currently 0) to report on minor revisions that improve performance, add additional features, etc. This is a major milestone for our Hive work, and it means we’re now finally freeing up to take on big new tasks.

Hived work (blockchain node software)

We’re still working on creating a hived plugin that can directly write the needed data into hivemind’s database during hive reindexing and normal block reception. Most of the data being provided by get_block_api is of no interest to hivemind, so using this API to get the data is unnecessarily wasting cpu, in addition to slowing down hivemind. I expect that using the plugin approach will lead to significant speedup in the initial sync time for hivemind and it should also reduce normal hivemind live-sync write time. Work is ongoing here:
https://gitlab.syncad.com/hive/hive/-/commits/km_live_postgres_dump/

Work has also continued on the code for implementing expiration of governance votes (witness votes, proposal votes, and voter proxy changes) when an account ceases activity, work is in progress here (one of our new programmers is working on this as an introduction to the core blockchain code):
https://gitlab.syncad.com/hive/hive/-/merge_requests/160

In the last developer call, someone suggested (happily I think), that nowadays it sounded like we had at least 5 core developers who worked on hived software at BlockTrades since Hive started. I knew the number was higher, but I didn’t know the exact number, so I decided to check on it and it looks like there have been 10 (note this doesn’t include additional programmers working on other Hive software such as hivemind and condenser).

We also have some other guys that have experience with the hived code base from Steemit days, who are currently on other projects, so it’s fair to say we’ve have plenty of core programmers for the 1st layer, in case anyone was concerned on this point.

Hivemind (2nd layer microservice for social media applications)

As mentioned in intro, we’ve about to release an official version of hivemind, and a lot of work in the last few weeks was required to get there. Here’s a short list of some of the work that’s been merged to the release code base recently:

https://gitlab.syncad.com/hive/hivemind/-/merge_requests/426
https://gitlab.syncad.com/hive/hivemind/-/merge_requests/411
https://gitlab.syncad.com/hive/hivemind/-/merge_requests/395
https://gitlab.syncad.com/hive/hivemind/-/merge_requests/387
https://gitlab.syncad.com/hive/hivemind/-/merge_requests/428

And this was my “holiday work” on optimizing queries :-)
https://gitlab.syncad.com/hive/hivemind/-/merge_requests/454
It allows for a quite big speedup, and our production hivemind server itself only has load of around 1 now (there’s also a hived located there, so total load on system is around 1.5) while handling a large portion of Hive’s API traffic. In short, hivemind can now handle a fairly huge amount of traffic without requiring a powerful server (my initial guess is this single hivemind server could handle around 8 times this traffic without a problem). This means that Hive has become a lot more decentralized since your average unix enthusiast could now inexpensively setup a server to serve all our traffic.

I’ll prepare a detailed report on the results of optimizing the latency of individual API calls later in a spreadsheet, because the timing information there could be quite useful to 2nd layer app developers.

We also have pending work to hivemind that hasn’t yet been merged, as we still have a long validation time on changes (because of time required for a “full sync” to 50m+ blocks), and we wanted to get out an official release and move releases for API nodes back to “master” branch and leave develop branch for experimental changes (due to urgency of development and frequent updates to database schema, we were using develop branch as our release branch for API nodes ever since HF24). As part of this official release, we ran a full sync test on the version just prior to optimizations from MR454 above, but changes to 454 are unlikely to affect that result, since it’s mostly about API optimization.

Condenser (open-source code for sites such as https://hive.blog)

We finished and deployed decentralized list changes, but haven’t had much time to test them yet in production. AFAIK they are working ok, however. I’ll probably do some testing later today.

What’s the plan for next week?

  • Continue creating hivemind tests.
  • We still have a few more optimizations that can be done similar to the changes done in 454, probably will complete them this week.
  • Move hivemind tests into hivemind repo instead of tests_api to simplify practical testing.
  • Begin design analysis for slimmed-down 2nd layer microservice and for development of smart contract platform based on it.
  • Continue work on speedup of hivemind full sync via the hived plugin as the slow sync time has a big impact on the speed of hivemind CI (which sets an upper limit on how fast we can validate changes). This work is also required for the slimmed-down 2nd layer microservice.
  • Continue work on governance changes for HF25(details on this work here: https://hive.blog/hive-139531/@blocktrades/roadmap-for-hive-related-work-by-blocktrades-in-the-next-6-months).
Sort:  

Could you share the specs of the "inexpensive" server ?

It's a very moderately-priced dedicated cloud-based server:

  • Relatively cheap CPU: E-2275G 4GHz CPU (4 core/8 thread system) (this CPU retails for $334)
  • 64GB RAM
  • Fanciest thing is it has 2 960GB samsung nvme drivers (and these aren't very fancy or expensive)

For testing and development, we use much more powerful servers that this, of course, to save time. Our development servers are in our own private data centers, since top-of-the-line equipment is costly to rent.

"Inexpensive"

Honestly it's expensive if you think of the price itself, but considering one of those could handle all of hive's traffic, it's pretty inexpensive :)

To give you some context steemit used to spend multiple thousand per month for their infrastructure, the setup above is probably 150$ or 200$ a month.

From what I recall (and I'm usually pretty good at recalling such things), within the first few years, Steemit was spending $75K per month on supporting their nodes (i.e. around $900K/year).

This is crazy. Surely the cost has to increase as we get back to old activity levels? Otherwise I'll setup my own node after the crypto bull run

At this point, I think we could increase traffic to 12x our current traffic (which would far exceed anything Steemit ever did), and the cost for the servers wouldn't go up (unless some network traffic costs kicked in, but I don't think this is the case).

thanks for that info. We need marketing. This makes Hive looks superior to all other chains I know.

Am I wrong or is this fantastic? That would allow handling smts in an easy and cheap way, right? :)

I mean yes but it was mostly because previously running a full node required 512 gb of ram and that increases the price exponentially. (plus steemit ran multiple nodes for redundancy)

Ok, but this is much cheaper for multiple nodes too, or I'm wrong?

Wow!

The machine I'm running my witness on is more powerful than that!
Its a i7 6800k 3.6GHz (6core/12thread) which is a similar speed CPU.
128Gb RAM
2 x 1TB NVME drives in RAID 0

Are you saying that such a machine could handle ALL of Hive's current transactions!!?

yes

Well done for making Hive API nodes so cheap to run!

This is fantastic for decentralisation.

The 2 x 1Tb super fast NVMe drives cost me $105 each (in Israel).

RAIDed they deliver amazing performance.

So for a couple of hundred bucks I've upgraded a 4 year old machine to be able to run ALL of Hive's traffic (if necessary).
Anyone with a bit of tech background can do this.

This means any one of the current 19 API nodes (with many more to come) could run all HIVE traffic in an emergency.

The Dev chat while I did not understand a lot of the technical stuff was interesting. I am glad to see that the Witness selection/retention system is still being looked at and a vote decay system is being examined. I feel that is very important to the safety net of Hive Block Chain.

I appreciate the efforts being taken to help keep users informed of changes, and really looking forward to the pagination on followers list so as to not need to keep scrolling forever.

It was one of your earlier comments on blockchain that I think inspired arcange to put the topic on the agenda for the meeting (but work was already in progress).

It is difficult for us mere mortals to understand the Block Chain, let alone stepping in to the realm of the gods of git-lab. So even though a lot of the Dev chats are over my head, I still listen because I can pick up a few hints of what is in store.

I think once you have a person doing the documentation, perhaps some of that will spill over and help users stay a tad bit more informed of what is going on.

"Begin design analysis for slimmed-down 2nd layer microservice" - we could all use some slimming down after being locked up with noting to do but eat

You're not kidding. I didn't even do much exercise in December, since my regular workout partner was too busy with their job (delivering packages during the holiday). But just started back last Sunday, so hopefully I'll be able to lose some of it soon.

I wished for some gainz for the holidays but I wanted to measure them in dollars not kilos

amigo aqui mi visita desde venezuela,feliz año,muy buen trabajo,espero contar con su apoyo para podercontinuar aqui,saludos

Thanks for the update

I’ll prepare a detailed report on the results of optimizing the latency of individual API calls later in a spreadsheet, because the timing information there could be quite useful to 2nd layer app developers.

This is super exciting.
Rebuilding a bunch of connections for our release of our 1st "Universal NFT" and this would help a ton.

Appreciate all the work you do.

Scalability is vital if Hive is to grow. I hope to see that put to the test in coming months. We definitely don't want people getting a bad experience. Thanks.

😃😃😃😃😃😃

When moon? 🚀 🚀

It's there pretty much every night, unless there's a lot of cloud cover.

Thanks for the update!
I hope someday this hivemind improvement work and tests can end or slow down.
Looking forward to see what you guys can do to make HIVE thrive again!

Hello @blocktrades… I have chosen your post on “-1st update of 2021 on BlockTrades work on Hive software-” for my daily initiative to reblog and comment…
Feliz2021.jpg
Let's keep working and supporting each other to grow at Hive!...

I saw on your crypto exchange blocktrades.us that EOS is under maintenance. Why is it under maintenance? Also, there are two USD tether icons, one of which is under maintenance, and the other of which can be traded. Are there two cryptos with the same name?

EOS had what was to us an essentially "surprise" hardfork and it takes a long to replay EOS in such a condition. We've been replaying for several weeks now. If it takes much longer, we may look for some way around the issue until the replay finishes.

USD tether can be exchanged on multiple blockchains. Initially it was released on the OMNI network (which is an overlay network on top of Bitcoin's network). But eventually they added support for exchanging USD Tether over Ethereum as well. Virtually all our users now use the Ethereum network now for exchanging tether (cheaper and faster), so we haven't bothered to restore the OMNI wallet, since it would most likely just lead to user confusion (and increased fees if they used it). We may just drop the omni version from our web page unless we see a request to restore it to functionality.

What triggered the EOS hardfork?

If I understood correctly, it was some bug or similar unexpected "event", but I didn't pay close attention, so I could be wrong.

So is Leo part of the Hive blockchain or part of the Ethereum blockchain?

Maybe you accidentally replied to wrong comment? I'm not sure what you're referring to.

Congratulations @blocktrades! You have completed the following achievement on the Hive blockchain and have been rewarded with new badge(s) :

Your post got the highest payout of the day

You can view your badges on your board and compare yourself to others in the Ranking
If you no longer want to receive notifications, reply to this comment with the word STOP

To support your work, I also upvoted your post!

Do not miss the last post from @hivebuzz:

Feedback from the January 1st Hive Power Up Day
Happy New Year - Project Activity Update

Greetings, happy year 2021, excellent work that you do to continuously improve our Hive blockchain

Greetings, happy year 2021, excellent work that you do to continuously improve our Hive blockchain

i don't understand a thing but things are moving behind the scenes and i like that! So a question for us who don't get the tech stuff

How important do you consider this update and the plan for next week on a scale of 1 to 10? where exactly will it help Hive in general?

and also, did i sounded like a reporter or something? :P

It's quite important for two reasons:
*we've successfully completed increasing the scalability of Hive by about 12x which means it is far more decentralizable

  • we can begin to work on new and important projects for Hive beyond current Hive capabilities

freaking awesome!!!!!

!wine


Cheers, @minnowspower You Successfully Shared 0.100 WINE With @blocktrades.
You Earned 0.100 WINE As Curation Reward.
You Utilized 1/3 Successful Calls.

wine-greeting


WINE Current Market Price : 0.000 HIVE

Yo man, I'm just curious, do you have any idea on how to change the way HBD is created? I know it's off the topic but if we find a good way Hive could easily add an aditional utility.

This is a consensus level issue. You would need to make the witnesses agree on such a change. If you think you can convince the witnesses, why don't you write a post about it? You could also go to the Hive-blockchain source code repo and create an issue.

I actually missed the whole point haha. What i was trying to ask is if there's any debate regarsing this issue. Given that blocktrades is one of the most relevant figures here, i just wanted to know what's his stand on it.

Regardless, thanks for the time.

There is no debate, just me trying to convince others:

It has been my opinion for sometime that Steem Dollars (and now Hive Dollars) inflation (Austrian definition) should stop completely until the Hive/HBD market-cap ratio exceeds 19 Hive for every HBD and should stop again when the ratio drops to or below 19 Hive for every HBD. Then have some relation between so the number of dollars issuance starts to diminish at say 25 Hive for every HBD.

These 19 and 25 could be a witness parameter, rather than these constants. If these values are set to 9 and 9, it is what we have today.

Should this be done, HBD could become a credible pegged currency. That's not such a high expectation though is it? Interest given to Hive Dollars in savings when they are super-dollar would also help control when it gets too high.

What does @blocktrades think? I am curios as well.

@leprechaun There's already code that stops printing of HBD when the ratio of issued HBD to Hive market cap goes too high. And there's also code that starts to lower the value of HBD if amount of HBD issue continues to exceed the Hive marketcap too much (this latter rules tends to encourage conversion of HBD to Hive before this haircut kicks in).

@occupation To allow for more HBD to be printed, and still allow for stability, there's only two solutions I can see: 1) increase the value of Hive (since max amount of HBD is effectively ratioed to this value) or 2) allow for collaterization of HBD via other currencies.

This latter method could potentially be implemented, but it would require some work, and it also brings currency custody risks too (unless one assumes the existence of trustless 1-1 blockchain transfers, which really doesn't exist now). So I don't think we should focus attention on this direction.

We like to overcomplicate things haha. Would like to hear what @edicted thinks of it. :D

Sure whatever gets added to the marketplace of ideas is welcome. It was different when Bitcoin was in a bear market. These days the Hive Dollar is doing surprisingly well in terms of dollars. Maybe because the money printer is going brrrrrrrr.

Begin design analysis for slimmed-down 2nd layer microservice and for development of smart contract platform based on it.

Is it possible to write about initial description/design of what things you are considering on this?

I've discussed this some already in our roadmap post, but I'll reiterate here briefly: the microservice app itself is quite simple, the idea is to have hived directly deliver blockchain data to a stripped down version hivemind with jus t basic data: block history including operations, plus account information and balances. This would all be coded in python/sql, just like hivemind itself. Next apps can build whatever particular features they want on top.

And we'll have modular sets of auxiliary data and API calls that can then be added, for apps that want more standard features as part of their design. By modularly designing such features, we can have standardized subsets of the entire API that can be provided by different types of API nodes as well.

The smart contract platform would be built as an additional modular layer on top of this basic framework. There's multiple options for the smart contract implementation, but right now I'm favoring directly executing the contracts within Postgres itself, so smart contracts could be written in any of the languages directly supported by Postgres (python, sql, etc) and Postgres would act as the sandboxing system.

Oh I see, I remember discussion about modular hivemind, many dapps already run stripped down version to some extend, unified/standard approach would be beneficial for sure especially if it helps number of 2nd layers designed for different purpose.

From what I understand, Smart contract would work with specific dataset within Postgres and 2nd layer code/node would check validity of that data during execution, then update that data?!

about initial description/design

I was asking this to know more of thought process on design of system. In chat, you mentioned token is necessary as incentive for 2nd layer nodes and what other things are there to consider, will it have unstructured or structured design, configurability, verifiability, security, speed.

Thanks for sharing

All efforts and guidance to help and inform us is very important. Certainly I would like to learn more about the subject, in any case we have people who like you help us with the opinions and evaluations of our publications. Successes, from Venezuela. Feliz year 2021.

Excellent information! Thanks for share!

You guys are the best, thanks for taking that time for us.

Is it possible to get theta and tfuel added to your wallets?

@blocktrades All the coin increase except shitcoin hive :p