Hive HardFork 26 Jump Starter Kit

in Blockchain Wizardry2 years ago

Intended for the Hive API node operators, witnesses, and developers.

All you need to do to bring your new Hive instance up within a couple of hours, or upgrade within minutes.

Yes, new Hive Hard Fork, new fancy logo reveal.

Code

GitLab

https://gitlab.syncad.com/hive/hive
Our core development efforts take place in a community hosted GitLab repository (thanks @blocktrades). There's Hive core itself, but also many other Hive related software repositories.

GitHub

https://github.com/openhive-network/hive
We use it as a push mirror for GitLab repository, mostly for visibility and decentralization - if you have an account on GitHub please fork at least hive and star it if you haven’t done so yet. We haven't paid much attention to it but apparently it's important for some outside metrics.

Services

API node

https://api.openhive.network

Seed node

hived v1.26.0 listens on gtg.openhive.network:2001
to use it in your config.ini file just add the line:

p2p-seed-node = gtg.openhive.network:2001

If you don't have any p2p-seed-node = entries in your config file, built-in defaults will be used (which contains my node too).

Stuff for download

TL;DR https://gtg.openhive.network/get

Binaries

./get/bincontains hived and cli_wallet binaries built on Ubuntu 20.04 LTS which should also run fine on Ubuntu 22.04 LTS.

Ubuntu 18.04 LTS

For those who are late to the party and facing troubles upgrading their Ubuntu 18.04 LTS to either 20.04 or 22.04, there’s also a ./get/bin/ubuntu-18.04-lts/ dir, that has binaries built for Ubuntu 18.04 LTS. It’s not officially supported and building it required some extra steps, but if you have no other choice, be my guest.

Of course you should soon upgrade your system anyway because it will reach the end of its hardware and maintenance support on April 30, 2023.

Blocks

./get/blockchain/compressed
The compressed block_log file (roughly 338GB), and block_log.artifacts file are updated once in a while and supersedes old uncompressed block_log (roughly 678GB) and block_log.index files.

Unfortunately updating your local block_log by continuing download is no longer supported because of the offset differences between individual nodes.

Snapshots

API

./get/snapshot/api/ contains a relatively recent snapshot of the API node with all the fancy plugins.
There’s a snapshot for the upcoming version v1.26.0 but also for the old one v1.25.0 if you need to switch back.
Uncompressed snapshot takes roughly 813GB
There’s also the example-api-config.ini file out there that contains settings compatible with the snapshot.

To decompress, you can use simply run it through something like: lbzip2 -dc | tar xv
(Using parallel bzip2 on multi-threaded systems might save you a lot of time)

To use snapshot you need:

  • A block_log file not smaller than the one used when the snapshot was made.
  • A block_log.artifacts file that’s matching your block_log file to save time for its generation (otherwise it could be regenerated)
  • A config.ini file compatible with the snapshot (see above), adjusted to your needs, without changes that could affect it in a way that changes the state.
  • A hived binary compatible with the snapshot

All of that you can find above.

Run hived with --load-snapshot name, assuming the snapshot is stored in snapshot/name

Exchanges

There’s also a snapshot meant for exchanges in ./get/snapshot/exchange/ that allows them to quickly get up and running, it requires a compatible configuration and that exchange account is one of those who are tracked by my node. If you run an exchange and want to be on that list to use a snapshot, just please let me know.

Hivemind database dump

./get/hivemind/ contains a relatively recent dump of the Hivemind database.
I use self-describing file names such as:
hivemind-20221008-v1.25.3-a68d8dd4.dump
Date when dump was taken, revision of hivemind that was running it.
You need at least that version, remember about intarray extension
Consider running pg_restore with at least -j 6 to run long running tasks in parallel
After restoring the database, make sure to run the db_upgrade script.

When restored from the dump it takes roughly 675GB. Dump file itself is just 60GB.

Some more useful tips:

  • Remember that you need to add plugin = wallet_bridge_api in your config.ini file if you are going to use cli_wallet.
  • Upgrading from v1.25.0 to v1.26.0 requires replay. If there's existing state file then remove it or use --force-replay.
  • If you can, use your current block_log file, that will save you a lot time/bandwidth.
  • If your block_log file is uncompressed you can (optionally) use compression tool like ~/build/programs/util/compress_block_log -j 32 -i blocks/uncompressed/ -o blocks/compressed/ or download already compressed one from https://gtg.openhive.network/get/blockchain/compressed/ or just continue using it (new blocks will be added in compressed form)
  • block_log.index file is replaced by block_log.artifacts
  • To avoid replay if you are in a hurry you can use one of the suitable snapshots.
  • Snapshot for exchanges can be used also for seed nodes and witness nodes (it just has more data than required).
  • Make sure to periodically backup your instance (using --dump-snapshot, etc.) to save a lot of time required by reindexing.
  • If you need help, try asking community members on https://openhive.chat/channel/dev (login with your Hive account using Hivesigner)

All resources are offered AS IS.

Sort:  

can someone please summarize a little bit what changes this new hardfork with respect to what we have today? it's a little confusing to follow all the technical updates of blocktrades and I still can't find a post that explains it in a simple way

Please take a look at a less technical one.

So excited for these 'upgrades' on the chain. Looking forward to it!

So when will it be possible to delegate RC? :)

That is part of the hard fork when it goes live in a couple days.

I believe Actifit already has it built into their front end and presume that Peakd will have it ready when the HF goes live.

Soon. If all goes well HF will be triggered tomorrow at 1200 UTC.

I truly appreciate the logo reveal, at least there's something in this post I understand :0) And it does look pretty awesome!

Also the lack of arguments in any posts I've come across regarding this hardfork seems like a good sign:)

 2 years ago (edited) 

Ha, that is exactly what this logo reveal is for! ;-)
As for the upgrade, Hive is really awesome platform already, so we could focus on really neat improvements and enabling even better features to those who wish to build their apps on top of Hive.
OBI is really one of the coolest new features, which makes given transaction set in stone within a second or two, whereas in case of Bitcoin it's a soft whisper "maybe" after 10 minutes.

No arguments here, I love this place! I recently had an old friend from the early days reach out and ask which crypto blogging platform was the best for posting right now.. In my opinion this is isn't just the best, it's truly the only one of its kind that I would recommend :)


The rewards earned on this comment will go directly to the people( @enjar ) sharing the post on Twitter as long as they are registered with @poshtoken. Sign up at https://hiveposh.com.

I liked the part about the video because shiny.

Fortunately it's square/cube shaped. Be aware of shiny ring-like objects.

Those rings don't scare me, as long as they sign the prenup.

bu dum tss!

Congratulations @gtg! Your post has been a top performer on the Hive blockchain and you have been rewarded with the following badge:

Post with the highest payout of the day.

You can view your badges on your board and compare yourself to others in the Ranking
If you no longer want to receive notifications, reply to this comment with the word STOP

Check out the last post from @hivebuzz:

Our Hive Power Delegations to the September PUM Winners

It is going to enhance a few parameters on the Hive blockchain, keep going.

@tipu curate

This post has been manually curated by @steemflow from Indiaunited community. Join us on our Discord Server.

Do you know that you can earn a passive income by delegating to @indiaunited. We share 100 % of the curation rewards with the delegators.

Here are some handy links for delegations: 100HP, 250HP, 500HP, 1000HP.

Read our latest announcement post to get more information.

image.png

Please contribute to the community by upvoting this comment and posts made by @indiaunited.

I like the fact that hive evolved to it's growth grows by the day, I have come to realize that hive has a vision set that will last for generations to come, I am very excited for this upgrade @gtg thanks for the information it was very crucial to me.

i like it, future is true on hive blockchain

Congratulations @gtg! You received a personal badge!

Thank you for taking the time to complete the Hive Keychain survey.

You can view your badges on your board and compare yourself to others in the Ranking

Check out the last post from @hivebuzz:

Our Hive Power Delegations to the October PUM Winners
Feedback from the November 1st Hive Power Up Day - New Turnout Record
Take the Keychain Survey and get your Future Shaper badge