Skip to content

Comments

Add beets fee split on Optimism#125

Merged
franzns merged 34 commits intobiweekly-runsfrom
feat-optimism-beets-split
Jul 10, 2025
Merged

Add beets fee split on Optimism#125
franzns merged 34 commits intobiweekly-runsfrom
feat-optimism-beets-split

Conversation

@franzns
Copy link
Contributor

@franzns franzns commented Jun 4, 2025

Beets Fee split as per BIP here: https://forum.balancer.fi/t/bip-800-deploy-balancer-v3-on-op-mainnet/6415

This is not yet a finished implementation but serves as a POC to assess the needed code-changes in the fee-allocator-v2 and decide whether it is feasible to integrate the fee-split into the process.

TLDR of the split:

  • Fees that go to either Balancer DAO or passive veBal are split 50/50 with Beets DAO.
  • The 70% fee recycling for core pools as vebal incentives is identical
  • The Alliance Partner allocation of 17.5% is identical for both core and non core pools

I have added this split to the allocator, the recon, as well as any output files and the resulting payloads.

The fee-allocator-v2 pulls some data from package dependencies (bal_tools or bal_addresses) as well as fetching it from github raw URLs. Currently in this PR, these are either hard-coded in the allocator code, or adapted in the local dependency tree for testing. I have not yet PR'ed these changes as I want to align on this POC first. Here is a list of changes in the dependencies that would be needed if we decide to move forward:

  1. add optimism here to production chains so the action builds core-pools cache:
    https://github.com/BalancerMaxis/bal_addresses/blob/main/extras/chains.json#L48

  2. add ECLP core pools here:
    https://raw.githubusercontent.com/BalancerMaxis/bal_addresses/main/config/core_pools_whitelist.json

  3. add optimism here: https://github.com/BalancerMaxis/bal_tools/blob/main/bal_tools/pools_gauges.py#L300

  4. add beets_share_pct here: https://raw.githubusercontent.com/BalancerMaxis/multisig-ops/main/config/protocol_fees_constants.json

@Xeonus Xeonus requested a review from jalbrekt85 June 10, 2025 07:53
@gosuto-inzasheru
Copy link
Contributor

@franzns is this ready for review? can you solve conflict and remove draft status?

@franzns
Copy link
Contributor Author

franzns commented Jun 10, 2025

@franzns is this ready for review? can you solve conflict and remove draft status?

This is not yet a finished implementation but serves as a POC to assess the needed code-changes in the fee-allocator-v2 and decide whether it is feasible to integrate the fee-split into the process.

If we decide to move forward and integrate the fee split, I'll put up all needed PRs, resolve any conflicts and mark them as ready for review.

@franzns
Copy link
Contributor Author

franzns commented Jun 12, 2025

This PR depends on

  1. Add Optimism Support bal_addresses#685
  2. Integrate Optimism bal_tools#107
  3. Add beets share pct for OP multisig-ops#2124

and can only be reviewed/merged after the above have been merged.

@Xeonus
Copy link
Contributor

Xeonus commented Jun 16, 2025

@franzns you can point to the specific branches for the dependencies to be able to do test runs and evaluate the code changes

@franzns
Copy link
Contributor Author

franzns commented Jun 17, 2025

@franzns you can point to the specific branches for the dependencies to be able to do test runs and evaluate the code changes

I can, but wouldnt it make more sense to merge the dependent PRs first and then have a proper PR here that points to prod branches?

@Xeonus
Copy link
Contributor

Xeonus commented Jun 17, 2025

@franzns it would help us evaluate everything and you could then do test runs. Makes it easier for us to review, usually how we operate with such feature changes. Thanks

@franzns franzns marked this pull request as ready for review June 18, 2025 09:02
@franzns
Copy link
Contributor Author

franzns commented Jun 18, 2025

@franzns it would help us evaluate everything and you could then do test runs. Makes it easier for us to review, usually how we operate with such feature changes. Thanks

Everything is updated to use the specific branches. What would be the best way to trigger a test fee run?

@franzns
Copy link
Contributor Author

franzns commented Jun 18, 2025

Was able to create a test run @Xeonus #146

Copy link
Contributor

@gosuto-inzasheru gosuto-inzasheru left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i used gha-biweekly-fees-1749217417...gha-biweekly-fees-1750258817#diff-6a8606c6cac86f715f1c824a9782d0a1b0383bd24f290860ad8b5ba8bc500df6R65 for a diff between the artifacts of the test fee run for this pr and the respective prod run

"base": 16742345432,
"mainnet": 17707104378
"mainnet": 17707104378,
"optimism": 1856000000
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

idem

@gosuto-inzasheru
Copy link
Contributor

i still need to compare artifacts!

@gosuto-inzasheru
Copy link
Contributor

@franzns almost there :D can you resolve the merge conflicts, rebase to latest main branch and run another round for the period ending 2025-06-19? there have been quite some updates to the main branch that we want to make sure wont cause issues. then we can compare your artifacts to #149

@gosuto-inzasheru
Copy link
Contributor

@franzns i was able to let the ci test suite run, but test_allocator.py::test_core_pool_allocation now fails

https://github.com/BalancerMaxis/protocol_fee_allocator_v2/actions/runs/15898417270/job/44835417755?pr=125

@jalbrekt85
Copy link
Collaborator

jalbrekt85 commented Jun 26, 2025

@franzns i was able to let the ci test suite run, but test_allocator.py::test_core_pool_allocation now fails

https://github.com/BalancerMaxis/protocol_fee_allocator_v2/actions/runs/15898417270/job/44835417755?pr=125

alliance pools are causing this deviation now that we have a couple, updated tests to account for this in #154

@franzns
Copy link
Contributor Author

franzns commented Jul 2, 2025

new PR with allocation: #159

Still need to check tests

EDIT: test worked it seems 😅

Copy link
Collaborator

@jalbrekt85 jalbrekt85 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you add a test that checks the beets fee share? i think you should be able to just add optimism to the input fees in the allocator pytest fixture and then have a test_beets_split or something that makes sure the new fee split is working as intended

Comment on lines 96 to 103
try:
chain = CorePoolChain(self, chain_name, fees, self.w3_by_chain[chain_name])
except ConnectionError as e:
print(f"Failed to initialize chain {chain_name} with fees {fees}. trying again after 10s...")
# retry with delay
time.sleep(10)
chain = CorePoolChain(self, chain_name, fees, self.w3_by_chain[chain_name])
print(f"Successfully initialized chain {chain_name} with fees {fees}.")
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you want to keep this or remove @jalbrekt85 ?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i think it should be removed, gosuto added retry logic to the w3 class here already: BalancerMaxis/bal_tools#112

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, removed

Copy link
Collaborator

@jalbrekt85 jalbrekt85 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@franzns franzns merged commit f44792f into biweekly-runs Jul 10, 2025
4 checks passed
@franzns franzns deleted the feat-optimism-beets-split branch July 10, 2025 06:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants