-
Notifications
You must be signed in to change notification settings - Fork 29
Description
The bottracker curation estimator reward calculations show slight deviations to the steem calculated values. Steem calculates the "weight" of each vote with a formula using the square root of the voter rshares in relation to the sum of all previous voters rshares. A great explanation of the curation reward system was given by @miniature-tiger in:
- An Illustrated Guide to Curation - from the simple to the complex - with real examples from past posts - Part 1
- An Illustrated Guide to Curation - from the simple to the complex - with real examples from past posts - Part 2
The core math is the calculation of
weight = (sqrt(prev_rshares - vote_rshares) - sqrt(prev_rshares)) * [30min-penalty] * [beneficiaries]
Trying to repeat the math manually on a couple of posts I saw deviations with the steem calculated weights. It turned out the reason for that is that steem does not use the actual square root function but an approximation. I suspect performance reasons, because a square root calculation on a 128bit integer may be computationally expensive. The according approximation function is at https://github.com/steemit/steem/blob/06cd84fddeea2cc9fca179a0f093976864ba7b09/libraries/chain/util/reward.cpp#L17
The following graph shows a plot of sqrt() and approx_sqrt() for an rshare range of [100, 2e14]. The calculations are done with python's math.sqrt(). As a reference, the current rshare values of 5% and 100% utopian-io votes at ~100% VP are shown.
You can see that the approx_sqrt() function uses a step-wise linear approximation of the square root and the approximated function tends to give higher results.
The following graph shows the relative difference of sqrt() and approx_sqrt() in relation to the sqrt() result. Please note the logarithmic scale on the x axis.
You can see that the deviation between the two the square root functions oscillates between 0 and around +6% in most of the cases. For low rshare values, the deviation can also be negative.
This means that using Math.sqrt() may give results that are up to around 6% off the values that are calculated by steem. This is however not necessarily the resulting error on the final curation reward in STEEM. This value had to be calculated with proper error propagation on the full formula.
I compared the results with both functions for a handful of posts manually and saw deviations in the order of 0-3%, but this is obviously not statistically significant.
The approx_sqrt() function should be rather easy to implement in the bottracker curation estimator to get consistent results with steem. Unfortunately my javascript skills are close to non-existant, so I can't provide a PR.
Posted on Utopian.io - Rewarding Open Source Contributors


