Smart Contracts

The Token Capacitor

The token capacitor is the smart contract that releases tokens for grants and accepts tokens as donations. The tokens in the capacitor are released at a rate that decays exponentially over time. Panvala’s token capacitor is configured with a half-life of 1456 days (four 52-week years), like Bitcoin’s block reward decay. This half-life is informed by the practices of other digital currencies, as well as common practices for issuing shares of corporations. However, it’s still just a guess. We’ve hardcoded this value not because it’s definitely the right choice forever, but because we believe that making it easy to alter the release curve would deter participation.

Withdrawing tokens from the token capacitor requires permission to be granted through the slate governance process. That process has its own timeline for granting permissions, but the token capacitor itself does not enforce restrictions on the timing of withdrawals. It only restricts the amount of tokens that can be withdrawn based on the balance after the last withdrawal or donation, the time of that change, and the amount of time that has elapsed since then.

Exponential Decay

The token capacitor releases tokens at rates such that its balance decays exponentially. Ideally, this decay would follow the formula for exponential decay:

However, since the floating point operations needed to implement this formula have determinism issues, it’s a poor fit for execution on a blockchain, where thousands of nodes need to agree on the result. The Ethereum Virtual Machine does not include floating point instructions for this reason. This leaves us two attractive approaches for implementing exponential decay: store a lookup table for pre-calculated values of the decay factor for selected values of t, or create a schedule of release rates to approximate exponential decay with a piecewise function.

It is easier to verify that a particular implementation of a piecewise schedule is free of any flaws that could throw off the supply policy of the system. Piecewise functions are deterministic, while attempting to approximate the curve more closely leads to behavior that depends on the prior sequence of balances and multipliers used from the lookup table.  In addition, since the goal of these smart contracts is to build consensus within a large community, it’s useful to be able to communicate exactly how many tokens should be released when using math that the public can do in their heads. Bitcoin’s block reward schedule also approximates exponential decay in this manner.

However, Panvala’s token capacitor releases are based on the current balance, not the current time like Bitcoin. Bitcoin can read from the clock to determine how many halvings have occurred, but Panvala would have to store or calculate the balance boundaries for each release rate. With donations, the balance can fluctuate unpredictably, and any piecewise schedule implementation would have to account for releases that cross boundaries of the schedule. Together, these concerns increase the complexity of the implementation to a degree that accepting the flaws of the lookup table approach is the right tradeoff to make.

Creating the Lookup Table

To create the lookup table, we must first select the smallest time interval that the table will support. The smaller the interval, the larger the error from truncation that compounds with every iteration. To use these multipliers with integers, we must choose a precision level to multiply by before using the multiplier, then divide out the precision factor when we’re done. We’ve chosen one day as the smallest interval and 1 x 10^12 as our precision factor. Together, they produce an error of about 531 tokens out of 50,000,000 over one half life.

We fill the rest of the lookup table with powers of two to be able to maintain more accuracy when more time has elapsed between the capacitor’s balance changes. However, we expect to achieve a flow of donations that exceeds one per day, which would cause the multiplier for one day to be used far more often than any other.

Each time we multiply by a multiplier, any present error compounds. As a result, using multipliers for fewer elapsed days over and over releases slightly more tokens than performing fewer multiplications using multipliers for more elapsed days.

function calculateDecay(uint256 _days) public view returns(uint256) { require(_days <= (2 ** decayMultipliers.length) - 1, "Time interval too large"); uint256 decay = scale; uint256 d = _days; for (uint256 i = 0; i < decayMultipliers.length; i++) { uint256 remainder = d % 2; uint256 quotient = d >> 1; if (remainder == 1) { uint256 multiplier = decayMultipliers[i]; decay = decay.mul(multiplier).div(scale); } else if (quotient == 0) { // Exit early if both quotient and remainder are zero break; } d = quotient; } return decay; }