Fixed a few typos

This commit is contained in:
Vitalik Buterin 2018-08-08 13:09:45 -04:00
parent b4f16cb807
commit f0c698dfc1
2 changed files with 14 additions and 14 deletions

Binary file not shown.

View File

@ -91,7 +91,7 @@ One of the most challenging issues in blockchain protocol design is how to limit
\section{Introduction and Model} \section{Introduction and Model}
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% %%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
A blockchain is a decentralized network consisting of a large number of computers that must all process transactions that transaction senders upload to the chain. Hence, a transaction that is published to a blockchain confers some private benefit to its sender, but also confers an external social cost to the network's participants. In order to account for this social cost, and prevent abuse of the blockchain as a common pool resource, some economic mechanism for restricting what transactions get included is required. However, there are many types of economic mechanisms that can be used to solve resource pricing problems of this type, and understanding which one is optimal requires more deeply understanding the nature and types of social costs in question. A blockchain is a decentralized network consisting of a large number of computers each of which must all process transactions that transaction senders upload to the chain. Hence, a transaction that is published to a blockchain confers some private benefit to its sender, but also confers an external social cost to the network's participants. In order to account for this social cost and prevent abuse of the blockchain as a common pool resource, some economic mechanism for restricting what transactions get included is required. However, there are many types of economic mechanisms that can be used to solve resource pricing problems of this type, and understanding which one is optimal requires more deeply understanding the nature and types of social costs in question.
The social costs can be broken down in two ways. First, one can categorize by fundamental type of resource expenditure: The social costs can be broken down in two ways. First, one can categorize by fundamental type of resource expenditure:
@ -111,7 +111,7 @@ Second, one can categorize by different types of first and second-order effects.
Each user $U_i$ has some direct resource cost function $R_i(W)$ representing the cost to the user of processing a given amount of weight. This cost can include electricity and bandwidth costs, marginal disk wear and tear, inconvenience from a user's other applications running more slowly, reduced battery life, and so on. For sufficiently high $w$, at some point the costs become unacceptable to any given user, at which point the user will drop offline (we assume $R_i(W)$ is flat above this point). Let $NodeCount(W)$ be the number of users still online at weight $W$. Note that different users could drop offline at different points for either of two reasons: (i) some users have a lower resource cost than others, and (ii) some users value being connected to the blockchain more than others. Each user $U_i$ has some direct resource cost function $R_i(W)$ representing the cost to the user of processing a given amount of weight. This cost can include electricity and bandwidth costs, marginal disk wear and tear, inconvenience from a user's other applications running more slowly, reduced battery life, and so on. For sufficiently high $w$, at some point the costs become unacceptable to any given user, at which point the user will drop offline (we assume $R_i(W)$ is flat above this point). Let $NodeCount(W)$ be the number of users still online at weight $W$. Note that different users could drop offline at different points for either of two reasons: (i) some users have a lower resource cost than others, and (ii) some users value being connected to the blockchain more than others.
There is some utility function $D(k)$ reflecting the social value of the level of decentralization achieved by having the number of online nodes, which can be translated into a function $D(W)$ of the total transaction load. There may also be some cost function $A(W)$ that reflects the increased attackability of the network as more transactions get included. We can summarize all of these costs as a combined cost function $C(W) = \sum_i R_i(W) + (A(W) - A(0)) - (D(W) - D(0))$. There is some utility function $D(k)$ reflecting the social value of the level of decentralization achieved by having the number of online nodes, which can be translated into a function $D(W)$ of the total transaction load. There may also be some cost function $A(W)$ that reflects the increased eash of attacking the network as more transactions get included. We can summarize all of these costs as a combined cost function $C(W) = \sum_i R_i(W) + (A(W) - A(0)) - (D(W) - D(0))$.
The above suffices as a model of a blockchain for the purpose of this paper; we do not need to care about details about proof of work, proof of stake, block structure, etc, except insofar as the details of those consensus algorithms and blockchain design patterns affect $NodeCount$ and $A$, and therefore $C$. The above suffices as a model of a blockchain for the purpose of this paper; we do not need to care about details about proof of work, proof of stake, block structure, etc, except insofar as the details of those consensus algorithms and blockchain design patterns affect $NodeCount$ and $A$, and therefore $C$.
@ -176,9 +176,9 @@ Taken together, if the consumer's marginal private costs increase faster with qu
\frac{ \fname{private\_benefit}^{\prime \prime}( \opname{quantity} ) }{ \fname{social\_cost}^{\prime \prime}( \opname{quantity}) } > 1 \; , \frac{ \fname{private\_benefit}^{\prime \prime}( \opname{quantity} ) }{ \fname{social\_cost}^{\prime \prime}( \opname{quantity}) } > 1 \; ,
\end{equation} \end{equation}
then seting prices is better, and in other cases setting quantities is better. Note that we need to use the second derivative because we are specifically talking about the \emph{the rate of change in marginal costs}. then setting prices is better, and in other cases setting quantities is better. Note that we need to use the second derivative because we are specifically talking about the \emph{the rate of change in marginal costs}.
The argument above applies only if costs and benefits are independently distributed. If changes in the cost and benefit curves are correlated, then an additional term must be added into the choice rule, increasing the relative attractiveness of limiting quantity. To see this intuitively, consider the extreme case where uncertainty in cost and benefit is perfectly correlated; in such a scenario, if original estimates of cost and benefit prove incorrect, both curves will move up or down in lockstep, and so the new equilibrium will be directly above or below the original estimated one; hence, a quantity-targeting policy would be perfectly correct and a price-targeting policy would be pessimal. This analysis covers only two possible policies, but a much greater space of options is available. One can think of policy space as the space of possible supply curves for a given resource, where a pricing policy represents a horizontal supply curve and a cap-and-trade scheme represents a vertical supply curve. Various forms of diagonal supply curves are also possible, and in most cases some form of not strictly horizontal or vertical supply curve is optimal. The argument above applies only if costs and benefits are independently distributed. If changes in the cost and benefit curves are correlated, then an additional term must be added into the choice rule, increasing the relative attractiveness of limiting quantity. To see this intuitively, consider the extreme case where uncertainty in cost and benefit is perfectly correlated; in such a scenario, if original estimates of cost and benefit prove incorrect, both curves will move up or down in lockstep, and so the new equilibrium will be directly above or below the original estimated one; hence, a quantity-targeting policy would be perfectly correct and a price-targeting policy would be pessimal. This analysis covers only two possible policies, but a much greater space of options is available. One can think of policy space as the space of possible supply curves for a given resource, where a pricing policy represents a horizontal supply curve and a cap-and-trade scheme represents a vertical supply curve. Various forms of diagonal supply curves are also possible, and in most cases, some form of not strictly horizontal or vertical supply curve is optimal.
Should blockchains have a block size limit, or should they not have a limit but instead charge a fixed fee per resource unit consumed, or would some intermediate policy, one which charges a fee as a function $F(w)$ of the weight included in a block and where $F'(w)$ is increasing and possibly reaches an asymptote, be optimal? To estimate optimal policy under the prices vs. quantities framework, we start off by attempting to estimate the social cost function. Should blockchains have a block size limit, or should they not have a limit but instead charge a fixed fee per resource unit consumed, or would some intermediate policy, one which charges a fee as a function $F(w)$ of the weight included in a block and where $F'(w)$ is increasing and possibly reaches an asymptote, be optimal? To estimate optimal policy under the prices vs. quantities framework, we start off by attempting to estimate the social cost function.
@ -211,7 +211,7 @@ The nonlinearity in this figure is in part an artefact of the data; during the p
\label{fig:three} \label{fig:three}
\end{center} \end{center}
However, there are superlinear costs at play as well. Clients need to process both the main chain and the blocks that do not become part of the canonical chain; hence, if a level of canonical-chain throughput $x$ causes an uncle rate $p$, then the actual level of computational burden is $\frac{x}{1-p}$, with a denominator that keeps decreasing toward zero as the canonical-chain throughput increases. Additionally, with high uncle rates selfish mining attacks become much easier \cite{optimalsm}, and the reduction in node count itself leads to pooling, which makes selfish mining more likely. There is thus a qualitative sense in which the social cost of increasing $W$ to whe point where $t = T$ is more than ten times that of setting $W$ so that $t = T * 0.1$. However, there are superlinear costs at play as well. Clients need to process both the main chain and the blocks that do not become part of the canonical chain; hence, if a level of canonical-chain throughput $x$ causes an uncle rate $p$, then the actual level of computational burden is $\frac{x}{1-p}$, with a denominator that keeps decreasing toward zero as the canonical-chain throughput increases. Additionally, with high uncle rates selfish mining attacks become much easier \cite{optimalsm}, and the reduction in node count itself leads to pooling, which makes selfish mining more likely. There is thus a qualitative sense in which the social cost of increasing $W$ to the point where $t = T$ is more than ten times that of setting $W$ so that $t = T * 0.1$.
Even if the cost function is superlinear at the extremes, however, it appears to be linear at the lower side of the distribution, and the arguments from the Cornell study suggest it may even be sublinear. If the block size increases from 10kb to 1000kb, a significant social cost is incurred because IoT devices, smartphones, Raspberry Pis, etc have a much harder time staying connected, but an increase from 1000kb to 1990k does not have such a high cost, because the range of use cases that become unusable within that interval is much lower. Hence, it seems plausible that the social cost curve is U-shaped: Even if the cost function is superlinear at the extremes, however, it appears to be linear at the lower side of the distribution, and the arguments from the Cornell study suggest it may even be sublinear. If the block size increases from 10kb to 1000kb, a significant social cost is incurred because IoT devices, smartphones, Raspberry Pis, etc have a much harder time staying connected, but an increase from 1000kb to 1990k does not have such a high cost, because the range of use cases that become unusable within that interval is much lower. Hence, it seems plausible that the social cost curve is U-shaped:
@ -277,7 +277,7 @@ However, Bitcoin has recently entered the ``full blocks'' regime, where transact
\footnote{Source: http://etherscan.io/charts; spreadsheet with data and calculations at http://vitalik.ca/files/FeesAndETH.ods} \footnote{Source: http://etherscan.io/charts; spreadsheet with data and calculations at http://vitalik.ca/files/FeesAndETH.ods}
\end{center} \end{center}
In the absence of political pressure on miners to make further gas limit increases, we see no reason for this state of affairs not to continue; and if political pressure \emph{can} be used to increase gas limits when needed, then the same processes could be used to adjust a fixed dee. In the absence of political pressure on miners to make further gas limit increases, we see no reason for this state of affairs to not continue; and if political pressure \emph{can} be used to increase gas limits when needed, then the same processes could be used to adjust a fixed dee.
\section{Transaction Fees and Auction Theory} \section{Transaction Fees and Auction Theory}
@ -295,9 +295,9 @@ However, kth price auctions have a different kind of serious flaw: they are not
\includegraphics[width=3.5in]{kth_price_revenue.png} \\ \includegraphics[width=3.5in]{kth_price_revenue.png} \\
\end{center} \end{center}
A more serious issue is \emph{collusion} between the proposer and some transaction senders. A proposer can potentially collude with low-fee tansaction senders (eg. suppose there is a single entity, like an exchange or mining pool, that sends such transactions and can b easily negotiated with) that are sending transactions with some fee $f_{low}$. The proposer can ask them to instead send their transactions with fee $f_{high}$, and refund them $f_{high} - \frac{f_{low}}{2}$. The proposer's revenue is now even higher: the proposer benefits from the increased height of the ``rectangle'' of fee revenue that they would get with the ``dummy transaction'' strategy above, but they would also get a portion of the revenue from transactions that they would otherwise have sacrificed. \footnote{For a more detailed treatment of similar issues, see \cite{li2018} and \cite{rothkopf2007}.} A more serious issue is \emph{collusion} between the proposer and some transaction senders. A proposer can potentially collude with low-fee transaction senders (eg. suppose there is a single entity, like an exchange or mining pool, that sends such transactions and can be easily negotiated with) that are sending transactions with some fee $f_{low}$. The proposer can ask them to instead send their transactions with fee $f_{high}$, and refund them $f_{high} - \frac{f_{low}}{2}$. The proposer's revenue is now even higher: the proposer benefits from the increased height of the ``rectangle'' of fee revenue that they would get with the ``dummy transaction'' strategy above, but they would also get a portion of the revenue from transactions that they would otherwise have sacrificed. \footnote{For a more detailed treatment of similar issues, see \cite{li2018} and \cite{rothkopf2007}.}
Hence, both first-price and second-price auctions are unsatisfactory. However, note that these issues are exclusively properties of auctions, and not properties of a fixed-price sale. If being included in the blockchain simply requires paying some $minFee$, then transaction senders have a simple strategy that they can use to set the fee on their transaction. Let $v$ be a user's private valuation for a transaction getting included in the next block. The user would check if $v > minFee$; if it is, they would bid $minFee + \epsilon$ (to provide a slight incentive for the block producer to include the transaction); if $v < minFee$ they would not send the transaction. This is a very simple strategy that does not require knowledge of others' valuations, and is optimal for the transaction sender. Hence, both first-price and second-price auctions are unsatisfactory. However, note that these issues are exclusively properties of auctions, and not properties of a fixed-price sale. If being included in the blockchain simply requires paying some $minFee$, then transaction senders have a simple strategy that they can use to set the fee on their transaction. Let $v$ be a user's private valuation for a transaction getting included in the next block. The user would check if $v > minFee$; if it is, they would bid $minFee + \epsilon$ (to provide a slight incentive for the block producer to include the transaction); if $v < minFee$ they would not send the transaction. This is a very simple strategy that does not require knowledge of others' valuations and is optimal for the transaction sender.
\section{Improving the Second Best} \section{Improving the Second Best}
@ -309,7 +309,7 @@ Suppose that we start with an existing policy which sets a weight limit $w_{max}
\includegraphics[width=2.5in]{Triangle1.png} \\ \includegraphics[width=2.5in]{Triangle1.png} \\
\end{center} \end{center}
The area of the triangle, representing the total economic losses from an excessive (or if $r$ is negative, insufficient) number of transactions being included, is $\frac{1}{2} * r^2 * (C'' + D'')$; for simplicity we'll call this value $A$. Now suppose that, like in reality, demand is volatile, which we will approximate with a simple model where $D'(1 + x)$ half the time equals to $1 + D'' * x + \delta$ and the other half of the time equals to $1 + D'' * x - \delta$. In the $-\delta$ period, the height of the triangle increases from $r * (C'' + D'')$ to $r * (C'' + D'') + \delta$, or a ratio of $1 + \frac{\delta}{r * (C'' + D'')}$. The area of the triangle, representing the total economic losses from an excessive (or if $r$ is negative, insufficient) number of transactions being included, is $\frac{1}{2} * r^2 * (C'' + D'')$; for simplicity we'll call this value $A$. Now, we will incorporate into the model the fact that demand is naturally volatile. We will approximate this with a simple model where $D'(1 + x)$ half the time equals to $1 + D'' * x + \delta$ and the other half of the time equals to $1 + D'' * x - \delta$. In the $-\delta$ period, the height of the triangle increases from $r * (C'' + D'')$ to $r * (C'' + D'') + \delta$, or a ratio of $1 + \frac{\delta}{r * (C'' + D'')}$.
\begin{center} \begin{center}
\includegraphics[width=2.5in]{Triangle2.png} \\ \includegraphics[width=2.5in]{Triangle2.png} \\
@ -336,7 +336,7 @@ We now propose an alternate resource pricing/limit rule that we believe provides
\item In any particular block, let $w_{prev}$ be the amount of weight consumed in the previous block, and $minFee_{prev}$ be the previous block's $minFee$ value. See $minFee$ for this block to equal $minFee_{prev} * (1 + (\frac{w_{prev}}{w_{newmax}} - \frac{1}{2}) * adjSpeed$. \item In any particular block, let $w_{prev}$ be the amount of weight consumed in the previous block, and $minFee_{prev}$ be the previous block's $minFee$ value. See $minFee$ for this block to equal $minFee_{prev} * (1 + (\frac{w_{prev}}{w_{newmax}} - \frac{1}{2}) * adjSpeed$.
\end{itemize} \end{itemize}
This rule is likely to outperform simple limits in terms of allocative efficiency for the reasons cited above, and also (except during sudden and extreme spikes) eliminates the issues with first and second price auctions described above. \footnote{In the specific case of storage pricing, a quirk in Ethereum gas pricing rules that allows storage to be (mostly) paid for before it is actually used allows for second-layer markets like GasToken\cite{gastoken} where gas can be burned to generate ``congealed storage use privileges'', which can then be used later. The possibility of doing this unintentionally creates efficiency gains similar in type, though smaller in size, than those described here.} This rule is likely to outperform simple limits in terms of allocative efficiency for the reasons cited above, and it also (except during sudden and extreme spikes) eliminates the issues with first and second price auctions described above. \footnote{In the specific case of storage pricing, a quirk in Ethereum gas pricing rules that allows storage to be (mostly) paid for before it is actually used allows for second-layer markets like GasToken\cite{gastoken} where gas can be burned to generate ``congealed storage use privileges'', which can then be used later. The possibility of doing this unintentionally creates efficiency gains similar in type, though smaller in size, than those described here.}
As a philosophical note, complex gas and fee policies are often criticized as being a form of economic ``central planning'', which is frowned upon because planners may not have aligned incentives and have less information than participants closer to the day-to-day economic activity. That said, note that \emph{any} transaction pricing policy, whether fee-based or limit-based, necessarily has centrally planned parameters. I would argue that the correct way to apply the Hayekian anti-central-planning intuition is to treat it as saying that central plans are less bad if those plans have \emph{lower Kolmogorov complexity}, a simple strict weight limit being an ideal example. As a philosophical note, complex gas and fee policies are often criticized as being a form of economic ``central planning'', which is frowned upon because planners may not have aligned incentives and have less information than participants closer to the day-to-day economic activity. That said, note that \emph{any} transaction pricing policy, whether fee-based or limit-based, necessarily has centrally planned parameters. I would argue that the correct way to apply the Hayekian anti-central-planning intuition is to treat it as saying that central plans are less bad if those plans have \emph{lower Kolmogorov complexity}, a simple strict weight limit being an ideal example.
@ -392,7 +392,7 @@ However, both the Bitcoin and Ethereum approaches have four large problems that
The first problem can possibly be solved by simply making storage more expensive. However, making storage more expensive and doing nothing else would make it prohibitively expensive to use storage for very short periods of time. One could offer a time-based refund, refunding more if a storage slot is cleared earlier rather than later; the only arbitrage-free scheme for this is to define some decreasing nonnegative function $F(t)$ (eg. $F(t) = h * e^{-kt}$) of the current time, and charge $F(t)$ for filling a storage slot at time $t$, and refund $F(t)$ for clearing a storage slot at time $t$.\footnote{If different storage slots can have different $F(t)$ functions, then at any point where $F_1'(t) > F_2'(t)$, there is an arbitrage opportunity where if the holder of $F_1$ (the slower-falling function) no longer needs their storage slot, they can instead assign permission to use it to the holder of the other storage slot, and the holder of the other storage slot can clear it immediately.} However, this approach is very capital-inefficient, requiring large deposits to use storage, and additionally rests on the assumption that the social cost of storage will continue to forever decrease quickly enough that the integral is convergent. The first problem can possibly be solved by simply making storage more expensive. However, making storage more expensive and doing nothing else would make it prohibitively expensive to use storage for very short periods of time. One could offer a time-based refund, refunding more if a storage slot is cleared earlier rather than later; the only arbitrage-free scheme for this is to define some decreasing nonnegative function $F(t)$ (eg. $F(t) = h * e^{-kt}$) of the current time, and charge $F(t)$ for filling a storage slot at time $t$, and refund $F(t)$ for clearing a storage slot at time $t$.\footnote{If different storage slots can have different $F(t)$ functions, then at any point where $F_1'(t) > F_2'(t)$, there is an arbitrage opportunity where if the holder of $F_1$ (the slower-falling function) no longer needs their storage slot, they can instead assign permission to use it to the holder of the other storage slot, and the holder of the other storage slot can clear it immediately.} However, this approach is very capital-inefficient, requiring large deposits to use storage, and additionally rests on the assumption that the social cost of storage will continue to forever decrease quickly enough that the integral is convergent.
A solution that does not have these problems is to implement a time-based storage maintenance fee (sometimes also called ``rent''). The simplest way to implement this is simple: every account object is charged X coins per block per byte that it consumes in the state. If an account has less coins than the amount needed to pay for $pokeThreshold$ blocks (say, $pokeThreshold = 500$), then anyone can ``poke'' the account and delete it from storage, and claim $k * pokeThreshold$ blocks' worth of rent as a bounty where $k \in (0,1]$. Implementing the above is impractical as every block going through every account and decrementing its balance has immense overhead. However, this can be computed quite practically through lazy evaluation: A solution that does not have these problems is to implement a time-based storage maintenance fee (sometimes also called ``rent''). The simplest way to implement this is simple: every account object is charged X coins per block per byte that it consumes in the state. If an account has less coins than the amount needed to pay for $pokeThreshold$ blocks (say, $pokeThreshold = 500$), then anyone can ``poke'' the account and delete it from storage and claim $k * pokeThreshold$ blocks' worth of rent as a bounty where $k \in (0,1]$. Implementing the above is impractical as every block going through every account and decrementing its balance has immense overhead. However, this can be computed quite practically through lazy evaluation:
\begin{itemize} \begin{itemize}
\item All accounts store an additional data field, $LastBlockAccessed$ \item All accounts store an additional data field, $LastBlockAccessed$
@ -403,7 +403,7 @@ A solution that does not have these problems is to implement a time-based storag
Suppose that we want the maintenance fee to be able to vary over time. Then, for all block heights $h$ we save in storage $totalFee[h] = \sum_{i=1}^h Fee[i] = totalFee[h-1] + Fee[h]$. We compute the current balance as $$balance - sizeOf(account) * (totalFee[curBlock] - totalFee[LastBlockAccessed])$$, where $totalFee[curBlock] - totalFee[LastBlockAccessed]$ can be understood as $\sum_{i=LastBlockAccessed}^{curBlock} Fee[i]$. Suppose that we want the maintenance fee to be able to vary over time. Then, for all block heights $h$ we save in storage $totalFee[h] = \sum_{i=1}^h Fee[i] = totalFee[h-1] + Fee[h]$. We compute the current balance as $$balance - sizeOf(account) * (totalFee[curBlock] - totalFee[LastBlockAccessed])$$, where $totalFee[curBlock] - totalFee[LastBlockAccessed]$ can be understood as $\sum_{i=LastBlockAccessed}^{curBlock} Fee[i]$.
However, we will argue in favor of simply setting the maintenance fee to one specific value (eg. $10^{-7}$ ETH per byte per year), and leaving it this way forever. First of all, the social cost of storage use is clearly almost perfectly linear in the short and medium run, but it is also much more linear in the long run. There is no analog to the natural asymptote of bandwidth and computation costs in blockchains where at some point the uncle rate reaches 100\%; even if the storage of the Ethereum blockchain starts increasing by 10 GB per day, then blockchain nodes will be quickly relegated to only running on data centers, but the blockchain will still fundamentally be functional. In fact, if you assume that node storage capacity is distributed among the same distribution as the Cornell study\cite{cornell} shows bandwidth is, so $NodeCount(W) = \frac{1}{W}$, and assume the logarithmic utility function for node count, so $D(x) = log(x) = -log(W)$ then the social cost component from node centralization is roughly $C(W) = log(W)$, or $C'(W) = \frac{1}{W}$ - very steeply \emph{sublinear}. However, we will argue in favor of simply setting the maintenance fee to one specific value (eg. $10^{-7}$ ETH per byte per year) and leaving it this way forever. First of all, the social cost of storage use is clearly almost perfectly linear in the short and medium run, but it is also much more linear in the long run. There is no analog to the natural asymptote of bandwidth and computation costs in blockchains where at some point the uncle rate reaches 100\%; even if the storage of the Ethereum blockchain starts increasing by 10 GB per day, then blockchain nodes will be quickly relegated to only running on data centers, but the blockchain will still fundamentally be functional. In fact, if you assume that node storage capacity is distributed among the same distribution as the Cornell study\cite{cornell} shows bandwidth is, so $NodeCount(W) = \frac{1}{W}$, and assume the logarithmic utility function for node count, so $D(x) = log(x) = -log(W)$ then the social cost component from node centralization is roughly $C(W) = log(W)$, or $C'(W) = \frac{1}{W}$ - very steeply \emph{sublinear}.
Second, the developer and user experience considerably improves if developers and users can determine with exactness a minimum ``time to live'' for any given contract far ahead in advance. Variable fees do not have this property; a fixed fee does. Third, as cryptocurrency prices are more stable than transaction fees, a fixed fee improves price predictability, in both cryptocurrency and fiat-denominated terms. Fourth, a fixed fee is simple, both intuitively and in the semi-formal sense of having low Kolmogorov complexity. Second, the developer and user experience considerably improves if developers and users can determine with exactness a minimum ``time to live'' for any given contract far ahead in advance. Variable fees do not have this property; a fixed fee does. Third, as cryptocurrency prices are more stable than transaction fees, a fixed fee improves price predictability, in both cryptocurrency and fiat-denominated terms. Fourth, a fixed fee is simple, both intuitively and in the semi-formal sense of having low Kolmogorov complexity.
@ -431,11 +431,11 @@ If the contract contains more funds at the time of the older hibernation that it
\item A proof of non-prior-waking consists of a Merkle branch pointing to the contract's address once every $MinInterval$ \item A proof of non-prior-waking consists of a Merkle branch pointing to the contract's address once every $MinInterval$
\end{itemize} \end{itemize}
Adjusting $MinInterval$ is a tradeoff: smaller values enable launching contracts cheaply for shorter periods of time, but larger values shrink the size of the witness required for waking, as well as shrinking the number of ever-growing historical state roots that need to be stored. For a $MinInterval$ of one week, and a state with $2^{30}$ accounts, waking a ten year old contract would require $32 * log(2<sup>30</sup>) * \frac{10 * 365.242}{7} \approx$ 500,000 bytes; a $minInterval$ of one month reduces this to 115,200 bytes. Adjusting $MinInterval$ is a tradeoff: smaller values enable launching contracts cheaply for shorter periods of time, but larger values shrink the size of the witness required for waking, as well as shrinking the number of ever-growing historical state roots that need to be stored. For a $MinInterval$ of one week, and a state with $2^{30}$ accounts, waking a ten year old contract would require $32 * log(2<sup>30</sup>) * \frac{10 * 365.242}{7} \approx$ 500,000 bytes; a $MinInterval$ of one month reduces this to 115,200 bytes.
\section{Conclusion} \section{Conclusion}
Economic analysis can be used to significantly improve the incentive alignment of resource usage inside of public blockchains. Simplistic models of one-dimensional weight limits often lead to prices that are highly mismatched relative to social costs, and slightly more complex techniques involving a combination of ``Pigovian taxes'' and cap-and-trade mechanics such as weight limits can improve significantly on the status quo. Storage in particular is a very different resource market from other types of resource markets, and should be treated separately. Economic analysis can be used to significantly improve the incentive alignment of resource usage inside of public blockchains. Simplistic models of one-dimensional weight limits often lead to prices that are highly mismatched relative to social costs, and slightly more complex techniques involving a combination of ``Pigovian taxes'' and cap-and-trade mechanics such as weight limits can improve significantly on the status quo. Storage in particular is a very different resource market from other types of resource markets, and it should be treated separately.
More economic analysis and econometric research can be used to help identify further mechanisms that can be used to better reduce costs while discouraging wasteful use of public blockchain resources. More economic analysis and econometric research can be used to help identify further mechanisms that can be used to better reduce costs while discouraging wasteful use of public blockchain resources.