mirror of
https://github.com/status-im/eth2.0-specs.git
synced 2025-01-11 19:24:15 +00:00
Move test-generators to specs repo
Co-authored-by: Chih Cheng Liang <chihchengliang@gmail.com> Co-authored-by: Danny Ryan <dannyjryan@gmail.com> Co-authored-by: Dmitrii Shmatko <leodex23@gmail.com> Co-authored-by: Jannik Luhn <jannik@brainbot.com> Co-authored-by: Paul Hauner <paul@paulhauner.com> Co-authored-by: protolambda <proto@protolambda.com>
This commit is contained in:
parent
53e528e56c
commit
64f012b276
151
test_generators/README.md
Normal file
151
test_generators/README.md
Normal file
@ -0,0 +1,151 @@
|
||||
# Eth2.0 Test Generators
|
||||
|
||||
This directory of contains all the generators for YAML tests, consumed by Eth 2.0 client implementations.
|
||||
|
||||
Any issues with the generators and/or generated tests should be filed
|
||||
in the repository that hosts the generator outputs, here: [ethereum/eth2.0-tests](https://github.com/ethereum/eth2.0-tests/).
|
||||
|
||||
Whenever a release is made, the new tests are automatically built and
|
||||
[eth2TestGenBot](https://github.com/eth2TestGenBot) commits the changes to the test repository.
|
||||
|
||||
## How to run generators
|
||||
|
||||
pre-requisites:
|
||||
- Python 3 installed
|
||||
- PIP 3
|
||||
- GNU make
|
||||
|
||||
### Cleaning
|
||||
|
||||
This removes the existing virtual environments (`/test_generators/.venvs/`), and generated tests (`/yaml_tests/`).
|
||||
|
||||
```bash
|
||||
make clean
|
||||
```
|
||||
|
||||
### Running all test generators
|
||||
|
||||
This runs all the generators.
|
||||
|
||||
```bash
|
||||
make all
|
||||
```
|
||||
|
||||
### Running a single generator
|
||||
|
||||
The make file auto-detects generators in the `test_generators/` directory,
|
||||
and provides a tests-gen target for each generator, see example.
|
||||
|
||||
```bash
|
||||
make ./tests/shuffling/
|
||||
```
|
||||
|
||||
## Developing a generator
|
||||
|
||||
Simply open up the generator (not all at once) of choice in your favorite IDE/editor, and run:
|
||||
|
||||
```bash
|
||||
# Create a virtual environment (any venv/.venv/.venvs is git-ignored)
|
||||
python3 -m venv .venv
|
||||
# Activate the venv, this is where dependencies are installed for the generator
|
||||
. .venv/bin/activate
|
||||
```
|
||||
|
||||
Now that you have a virtual environment, write your generator.
|
||||
It's recommended to extend the base-generator.
|
||||
|
||||
Create a `requirements.txt` in the root of your generator directory:
|
||||
```
|
||||
eth-utils==1.4.1
|
||||
../test_libs/gen_helpers
|
||||
```
|
||||
|
||||
Install all the necessary requirements (re-run when you add more):
|
||||
```bash
|
||||
pip3 install -r requirements.txt --user
|
||||
```
|
||||
|
||||
And write your initial test generator, extending the base generator:
|
||||
|
||||
Write a `main.py` file, here's an example:
|
||||
|
||||
```python
|
||||
from gen_base import gen_runner, gen_suite, gen_typing
|
||||
|
||||
from eth_utils import (
|
||||
to_dict, to_tuple
|
||||
)
|
||||
|
||||
|
||||
@to_dict
|
||||
def bar_test_case(v: int):
|
||||
yield "bar_v", v
|
||||
yield "bar_v_plus_1", v + 1
|
||||
yield "bar_list", list(range(v))
|
||||
|
||||
|
||||
@to_tuple
|
||||
def generate_bar_test_cases():
|
||||
for i in range(10):
|
||||
yield bar_test_case(i)
|
||||
|
||||
|
||||
def bar_test_suite() -> gen_typing.TestSuite:
|
||||
return gen_suite.render_suite(
|
||||
title="bar_minimal",
|
||||
summary="Minimal example suite, testing bar.",
|
||||
fork="v0.5.1",
|
||||
config="minimal",
|
||||
test_cases=generate_bar_test_cases())
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
gen_runner.run_generator("foo", [bar_test_suite])
|
||||
|
||||
```
|
||||
|
||||
Recommendations:
|
||||
- you can have more than just 1 generator, e.g. ` gen_runner.run_generator("foo", [bar_test_suite, abc_test_suite, example_test_suite])`
|
||||
- you can concatenate lists of test cases, if you don't want to split it up in suites.
|
||||
- you can split your suite generators into different python files/packages, good for code organization.
|
||||
- use config "minimal" for performance. But also implement a suite with the default config where necessary
|
||||
- the test-generator accepts `--output` and `--force` (overwrite output)
|
||||
|
||||
## How to add a new test generator
|
||||
|
||||
In order to add a new test generator that builds `New Tests`:
|
||||
|
||||
1. Create a new directory `new_tests`, within the `test_generators` directory.
|
||||
Note that `new_tests` is also the name of the directory in which the tests will appear in the tests repository later.
|
||||
2. Your generator is assumed to have a `requirements.txt` file,
|
||||
with any dependencies it may need. Leave it empty if your generator has none.
|
||||
3. Your generator is assumed to have a `main.py` file in its root.
|
||||
By adding the base generator to your requirements, you can make a generator really easily. See docs below.
|
||||
4. Your generator is called with `-o some/file/path/for_testing/can/be_anything`.
|
||||
The base generator helps you handle this; you only have to define suite headers,
|
||||
and a list of tests for each suite you generate.
|
||||
5. Finally, add any linting or testing commands to the
|
||||
[circleci config file](https://github.com/ethereum/eth2.0-test-generators/blob/master/.circleci/config.yml)
|
||||
if desired to increase code quality.
|
||||
|
||||
Note: you do not have to change the makefile.
|
||||
However, if necessary (e.g. not using python, or mixing in other languages), submit an issue, and it can be a special case.
|
||||
Do note that generators should be easy to maintain, lean, and based on the spec.
|
||||
|
||||
All of this should be done in a pull request to the master branch.
|
||||
|
||||
To deploy new tests to the testing repository:
|
||||
|
||||
1. Create a release tag with a new version number on Github.
|
||||
2. Increment either the:
|
||||
- major version, to indicate a change in the general testing format
|
||||
- minor version, if a new test generator has been added
|
||||
- path version, in other cases.
|
||||
|
||||
## How to remove a test generator
|
||||
|
||||
If a test generator is not needed anymore, undo the steps described above and make a new release:
|
||||
|
||||
1. remove the generator folder
|
||||
2. remove the generated tests in the `eth2.0-tests` repository by opening a PR there.
|
||||
3. make a new release
|
20
test_generators/bls/README.md
Normal file
20
test_generators/bls/README.md
Normal file
@ -0,0 +1,20 @@
|
||||
# BLS Test Generator
|
||||
|
||||
Explanation of BLS12-381 type hierarchy
|
||||
The base unit is bytes48 of which only 381 bits are used
|
||||
|
||||
- FQ: uint381 modulo field modulus
|
||||
- FQ2: (FQ, FQ)
|
||||
- G2: (FQ2, FQ2, FQ2)
|
||||
|
||||
## Resources
|
||||
|
||||
- [Eth2.0 spec](https://github.com/ethereum/eth2.0-specs/blob/master/specs/bls_signature.md)
|
||||
- [Finite Field Arithmetic](http://www.springeronline.com/sgw/cda/pageitems/document/cda_downloaddocument/0,11996,0-0-45-110359-0,00.pdf)
|
||||
- Chapter 2 of [Elliptic Curve Cryptography](http://cacr.uwaterloo.ca/ecc/). Darrel Hankerson, Alfred Menezes, and Scott Vanstone
|
||||
- [Zcash BLS parameters](https://github.com/zkcrypto/pairing/tree/master/src/bls12_381)
|
||||
- [Trinity implementation](https://github.com/ethereum/trinity/blob/master/eth2/_utils/bls.py)
|
||||
|
||||
## Comments
|
||||
|
||||
Compared to Zcash, Ethereum specs always requires the compressed form (c_flag / most significant bit always set).
|
192
test_generators/bls/main.py
Normal file
192
test_generators/bls/main.py
Normal file
@ -0,0 +1,192 @@
|
||||
"""
|
||||
BLS test vectors generator
|
||||
Usage:
|
||||
"python tgen_bls path/to/output.yml"
|
||||
"""
|
||||
|
||||
# Standard library
|
||||
import sys
|
||||
from typing import Tuple
|
||||
|
||||
# Third-party
|
||||
import yaml
|
||||
|
||||
# Ethereum
|
||||
from eth_utils import int_to_big_endian, big_endian_to_int
|
||||
|
||||
# Local imports
|
||||
from py_ecc import bls
|
||||
|
||||
|
||||
def int_to_hex(n: int) -> str:
|
||||
return '0x' + int_to_big_endian(n).hex()
|
||||
|
||||
|
||||
def hex_to_int(x: str) -> int:
|
||||
return int(x, 16)
|
||||
|
||||
|
||||
# Note: even though a domain is only an uint64,
|
||||
# To avoid issues with YAML parsers that are limited to 53-bit (JS language limit)
|
||||
# It is serialized as an hex string as well.
|
||||
DOMAINS = [
|
||||
0,
|
||||
1,
|
||||
1234,
|
||||
2**32-1,
|
||||
2**64-1
|
||||
]
|
||||
|
||||
MESSAGES = [
|
||||
b'\x00' * 32,
|
||||
b'\x56' * 32,
|
||||
b'\xab' * 32,
|
||||
]
|
||||
|
||||
PRIVKEYS = [
|
||||
# Curve order is 256 so private keys are 32 bytes at most.
|
||||
# Also not all integers is a valid private key, so using pre-generated keys
|
||||
hex_to_int('0x00000000000000000000000000000000263dbd792f5b1be47ed85f8938c0f29586af0d3ac7b977f21c278fe1462040e3'),
|
||||
hex_to_int('0x0000000000000000000000000000000047b8192d77bf871b62e87859d653922725724a5c031afeabc60bcef5ff665138'),
|
||||
hex_to_int('0x00000000000000000000000000000000328388aff0d4a5b7dc9205abd374e7e98f3cd9f3418edb4eafda5fb16473d216'),
|
||||
]
|
||||
|
||||
|
||||
def hash_message(msg: bytes,
|
||||
domain: int) ->Tuple[Tuple[str, str], Tuple[str, str], Tuple[str, str]]:
|
||||
"""
|
||||
Hash message
|
||||
Input:
|
||||
- Message as bytes
|
||||
- domain as uint64
|
||||
Output:
|
||||
- Message hash as a G2 point
|
||||
"""
|
||||
return [
|
||||
[
|
||||
int_to_hex(fq2.coeffs[0]),
|
||||
int_to_hex(fq2.coeffs[1]),
|
||||
]
|
||||
for fq2 in bls.utils.hash_to_G2(msg, domain)
|
||||
]
|
||||
|
||||
|
||||
def hash_message_compressed(msg: bytes, domain: int) -> Tuple[str, str]:
|
||||
"""
|
||||
Hash message
|
||||
Input:
|
||||
- Message as bytes
|
||||
- domain as uint64
|
||||
Output:
|
||||
- Message hash as a compressed G2 point
|
||||
"""
|
||||
z1, z2 = bls.utils.compress_G2(bls.utils.hash_to_G2(msg, domain))
|
||||
return [int_to_hex(z1), int_to_hex(z2)]
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
|
||||
# Order not preserved - https://github.com/yaml/pyyaml/issues/110
|
||||
metadata = {
|
||||
'title': 'BLS signature and aggregation tests',
|
||||
'summary': 'Test vectors for BLS signature',
|
||||
'test_suite': 'bls',
|
||||
'fork': 'phase0-0.5.0',
|
||||
}
|
||||
|
||||
case01_message_hash_G2_uncompressed = []
|
||||
for msg in MESSAGES:
|
||||
for domain in DOMAINS:
|
||||
case01_message_hash_G2_uncompressed.append({
|
||||
'input': {'message': '0x' + msg.hex(), 'domain': int_to_hex(domain)},
|
||||
'output': hash_message(msg, domain)
|
||||
})
|
||||
|
||||
case02_message_hash_G2_compressed = []
|
||||
for msg in MESSAGES:
|
||||
for domain in DOMAINS:
|
||||
case02_message_hash_G2_compressed.append({
|
||||
'input': {'message': '0x' + msg.hex(), 'domain': int_to_hex(domain)},
|
||||
'output': hash_message_compressed(msg, domain)
|
||||
})
|
||||
|
||||
case03_private_to_public_key = []
|
||||
# Used in later cases
|
||||
pubkeys = [bls.privtopub(privkey) for privkey in PRIVKEYS]
|
||||
# Used in public key aggregation
|
||||
pubkeys_serial = ['0x' + pubkey.hex() for pubkey in pubkeys]
|
||||
case03_private_to_public_key = [
|
||||
{
|
||||
'input': int_to_hex(privkey),
|
||||
'output': pubkey_serial,
|
||||
}
|
||||
for privkey, pubkey_serial in zip(PRIVKEYS, pubkeys_serial)
|
||||
]
|
||||
|
||||
case04_sign_messages = []
|
||||
sigs = [] # used in verify
|
||||
for privkey in PRIVKEYS:
|
||||
for message in MESSAGES:
|
||||
for domain in DOMAINS:
|
||||
sig = bls.sign(message, privkey, domain)
|
||||
case04_sign_messages.append({
|
||||
'input': {
|
||||
'privkey': int_to_hex(privkey),
|
||||
'message': '0x' + message.hex(),
|
||||
'domain': int_to_hex(domain)
|
||||
},
|
||||
'output': '0x' + sig.hex()
|
||||
})
|
||||
sigs.append(sig)
|
||||
|
||||
# TODO: case05_verify_messages: Verify messages signed in case04
|
||||
# It takes too long, empty for now
|
||||
|
||||
case06_aggregate_sigs = []
|
||||
for domain in DOMAINS:
|
||||
for message in MESSAGES:
|
||||
sigs = []
|
||||
for privkey in PRIVKEYS:
|
||||
sig = bls.sign(message, privkey, domain)
|
||||
sigs.append(sig)
|
||||
case06_aggregate_sigs.append({
|
||||
'input': ['0x' + sig.hex() for sig in sigs],
|
||||
'output': '0x' + bls.aggregate_signatures(sigs).hex(),
|
||||
})
|
||||
|
||||
case07_aggregate_pubkeys = [
|
||||
{
|
||||
'input': pubkeys_serial,
|
||||
'output': '0x' + bls.aggregate_pubkeys(pubkeys).hex(),
|
||||
}
|
||||
]
|
||||
|
||||
# TODO
|
||||
# Aggregate verify
|
||||
|
||||
# TODO
|
||||
# Proof-of-possession
|
||||
|
||||
with open(sys.argv[2] + "test_bls.yml", 'w') as outfile:
|
||||
# Dump at top level
|
||||
yaml.dump(metadata, outfile, default_flow_style=False)
|
||||
# default_flow_style will unravel "ValidatorRecord" and "committee" line,
|
||||
# exploding file size
|
||||
yaml.dump(
|
||||
{'case01_message_hash_G2_uncompressed': case01_message_hash_G2_uncompressed},
|
||||
outfile,
|
||||
)
|
||||
yaml.dump(
|
||||
{'case02_message_hash_G2_compressed': case02_message_hash_G2_compressed},
|
||||
outfile,
|
||||
)
|
||||
yaml.dump(
|
||||
{'case03_private_to_public_key': case03_private_to_public_key},
|
||||
outfile,
|
||||
)
|
||||
yaml.dump({'case04_sign_messages': case04_sign_messages}, outfile)
|
||||
|
||||
# Too time consuming to generate
|
||||
# yaml.dump({'case05_verify_messages': case05_verify_messages}, outfile)
|
||||
yaml.dump({'case06_aggregate_sigs': case06_aggregate_sigs}, outfile)
|
||||
yaml.dump({'case07_aggregate_pubkeys': case07_aggregate_pubkeys}, outfile)
|
2
test_generators/bls/requirements.txt
Normal file
2
test_generators/bls/requirements.txt
Normal file
@ -0,0 +1,2 @@
|
||||
py-ecc==1.6.0
|
||||
PyYAML==4.2b1
|
16
test_generators/shuffling/README.md
Normal file
16
test_generators/shuffling/README.md
Normal file
@ -0,0 +1,16 @@
|
||||
# Shuffling Test Generator
|
||||
|
||||
```
|
||||
2018 Status Research & Development GmbH
|
||||
Copyright and related rights waived via [CC0](https://creativecommons.org/publicdomain/zero/1.0/).
|
||||
|
||||
This work uses public domain work under CC0 from the Ethereum Foundation
|
||||
https://github.com/ethereum/eth2.0-specs
|
||||
```
|
||||
|
||||
|
||||
This file implements a test vectors generator for the shuffling algorithm described in the Ethereum
|
||||
[specs](https://github.com/ethereum/eth2.0-specs/blob/2983e68f0305551083fac7fcf9330c1fc9da3411/specs/core/0_beacon-chain.md#get_new_shuffling)
|
||||
|
||||
Utilizes 'swap or not' shuffling found in [An Enciphering Scheme Based on a Card Shuffle](https://link.springer.com/content/pdf/10.1007%2F978-3-642-32009-5_1.pdf).
|
||||
See the `Generalized domain` algorithm on page 3.
|
6
test_generators/shuffling/constants.py
Normal file
6
test_generators/shuffling/constants.py
Normal file
@ -0,0 +1,6 @@
|
||||
SLOTS_PER_EPOCH = 2**6 # 64 slots, 6.4 minutes
|
||||
FAR_FUTURE_EPOCH = 2**64 - 1 # uint64 max
|
||||
SHARD_COUNT = 2**10 # 1024
|
||||
TARGET_COMMITTEE_SIZE = 2**7 # 128 validators
|
||||
ACTIVATION_EXIT_DELAY = 2**2 # 4 epochs
|
||||
SHUFFLE_ROUND_COUNT = 90
|
95
test_generators/shuffling/core_helpers.py
Normal file
95
test_generators/shuffling/core_helpers.py
Normal file
@ -0,0 +1,95 @@
|
||||
from typing import Any, List, NewType
|
||||
|
||||
from constants import SLOTS_PER_EPOCH, SHARD_COUNT, TARGET_COMMITTEE_SIZE, SHUFFLE_ROUND_COUNT
|
||||
from utils import hash
|
||||
from yaml_objects import Validator
|
||||
|
||||
Epoch = NewType("Epoch", int)
|
||||
ValidatorIndex = NewType("ValidatorIndex", int)
|
||||
Bytes32 = NewType("Bytes32", bytes)
|
||||
|
||||
|
||||
def int_to_bytes1(x):
|
||||
return x.to_bytes(1, 'little')
|
||||
|
||||
|
||||
def int_to_bytes4(x):
|
||||
return x.to_bytes(4, 'little')
|
||||
|
||||
|
||||
def bytes_to_int(data: bytes) -> int:
|
||||
return int.from_bytes(data, 'little')
|
||||
|
||||
|
||||
def is_active_validator(validator: Validator, epoch: Epoch) -> bool:
|
||||
"""
|
||||
Check if ``validator`` is active.
|
||||
"""
|
||||
return validator.activation_epoch <= epoch < validator.exit_epoch
|
||||
|
||||
|
||||
def get_active_validator_indices(validators: List[Validator], epoch: Epoch) -> List[ValidatorIndex]:
|
||||
"""
|
||||
Get indices of active validators from ``validators``.
|
||||
"""
|
||||
return [i for i, v in enumerate(validators) if is_active_validator(v, epoch)]
|
||||
|
||||
|
||||
def split(values: List[Any], split_count: int) -> List[List[Any]]:
|
||||
"""
|
||||
Splits ``values`` into ``split_count`` pieces.
|
||||
"""
|
||||
list_length = len(values)
|
||||
return [
|
||||
values[(list_length * i // split_count): (list_length * (i + 1) // split_count)]
|
||||
for i in range(split_count)
|
||||
]
|
||||
|
||||
|
||||
def get_epoch_committee_count(active_validator_count: int) -> int:
|
||||
"""
|
||||
Return the number of committees in one epoch.
|
||||
"""
|
||||
return max(
|
||||
1,
|
||||
min(
|
||||
SHARD_COUNT // SLOTS_PER_EPOCH,
|
||||
active_validator_count // SLOTS_PER_EPOCH // TARGET_COMMITTEE_SIZE,
|
||||
)
|
||||
) * SLOTS_PER_EPOCH
|
||||
|
||||
|
||||
def get_permuted_index(index: int, list_size: int, seed: Bytes32) -> int:
|
||||
"""
|
||||
Return `p(index)` in a pseudorandom permutation `p` of `0...list_size-1` with ``seed`` as entropy.
|
||||
|
||||
Utilizes 'swap or not' shuffling found in
|
||||
https://link.springer.com/content/pdf/10.1007%2F978-3-642-32009-5_1.pdf
|
||||
See the 'generalized domain' algorithm on page 3.
|
||||
"""
|
||||
for round in range(SHUFFLE_ROUND_COUNT):
|
||||
pivot = bytes_to_int(hash(seed + int_to_bytes1(round))[0:8]) % list_size
|
||||
flip = (pivot - index) % list_size
|
||||
position = max(index, flip)
|
||||
source = hash(seed + int_to_bytes1(round) + int_to_bytes4(position // 256))
|
||||
byte = source[(position % 256) // 8]
|
||||
bit = (byte >> (position % 8)) % 2
|
||||
index = flip if bit else index
|
||||
|
||||
return index
|
||||
|
||||
|
||||
def get_shuffling(seed: Bytes32,
|
||||
validators: List[Validator],
|
||||
epoch: Epoch) -> List[List[ValidatorIndex]]:
|
||||
"""
|
||||
Shuffle active validators and split into crosslink committees.
|
||||
Return a list of committees (each a list of validator indices).
|
||||
"""
|
||||
# Shuffle active validator indices
|
||||
active_validator_indices = get_active_validator_indices(validators, epoch)
|
||||
length = len(active_validator_indices)
|
||||
shuffled_indices = [active_validator_indices[get_permuted_index(i, length, seed)] for i in range(length)]
|
||||
|
||||
# Split the shuffled active validator indices
|
||||
return split(shuffled_indices, get_epoch_committee_count(length))
|
160
test_generators/shuffling/main.py
Normal file
160
test_generators/shuffling/main.py
Normal file
@ -0,0 +1,160 @@
|
||||
import random
|
||||
import sys
|
||||
import os
|
||||
|
||||
import yaml
|
||||
|
||||
from constants import ACTIVATION_EXIT_DELAY, FAR_FUTURE_EPOCH
|
||||
from core_helpers import get_shuffling
|
||||
from yaml_objects import Validator
|
||||
|
||||
|
||||
def noop(self, *args, **kw):
|
||||
# Prevent !!str or !!binary tags
|
||||
pass
|
||||
|
||||
|
||||
yaml.emitter.Emitter.process_tag = noop
|
||||
|
||||
|
||||
EPOCH = 1000 # The epoch, also a mean for the normal distribution
|
||||
|
||||
# Standard deviation, around 8% validators will activate or exit within
|
||||
# ENTRY_EXIT_DELAY inclusive from EPOCH thus creating an edge case for validator
|
||||
# shuffling
|
||||
RAND_EPOCH_STD = 35
|
||||
MAX_EXIT_EPOCH = 5000 # Maximum exit_epoch for easier reading
|
||||
|
||||
|
||||
def active_exited_validators_generator():
|
||||
"""
|
||||
Random cases with variety of validator's activity status
|
||||
"""
|
||||
# Order not preserved - https://github.com/yaml/pyyaml/issues/110
|
||||
metadata = {
|
||||
'title': 'Shuffling Algorithm Tests 1',
|
||||
'summary': 'Test vectors for validator shuffling with different validator\'s activity status.'
|
||||
' Note: only relevant validator fields are defined.',
|
||||
'test_suite': 'shuffle',
|
||||
'fork': 'phase0-0.5.0',
|
||||
}
|
||||
|
||||
# Config
|
||||
random.seed(int("0xEF00BEAC", 16))
|
||||
num_cases = 10
|
||||
|
||||
test_cases = []
|
||||
|
||||
for case in range(num_cases):
|
||||
seedhash = bytes(random.randint(0, 255) for byte in range(32))
|
||||
idx_max = random.randint(128, 512)
|
||||
|
||||
validators = []
|
||||
for idx in range(idx_max):
|
||||
v = Validator(original_index=idx)
|
||||
# 4/5 of all validators are active
|
||||
if random.random() < 0.8:
|
||||
# Choose a normally distributed epoch number
|
||||
rand_epoch = round(random.gauss(EPOCH, RAND_EPOCH_STD))
|
||||
|
||||
# for 1/2 of *active* validators rand_epoch is the activation epoch
|
||||
if random.random() < 0.5:
|
||||
v.activation_epoch = rand_epoch
|
||||
|
||||
# 1/4 of active validators will exit in forseeable future
|
||||
if random.random() < 0.5:
|
||||
v.exit_epoch = random.randint(
|
||||
rand_epoch + ACTIVATION_EXIT_DELAY + 1, MAX_EXIT_EPOCH)
|
||||
# 1/4 of active validators in theory remain in the set indefinitely
|
||||
else:
|
||||
v.exit_epoch = FAR_FUTURE_EPOCH
|
||||
# for the other active 1/2 rand_epoch is the exit epoch
|
||||
else:
|
||||
v.activation_epoch = random.randint(
|
||||
0, rand_epoch - ACTIVATION_EXIT_DELAY)
|
||||
v.exit_epoch = rand_epoch
|
||||
|
||||
# The remaining 1/5 of all validators is not activated
|
||||
else:
|
||||
v.activation_epoch = FAR_FUTURE_EPOCH
|
||||
v.exit_epoch = FAR_FUTURE_EPOCH
|
||||
|
||||
validators.append(v)
|
||||
|
||||
input_ = {
|
||||
'validators': validators,
|
||||
'epoch': EPOCH
|
||||
}
|
||||
output = get_shuffling(
|
||||
seedhash, validators, input_['epoch'])
|
||||
|
||||
test_cases.append({
|
||||
'seed': '0x' + seedhash.hex(), 'input': input_, 'output': output
|
||||
})
|
||||
|
||||
return {
|
||||
'metadata': metadata,
|
||||
'filename': 'test_vector_shuffling.yml',
|
||||
'test_cases': test_cases
|
||||
}
|
||||
|
||||
|
||||
def validators_set_size_variety_generator():
|
||||
"""
|
||||
Different validator set size cases, inspired by removed manual `permutated_index` tests
|
||||
https://github.com/ethereum/eth2.0-test-generators/tree/bcd9ab2933d9f696901d1dfda0828061e9d3093f/permutated_index
|
||||
"""
|
||||
# Order not preserved - https://github.com/yaml/pyyaml/issues/110
|
||||
metadata = {
|
||||
'title': 'Shuffling Algorithm Tests 2',
|
||||
'summary': 'Test vectors for validator shuffling with different validator\'s set size.'
|
||||
' Note: only relevant validator fields are defined.',
|
||||
'test_suite': 'shuffle',
|
||||
'fork': 'tchaikovsky',
|
||||
'version': 1.0
|
||||
}
|
||||
|
||||
# Config
|
||||
random.seed(int("0xEF00BEAC", 16))
|
||||
|
||||
test_cases = []
|
||||
|
||||
seedhash = bytes(random.randint(0, 255) for byte in range(32))
|
||||
idx_max = 4096
|
||||
set_sizes = [1, 2, 3, 1024, idx_max]
|
||||
|
||||
for size in set_sizes:
|
||||
validators = []
|
||||
for idx in range(size):
|
||||
v = Validator(original_index=idx)
|
||||
v.activation_epoch = EPOCH
|
||||
v.exit_epoch = FAR_FUTURE_EPOCH
|
||||
validators.append(v)
|
||||
input_ = {
|
||||
'validators': validators,
|
||||
'epoch': EPOCH
|
||||
}
|
||||
output = get_shuffling(
|
||||
seedhash, validators, input_['epoch'])
|
||||
|
||||
test_cases.append({
|
||||
'seed': '0x' + seedhash.hex(), 'input': input_, 'output': output
|
||||
})
|
||||
|
||||
return {
|
||||
'metadata': metadata,
|
||||
'filename': 'shuffling_set_size.yml',
|
||||
'test_cases': test_cases
|
||||
}
|
||||
|
||||
|
||||
if __name__ == '__main__':
|
||||
output_dir = sys.argv[2]
|
||||
for generator in [active_exited_validators_generator, validators_set_size_variety_generator]:
|
||||
result = generator()
|
||||
filename = os.path.join(output_dir, result['filename'])
|
||||
with open(filename, 'w') as outfile:
|
||||
# Dump at top level
|
||||
yaml.dump(result['metadata'], outfile, default_flow_style=False)
|
||||
# default_flow_style will unravel "ValidatorRecord" and "committee" line, exploding file size
|
||||
yaml.dump({'test_cases': result['test_cases']}, outfile)
|
4
test_generators/shuffling/requirements.txt
Normal file
4
test_generators/shuffling/requirements.txt
Normal file
@ -0,0 +1,4 @@
|
||||
eth-hash[pycryptodome]==0.2.0
|
||||
eth-typing==2.0.0
|
||||
eth-utils==1.4.1
|
||||
PyYAML==4.2b1
|
6
test_generators/shuffling/utils.py
Normal file
6
test_generators/shuffling/utils.py
Normal file
@ -0,0 +1,6 @@
|
||||
from eth_typing import Hash32
|
||||
from eth_utils import keccak
|
||||
|
||||
|
||||
def hash(x: bytes) -> Hash32:
|
||||
return keccak(x)
|
25
test_generators/shuffling/yaml_objects.py
Normal file
25
test_generators/shuffling/yaml_objects.py
Normal file
@ -0,0 +1,25 @@
|
||||
from typing import Any
|
||||
|
||||
import yaml
|
||||
|
||||
|
||||
class Validator(yaml.YAMLObject):
|
||||
"""
|
||||
A validator stub containing only the fields relevant for get_shuffling()
|
||||
"""
|
||||
fields = {
|
||||
'activation_epoch': 'uint64',
|
||||
'exit_epoch': 'uint64',
|
||||
# Extra index field to ease testing/debugging
|
||||
'original_index': 'uint64',
|
||||
}
|
||||
|
||||
def __init__(self, **kwargs):
|
||||
for k in self.fields.keys():
|
||||
setattr(self, k, kwargs.get(k))
|
||||
|
||||
def __setattr__(self, name: str, value: Any) -> None:
|
||||
super().__setattr__(name, value)
|
||||
|
||||
def __getattribute__(self, name: str) -> Any:
|
||||
return super().__getattribute__(name)
|
0
test_generators/ssz/__init__.py
Normal file
0
test_generators/ssz/__init__.py
Normal file
84
test_generators/ssz/main.py
Normal file
84
test_generators/ssz/main.py
Normal file
@ -0,0 +1,84 @@
|
||||
import argparse
|
||||
import pathlib
|
||||
import sys
|
||||
|
||||
from ruamel.yaml import (
|
||||
YAML,
|
||||
)
|
||||
|
||||
from uint_test_generators import (
|
||||
generate_uint_bounds_test,
|
||||
generate_uint_random_test,
|
||||
generate_uint_wrong_length_test,
|
||||
)
|
||||
|
||||
test_generators = [
|
||||
generate_uint_random_test,
|
||||
generate_uint_wrong_length_test,
|
||||
generate_uint_bounds_test,
|
||||
]
|
||||
|
||||
|
||||
def make_filename_for_test(test):
|
||||
title = test["title"]
|
||||
filename = title.lower().replace(" ", "_") + ".yaml"
|
||||
return pathlib.Path(filename)
|
||||
|
||||
|
||||
def validate_output_dir(path_str):
|
||||
path = pathlib.Path(path_str)
|
||||
|
||||
if not path.exists():
|
||||
raise argparse.ArgumentTypeError("Output directory must exist")
|
||||
|
||||
if not path.is_dir():
|
||||
raise argparse.ArgumentTypeError("Output path must lead to a directory")
|
||||
|
||||
return path
|
||||
|
||||
|
||||
parser = argparse.ArgumentParser(
|
||||
prog="gen-ssz-tests",
|
||||
description="Generate YAML test files for SSZ and tree hashing",
|
||||
)
|
||||
parser.add_argument(
|
||||
"-o",
|
||||
"--output-dir",
|
||||
dest="output_dir",
|
||||
required=True,
|
||||
type=validate_output_dir,
|
||||
help="directory into which the generated YAML files will be dumped"
|
||||
)
|
||||
parser.add_argument(
|
||||
"-f",
|
||||
"--force",
|
||||
action="store_true",
|
||||
default=False,
|
||||
help="if set overwrite test files if they exist",
|
||||
)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
args = parser.parse_args()
|
||||
output_dir = args.output_dir
|
||||
if not args.force:
|
||||
file_mode = "x"
|
||||
else:
|
||||
file_mode = "w"
|
||||
|
||||
yaml = YAML(pure=True)
|
||||
|
||||
print(f"generating {len(test_generators)} test files...")
|
||||
for test_generator in test_generators:
|
||||
test = test_generator()
|
||||
|
||||
filename = make_filename_for_test(test)
|
||||
path = output_dir / filename
|
||||
|
||||
try:
|
||||
with path.open(file_mode) as f:
|
||||
yaml.dump(test, f)
|
||||
except IOError as e:
|
||||
sys.exit(f'Error when dumping test "{test["title"]}" ({e})')
|
||||
|
||||
print("done.")
|
102
test_generators/ssz/renderers.py
Normal file
102
test_generators/ssz/renderers.py
Normal file
@ -0,0 +1,102 @@
|
||||
from collections.abc import (
|
||||
Mapping,
|
||||
Sequence,
|
||||
)
|
||||
|
||||
from eth_utils import (
|
||||
encode_hex,
|
||||
to_dict,
|
||||
)
|
||||
|
||||
from ssz.sedes import (
|
||||
BaseSedes,
|
||||
Boolean,
|
||||
Bytes,
|
||||
BytesN,
|
||||
Container,
|
||||
List,
|
||||
UInt,
|
||||
)
|
||||
|
||||
|
||||
def render_value(value):
|
||||
if isinstance(value, bool):
|
||||
return value
|
||||
elif isinstance(value, int):
|
||||
return str(value)
|
||||
elif isinstance(value, bytes):
|
||||
return encode_hex(value)
|
||||
elif isinstance(value, Sequence):
|
||||
return tuple(render_value(element) for element in value)
|
||||
elif isinstance(value, Mapping):
|
||||
return render_dict_value(value)
|
||||
else:
|
||||
raise ValueError(f"Cannot render value {value}")
|
||||
|
||||
|
||||
@to_dict
|
||||
def render_dict_value(value):
|
||||
for key, value in value.items():
|
||||
yield key, render_value(value)
|
||||
|
||||
|
||||
def render_type_definition(sedes):
|
||||
if isinstance(sedes, Boolean):
|
||||
return "bool"
|
||||
|
||||
elif isinstance(sedes, UInt):
|
||||
return f"uint{sedes.length * 8}"
|
||||
|
||||
elif isinstance(sedes, BytesN):
|
||||
return f"bytes{sedes.length}"
|
||||
|
||||
elif isinstance(sedes, Bytes):
|
||||
return f"bytes"
|
||||
|
||||
elif isinstance(sedes, List):
|
||||
return [render_type_definition(sedes.element_sedes)]
|
||||
|
||||
elif isinstance(sedes, Container):
|
||||
return {
|
||||
field_name: render_type_definition(field_sedes)
|
||||
for field_name, field_sedes in sedes.fields
|
||||
}
|
||||
|
||||
elif isinstance(sedes, BaseSedes):
|
||||
raise Exception("Unreachable: All sedes types have been checked")
|
||||
|
||||
else:
|
||||
raise TypeError("Expected BaseSedes")
|
||||
|
||||
|
||||
@to_dict
|
||||
def render_test_case(*, sedes, valid, value=None, serial=None, description=None, tags=None):
|
||||
value_and_serial_given = value is not None and serial is not None
|
||||
if valid:
|
||||
if not value_and_serial_given:
|
||||
raise ValueError("For valid test cases, both value and ssz must be present")
|
||||
else:
|
||||
if value_and_serial_given:
|
||||
raise ValueError("For invalid test cases, either value or ssz must not be present")
|
||||
|
||||
if tags is None:
|
||||
tags = []
|
||||
|
||||
yield "type", render_type_definition(sedes)
|
||||
yield "valid", valid
|
||||
if value is not None:
|
||||
yield "value", render_value(value)
|
||||
if serial is not None:
|
||||
yield "ssz", encode_hex(serial)
|
||||
if description is not None:
|
||||
yield description
|
||||
yield "tags", tags
|
||||
|
||||
|
||||
@to_dict
|
||||
def render_test(*, title, summary, fork, test_cases):
|
||||
yield "title", title,
|
||||
if summary is not None:
|
||||
yield "summary", summary
|
||||
yield "fork", fork
|
||||
yield "test_cases", test_cases
|
2
test_generators/ssz/requirements.txt
Normal file
2
test_generators/ssz/requirements.txt
Normal file
@ -0,0 +1,2 @@
|
||||
ruamel.yaml==0.15.87
|
||||
ssz==0.1.0a2
|
132
test_generators/ssz/uint_test_generators.py
Normal file
132
test_generators/ssz/uint_test_generators.py
Normal file
@ -0,0 +1,132 @@
|
||||
import random
|
||||
|
||||
from eth_utils import (
|
||||
to_tuple,
|
||||
)
|
||||
|
||||
import ssz
|
||||
from ssz.sedes import (
|
||||
UInt,
|
||||
)
|
||||
from renderers import (
|
||||
render_test,
|
||||
render_test_case,
|
||||
)
|
||||
|
||||
random.seed(0)
|
||||
|
||||
|
||||
BIT_SIZES = [i for i in range(8, 512 + 1, 8)]
|
||||
RANDOM_TEST_CASES_PER_BIT_SIZE = 10
|
||||
RANDOM_TEST_CASES_PER_LENGTH = 3
|
||||
|
||||
|
||||
def get_random_bytes(length):
|
||||
return bytes(random.randint(0, 255) for _ in range(length))
|
||||
|
||||
|
||||
def generate_uint_bounds_test():
|
||||
test_cases = generate_uint_bounds_test_cases() + generate_uint_out_of_bounds_test_cases()
|
||||
|
||||
return render_test(
|
||||
title="UInt Bounds",
|
||||
summary="Integers right at or beyond the bounds of the allowed value range",
|
||||
fork="phase0-0.2.0",
|
||||
test_cases=test_cases,
|
||||
)
|
||||
|
||||
|
||||
def generate_uint_random_test():
|
||||
test_cases = generate_random_uint_test_cases()
|
||||
|
||||
return render_test(
|
||||
title="UInt Random",
|
||||
summary="Random integers chosen uniformly over the allowed value range",
|
||||
fork="phase0-0.2.0",
|
||||
test_cases=test_cases,
|
||||
)
|
||||
|
||||
|
||||
def generate_uint_wrong_length_test():
|
||||
test_cases = generate_uint_wrong_length_test_cases()
|
||||
|
||||
return render_test(
|
||||
title="UInt Wrong Length",
|
||||
summary="Serialized integers that are too short or too long",
|
||||
fork="phase0-0.2.0",
|
||||
test_cases=test_cases,
|
||||
)
|
||||
|
||||
|
||||
@to_tuple
|
||||
def generate_random_uint_test_cases():
|
||||
for bit_size in BIT_SIZES:
|
||||
sedes = UInt(bit_size)
|
||||
|
||||
for _ in range(RANDOM_TEST_CASES_PER_BIT_SIZE):
|
||||
value = random.randrange(0, 2 ** bit_size)
|
||||
serial = ssz.encode(value, sedes)
|
||||
# note that we need to create the tags in each loop cycle, otherwise ruamel will use
|
||||
# YAML references which makes the resulting file harder to read
|
||||
tags = tuple(["atomic", "uint", "random"])
|
||||
yield render_test_case(
|
||||
sedes=sedes,
|
||||
valid=True,
|
||||
value=value,
|
||||
serial=serial,
|
||||
tags=tags,
|
||||
)
|
||||
|
||||
|
||||
@to_tuple
|
||||
def generate_uint_wrong_length_test_cases():
|
||||
for bit_size in BIT_SIZES:
|
||||
sedes = UInt(bit_size)
|
||||
lengths = sorted({
|
||||
0,
|
||||
sedes.length // 2,
|
||||
sedes.length - 1,
|
||||
sedes.length + 1,
|
||||
sedes.length * 2,
|
||||
})
|
||||
for length in lengths:
|
||||
for _ in range(RANDOM_TEST_CASES_PER_LENGTH):
|
||||
tags = tuple(["atomic", "uint", "wrong_length"])
|
||||
yield render_test_case(
|
||||
sedes=sedes,
|
||||
valid=False,
|
||||
serial=get_random_bytes(length),
|
||||
tags=tags,
|
||||
)
|
||||
|
||||
|
||||
@to_tuple
|
||||
def generate_uint_bounds_test_cases():
|
||||
common_tags = ("atomic", "uint")
|
||||
for bit_size in BIT_SIZES:
|
||||
sedes = UInt(bit_size)
|
||||
|
||||
for value, tag in ((0, "uint_lower_bound"), (2 ** bit_size - 1, "uint_upper_bound")):
|
||||
serial = ssz.encode(value, sedes)
|
||||
yield render_test_case(
|
||||
sedes=sedes,
|
||||
valid=True,
|
||||
value=value,
|
||||
serial=serial,
|
||||
tags=common_tags + (tag,),
|
||||
)
|
||||
|
||||
|
||||
@to_tuple
|
||||
def generate_uint_out_of_bounds_test_cases():
|
||||
common_tags = ("atomic", "uint")
|
||||
for bit_size in BIT_SIZES:
|
||||
sedes = UInt(bit_size)
|
||||
|
||||
for value, tag in ((-1, "uint_underflow"), (2 ** bit_size, "uint_overflow")):
|
||||
yield render_test_case(
|
||||
sedes=sedes,
|
||||
valid=False,
|
||||
value=value,
|
||||
tags=common_tags + (tag,),
|
||||
)
|
Loading…
x
Reference in New Issue
Block a user