mirror of
https://github.com/status-im/eth2.0-specs.git
synced 2025-03-02 11:10:35 +00:00
commit
2c0fcee1e3
@ -21,11 +21,11 @@ Features are researched and developed in parallel, and then consolidated into se
|
||||
| 1 | **Altair** | `74240` | <ul><li>Core</li><ul><li>[Beacon chain changes](specs/altair/beacon-chain.md)</li><li>[Altair fork](specs/altair/fork.md)</li></ul><li>Additions</li><ul><li>[Light client sync protocol](specs/altair/light-client/sync-protocol.md) ([full node](specs/altair/light-client/full-node.md), [light client](specs/altair/light-client/light-client.md), [networking](specs/altair/light-client/p2p-interface.md))</li><li>[Honest validator guide changes](specs/altair/validator.md)</li><li>[P2P networking](specs/altair/p2p-interface.md)</li></ul></ul> |
|
||||
| 2 | **Bellatrix** <br/> (["The Merge"](https://ethereum.org/en/upgrades/merge/)) | `144896` | <ul><li>Core</li><ul><li>[Beacon Chain changes](specs/bellatrix/beacon-chain.md)</li><li>[Bellatrix fork](specs/bellatrix/fork.md)</li><li>[Fork choice changes](specs/bellatrix/fork-choice.md)</li></ul><li>Additions</li><ul><li>[Honest validator guide changes](specs/bellatrix/validator.md)</li><li>[P2P networking](specs/bellatrix/p2p-interface.md)</li></ul></ul> |
|
||||
| 3 | **Capella** | `194048` | <ul><li>Core</li><ul><li>[Beacon chain changes](specs/capella/beacon-chain.md)</li><li>[Capella fork](specs/capella/fork.md)</li></ul><li>Additions</li><ul><li>[Light client sync protocol changes](specs/capella/light-client/sync-protocol.md) ([fork](specs/capella/light-client/fork.md), [full node](specs/capella/light-client/full-node.md), [networking](specs/capella/light-client/p2p-interface.md))</li></ul><ul><li>[Validator additions](specs/capella/validator.md)</li><li>[P2P networking](specs/capella/p2p-interface.md)</li></ul></ul> |
|
||||
| 4 | **Deneb** | `269568` | <ul><li>Core</li><ul><li>[Beacon Chain changes](specs/deneb/beacon-chain.md)</li><li>[Deneb fork](specs/deneb/fork.md)</li><li>[Polynomial commitments](specs/deneb/polynomial-commitments.md)</li><li>[Fork choice changes](specs/deneb/fork-choice.md)</li></ul><li>Additions</li><ul><li>[Light client sync protocol changes](specs/deneb/light-client/sync-protocol.md) ([fork](specs/deneb/light-client/fork.md), [full node](specs/deneb/light-client/full-node.md), [networking](specs/deneb/light-client/p2p-interface.md))</li></ul><ul><li>[Honest validator guide changes](specs/deneb/validator.md)</li><li>[P2P networking](specs/deneb/p2p-interface.md)</li></ul></ul> |
|
||||
|
||||
### In-development Specifications
|
||||
| Code Name or Topic | Specs | Notes |
|
||||
| - | - | - |
|
||||
| Deneb (tentative) | <ul><li>Core</li><ul><li>[Beacon Chain changes](specs/deneb/beacon-chain.md)</li><li>[Deneb fork](specs/deneb/fork.md)</li><li>[Polynomial commitments](specs/deneb/polynomial-commitments.md)</li><li>[Fork choice changes](specs/deneb/fork-choice.md)</li></ul><li>Additions</li><ul><li>[Light client sync protocol changes](specs/deneb/light-client/sync-protocol.md) ([fork](specs/deneb/light-client/fork.md), [full node](specs/deneb/light-client/full-node.md), [networking](specs/deneb/light-client/p2p-interface.md))</li></ul><ul><li>[Honest validator guide changes](specs/deneb/validator.md)</li><li>[P2P networking](specs/deneb/p2p-interface.md)</li></ul></ul> |
|
||||
| Sharding (outdated) | <ul><li>Core</li><ul><li>[Beacon Chain changes](specs/_features/sharding/beacon-chain.md)</li></ul><li>Additions</li><ul><li>[P2P networking](specs/_features/sharding/p2p-interface.md)</li></ul></ul> |
|
||||
| Custody Game (outdated) | <ul><li>Core</li><ul><li>[Beacon Chain changes](specs/_features/custody_game/beacon-chain.md)</li></ul><li>Additions</li><ul><li>[Honest validator guide changes](specs/_features/custody_game/validator.md)</li></ul></ul> | Dependent on sharding |
|
||||
| Data Availability Sampling (outdated) | <ul><li>Core</li><ul><li>[Core types and functions](specs/_features/das/das-core.md)</li><li>[Fork choice changes](specs/_features/das/fork-choice.md)</li></ul><li>Additions</li><ul><li>[P2P Networking](specs/_features/das/p2p-interface.md)</li><li>[Sampling process](specs/_features/das/sampling.md)</li></ul></ul> | <ul><li> Dependent on sharding</li><li>[Technical explainer](https://hackmd.io/@HWeNw8hNRimMm2m2GH56Cw/B1YJPGkpD)</li></ul> |
|
||||
|
@ -49,7 +49,7 @@ CAPELLA_FORK_VERSION: 0x03000000
|
||||
CAPELLA_FORK_EPOCH: 194048 # April 12, 2023, 10:27:35pm UTC
|
||||
# Deneb
|
||||
DENEB_FORK_VERSION: 0x04000000
|
||||
DENEB_FORK_EPOCH: 18446744073709551615
|
||||
DENEB_FORK_EPOCH: 269568 # March 13, 2024, 01:55:35pm UTC
|
||||
# EIP6110
|
||||
EIP6110_FORK_VERSION: 0x05000000 # temporary stub
|
||||
EIP6110_FORK_EPOCH: 18446744073709551615
|
||||
|
@ -12,6 +12,8 @@
|
||||
- [Preset](#preset)
|
||||
- [Cells](#cells)
|
||||
- [Helper functions](#helper-functions)
|
||||
- [BLS12-381 helpers](#bls12-381-helpers)
|
||||
- [`bytes_to_cell`](#bytes_to_cell)
|
||||
- [Linear combinations](#linear-combinations)
|
||||
- [`g2_lincomb`](#g2_lincomb)
|
||||
- [FFTs](#ffts)
|
||||
@ -40,6 +42,9 @@
|
||||
- [`verify_cell_proof`](#verify_cell_proof)
|
||||
- [`verify_cell_proof_batch`](#verify_cell_proof_batch)
|
||||
- [Reconstruction](#reconstruction)
|
||||
- [`construct_vanishing_polynomial`](#construct_vanishing_polynomial)
|
||||
- [`recover_shifted_data`](#recover_shifted_data)
|
||||
- [`recover_original_data`](#recover_original_data)
|
||||
- [`recover_polynomial`](#recover_polynomial)
|
||||
|
||||
<!-- END doctoc generated TOC please keep comment here to allow auto update -->
|
||||
@ -74,13 +79,26 @@ Cells are the smallest unit of blob data that can come with their own KZG proofs
|
||||
|
||||
| Name | Value | Description |
|
||||
| - | - | - |
|
||||
| `FIELD_ELEMENTS_PER_EXT_BLOB` | `2 * FIELD_ELEMENTS_PER_BLOB` | Number of field elements in a Reed-Solomon extended blob |
|
||||
| `FIELD_ELEMENTS_PER_CELL` | `uint64(64)` | Number of field elements in a cell |
|
||||
| `BYTES_PER_CELL` | `FIELD_ELEMENTS_PER_CELL * BYTES_PER_FIELD_ELEMENT` | The number of bytes in a cell |
|
||||
| `CELLS_PER_BLOB` | `((2 * FIELD_ELEMENTS_PER_BLOB) // FIELD_ELEMENTS_PER_CELL)` | The number of cells in a blob |
|
||||
| `CELLS_PER_BLOB` | `FIELD_ELEMENTS_PER_EXT_BLOB // FIELD_ELEMENTS_PER_CELL` | The number of cells in a blob |
|
||||
| `RANDOM_CHALLENGE_KZG_CELL_BATCH_DOMAIN` | `b'RCKZGCBATCH__V1_'` |
|
||||
|
||||
## Helper functions
|
||||
|
||||
### BLS12-381 helpers
|
||||
|
||||
#### `bytes_to_cell`
|
||||
|
||||
```python
|
||||
def bytes_to_cell(cell_bytes: Vector[Bytes32, FIELD_ELEMENTS_PER_CELL]) -> Cell:
|
||||
"""
|
||||
Convert untrusted bytes into a Cell.
|
||||
"""
|
||||
return [bytes_to_bls_field(element) for element in cell_bytes]
|
||||
```
|
||||
|
||||
### Linear combinations
|
||||
|
||||
#### `g2_lincomb`
|
||||
@ -156,7 +174,9 @@ def add_polynomialcoeff(a: PolynomialCoeff, b: PolynomialCoeff) -> PolynomialCoe
|
||||
Sum the coefficient form polynomials ``a`` and ``b``.
|
||||
"""
|
||||
a, b = (a, b) if len(a) >= len(b) else (b, a)
|
||||
return [(a[i] + (b[i] if i < len(b) else 0)) % BLS_MODULUS for i in range(len(a))]
|
||||
length_a = len(a)
|
||||
length_b = len(b)
|
||||
return [(a[i] + (b[i] if i < length_b else 0)) % BLS_MODULUS for i in range(length_a)]
|
||||
```
|
||||
|
||||
#### `neg_polynomialcoeff`
|
||||
@ -242,7 +262,7 @@ def interpolate_polynomialcoeff(xs: Sequence[BLSFieldElement], ys: Sequence[BLSF
|
||||
summand, [(- int(weight_adjustment) * int(xs[j])) % BLS_MODULUS, weight_adjustment]
|
||||
)
|
||||
r = add_polynomialcoeff(r, summand)
|
||||
|
||||
|
||||
return r
|
||||
```
|
||||
|
||||
@ -330,13 +350,13 @@ def verify_kzg_proof_multi_impl(commitment: KZGCommitment,
|
||||
#### `coset_for_cell`
|
||||
|
||||
```python
|
||||
def coset_for_cell(cell_id: int) -> Cell:
|
||||
def coset_for_cell(cell_id: CellID) -> Cell:
|
||||
"""
|
||||
Get the coset for a given ``cell_id``
|
||||
"""
|
||||
assert cell_id < CELLS_PER_BLOB
|
||||
roots_of_unity_brp = bit_reversal_permutation(
|
||||
compute_roots_of_unity(2 * FIELD_ELEMENTS_PER_BLOB)
|
||||
compute_roots_of_unity(FIELD_ELEMENTS_PER_EXT_BLOB)
|
||||
)
|
||||
return Cell(roots_of_unity_brp[FIELD_ELEMENTS_PER_CELL * cell_id:FIELD_ELEMENTS_PER_CELL * (cell_id + 1)])
|
||||
```
|
||||
@ -385,8 +405,8 @@ def compute_cells(blob: Blob) -> Vector[Cell, CELLS_PER_BLOB]:
|
||||
polynomial = blob_to_polynomial(blob)
|
||||
polynomial_coeff = polynomial_eval_to_coeff(polynomial)
|
||||
|
||||
extended_data = fft_field(polynomial_coeff + [0] * FIELD_ELEMENTS_PER_BLOB,
|
||||
compute_roots_of_unity(2 * FIELD_ELEMENTS_PER_BLOB))
|
||||
extended_data = fft_field(polynomial_coeff + [0] * FIELD_ELEMENTS_PER_BLOB,
|
||||
compute_roots_of_unity(FIELD_ELEMENTS_PER_EXT_BLOB))
|
||||
extended_data_rbo = bit_reversal_permutation(extended_data)
|
||||
return [extended_data_rbo[i * FIELD_ELEMENTS_PER_CELL:(i + 1) * FIELD_ELEMENTS_PER_CELL]
|
||||
for i in range(CELLS_PER_BLOB)]
|
||||
@ -397,10 +417,10 @@ def compute_cells(blob: Blob) -> Vector[Cell, CELLS_PER_BLOB]:
|
||||
#### `verify_cell_proof`
|
||||
|
||||
```python
|
||||
def verify_cell_proof(commitment: KZGCommitment,
|
||||
cell_id: int,
|
||||
cell: Cell,
|
||||
proof: KZGProof) -> bool:
|
||||
def verify_cell_proof(commitment_bytes: Bytes48,
|
||||
cell_id: CellID,
|
||||
cell_bytes: Vector[Bytes32, FIELD_ELEMENTS_PER_CELL],
|
||||
proof_bytes: Bytes48) -> bool:
|
||||
"""
|
||||
Check a cell proof
|
||||
|
||||
@ -408,19 +428,26 @@ def verify_cell_proof(commitment: KZGCommitment,
|
||||
"""
|
||||
coset = coset_for_cell(cell_id)
|
||||
|
||||
return verify_kzg_proof_multi_impl(commitment, coset, cell, proof)
|
||||
return verify_kzg_proof_multi_impl(
|
||||
bytes_to_kzg_commitment(commitment_bytes),
|
||||
coset,
|
||||
bytes_to_cell(cell_bytes),
|
||||
bytes_to_kzg_proof(proof_bytes))
|
||||
```
|
||||
|
||||
#### `verify_cell_proof_batch`
|
||||
|
||||
```python
|
||||
def verify_cell_proof_batch(row_commitments: Sequence[KZGCommitment],
|
||||
row_ids: Sequence[int],
|
||||
column_ids: Sequence[int],
|
||||
cells: Sequence[Cell],
|
||||
proofs: Sequence[KZGProof]) -> bool:
|
||||
def verify_cell_proof_batch(row_commitments_bytes: Sequence[Bytes48],
|
||||
row_ids: Sequence[uint64],
|
||||
column_ids: Sequence[uint64],
|
||||
cells_bytes: Sequence[Vector[Bytes32, FIELD_ELEMENTS_PER_CELL]],
|
||||
proofs_bytes: Sequence[Bytes48]) -> bool:
|
||||
"""
|
||||
Check multiple cell proofs. This function implements the naive algorithm of checking every cell
|
||||
Verify a set of cells, given their corresponding proofs and their coordinates (row_id, column_id) in the blob
|
||||
matrix. The list of all commitments is also provided in row_commitments_bytes.
|
||||
|
||||
This function implements the naive algorithm of checking every cell
|
||||
individually; an efficient algorithm can be found here:
|
||||
https://ethresear.ch/t/a-universal-verification-equation-for-data-availability-sampling/13240
|
||||
|
||||
@ -430,10 +457,16 @@ def verify_cell_proof_batch(row_commitments: Sequence[KZGCommitment],
|
||||
|
||||
Public method.
|
||||
"""
|
||||
assert len(cells_bytes) == len(proofs_bytes) == len(row_ids) == len(column_ids)
|
||||
|
||||
# Get commitments via row IDs
|
||||
commitments = [row_commitments[row_id] for row_id in row_ids]
|
||||
|
||||
commitments_bytes = [row_commitments_bytes[row_id] for row_id in row_ids]
|
||||
|
||||
# Get objects from bytes
|
||||
commitments = [bytes_to_kzg_commitment(commitment_bytes) for commitment_bytes in commitments_bytes]
|
||||
cells = [bytes_to_cell(cell_bytes) for cell_bytes in cells_bytes]
|
||||
proofs = [bytes_to_kzg_proof(proof_bytes) for proof_bytes in proofs_bytes]
|
||||
|
||||
return all(
|
||||
verify_kzg_proof_multi_impl(commitment, coset_for_cell(column_id), cell, proof)
|
||||
for commitment, column_id, cell, proof in zip(commitments, column_ids, cells, proofs)
|
||||
@ -442,69 +475,99 @@ def verify_cell_proof_batch(row_commitments: Sequence[KZGCommitment],
|
||||
|
||||
## Reconstruction
|
||||
|
||||
### `recover_polynomial`
|
||||
### `construct_vanishing_polynomial`
|
||||
|
||||
```python
|
||||
def recover_polynomial(cell_ids: Sequence[CellID], cells: Sequence[Cell]) -> Polynomial:
|
||||
def construct_vanishing_polynomial(missing_cell_ids: Sequence[CellID]) -> Tuple[
|
||||
Sequence[BLSFieldElement],
|
||||
Sequence[BLSFieldElement]]:
|
||||
"""
|
||||
Recovers a polynomial from 2 * FIELD_ELEMENTS_PER_CELL evaluations, half of which can be missing.
|
||||
|
||||
This algorithm uses FFTs to recover cells faster than using Lagrange implementation. However,
|
||||
a faster version thanks to Qi Zhou can be found here:
|
||||
https://github.com/ethereum/research/blob/51b530a53bd4147d123ab3e390a9d08605c2cdb8/polynomial_reconstruction/polynomial_reconstruction_danksharding.py
|
||||
|
||||
Public method.
|
||||
Given the cells that are missing from the data, compute the polynomial that vanishes at every point that
|
||||
corresponds to a missing field element.
|
||||
"""
|
||||
assert len(cell_ids) == len(cells)
|
||||
assert len(cells) >= CELLS_PER_BLOB // 2
|
||||
missing_cell_ids = [cell_id for cell_id in range(CELLS_PER_BLOB) if cell_id not in cell_ids]
|
||||
# Get the small domain
|
||||
roots_of_unity_reduced = compute_roots_of_unity(CELLS_PER_BLOB)
|
||||
|
||||
# Compute polynomial that vanishes at all the missing cells (over the small domain)
|
||||
short_zero_poly = vanishing_polynomialcoeff([
|
||||
roots_of_unity_reduced[reverse_bits(cell_id, CELLS_PER_BLOB)]
|
||||
for cell_id in missing_cell_ids
|
||||
roots_of_unity_reduced[reverse_bits(missing_cell_id, CELLS_PER_BLOB)]
|
||||
for missing_cell_id in missing_cell_ids
|
||||
])
|
||||
|
||||
full_zero_poly = []
|
||||
for i in short_zero_poly:
|
||||
full_zero_poly.append(i)
|
||||
full_zero_poly.extend([0] * (FIELD_ELEMENTS_PER_CELL - 1))
|
||||
full_zero_poly = full_zero_poly + [0] * (2 * FIELD_ELEMENTS_PER_BLOB - len(full_zero_poly))
|
||||
# Extend vanishing polynomial to full domain using the closed form of the vanishing polynomial over a coset
|
||||
zero_poly_coeff = [0] * FIELD_ELEMENTS_PER_EXT_BLOB
|
||||
for i, coeff in enumerate(short_zero_poly):
|
||||
zero_poly_coeff[i * FIELD_ELEMENTS_PER_CELL] = coeff
|
||||
|
||||
zero_poly_eval = fft_field(full_zero_poly,
|
||||
compute_roots_of_unity(2 * FIELD_ELEMENTS_PER_BLOB))
|
||||
# Compute evaluations of the extended vanishing polynomial
|
||||
zero_poly_eval = fft_field(zero_poly_coeff,
|
||||
compute_roots_of_unity(FIELD_ELEMENTS_PER_EXT_BLOB))
|
||||
zero_poly_eval_brp = bit_reversal_permutation(zero_poly_eval)
|
||||
for cell_id in missing_cell_ids:
|
||||
start = cell_id * FIELD_ELEMENTS_PER_CELL
|
||||
end = (cell_id + 1) * FIELD_ELEMENTS_PER_CELL
|
||||
assert zero_poly_eval_brp[start:end] == [0] * FIELD_ELEMENTS_PER_CELL
|
||||
for cell_id in cell_ids:
|
||||
start = cell_id * FIELD_ELEMENTS_PER_CELL
|
||||
end = (cell_id + 1) * FIELD_ELEMENTS_PER_CELL
|
||||
assert all(a != 0 for a in zero_poly_eval_brp[start:end])
|
||||
|
||||
extended_evaluation_rbo = [0] * (FIELD_ELEMENTS_PER_BLOB * 2)
|
||||
# Sanity check
|
||||
for cell_id in range(CELLS_PER_BLOB):
|
||||
start = cell_id * FIELD_ELEMENTS_PER_CELL
|
||||
end = (cell_id + 1) * FIELD_ELEMENTS_PER_CELL
|
||||
if cell_id in missing_cell_ids:
|
||||
assert all(a == 0 for a in zero_poly_eval_brp[start:end])
|
||||
else: # cell_id in cell_ids
|
||||
assert all(a != 0 for a in zero_poly_eval_brp[start:end])
|
||||
|
||||
return zero_poly_coeff, zero_poly_eval, zero_poly_eval_brp
|
||||
```
|
||||
|
||||
### `recover_shifted_data`
|
||||
|
||||
```python
|
||||
def recover_shifted_data(cell_ids: Sequence[CellID],
|
||||
cells: Sequence[Cell],
|
||||
zero_poly_eval: Sequence[BLSFieldElement],
|
||||
zero_poly_coeff: Sequence[BLSFieldElement],
|
||||
roots_of_unity_extended: Sequence[BLSFieldElement]) -> Tuple[
|
||||
Sequence[BLSFieldElement],
|
||||
Sequence[BLSFieldElement],
|
||||
BLSFieldElement]:
|
||||
"""
|
||||
Given Z(x), return polynomial Q_1(x)=(E*Z)(k*x) and Q_2(x)=Z(k*x) and k^{-1}.
|
||||
"""
|
||||
shift_factor = BLSFieldElement(PRIMITIVE_ROOT_OF_UNITY)
|
||||
shift_inv = div(BLSFieldElement(1), shift_factor)
|
||||
|
||||
extended_evaluation_rbo = [0] * FIELD_ELEMENTS_PER_EXT_BLOB
|
||||
for cell_id, cell in zip(cell_ids, cells):
|
||||
start = cell_id * FIELD_ELEMENTS_PER_CELL
|
||||
end = (cell_id + 1) * FIELD_ELEMENTS_PER_CELL
|
||||
extended_evaluation_rbo[start:end] = cell
|
||||
extended_evaluation = bit_reversal_permutation(extended_evaluation_rbo)
|
||||
|
||||
# Compute (E*Z)(x)
|
||||
extended_evaluation_times_zero = [BLSFieldElement(int(a) * int(b) % BLS_MODULUS)
|
||||
for a, b in zip(zero_poly_eval, extended_evaluation)]
|
||||
|
||||
roots_of_unity_extended = compute_roots_of_unity(2 * FIELD_ELEMENTS_PER_BLOB)
|
||||
|
||||
extended_evaluations_fft = fft_field(extended_evaluation_times_zero, roots_of_unity_extended, inv=True)
|
||||
|
||||
shift_factor = BLSFieldElement(PRIMITIVE_ROOT_OF_UNITY)
|
||||
shift_inv = div(BLSFieldElement(1), shift_factor)
|
||||
|
||||
# Compute (E*Z)(k*x)
|
||||
shifted_extended_evaluation = shift_polynomialcoeff(extended_evaluations_fft, shift_factor)
|
||||
shifted_zero_poly = shift_polynomialcoeff(full_zero_poly, shift_factor)
|
||||
# Compute Z(k*x)
|
||||
shifted_zero_poly = shift_polynomialcoeff(zero_poly_coeff, shift_factor)
|
||||
|
||||
eval_shifted_extended_evaluation = fft_field(shifted_extended_evaluation, roots_of_unity_extended)
|
||||
eval_shifted_zero_poly = fft_field(shifted_zero_poly, roots_of_unity_extended)
|
||||
|
||||
|
||||
return eval_shifted_extended_evaluation, eval_shifted_zero_poly, shift_inv
|
||||
```
|
||||
|
||||
### `recover_original_data`
|
||||
|
||||
```python
|
||||
def recover_original_data(eval_shifted_extended_evaluation: Sequence[BLSFieldElement],
|
||||
eval_shifted_zero_poly: Sequence[BLSFieldElement],
|
||||
shift_inv: BLSFieldElement,
|
||||
roots_of_unity_extended: Sequence[BLSFieldElement]) -> Sequence[BLSFieldElement]:
|
||||
"""
|
||||
Given Q_1, Q_2 and k^{-1}, compute P(x).
|
||||
"""
|
||||
# Compute Q_3 = Q_1(x)/Q_2(x) = P(k*x)
|
||||
eval_shifted_reconstructed_poly = [
|
||||
div(a, b)
|
||||
for a, b in zip(eval_shifted_extended_evaluation, eval_shifted_zero_poly)
|
||||
@ -512,10 +575,59 @@ def recover_polynomial(cell_ids: Sequence[CellID], cells: Sequence[Cell]) -> Pol
|
||||
|
||||
shifted_reconstructed_poly = fft_field(eval_shifted_reconstructed_poly, roots_of_unity_extended, inv=True)
|
||||
|
||||
# Unshift P(k*x) by k^{-1} to get P(x)
|
||||
reconstructed_poly = shift_polynomialcoeff(shifted_reconstructed_poly, shift_inv)
|
||||
|
||||
reconstructed_data = bit_reversal_permutation(fft_field(reconstructed_poly, roots_of_unity_extended))
|
||||
|
||||
return reconstructed_data
|
||||
```
|
||||
|
||||
### `recover_polynomial`
|
||||
|
||||
```python
|
||||
def recover_polynomial(cell_ids: Sequence[CellID],
|
||||
cells_bytes: Sequence[Vector[Bytes32, FIELD_ELEMENTS_PER_CELL]]) -> Polynomial:
|
||||
"""
|
||||
Recover original polynomial from FIELD_ELEMENTS_PER_EXT_BLOB evaluations, half of which can be missing. This
|
||||
algorithm uses FFTs to recover cells faster than using Lagrange implementation, as can be seen here:
|
||||
https://ethresear.ch/t/reed-solomon-erasure-code-recovery-in-n-log-2-n-time-with-ffts/3039
|
||||
|
||||
A faster version thanks to Qi Zhou can be found here:
|
||||
https://github.com/ethereum/research/blob/51b530a53bd4147d123ab3e390a9d08605c2cdb8/polynomial_reconstruction/polynomial_reconstruction_danksharding.py
|
||||
|
||||
Public method.
|
||||
"""
|
||||
assert len(cell_ids) == len(cells_bytes)
|
||||
# Check we have enough cells to be able to perform the reconstruction
|
||||
assert CELLS_PER_BLOB / 2 <= len(cell_ids) <= CELLS_PER_BLOB
|
||||
# Check for duplicates
|
||||
assert len(cell_ids) == len(set(cell_ids))
|
||||
|
||||
# Get the extended domain
|
||||
roots_of_unity_extended = compute_roots_of_unity(FIELD_ELEMENTS_PER_EXT_BLOB)
|
||||
|
||||
# Convert from bytes to cells
|
||||
cells = [bytes_to_cell(cell_bytes) for cell_bytes in cells_bytes]
|
||||
|
||||
missing_cell_ids = [cell_id for cell_id in range(CELLS_PER_BLOB) if cell_id not in cell_ids]
|
||||
zero_poly_coeff, zero_poly_eval, zero_poly_eval_brp = construct_vanishing_polynomial(missing_cell_ids)
|
||||
|
||||
eval_shifted_extended_evaluation, eval_shifted_zero_poly, shift_inv = recover_shifted_data(
|
||||
cell_ids,
|
||||
cells,
|
||||
zero_poly_eval,
|
||||
zero_poly_coeff,
|
||||
roots_of_unity_extended,
|
||||
)
|
||||
|
||||
reconstructed_data = recover_original_data(
|
||||
eval_shifted_extended_evaluation,
|
||||
eval_shifted_zero_poly,
|
||||
shift_inv,
|
||||
roots_of_unity_extended,
|
||||
)
|
||||
|
||||
for cell_id, cell in zip(cell_ids, cells):
|
||||
start = cell_id * FIELD_ELEMENTS_PER_CELL
|
||||
end = (cell_id + 1) * FIELD_ELEMENTS_PER_CELL
|
||||
|
@ -29,7 +29,7 @@ Warning: this configuration is not definitive.
|
||||
| Name | Value |
|
||||
| - | - |
|
||||
| `DENEB_FORK_VERSION` | `Version('0x04000000')` |
|
||||
| `DENEB_FORK_EPOCH` | `Epoch(18446744073709551615)` **TBD** |
|
||||
| `DENEB_FORK_EPOCH` | `Epoch(269568)` (March 13, 2024, 01:55:35pm UTC) |
|
||||
|
||||
## Helper functions
|
||||
|
||||
|
@ -578,7 +578,7 @@ def verify_blob_kzg_proof_batch(blobs: Sequence[Blob],
|
||||
"""
|
||||
|
||||
assert len(blobs) == len(commitments_bytes) == len(proofs_bytes)
|
||||
|
||||
|
||||
commitments, evaluation_challenges, ys, proofs = [], [], [], []
|
||||
for blob, commitment_bytes, proof_bytes in zip(blobs, commitments_bytes, proofs_bytes):
|
||||
assert len(blob) == BYTES_PER_BLOB
|
||||
|
@ -178,6 +178,7 @@ The following values are (non-configurable) constants used throughout the specif
|
||||
|
||||
| Name | Value |
|
||||
| - | - |
|
||||
| `UINT64_MAX` | `uint64(2**64 - 1)` |
|
||||
| `GENESIS_SLOT` | `Slot(0)` |
|
||||
| `GENESIS_EPOCH` | `Epoch(0)` |
|
||||
| `FAR_FUTURE_EPOCH` | `Epoch(2**64 - 1)` |
|
||||
@ -599,6 +600,8 @@ def integer_squareroot(n: uint64) -> uint64:
|
||||
"""
|
||||
Return the largest integer ``x`` such that ``x**2 <= n``.
|
||||
"""
|
||||
if n == UINT64_MAX:
|
||||
return uint64(4294967295)
|
||||
x = n
|
||||
y = (x + 1) // 2
|
||||
while y < x:
|
||||
|
@ -1 +1 @@
|
||||
1.4.0-beta.6
|
||||
1.4.0-beta.7
|
||||
|
0
tests/core/pyspec/eth2spec/py.typed
Normal file
0
tests/core/pyspec/eth2spec/py.typed
Normal file
@ -0,0 +1,112 @@
|
||||
from eth2spec.test.context import (
|
||||
ForkMeta,
|
||||
with_fork_metas,
|
||||
with_presets,
|
||||
)
|
||||
from eth2spec.test.helpers.constants import (
|
||||
AFTER_DENEB_PRE_POST_FORKS,
|
||||
MINIMAL,
|
||||
)
|
||||
from eth2spec.test.helpers.keys import pubkeys
|
||||
from eth2spec.test.helpers.fork_transition import (
|
||||
do_fork,
|
||||
transition_to_next_epoch_and_append_blocks,
|
||||
transition_until_fork,
|
||||
)
|
||||
|
||||
|
||||
def mock_activated_validators(spec, state, mock_activations):
|
||||
validator_count = len(state.validators)
|
||||
for i in range(mock_activations):
|
||||
index = validator_count + i
|
||||
validator = spec.Validator(
|
||||
pubkey=pubkeys[index],
|
||||
withdrawal_credentials=spec.ETH1_ADDRESS_WITHDRAWAL_PREFIX + b'\x00' * 11 + b'\x56' * 20,
|
||||
activation_eligibility_epoch=0,
|
||||
activation_epoch=spec.FAR_FUTURE_EPOCH,
|
||||
exit_epoch=spec.FAR_FUTURE_EPOCH,
|
||||
withdrawable_epoch=spec.FAR_FUTURE_EPOCH,
|
||||
effective_balance=spec.MAX_EFFECTIVE_BALANCE,
|
||||
)
|
||||
state.validators.append(validator)
|
||||
state.balances.append(spec.MAX_EFFECTIVE_BALANCE)
|
||||
state.previous_epoch_participation.append(spec.ParticipationFlags(0b0000_0000))
|
||||
state.current_epoch_participation.append(spec.ParticipationFlags(0b0000_0000))
|
||||
state.inactivity_scores.append(0)
|
||||
state.validators[index].activation_epoch = spec.get_current_epoch(state)
|
||||
|
||||
|
||||
@with_fork_metas([ForkMeta(pre_fork_name=pre, post_fork_name=post, fork_epoch=2)
|
||||
for pre, post in AFTER_DENEB_PRE_POST_FORKS])
|
||||
@with_presets([MINIMAL], reason="churn limit update needs enough validators")
|
||||
def test_higher_churn_limit_to_lower(state, fork_epoch, spec, post_spec, pre_tag, post_tag):
|
||||
"""
|
||||
Test if churn limit goes from high to low due to EIP-7514.
|
||||
"""
|
||||
# Create high churn limit
|
||||
mock_activations = post_spec.config.MAX_PER_EPOCH_ACTIVATION_CHURN_LIMIT * spec.config.CHURN_LIMIT_QUOTIENT
|
||||
mock_activated_validators(spec, state, mock_activations)
|
||||
|
||||
transition_until_fork(spec, state, fork_epoch)
|
||||
|
||||
churn_limit_0 = spec.get_validator_churn_limit(state)
|
||||
assert churn_limit_0 > post_spec.config.MAX_PER_EPOCH_ACTIVATION_CHURN_LIMIT
|
||||
|
||||
# check pre state
|
||||
assert spec.get_current_epoch(state) < fork_epoch
|
||||
|
||||
yield "pre", state
|
||||
|
||||
# irregular state transition to handle fork
|
||||
blocks = []
|
||||
state, block = do_fork(state, spec, post_spec, fork_epoch)
|
||||
blocks.append(post_tag(block))
|
||||
|
||||
# check post state
|
||||
assert spec.get_current_epoch(state) == fork_epoch
|
||||
|
||||
# continue regular state transition with new spec into next epoch
|
||||
transition_to_next_epoch_and_append_blocks(post_spec, state, post_tag, blocks, only_last_block=True)
|
||||
|
||||
yield "blocks", blocks
|
||||
yield "post", state
|
||||
|
||||
churn_limit_1 = post_spec.get_validator_activation_churn_limit(state)
|
||||
assert churn_limit_1 == post_spec.config.MAX_PER_EPOCH_ACTIVATION_CHURN_LIMIT
|
||||
assert churn_limit_1 < churn_limit_0
|
||||
|
||||
|
||||
@with_fork_metas([ForkMeta(pre_fork_name=pre, post_fork_name=post, fork_epoch=2)
|
||||
for pre, post in AFTER_DENEB_PRE_POST_FORKS])
|
||||
@with_presets([MINIMAL], reason="churn limit update needs enough validators")
|
||||
def test_higher_churn_limit_to_lower__without_block(state, fork_epoch, spec, post_spec, pre_tag, post_tag):
|
||||
"""
|
||||
Test if churn limit goes from high to low due to EIP-7514.
|
||||
"""
|
||||
# Create high churn limit
|
||||
mock_activations = post_spec.config.MAX_PER_EPOCH_ACTIVATION_CHURN_LIMIT * spec.config.CHURN_LIMIT_QUOTIENT
|
||||
mock_activated_validators(spec, state, mock_activations)
|
||||
|
||||
transition_until_fork(spec, state, fork_epoch)
|
||||
|
||||
churn_limit_0 = spec.get_validator_churn_limit(state)
|
||||
assert churn_limit_0 > post_spec.config.MAX_PER_EPOCH_ACTIVATION_CHURN_LIMIT
|
||||
|
||||
# check pre state
|
||||
assert spec.get_current_epoch(state) < fork_epoch
|
||||
|
||||
yield "pre", state
|
||||
|
||||
# irregular state transition to handle fork
|
||||
# set with_block=False here
|
||||
state, _ = do_fork(state, spec, post_spec, fork_epoch, with_block=False)
|
||||
|
||||
# check post state
|
||||
assert spec.get_current_epoch(state) == fork_epoch
|
||||
|
||||
yield "blocks", []
|
||||
yield "post", state
|
||||
|
||||
churn_limit_1 = post_spec.get_validator_activation_churn_limit(state)
|
||||
assert churn_limit_1 == post_spec.config.MAX_PER_EPOCH_ACTIVATION_CHURN_LIMIT
|
||||
assert churn_limit_1 < churn_limit_0
|
@ -10,6 +10,10 @@ from eth2spec.test.helpers.sharding import (
|
||||
from eth2spec.utils.bls import BLS_MODULUS
|
||||
|
||||
|
||||
def field_element_bytes(x):
|
||||
return int.to_bytes(x % BLS_MODULUS, 32, "big")
|
||||
|
||||
|
||||
@with_eip7594_and_later
|
||||
@spec_test
|
||||
@single_phase
|
||||
@ -34,10 +38,13 @@ def test_verify_cell_proof(spec):
|
||||
blob = get_sample_blob(spec)
|
||||
commitment = spec.blob_to_kzg_commitment(blob)
|
||||
cells, proofs = spec.compute_cells_and_proofs(blob)
|
||||
|
||||
cells_bytes = [[field_element_bytes(element) for element in cell] for cell in cells]
|
||||
|
||||
cell_id = 0
|
||||
assert spec.verify_cell_proof(commitment, cell_id, cells[cell_id], proofs[cell_id])
|
||||
assert spec.verify_cell_proof(commitment, cell_id, cells_bytes[cell_id], proofs[cell_id])
|
||||
cell_id = 1
|
||||
assert spec.verify_cell_proof(commitment, cell_id, cells[cell_id], proofs[cell_id])
|
||||
assert spec.verify_cell_proof(commitment, cell_id, cells_bytes[cell_id], proofs[cell_id])
|
||||
|
||||
|
||||
@with_eip7594_and_later
|
||||
@ -47,13 +54,16 @@ def test_verify_cell_proof_batch(spec):
|
||||
blob = get_sample_blob(spec)
|
||||
commitment = spec.blob_to_kzg_commitment(blob)
|
||||
cells, proofs = spec.compute_cells_and_proofs(blob)
|
||||
cells_bytes = [[field_element_bytes(element) for element in cell] for cell in cells]
|
||||
|
||||
assert len(cells) == len(proofs)
|
||||
|
||||
assert spec.verify_cell_proof_batch(
|
||||
row_commitments=[commitment],
|
||||
row_ids=[0],
|
||||
column_ids=[0, 1],
|
||||
cells=cells[0:1],
|
||||
proofs=proofs,
|
||||
row_commitments_bytes=[commitment],
|
||||
row_ids=[0, 0],
|
||||
column_ids=[0, 4],
|
||||
cells_bytes=[cells_bytes[0], cells_bytes[4]],
|
||||
proofs_bytes=[proofs[0], proofs[4]],
|
||||
)
|
||||
|
||||
|
||||
@ -73,10 +83,10 @@ def test_recover_polynomial(spec):
|
||||
|
||||
# Extend data with Reed-Solomon and split the extended data in cells
|
||||
cells = spec.compute_cells(blob)
|
||||
cells_bytes = [[field_element_bytes(element) for element in cell] for cell in cells]
|
||||
|
||||
# Compute the cells we will be recovering from
|
||||
cell_ids = []
|
||||
known_cells = []
|
||||
# First figure out just the indices of the cells
|
||||
for i in range(N_SAMPLES):
|
||||
j = rng.randint(0, spec.CELLS_PER_BLOB)
|
||||
@ -84,10 +94,10 @@ def test_recover_polynomial(spec):
|
||||
j = rng.randint(0, spec.CELLS_PER_BLOB)
|
||||
cell_ids.append(j)
|
||||
# Now the cells themselves
|
||||
known_cells = [cells[cell_id] for cell_id in cell_ids]
|
||||
known_cells_bytes = [cells_bytes[cell_id] for cell_id in cell_ids]
|
||||
|
||||
# Recover the data
|
||||
recovered_data = spec.recover_polynomial(cell_ids, known_cells)
|
||||
recovered_data = spec.recover_polynomial(cell_ids, known_cells_bytes)
|
||||
|
||||
# Check that the original data match the non-extended portion of the recovered data
|
||||
assert original_polynomial == recovered_data[:len(recovered_data) // 2]
|
||||
|
@ -89,4 +89,4 @@ ALL_PRESETS = (MINIMAL, MAINNET)
|
||||
#
|
||||
# Number
|
||||
#
|
||||
MAX_UINT_64 = 2**64 - 1
|
||||
UINT64_MAX = 2**64 - 1
|
||||
|
@ -0,0 +1,29 @@
|
||||
import random
|
||||
from math import isqrt
|
||||
from eth2spec.test.context import (
|
||||
spec_test,
|
||||
single_phase,
|
||||
with_all_phases,
|
||||
)
|
||||
|
||||
|
||||
@with_all_phases
|
||||
@spec_test
|
||||
@single_phase
|
||||
def test_integer_squareroot(spec):
|
||||
values = [0, 100, 2**64 - 2, 2**64 - 1]
|
||||
for n in values:
|
||||
uint64_n = spec.uint64(n)
|
||||
assert spec.integer_squareroot(uint64_n) == isqrt(n)
|
||||
|
||||
rng = random.Random(5566)
|
||||
for _ in range(10):
|
||||
n = rng.randint(0, 2**64 - 1)
|
||||
uint64_n = spec.uint64(n)
|
||||
assert spec.integer_squareroot(uint64_n) == isqrt(n)
|
||||
|
||||
try:
|
||||
spec.integer_squareroot(spec.uint64(2**64))
|
||||
assert False
|
||||
except ValueError:
|
||||
pass
|
@ -2,7 +2,7 @@ from eth2spec.test.context import (
|
||||
spec_state_test,
|
||||
with_all_phases,
|
||||
)
|
||||
from eth2spec.test.helpers.constants import MAX_UINT_64
|
||||
from eth2spec.test.helpers.constants import UINT64_MAX
|
||||
from eth2spec.test.helpers.forks import (
|
||||
is_post_altair, is_post_bellatrix,
|
||||
)
|
||||
@ -16,9 +16,9 @@ def check_bound(value, lower_bound, upper_bound):
|
||||
@with_all_phases
|
||||
@spec_state_test
|
||||
def test_validators(spec, state):
|
||||
check_bound(spec.VALIDATOR_REGISTRY_LIMIT, 1, MAX_UINT_64)
|
||||
check_bound(spec.MAX_COMMITTEES_PER_SLOT, 1, MAX_UINT_64)
|
||||
check_bound(spec.TARGET_COMMITTEE_SIZE, 1, MAX_UINT_64)
|
||||
check_bound(spec.VALIDATOR_REGISTRY_LIMIT, 1, UINT64_MAX)
|
||||
check_bound(spec.MAX_COMMITTEES_PER_SLOT, 1, UINT64_MAX)
|
||||
check_bound(spec.TARGET_COMMITTEE_SIZE, 1, UINT64_MAX)
|
||||
|
||||
# Note: can be less if you assume stricters bounds on validator set based on total ETH supply
|
||||
maximum_validators_per_committee = (
|
||||
@ -30,24 +30,24 @@ def test_validators(spec, state):
|
||||
check_bound(spec.config.MIN_PER_EPOCH_CHURN_LIMIT, 1, spec.VALIDATOR_REGISTRY_LIMIT)
|
||||
check_bound(spec.config.CHURN_LIMIT_QUOTIENT, 1, spec.VALIDATOR_REGISTRY_LIMIT)
|
||||
|
||||
check_bound(spec.config.MIN_GENESIS_ACTIVE_VALIDATOR_COUNT, spec.TARGET_COMMITTEE_SIZE, MAX_UINT_64)
|
||||
check_bound(spec.config.MIN_GENESIS_ACTIVE_VALIDATOR_COUNT, spec.TARGET_COMMITTEE_SIZE, UINT64_MAX)
|
||||
|
||||
|
||||
@with_all_phases
|
||||
@spec_state_test
|
||||
def test_balances(spec, state):
|
||||
assert spec.MAX_EFFECTIVE_BALANCE % spec.EFFECTIVE_BALANCE_INCREMENT == 0
|
||||
check_bound(spec.MIN_DEPOSIT_AMOUNT, 1, MAX_UINT_64)
|
||||
check_bound(spec.MAX_EFFECTIVE_BALANCE, spec.MIN_DEPOSIT_AMOUNT, MAX_UINT_64)
|
||||
check_bound(spec.MAX_EFFECTIVE_BALANCE, spec.EFFECTIVE_BALANCE_INCREMENT, MAX_UINT_64)
|
||||
check_bound(spec.MIN_DEPOSIT_AMOUNT, 1, UINT64_MAX)
|
||||
check_bound(spec.MAX_EFFECTIVE_BALANCE, spec.MIN_DEPOSIT_AMOUNT, UINT64_MAX)
|
||||
check_bound(spec.MAX_EFFECTIVE_BALANCE, spec.EFFECTIVE_BALANCE_INCREMENT, UINT64_MAX)
|
||||
|
||||
|
||||
@with_all_phases
|
||||
@spec_state_test
|
||||
def test_hysteresis_quotient(spec, state):
|
||||
check_bound(spec.HYSTERESIS_QUOTIENT, 1, MAX_UINT_64)
|
||||
check_bound(spec.HYSTERESIS_QUOTIENT, 1, UINT64_MAX)
|
||||
check_bound(spec.HYSTERESIS_DOWNWARD_MULTIPLIER, 1, spec.HYSTERESIS_QUOTIENT)
|
||||
check_bound(spec.HYSTERESIS_UPWARD_MULTIPLIER, spec.HYSTERESIS_QUOTIENT, MAX_UINT_64)
|
||||
check_bound(spec.HYSTERESIS_UPWARD_MULTIPLIER, spec.HYSTERESIS_QUOTIENT, UINT64_MAX)
|
||||
|
||||
|
||||
@with_all_phases
|
||||
@ -68,7 +68,7 @@ def test_time(spec, state):
|
||||
assert spec.SLOTS_PER_EPOCH <= spec.SLOTS_PER_HISTORICAL_ROOT
|
||||
assert spec.MIN_SEED_LOOKAHEAD < spec.MAX_SEED_LOOKAHEAD
|
||||
assert spec.SLOTS_PER_HISTORICAL_ROOT % spec.SLOTS_PER_EPOCH == 0
|
||||
check_bound(spec.SLOTS_PER_HISTORICAL_ROOT, spec.SLOTS_PER_EPOCH, MAX_UINT_64)
|
||||
check_bound(spec.SLOTS_PER_HISTORICAL_ROOT, spec.SLOTS_PER_EPOCH, UINT64_MAX)
|
||||
check_bound(spec.MIN_ATTESTATION_INCLUSION_DELAY, 1, spec.SLOTS_PER_EPOCH)
|
||||
|
||||
|
||||
|
@ -18,6 +18,7 @@ from eth2spec.test.altair.transition import (
|
||||
)
|
||||
from eth2spec.test.deneb.transition import (
|
||||
test_operations as test_deneb_operations,
|
||||
test_transition as test_deneb_transition,
|
||||
)
|
||||
|
||||
|
||||
@ -47,6 +48,7 @@ if __name__ == "__main__":
|
||||
test_altair_slashing,
|
||||
test_altair_operations,
|
||||
test_deneb_operations,
|
||||
test_deneb_transition,
|
||||
)
|
||||
for transition_test_module in all_tests:
|
||||
for pre_fork, post_fork in ALL_PRE_POST_FORKS:
|
||||
|
Loading…
x
Reference in New Issue
Block a user