updating readme and dependencies

This commit is contained in:
parithosh 2024-08-13 12:56:03 +02:00
parent 13ac373a2c
commit c22105d5b3
No known key found for this signature in database
GPG Key ID: 5636BC0E08138A24
3 changed files with 20 additions and 4 deletions

View File

@ -73,3 +73,19 @@ Documentation on the different components used during spec writing can be found
## Consensus spec tests ## Consensus spec tests
Conformance tests built from the executable python spec are available in the [Ethereum Proof-of-Stake Consensus Spec Tests](https://github.com/ethereum/consensus-spec-tests) repo. Compressed tarballs are available in [releases](https://github.com/ethereum/consensus-spec-tests/releases). Conformance tests built from the executable python spec are available in the [Ethereum Proof-of-Stake Consensus Spec Tests](https://github.com/ethereum/consensus-spec-tests) repo. Compressed tarballs are available in [releases](https://github.com/ethereum/consensus-spec-tests/releases).
## Installation and Usage
The consensus-specs repo can be used by running the tests locally or inside a docker container.
To run the tests locally:
- Clone the repository with `git clone https://github.com/ethereum/consensus-specs.git`
- Switch to the directory `cd consensus-specs`
- Install the dependencies with: `make install_test && make preinstallation && make pyspec`
- Run the tests with `make citest`
To run the tests inside a docker container:
- Switch to the directory with `cd scripts`
- Run the script `./build_run_docker_tests.sh`
- Find the results in a folder called `./testResults`
- Find more ways to customize the script with `./build_run_docker_tests.sh --h`

View File

@ -13,8 +13,8 @@ Ideally manual running of docker containers is for advanced users, we recommend
The `scripts/build_run_docker_tests.sh` script will cover most usecases. The script allows the user to configure the fork(altair/bellatrix/capella..), `$IMAGE_NAME` (specifies the container to use), preset type (mainnet/minimal), and test all forks flags. Ideally, this is the main way that users interact with the spec tests instead of running it locally with varying versions of dependencies. The `scripts/build_run_docker_tests.sh` script will cover most usecases. The script allows the user to configure the fork(altair/bellatrix/capella..), `$IMAGE_NAME` (specifies the container to use), preset type (mainnet/minimal), and test all forks flags. Ideally, this is the main way that users interact with the spec tests instead of running it locally with varying versions of dependencies.
E.g: E.g:
- `./build_run_test.sh --p mainnet` will run the mainnet preset tests `./build_run_docker_tests.sh --p mainnet` will run the mainnet preset tests
- `./build_run_test.sh --a` will run all the tests across all the forks - `./build_run_docker_tests.sh --a` will run all the tests across all the forks
- `./build_run_test.sh --f deneb` will only run deneb tests - `./build_run_docker_tests.sh --f deneb` will only run deneb tests
Results are always placed in a folder called `./testResults`. The results are `.xml` files and contain the fork they represent and the date/time they were run at. Results are always placed in a folder called `./testResults`. The results are `.xml` files and contain the fork they represent and the date/time they were run at.

View File

@ -549,7 +549,7 @@ setup(
install_requires=[ install_requires=[
"eth-utils>=2.0.0,<3", "eth-utils>=2.0.0,<3",
"eth-typing>=3.2.0,<4.0.0", "eth-typing>=3.2.0,<4.0.0",
"pycryptodome==3.15.0", "pycryptodome>=3.19.1",
"py_ecc==6.0.0", "py_ecc==6.0.0",
"milagro_bls_binding==1.9.0", "milagro_bls_binding==1.9.0",
"remerkleable==0.1.28", "remerkleable==0.1.28",