84 lines
15 KiB
Plaintext
Raw Blame History

This file contains invisible Unicode characters

This file contains invisible Unicode characters that are indistinguishable to humans but may be processed differently by a computer. If you think that this is intentional, you can safely ignore this warning. Use the Escape button to reveal them.

DEBUG  tests.conftest:conftest.py:51 Running fixture setup: test_id
DEBUG  tests.conftest:conftest.py:57 Running test: test_publish_with_valid_meta with id: 2025-12-24_04-18-48__e607ec6b-ce26-4e71-a0f2-f2e00b50e6b3
DEBUG  src.steps.common:common.py:19 Running fixture setup: common_setup
DEBUG  src.steps.relay:relay.py:28 Running fixture setup: relay_setup
DEBUG  src.steps.relay:relay.py:34 Running fixture setup: setup_main_relay_nodes
DEBUG  src.node.docker_mananger:docker_mananger.py:19 Docker client initialized with image wakuorg/nwaku:latest
DEBUG  src.node.waku_node:waku_node.py:86 WakuNode instance initialized with log path ./log/docker/node1_2025-12-24_04-18-48__e607ec6b-ce26-4e71-a0f2-f2e00b50e6b3__wakuorg_nwaku:latest.log
DEBUG  src.node.waku_node:waku_node.py:90 Starting Node...
DEBUG  src.node.docker_mananger:docker_mananger.py:22 Attempting to create or retrieve network waku
DEBUG  src.node.docker_mananger:docker_mananger.py:25 Network waku already exists
DEBUG  src.node.docker_mananger:docker_mananger.py:108 Generated random external IP 172.18.74.69
DEBUG  src.node.docker_mananger:docker_mananger.py:101 Generated ports ['28720', '28721', '28722', '28723', '28724']
DEBUG  src.node.waku_node:waku_node.py:439 RLN credentials were not set
INFO  src.node.waku_node:waku_node.py:176 RLN credentials not set or credential store not available, starting without RLN
DEBUG  src.node.waku_node:waku_node.py:178 Using volumes []
DEBUG  src.node.docker_mananger:docker_mananger.py:49 docker run -i -t -p 28720:28720 -p 28721:28721 -p 28722:28722 -p 28723:28723 -p 28724:28724 wakuorg/nwaku:latest --listen-address=0.0.0.0 --rest=true --rest-admin=true --websocket-support=true --log-level=TRACE --rest-relay-cache-capacity=100 --websocket-port=28722 --rest-port=28720 --tcp-port=28721 --discv5-udp-port=28723 --rest-address=0.0.0.0 --nat=extip:172.18.74.69 --peer-exchange=true --discv5-discovery=true --cluster-id=3 --nodekey=73af02c993758b1cf6eec3b95bc81bafb8fbdaebc5a5018fdd6db01bbbde7ae4 --shard=0 --metrics-server=true --metrics-server-address=0.0.0.0 --metrics-server-port=28724 --metrics-logging=true --relay=true
DEBUG  src.node.docker_mananger:docker_mananger.py:55 docker network connect --ip 172.18.74.69 waku 26a9c6dca5520c0049a12f5a569981ecf0a3d957a1049479787e082bffeb06b6
DEBUG  src.node.docker_mananger:docker_mananger.py:58 Container started with ID 26a9c6dca552. Setting up logs at ./log/docker/node1_2025-12-24_04-18-48__e607ec6b-ce26-4e71-a0f2-f2e00b50e6b3__wakuorg_nwaku:latest.log
DEBUG  src.node.waku_node:waku_node.py:190 Started container from image wakuorg/nwaku:latest. REST: 28720
DEBUG  src.libs.common:common.py:47 Sleeping for 1 seconds
ERROR  src.node.docker_mananger:docker_mananger.py:89 Max retries reached for container eafbdc807e31. Exiting log stream.
ERROR  src.node.docker_mananger:docker_mananger.py:89 Max retries reached for container 158ce85c2d3b. Exiting log stream.
INFO  src.node.api_clients.base_client:base_client.py:37 curl -v -X GET "http://127.0.0.1:28720/health" -H "Content-Type: application/json" -d 'None'
INFO  src.node.api_clients.base_client:base_client.py:22 Response status code: 200. Response content: b'{"nodeHealth":"READY","protocolsHealth":[{"Relay":"NOT_READY","desc":"No connected peers"},{"Rln Relay":"NOT_MOUNTED"},{"Lightpush":"NOT_MOUNTED"},{"Legacy Lightpush":"NOT_MOUNTED"},{"Filter":"NOT_MOUNTED"},{"Store":"NOT_MOUNTED"},{"Legacy Store":"NOT_MOUNTED"},{"Peer Exchange":"READY"},{"Rendezvous":"NOT_READY","desc":"No Rendezvous peers are available yet"},{"Mix":"NOT_MOUNTED"},{"Lightpush Client":"NOT_READY","desc":"No Lightpush service peer available yet"},{"Legacy Lightpush Client":"NOT_READY","desc":"No Lightpush service peer available yet"},{"Store Client":"NOT_READY","desc":"No Store service peer available yet, neither Store service set up for the node"},{"Legacy Store Client":"NOT_READY","desc":"No Legacy Store service peers are available yet, neither Store service set up for the node"},{"Filter Client":"NOT_READY","desc":"No Filter service peer available yet"}]}'
INFO  src.node.waku_node:waku_node.py:287 Node protocols are initialized !!
INFO  src.node.api_clients.base_client:base_client.py:37 curl -v -X GET "http://127.0.0.1:28720/debug/v1/info" -H "Content-Type: application/json" -d 'None'
INFO  src.node.api_clients.base_client:base_client.py:22 Response status code: 200. Response content: b'{"listenAddresses":["/ip4/172.18.74.69/tcp/28721/p2p/16Uiu2HAmF6GPmVvHZg95hH2RCtPmRyBwoPHRqCL6M7ewPhn8shBr","/ip4/172.18.74.69/tcp/28722/ws/p2p/16Uiu2HAmF6GPmVvHZg95hH2RCtPmRyBwoPHRqCL6M7ewPhn8shBr"],"enrUri":"enr:-L24QPeILpcN1aZ2e6_OIe_K_myjPVT8rvIIqwkvSMcMOVMcdVJZIXZv9CWqiEpJYDVSx9fHSsXKVyqViejDVbNtYLkCgmlkgnY0gmlwhKwSSkWKbXVsdGlhZGRyc5YACASsEkpFBnAxAAoErBJKRQZwMt0DgnJzhQADAQAAiXNlY3AyNTZrMaEDJDEInTLtI--MRbd3nUTdgiGNK4c5aFyQxpqZpxzENXWDdGNwgnAxg3VkcIJwM4V3YWt1MgE"}'
INFO  src.node.waku_node:waku_node.py:292 REST service is ready !!
DEBUG  src.node.docker_mananger:docker_mananger.py:19 Docker client initialized with image wakuorg/nwaku:latest
DEBUG  src.node.waku_node:waku_node.py:86 WakuNode instance initialized with log path ./log/docker/node2_2025-12-24_04-18-48__e607ec6b-ce26-4e71-a0f2-f2e00b50e6b3__wakuorg_nwaku:latest.log
DEBUG  src.node.waku_node:waku_node.py:90 Starting Node...
DEBUG  src.node.docker_mananger:docker_mananger.py:22 Attempting to create or retrieve network waku
DEBUG  src.node.docker_mananger:docker_mananger.py:25 Network waku already exists
DEBUG  src.node.docker_mananger:docker_mananger.py:108 Generated random external IP 172.18.176.189
DEBUG  src.node.docker_mananger:docker_mananger.py:101 Generated ports ['29160', '29161', '29162', '29163', '29164']
DEBUG  src.node.waku_node:waku_node.py:439 RLN credentials were not set
INFO  src.node.waku_node:waku_node.py:176 RLN credentials not set or credential store not available, starting without RLN
DEBUG  src.node.waku_node:waku_node.py:178 Using volumes []
DEBUG  src.node.docker_mananger:docker_mananger.py:49 docker run -i -t -p 29160:29160 -p 29161:29161 -p 29162:29162 -p 29163:29163 -p 29164:29164 wakuorg/nwaku:latest --listen-address=0.0.0.0 --rest=true --rest-admin=true --websocket-support=true --log-level=TRACE --rest-relay-cache-capacity=100 --websocket-port=29162 --rest-port=29160 --tcp-port=29161 --discv5-udp-port=29163 --rest-address=0.0.0.0 --nat=extip:172.18.176.189 --peer-exchange=true --discv5-discovery=true --cluster-id=3 --nodekey=da434e811934e8f6a3e2059d9db5c5ac4269f1b9de54b3b50946519f0dade31d --shard=0 --metrics-server=true --metrics-server-address=0.0.0.0 --metrics-server-port=29164 --metrics-logging=true --relay=true --discv5-bootstrap-node=enr:-L24QPeILpcN1aZ2e6_OIe_K_myjPVT8rvIIqwkvSMcMOVMcdVJZIXZv9CWqiEpJYDVSx9fHSsXKVyqViejDVbNtYLkCgmlkgnY0gmlwhKwSSkWKbXVsdGlhZGRyc5YACASsEkpFBnAxAAoErBJKRQZwMt0DgnJzhQADAQAAiXNlY3AyNTZrMaEDJDEInTLtI--MRbd3nUTdgiGNK4c5aFyQxpqZpxzENXWDdGNwgnAxg3VkcIJwM4V3YWt1MgE
DEBUG  src.node.docker_mananger:docker_mananger.py:55 docker network connect --ip 172.18.176.189 waku 34c747fee5c0f799d8557be043f9389bc9c5b0568a6e9671ec2ac8e64110b065
DEBUG  src.node.docker_mananger:docker_mananger.py:58 Container started with ID 34c747fee5c0. Setting up logs at ./log/docker/node2_2025-12-24_04-18-48__e607ec6b-ce26-4e71-a0f2-f2e00b50e6b3__wakuorg_nwaku:latest.log
DEBUG  src.node.waku_node:waku_node.py:190 Started container from image wakuorg/nwaku:latest. REST: 29160
DEBUG  src.libs.common:common.py:47 Sleeping for 1 seconds
INFO  src.node.api_clients.base_client:base_client.py:37 curl -v -X GET "http://127.0.0.1:29160/health" -H "Content-Type: application/json" -d 'None'
INFO  src.node.api_clients.base_client:base_client.py:22 Response status code: 200. Response content: b'{"nodeHealth":"READY","protocolsHealth":[{"Relay":"NOT_READY","desc":"No connected peers"},{"Rln Relay":"NOT_MOUNTED"},{"Lightpush":"NOT_MOUNTED"},{"Legacy Lightpush":"NOT_MOUNTED"},{"Filter":"NOT_MOUNTED"},{"Store":"NOT_MOUNTED"},{"Legacy Store":"NOT_MOUNTED"},{"Peer Exchange":"READY"},{"Rendezvous":"NOT_READY","desc":"No Rendezvous peers are available yet"},{"Mix":"NOT_MOUNTED"},{"Lightpush Client":"NOT_READY","desc":"No Lightpush service peer available yet"},{"Legacy Lightpush Client":"NOT_READY","desc":"No Lightpush service peer available yet"},{"Store Client":"NOT_READY","desc":"No Store service peer available yet, neither Store service set up for the node"},{"Legacy Store Client":"NOT_READY","desc":"No Legacy Store service peers are available yet, neither Store service set up for the node"},{"Filter Client":"NOT_READY","desc":"No Filter service peer available yet"}]}'
INFO  src.node.waku_node:waku_node.py:287 Node protocols are initialized !!
INFO  src.node.api_clients.base_client:base_client.py:37 curl -v -X GET "http://127.0.0.1:29160/debug/v1/info" -H "Content-Type: application/json" -d 'None'
INFO  src.node.api_clients.base_client:base_client.py:22 Response status code: 200. Response content: b'{"listenAddresses":["/ip4/172.18.176.189/tcp/29161/p2p/16Uiu2HAm6fbWfku8RMxaMcRcxxcigDAQkxKtDh1NefVd7eULLy4i","/ip4/172.18.176.189/tcp/29162/ws/p2p/16Uiu2HAm6fbWfku8RMxaMcRcxxcigDAQkxKtDh1NefVd7eULLy4i"],"enrUri":"enr:-L24QP1tc_Cc3eTVPdk2k5twxIADpH4XgVIx86A2UIVNBH8SdeMtfZfyckHlKJ1Uvz0y3-JMTOKGt4MNjkwUupP_YDMCgmlkgnY0gmlwhKwSsL2KbXVsdGlhZGRyc5YACASsErC9BnHpAAoErBKwvQZx6t0DgnJzhQADAQAAiXNlY3AyNTZrMaECpwGiyz74FVC4Yu0dznUS01Z2yHtvqAOtzhorOZuU4_-DdGNwgnHpg3VkcIJx64V3YWt1MgE"}'
INFO  src.node.waku_node:waku_node.py:292 REST service is ready !!
INFO  src.node.api_clients.base_client:base_client.py:37 curl -v -X POST "http://127.0.0.1:29160/admin/v1/peers" -H "Content-Type: application/json" -d '["/ip4/172.18.74.69/tcp/28721/p2p/16Uiu2HAmF6GPmVvHZg95hH2RCtPmRyBwoPHRqCL6M7ewPhn8shBr"]'
INFO  src.node.api_clients.base_client:base_client.py:22 Response status code: 200. Response content: b'OK'
DEBUG  src.steps.relay:relay.py:59 Running fixture setup: subscribe_main_relay_nodes
INFO  src.node.api_clients.base_client:base_client.py:37 curl -v -X POST "http://127.0.0.1:28720/relay/v1/subscriptions" -H "Content-Type: application/json" -d '["/waku/2/rs/3/1"]'
INFO  src.node.api_clients.base_client:base_client.py:22 Response status code: 200. Response content: b'OK'
INFO  src.node.api_clients.base_client:base_client.py:37 curl -v -X POST "http://127.0.0.1:29160/relay/v1/subscriptions" -H "Content-Type: application/json" -d '["/waku/2/rs/3/1"]'
INFO  src.node.api_clients.base_client:base_client.py:22 Response status code: 200. Response content: b'OK'
INFO  src.node.api_clients.base_client:base_client.py:37 curl -v -X POST "http://127.0.0.1:28720/relay/v1/messages/%2Fwaku%2F2%2Frs%2F3%2F1" -H "Content-Type: application/json" -d '{"payload": "UmVsYXkgd29ya3MhIQ==", "contentTopic": "/test/1/waku-relay/proto", "timestamp": '$(date +%s%N)'}'
INFO  src.node.api_clients.base_client:base_client.py:22 Response status code: 200. Response content: b'OK'
DEBUG  src.libs.common:common.py:47 Sleeping for 0.1 seconds
DEBUG  src.steps.relay:relay.py:123 Checking that peer NODE_1:wakuorg/nwaku:latest can find the published message
INFO  src.node.api_clients.base_client:base_client.py:37 curl -v -X GET "http://127.0.0.1:28720/relay/v1/messages/%2Fwaku%2F2%2Frs%2F3%2F1" -H "Content-Type: application/json" -d 'None'
INFO  src.node.api_clients.base_client:base_client.py:22 Response status code: 200. Response content: b'[{"payload":"UmVsYXkgd29ya3MhIQ==","contentTopic":"/test/1/waku-relay/proto","version":0,"timestamp":1766549931312376832,"ephemeral":false,"proof":""}]'
DEBUG  src.steps.relay:relay.py:123 Checking that peer NODE_2:wakuorg/nwaku:latest can find the published message
INFO  src.node.api_clients.base_client:base_client.py:37 curl -v -X GET "http://127.0.0.1:29160/relay/v1/messages/%2Fwaku%2F2%2Frs%2F3%2F1" -H "Content-Type: application/json" -d 'None'
INFO  src.node.api_clients.base_client:base_client.py:22 Response status code: 200. Response content: b'[{"payload":"UmVsYXkgd29ya3MhIQ==","contentTopic":"/test/1/waku-relay/proto","version":0,"timestamp":1766549931312376832,"ephemeral":false,"proof":""}]'
INFO  src.steps.relay:relay.py:71 WARM UP successful!!
INFO  src.node.api_clients.base_client:base_client.py:37 curl -v -X POST "http://127.0.0.1:28720/relay/v1/messages/%2Fwaku%2F2%2Frs%2F3%2F1" -H "Content-Type: application/json" -d '{"payload": "UmVsYXkgd29ya3MhIQ==", "contentTopic": "/test/1/waku-relay/proto", "timestamp": '$(date +%s%N)', "meta": "UmVsYXkgd29ya3MhIQ=="}'
INFO  src.node.api_clients.base_client:base_client.py:22 Response status code: 200. Response content: b'OK'
DEBUG  src.libs.common:common.py:47 Sleeping for 0.1 seconds
DEBUG  src.steps.relay:relay.py:123 Checking that peer NODE_1:wakuorg/nwaku:latest can find the published message
INFO  src.node.api_clients.base_client:base_client.py:37 curl -v -X GET "http://127.0.0.1:28720/relay/v1/messages/%2Fwaku%2F2%2Frs%2F3%2F1" -H "Content-Type: application/json" -d 'None'
INFO  src.node.api_clients.base_client:base_client.py:22 Response status code: 200. Response content: b'[{"payload":"UmVsYXkgd29ya3MhIQ==","contentTopic":"/test/1/waku-relay/proto","version":0,"timestamp":1766549931427531084,"meta":"UmVsYXkgd29ya3MhIQ==","ephemeral":false,"proof":""}]'
DEBUG  src.steps.relay:relay.py:123 Checking that peer NODE_2:wakuorg/nwaku:latest can find the published message
INFO  src.node.api_clients.base_client:base_client.py:37 curl -v -X GET "http://127.0.0.1:29160/relay/v1/messages/%2Fwaku%2F2%2Frs%2F3%2F1" -H "Content-Type: application/json" -d 'None'
INFO  src.node.api_clients.base_client:base_client.py:22 Response status code: 200. Response content: b'[{"payload":"UmVsYXkgd29ya3MhIQ==","contentTopic":"/test/1/waku-relay/proto","version":0,"timestamp":1766549931427531084,"meta":"UmVsYXkgd29ya3MhIQ==","ephemeral":false,"proof":""}]'
DEBUG  tests.conftest:conftest.py:59 Running fixture teardown: test_setup
DEBUG  tests.conftest:conftest.py:83 Running fixture teardown: close_open_nodes
DEBUG  src.node.waku_node:waku_node.py:234 Stopping container with id 26a9c6dca552
DEBUG  src.node.waku_node:waku_node.py:241 Container stopped.
DEBUG  src.node.waku_node:waku_node.py:234 Stopping container with id 34c747fee5c0
DEBUG  src.node.waku_node:waku_node.py:241 Container stopped.
DEBUG  tests.conftest:conftest.py:98 Running fixture teardown: check_waku_log_errors
DEBUG  src.node.docker_mananger:docker_mananger.py:144 No errors found in the waku logs.
DEBUG  src.node.docker_mananger:docker_mananger.py:144 No errors found in the waku logs.