status-desktop/test/e2e
Valentina Novgorodtceva 2fc228732e chore: verifications of dots in password field commented for now 2024-02-28 14:20:01 +03:00
..
ci ci: remove redundant env var 2024-02-26 13:52:20 +03:00
configs chore: increase process timeout for 10 seconds 2024-01-23 10:06:21 +03:00
constants chore: changed to sepolia 2024-02-26 10:55:05 +03:00
driver chore: trying to force the cycle work 2024-02-20 12:48:33 +03:00
ext chore: category tests updated + created separate object maps for communities and messaging 2024-02-20 15:07:54 +03:00
fixtures chore: skipping testrail report added if getting testrail error 2024-02-28 11:33:39 +03:00
gui chore: removed verification that channel is selected (I did it in branch but let's do this for master to increase stability) 2024-02-28 14:20:01 +03:00
scripts Increase timeout for wait_for_port 2024-01-23 10:06:21 +03:00
tests chore: verifications of dots in password field commented for now 2024-02-28 14:20:01 +03:00
.gitignore feat(driver): add squish.ini config file to repo 2023-12-01 14:48:45 +01:00
README.md chore: introduce timeout mark and update requirements 2023-12-26 19:14:36 +03:00
conftest.py feat(conftest): set log level using LOG_LEVEL env var 2023-12-05 23:31:44 +01:00
img.png chore: update readme 2023-12-14 10:45:45 +03:00
img_1.png chore: update readme 2023-12-14 10:45:45 +03:00
pytest.ini chore: introduce timeout mark and update requirements 2023-12-26 19:14:36 +03:00
requirements.txt chore: introduce timeout mark and update requirements 2023-12-26 19:14:36 +03:00

README.md

This repository manages UI tests for desktop application

How to set up your environment

  1. MacOS: https://www.notion.so/Mac-arch-x64-and-Intel-50ea48dae1d4481b882afdbfad38e95a
  2. Linux: https://www.notion.so/Linux-21f7abd2bb684a0fb10057848760a889
  3. Windows: https://www.notion.so/Windows-fbccd2b09b784b32ba4174233d83878d

NOTE: when MacOS and Linux are proven to be working, Windows guide could be outdated (no one yet set up Windows)

Which build to use

  1. you can use your local dev build but sometimes tests hag there. To use it, just place a path to the executable to AUT_PATH in your _local.py config, for example AUT_PATH = "/Users/anastasiya/status-desktop/bin/nim_status_client"

  2. normally, please use CI build. Grab recent one from Jenkins job https://ci.status.im/job/status-desktop/job/nightly/

    2.1 Linux and Windows could be taken from nightly job img.png

    2.2 Mac requires entitlements for Squish which we don't add by default, so please go here https://ci.status.im/job/status-desktop/job/systems/job/macos/ and select architecture you need (arm or intel), click Build with parameters and select Squish entitlements. Select a branch if u like (master is default) img_1.png

Pytest marks used

You can run tests by mark, just use it like this in command line:

python3 -m pytest -m critical

or directly in pycharm terminal:

pytest -m critical

You can obtain the list of all marks we have by running this pytest --markers

  • critical, mark used to select the most important checks we do for PRs in desktop repository (the same for our repo PRs)
  • xfail, used to link tests to existing tickets in desktop, so if test fails it will be marked as expected to fail in report with a reference to the ticket. At the same time, if such test is passing, it will be shown as XPASS (unexpectedly passing) in report, which will indicate the initial bug is gone
  • skip, used to just skip tests for various reasons, normally with a ticket linked
  • flaky, used to mark the tests that are normally passing but sometimes fail. If such test passes, then if will be shown as passed in report normally. If the test fails, then the total run wont be failed, but the corresponding test will be marked as xfail in the report. It is done for a few tests that are not super stable yet, but passes most of the time. This mark should be used with caution and in case of real need only.
  • timeout(timeout=180, method="thread"), to catch excessively long test durations like deadlocked or hanging tests. This is done by pytest-timeout plugin