Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tests seem to pass only when running one at a time #1339

Open
tstokes8040 opened this issue Jul 8, 2021 · 10 comments
Open

Tests seem to pass only when running one at a time #1339

tstokes8040 opened this issue Jul 8, 2021 · 10 comments

Comments

@tstokes8040
Copy link

We are running tests 5 at a time using asyncCaptureLimit: 5. This is causing tests to fail and fail almost randomly (not the same tests failing every time). The tests show that there are like 5-10 pixels off on the pages. However, with seeing this in the browser and just running the tests, nothing has clearly changed. However, when running the tests 1 by 1 manually, the tests pass. Seems like more of the tests pass when I set asyncCaptureLimit to 2. Any idea why this is happening? It is very inconsistent and honestly a real headache for our dev team to the point where we assume the regression tests will always fail because on the inconsistency and reliability of our tests.

@garris
Copy link
Owner

garris commented Jul 9, 2021

What environment are you running this in? Sometimes this kind of problem can be resolved by running it on a machine with loads of RAM.

@mirzazeyrek
Copy link
Contributor

mirzazeyrek commented Jul 9, 2021

What I can suggest is try to run the test two times if it's fails. And notify dev team only after if it fails 2 times.

@tstokes8040
Copy link
Author

What environment are you running this in? Sometimes this kind of problem can be resolved by running it on a machine with loads of RAM.

Locally with 32GB of RAM on my Mac.

@tstokes8040
Copy link
Author

tify dev team

It is odd because we do run it twice and sometimes different tests fail and some that failed last time passed and vice versa. It is very odd.

@mirzazeyrek
Copy link
Contributor

If same tests fails two times in a row there could be an issue with the loading order of scripts.

It's also possible that it's a real bug that happens randomly. Usually it's difficult to catch such issues.

@fgerards
Copy link

Same here - running 5.3.latest on Apple M1 Mac 16GB RAM with chromium installed via phpbrew. Could the chromium-version be a problem ? All screenshots are 10-15px offset vertically - I even tried to fake some mouse scrolling downwards-/reverse but didnt change anything.

@garris
Copy link
Owner

garris commented Jul 13, 2021

Hmmm 🤔,

  1. are you able to test if running on older mac hardware behaves as expected?

  2. do older versions of backstop behave as expected?

  3. are you able to test if using backstop with docker rendering behaves as expected?

@fgerards
Copy link

ad 2): v5.1.latest doesnt behave like that
ad 1): a collegue with a linux(ubuntu laptop also has this problem in latest version - maybe its due to a bug in chromium/puppeteer for linux/mac ?
ad 3): what do you mean ? running backstopjs inside a docker container ? the target instances i am testing are all docker-based...

@garris
Copy link
Owner

garris commented Jul 23, 2021

I am wondering if this puppeteer bump may be the issue...
0379221

@r-oldenburg
Copy link

Could this be related to #1344?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants