Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tests: setup five pg versions for testing #290

Closed
wants to merge 1 commit into from
Closed

tests: setup five pg versions for testing #290

wants to merge 1 commit into from

Conversation

hunleyd
Copy link
Collaborator

@hunleyd hunleyd commented Jun 12, 2022

SUMMARY

Alter the text matrix to specify the PG versions as discussed in #276

ISSUE TYPE
  • Feature Pull Request

@hunleyd hunleyd self-assigned this Jun 12, 2022
@hunleyd
Copy link
Collaborator Author

hunleyd commented Jun 12, 2022

hey @Andersson007 .. this is nowhere near complete, but i have questions for you:

  1. does this approach make sense to you (installing all 5 versions in the test instance at once)?
  2. can i remove tests/integration/targets/*/vars/* for OSes that are not listed in the Azure pipeline?

Those will get me started. I'm sure I'll be back with more :)

@Andersson007
Copy link
Collaborator

hey @Andersson007 .. this is nowhere near complete, but i have questions for you:

  1. does this approach make sense to you (installing all 5 versions in the test instance at once)?
  2. can i remove tests/integration/targets/*/vars/* for OSes that are not listed in the Azure pipeline?

Those will get me started. I'm sure I'll be back with more :)

@hunleyd , first of all, thank you for working on this!

  1. does this approach make sense to you (installing all 5 versions in the test instance at once)?

I'm not sure i fully understand what the implementation plan is.
Could you please explain it?
IMO, ideally it should look like in ansible-collections/community.mysql#370.
If you open the test section, you'll see entries like Integration (Python: 3.8, Ansible: stable-2.12, MySQL: mariadb_10.8.3, Connector: mysqlclient==2.0.1.

  • So there we see separate tasks for each version of MySQL and the connector. I think it's very convenient to have them separate.
  • I'm not sure if this is achievable in AZP.
  • GHA is not an option if we don't find a way to run the tests using diff, i.e. only tests for changed modules must run, not for all. Currently the system is overloaded as most of the collections under the ansible-collections org run all their tests against every commit. If we switch to GHA, out test run will last up to a couple of hours...

Of course the implementation is discussable, i just can't imagine now how it'll look.
I see GHA implementation in c.mysql very convenient, the question is if it's possible to do something similar with AZP OR how to run targets in GHA separately - only for changed modules (and their unit/integration tests), not all the targets.

  1. can i remove tests/integration/targets/*/vars/* for OSes that are not listed in the Azure pipeline?

Makes sense.

IMO ideally we use one distro, say, Ubuntu or CentOS and install and run all supported PG and connector versions on it.

@hunleyd
Copy link
Collaborator Author

hunleyd commented Jun 13, 2022

  1. does this approach make sense to you (installing all 5 versions in the test instance at once)?

I'm not sure i fully understand what the implementation plan is. Could you please explain it?

Well, based on what little I know about AZP it seemed like the solution was to simply run each of the existing tests 5 times (using with_sequence). So what I've done so far is install the PGDG repo onto the test image, and then I install PG 10 - 14 assigning each a listen port of 54nn where nn is the version of PG (so 5410 for PG10 5411 for PG11, etc). This gives us the five instances we want to test against. Then for each of the tests, in main.yml we'd do the same with_sequence loop and change the port we connect to on each iteration. Basically we'd tweak tests/integration/targets/postgresql_copy/tasks/main.yml (and the others) to run itself 5 times (one per port).

IMO, ideally it should look like in ansible-collections/community.mysql#370. If you open the test section, you'll see entries like Integration (Python: 3.8, Ansible: stable-2.12, MySQL: mariadb_10.8.3, Connector: mysqlclient==2.0.1.

* So there we see separate tasks for each version of MySQL and the connector. I think it's very convenient to have them separate.

Yeah, this was what I had originally envisioned, but looking at the AZP bits it didn't seem setup to support that.

* I'm not sure if this is achievable in AZP.

* GHA is not an option if we don't find a way to run the tests using diff, i.e. only tests for changed modules must run, not for all. Currently the system is overloaded as most of the collections under the ansible-collections org run **all** their tests against every commit. If we switch to GHA, out test run will last up to a couple of hours...

Oh, eek. Of course, I was surprised that AZP ran for a draft PR. That seems wrong to me :)

Of course the implementation is discussable, i just can't imagine now how it'll look. I see GHA implementation in c.mysql very convenient, the question is if it's possible to do something similar with AZP OR how to run targets in GHA separately - only for changed modules (and their unit/integration tests), not all the targets.

Who would be the best AZP resource to go harass? :)

  1. can i remove tests/integration/targets/*/vars/* for OSes that are not listed in the Azure pipeline?

Makes sense.

OK

IMO ideally we use one distro, say, Ubuntu or CentOS and install and run all supported PG and connector versions on it.

I was actually thinking that myself and would prefer it.

@Andersson007
Copy link
Collaborator

@hunleyd thanks for clarifying!

Well, based on what little I know about AZP it seemed like the solution was to simply run each of the existing tests 5 times (using with_sequence). So what I've done so far is install the PGDG repo onto the test image, and then I install PG 10 - 14 assigning each a listen port of 54nn where nn is the version of PG (so 5410 for PG10 5411 for PG11, etc). This gives us the five instances we want to test against. Then for each of the tests, in main.yml we'd do the same with_sequence loop and change the port we connect to on each iteration. Basically we'd tweak tests/integration/targets/postgresql_copy/tasks/main.yml (and the others) to run itself 5 times (one per port).

With this approach, will it be easy (not only for us, folks who understand the process, but for newcomers) to see/find against which version of postgres/connector the tests fail? I think this is very important not to parse a whole log each time when our run fails.

@hunleyd
Copy link
Collaborator Author

hunleyd commented Jun 14, 2022

With this approach, will it be easy (not only for us, folks who understand the process, but for newcomers) to see/find against which version of postgres/connector the tests fail? I think this is very important not to parse a whole log each time when our run fails.

define 'easy' :)

Will it say CI (Docker 2.11 CentOS7 PG10) in the checks? No. That would require having different images available to test with in AZP. no?

We can have the output of the tests say what PGnn is. Which is less than ideal, I agree.

@Andersson007
Copy link
Collaborator

Will it say CI (Docker 2.11 CentOS7 PG10) in the checks? No. That would require having different images available to test with in AZP. no?

I think to see a distro is not important but Ansible, PG and driver versions is important to see, otherwise it can be hard to troubleshoot

We can have the output of the tests say what PGnn is. Which is less than ideal, I agree.

Yep, it's less than ideal.. it'll be a huge log for all the versions. On the other hand, it's also not ideal now but a bit better than having the huge log, now it's easier to find a version.

@Andersson007
Copy link
Collaborator

But we can try what you're suggesting (imo) using:

  • One distro, say, Ubuntu 20.04 (or any other which can run all the versions we need)
  • We should have opportunity to add different connector versions later

and then see if it lives up to our demands

@hunleyd hunleyd closed this by deleting the head repository Nov 2, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants