-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Missing deps modules #5
Comments
For the first situation, I don't understand how one would do this in practice. I'd be more inclined to throw the distribution name and epoch timestamp into a file and skip testing for some arbitrary amount of time (at which point the DB is cleared). DB_file would probably be fine for this. For the second situation, I continue to vehemently oppose using "NA" for this purpose. I do favor creating a new grade to cover it, but that requires @barbie to buy in since it affects statistics and reporting. I think it's likely to irritate a lot of authors if it's a regular report that shows up in their inbox or on metacpan, because most of them won't have the time or inclination to do anything about upstream problems. |
For both situations this was handled by CPAN::YACSmoke (and possibly CPANPLUS::YACSmoke as was based on the former). In the first situation CPAN::YACSmoke kept a history of what was tested, and if module A was tested and its dependency module B failed (and even further up the tree), it was prevented from being tested again until the dependency version changed. In the second situation, a FAIL would be sent for the dependency, and like the first situation, the dependent would be blocked from being tested until the dependency was released with a newer version. In both cases the tester could clear the history and start again if they so wished. Like David I am very much against using NA for this, as it has nothing to do with the author of the dependent module. The author might want to know they have a bad dependency, but currently that is sort of covered by the CPAN Deps website. I don't believe a new grade is called for, but a mashup website, that takes the reports and the basis of CPAN Deps, but gives the author a clearer picture of the dependency tree based on OS/Perl versions. Or possibly a watch-list website that reports on dependency failure to dependent authors. It would have to be opt-in, as I know from bitter personal experience that some authors will take this as an eager opportunity to tear strips off you if you don't. I might consider a watch-list website in the future, unless someone else gets to it first, but creating new grades for this is not the way to go in my opinion. |
Hello I wouldn't mind to get NAs for the failiing deps, but that is not my problem now. So, good we agree something needs to be done to missing deps, or failing deps. Best, |
This can be solved using CPAN.pm configuration. "halt_on_failure" in CPAN::Config should be set to 0. |
CPAN::YACSmoke is now obsolete. The one to look at is CPANPLUS::YACSmoke, however, this only works with CPANPLUS (hence the name change). If chorny's solution above works for you, that might be a better solution for you now. |
Will check @chorny solution later today. Will give feedback then. |
I confirm I had it as |
ok, now there are fewer modules testing again. Namely, modules which build process I need to kill (infinite loops, et all), and some other non standard modules |
After some chat with @dagolden, he suggested to post here ideas on what is and what should be Smoker behavior in some circumstances.
In my humble Win32 smoking machine I am having a big problem with modules that are tested everyday (every time the smoker script reindexes cpan and restarts).
I am not sure in which situations this happens, but it seems it happens when:
I think these two situations need to be handled in a different way. I am not sure if they are feasible, if they are desirable, or even if they are possible (do not know the details behind the smoker script).
Is any of this possible? Does this make sense? If so, can I help somehow?
Cheers
The text was updated successfully, but these errors were encountered: