-
-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Default MAX_INT
for integer/float/interval generator
#61
Comments
@amureki one thing I miss on model bakery is a way to properly get stuff from the We could use a single dict in the settings file to configure the lib. If no dict is present, we can default to the old values of ### settings.py
MODEL_BAKERY_CONF = {
"MIN_INT": -10, # defaults to -10000
"MAX_INT": 10000,
} What do you think? |
@berinhard if you have more use cases in the code that could benefit from dedicated user configuration, then I would be totally up for it! I do not yet have a strong opinion on using single dict vs multiple constants like: ### settings.py
MODEL_BAKERY_MAX_INT = 10000
MODEL_BAKERY_OTHER_IMPORTANT_SETTING = 'happiness' I think, we might want to dig into some best practices or top used packages for inspiration. I think, one argument against dict with stringed values would be code search and refactoring - IDEs would not be detecting those values until you do a plain string search, while constants are easily detectable. |
I really do not think these reproduction steps reveal an issue on bakery. I requested changing the behaviour precisely because of this. If you are providing a generator for one of your fields that is not the automatic one, your tests should not rely on the fact that internally bakery had a low default for integers. +1 to making it configurable but I would keep the default limit high. Honestly it should really be even higher otherwise bakery does not help with boundary-testing with its default settings. |
I am having issues with these changes too. I am facing exactly the same problem, which surprisingly was warned and described in model mommy comments for What @AdrianLC says is in part true, but there is a point in which I really do not want to test the max value that postgres supports for integer data types. |
We introduced an increased
MAX_INT
value in recent work: #59I am not yet sure if this brings any unnecessary overhead to testing systems that are using
model_bakery
. What I definitely noticed - I had to adapt in several systems code that uses generators, where to be able to still usegen_integer()
andgen_float()
we need to reimplement those by using identical code (but with different constant - oldMAX_INT
).Expected behavior
I believe, in my work environment, I would expect everything to work the way it worked before. :)
Actual behavior
I cannot use
model_bakery
's generators everywhere in my tests, as I did before:MAX_INT
blows up tests in places, where we do not have a strict validation and limits formax_length
django.db.utils.DataError: smallint out of range
on certain code examples (check out "Reproduction Steps"Reproduction Steps
How to get
django.db.utils.DataError: smallint out of range
:Versions
Python: Python 3.5+
Django: 1.11+
Model Bakery: 1.1.0
Possible solutions
MAX_INT
from constant to a configurable (by library users) setting with a reasonable default (10000? I quickly checked https://github.com/joke2k/faker/blob/master/faker/providers/__init__.py#L95 for inspiration). I am not sure if this is a good option to introduce this setting that might be unrelated to most of the users (but then they just won't use it).The text was updated successfully, but these errors were encountered: