Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why not use the "Internals" convention? #95

Open
nikita-volkov opened this issue Jul 5, 2018 · 5 comments
Open

Why not use the "Internals" convention? #95

nikita-volkov opened this issue Jul 5, 2018 · 5 comments

Comments

@nikita-volkov
Copy link
Owner

nikita-volkov commented Jul 5, 2018

People keep asking to reveal some internal modules using the "Internals" convention. E.g.,

This thread is intended for an isolated discussion of that subject particularly.

In summary, my position is that the Internals conventions is a mistake.

@nikita-volkov
Copy link
Owner Author

Citing #86 (comment):

I've never actually seen a good reason to hide the internals of a library in Haskell, as long as it is made clear that a any code making use of the internals can break at any time and you won't support its use. Generally hiding internals make understanding a library for users*. There is a good reason why libraries like ByteString and Text have .Internal modules - there are many cases where being able to access the internals is necessary for users because of factors that aren't necessarily predictable (I've needed to access the internals of ByteString many times to make converting to other types such as Storable Vectors efficient many times).

I absolutely agree. I've never argued against having the lower-level APIs exposed. However, my point which you seem to be missing is that it doesn't necessarily have to be done with a hack to package versioning, which the "Internals" convention is. There are other methods!

Take a look at "postgresql-binary". I've only developed that package when I was working on "hasql". It could just as well be hidden in the internal modules of "hasql". But you get the full access to its functionality, while "hasql" still doesn't have any "Internals" modules.

Take a look at my other package - "potoki", it is basically a bunch of reexports from "potoki-core". "potoki" provides an API with closed abstractions and that API is targeted towards the general users of the whole "potoki" ecosystem. "potoki-core" has all the abstractions exposed and provides all the low-level features required for creating other packages of the ecosystem like "potoki-hasql".

Can you see? The lower-level details can be exposed without the ugliness of "Internals". The community would benefit a heck of a lot if the basic packages like "text" or "bytestring" were distributed like that. It is all about the audience of the package: in all the mentioned packages the internals are only needed by a few people doing the hardcore stuff, while the majority needs not to be bothered or otherwise affected by them. This is two separate audiences! Separate audiences need separate distributions.

This is not all that is wrong with the "Internals" convention. The convention goes against all the benefits of having semantic versioning. The users expect package versions in the x.x.* space to be absolutely compatible, yet the "Internals" convention simply requires them to depend on the specific versions of packages or forces the authors to stabilize their internal modules, thus defeating the purpose of them being internal. The only reason you don't see the drastic consequences of that yet is because fortunately not too many authors apply that convention.

@axman6
Copy link
Contributor

axman6 commented Jul 5, 2018

Unless I am missing something, while postgresql-binary does expose enough to write parsers for what I discussed, because the Value constructor isn't exposed by hasql, I can't actually make any use of the fact that I can write parsers in postgresql-binary, and thus cannot use those parsers I've written using postgresql-binary in my own hasql queries at all. You can make use of it because you are the author of hasql and are allowed to construct new Values, but I as a user can't. Really the only change that needs to happen is to change an export from Value to Value(..) somewhere and these problems would be solved, or even provide something similar to custom which can accept a Bool -> PostgreSQL.Binary.Decoding.Value a and return a Hasql.Decoders.Value a - doing that would avoid having to settle on the use of ReaderT internally, as long as you're sure you'll always have to maintain the bool indicating what type of value is used to represent times then that function should always be possible to support into the future.

As an exercise, you should try making a new package which has hasql as a dependency and see if you can write what I was trying to do. I would love to be wrong, but I think you'll find that you either have to resort to the hack I have, or have to use custom and then figure out how to run either timestamptz_float or timestamptz_int on a ByteString directly.

Edit: Now I'm looking into this more deeply again, it looks like what I'm after should be possible with:

timestamptzThyme = custom (bool (valueParser A.timestamp_float) (valueParser A.timestamp_int))

which is less painful than I expected - has this changed since I originally filed the bug? I don't remember seeing custom before today, nor seeing valueParser but that's likely been there for quite a long time.

@nikita-volkov
Copy link
Owner Author

Can you please move your post to the "Export Hasql.Private.Decoders.Value.Value to allow for other time libraries" issue? It's unrelated to the topic of this thread and I'm trying hard to keep the discussions structured and up to the topic, which is the reason I've isolated this thread in the first place. I'll answer in the mentioned issue.

@andremarianiello
Copy link

@nikita-volkov would you be in favor of a hasql-core package that exposed lower-level details of the hasql abstractions?

@nikita-volkov
Copy link
Owner Author

Sure. I would. If we discover there is a real need for it.

In case of the #144 I don't see the need, because the issue seems resolvable with the existing API. Possibly, I'm missing something, but let's discuss it there.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants