Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add JSON functions to the query interface #1423

Closed
wants to merge 7 commits into from

Conversation

myyra
Copy link

@myyra myyra commented Aug 25, 2023

Adds query interface wrappers for a subset of the SQLite JSON functions.

Sometimes it's near impossible to design a nice database schema due to external factors, and I've found myself needing the JSON functions quite a bit. They are a bit awkward to use without wrappers in the query interface (but otherwise work perfectly, which speaks a lot about the API design), so I wanted to start a discussion of how best to include them by making a PR that implements the read-only subset of JSON functions, and the ->> operator. I'm looking to implement the editing functions as well, but they'll be in another PR for multiple reasons.

Some notes/considerations:

Since SQLite has the concept of JSON paths, I thought about making a key path based API for the functions. But I then remembered that they can't be converted to Strings anyway, so I dropped the idea. And building any other DSL for it, while nice, would probably be quite overkill as it would basically mean building the jq query language in Swift… And if I remember correctly, you had some good reasons for not using key paths in the query APIs, so I'm assuming that this isn't something you'd want to pursue in the future either. Also, anything lighter would probably just end up wrapping .joined(separator: ".") in an overcomplicated way, so now all the paths are simply Strings.

This might be out of scope since the PR is only a change to the query interface, but returning JSON Strings seems a bit funny to me. I don't know of a nice native way to handle a lot of arbitrary JSON in Swift, so handling the returned data is currently left to users and the likes of AnyCodable. But I'd like to figure out something for this.

The tests are currently a reflection of SQLite's JSON examples since that was the easiest way to confirm that they were working, but since that's testing SQLite in addition to the query interface, should they be changed to just check the generated output? They are also in a separate file, which I don't think is the correct place, so I'd appreciate help organizing them.

Regarding the ->> operator, I'm not sure whether it should actually be included, but I wanted to try to implement it (with questionable success) and start the discussion of whether custom operators are something that should be in GRDB. I think my biggest issue is that we can't implement (at least to my knowledge) the -> operator, so it feels weird to only have the other one. Especially since it's not technically needed, as the functionality heavily overlaps the JSON functions.

JSON support is quite new, so all the functions will likely need OS version constraints. Can this be validated on simulators?

Lastly, despite reading the docs on SQLSpecificExpressible and SQLExpressible, and understanding the point of them from the compiler's point of view, I don't think I really grasped how they should be used in the APIs. So I'm assuming all of that needs to be refactored.

Pull Request Checklist

  • CONTRIBUTING: You have read https://github.com/groue/GRDB.swift/blob/master/CONTRIBUTING.md
  • BRANCH: This pull request is submitted against the development branch.
  • DOCUMENTATION: Inline documentation has been updated.
  • DOCUMENTATION: README.md or another dedicated guide has been updated.
  • TESTS: Changes are tested.
  • TESTS: The make smokeTest terminal command runs without failure. (couldn't run it fully yet due to a few stupid issues on my side, but the unit tests pass. Will double-check the whole thing before undrafting the PR)

@groue
Copy link
Owner

groue commented Aug 25, 2023

🤩 Thank you @myyra! I'm looking forward to looking at your pull request. Please hold on a few days 🙏

@myyra
Copy link
Author

myyra commented Aug 26, 2023

🤩 Thank you @myyra! I'm looking forward to looking at your pull request. Please hold on a few days 🙏

No rush!

I added version requirements (iOS 16, macOS 13, tvOS 16, watchOS 9) based on the comments about SQLite versions, but I still need to confirm the versions on simulators at least.

The failing tests look a bit funny to me, since only the operator is failing, although the functions should fail on the older versions as well. And the error message makes me think even more that I failed something in the operator implementation.

Or maybe this is the case?

Prior to version 3.38.0, the JSON functions were an extension that would only be included in builds if the -DSQLITE_ENABLE_JSON1 compile-time option was included.

If I understood the docs correctly, the functions were available as opt-in, but not the operators which were introduced in 3.38.0. So maybe they are compiled in for older versions, which would be nice. But again I need to actually confirm this.

@groue
Copy link
Owner

groue commented Aug 27, 2023

Sometimes it's near impossible to design a nice database schema due to external factors, and I've found myself needing the JSON functions quite a bit. They are a bit awkward to use without wrappers in the query interface (but otherwise work perfectly, which speaks a lot about the API design), so I wanted to start a discussion of how best to include them by making a PR that implements the read-only subset of JSON functions, and the ->> operator. I'm looking to implement the editing functions as well, but they'll be in another PR for multiple reasons.

You are very welcome :-)

Since SQLite has the concept of JSON paths, I thought about making a key path based API for the functions. But I then remembered that they can't be converted to Strings anyway, so I dropped the idea. And building any other DSL for it, while nice, would probably be quite overkill as it would basically mean building the jq query language in Swift… And if I remember correctly, you had some good reasons for not using key paths in the query APIs, so I'm assuming that this isn't something you'd want to pursue in the future either. Also, anything lighter would probably just end up wrapping .joined(separator: ".") in an overcomplicated way, so now all the paths are simply Strings.

We need to deal with strings at some point, anyway. I'm totally fine with it.

My reluctance about key paths is based on the fact that record types do frequently, but not always, exactly reflect their database representation. The reality is that record types are a Swift interface to their inner database representation. And it's the same for "json records" as well.

That's how people use them. They'll replace a text database value with a Swift string-based enum value. They'll publicly expose a price: Decimal property instead of the priceCents database integer. They'll publicly expose a location: CLLocationCoordinate2D property instead of two latitude and longitude database doubles.

And that's totally natural. It helps users avoiding an extra layer of "behind the front line" models when they want to hide intimate database details. It's good to be able to deal with a single layer of record types (some of them acting as a facade when needed).

And this explains why key paths are not our friends: they only have access to the Swift application-facing side of the record types, not to the inner, private, database-facing side made of columns or json keys. Key paths are not columns or JSON keys.

Some people think that a properly-designed app should define two types (one "proper" model that feeds from a "data transfer object" that feeds from the database). Well the library does not prevent those people from doing as they like. The rest of us can have their record types hide intimate database details when they feel like it. People should be free to choose their level of over-engineering, and the library tries to be architecture-agnostic.

This might be out of scope since the PR is only a change to the query interface, but returning JSON Strings seems a bit funny to me. I don't know of a nice native way to handle a lot of arbitrary JSON in Swift, so handling the returned data is currently left to users and the likes of AnyCodable. But I'd like to figure out something for this.

The new Swift functions return SQLExpression, which happen to contain strings when executed. Would you mind expanding on what you mean by "funny"? Do you mean that the SQL expression does not express which Swift type should decode it?

The tests are currently a reflection of SQLite's JSON examples since that was the easiest way to confirm that they were working, but since that's testing SQLite in addition to the query interface, should they be changed to just check the generated output? They are also in a separate file, which I don't think is the correct place, so I'd appreciate help organizing them.

That's very good like that. Most GRDB tests are integration tests and even end-to-end tests. They definitely are about checking that we output what the user expects.

Regarding the ->> operator, I'm not sure whether it should actually be included, but I wanted to try to implement it (with questionable success) and start the discussion of whether custom operators are something that should be in GRDB. I think my biggest issue is that we can't implement (at least to my knowledge) the -> operator, so it feels weird to only have the other one. Especially since it's not technically needed, as the functionality heavily overlaps the JSON functions.

Last time I tried including an operator, it quickly conflicted with other libraries. It just makes the life of users miserable 😅

Could we explore subscripts, maybe?

// expression ->> '$'
expression[jsonPath: "$"]

Let's take care about ->, the SQLite operator that returns a JSON string. We must design for it right now, and include it in the PR. That's how we'll be able to design clear and unambiguous apis for both. I agree that ->> looks like it is more frequently useful, but you'd be surprised by what people can come up with 😜 I can foresee users actually wanting to decode JSON strings out of ->.

JSON support is quite new, so all the functions will likely need OS version constraints. Can this be validated on simulators?

😬 (see below)

Lastly, despite reading the docs on SQLSpecificExpressible and SQLExpressible, and understanding the point of them from the compiler's point of view, I don't think I really grasped how they should be used in the APIs. So I'm assuming all of that needs to be refactored.

SQLSpecificExpressible is for "database types" like Column, SQLExpression, etc.

SQLExpressible is for the above "database types" and also "standard Swift values" like Int, String as well. It forms the full set of SQL expressions.

Swift dictionaries and arrays do not conform to those protocols.

The choice between one or the other depends on whether or not you want the user to use standard Swift values in your apis. Your experience using the JSON functions should guide your choice, here.

For example, the GRDB function max (as in Player.select(max(Column("score")))) only accepts SQLSpecificExpressible. This is how it does not conflict with the built-in Swift max function. And nobody writes MAX(42) in SQL.

Also, filter (as in Player.filter(Column("score") > 1000))) now only accepts SQLSpecificExpressible, after it used to accept SQLExpressible. This is because I witnessed some users writing Player.filter(42) instead of Player.filter(id: 42). This was library misuse allowed by the wrong choice of protocol. The fix was to fix the definition of filter: now the misguided Player.filter(42) does not compile.

On the other hand, += (as in try Player.updateAll(db, Column("score") += 1)) accepts SQLExpressible, so that users can feed a plain integer.

I added version requirements (iOS 16, macOS 13, tvOS 16, watchOS 9) based on the comments about SQLite versions, but I still need to confirm the versions on simulators at least.

OK. This is not easy because this is not well documented. Sometimes I have to use a conservative system version (maybe too restrictive), especially on macOS. Usually Apple synchronizes SQLite version across iOS/macOS/etc releases. Above all, do not harm: make sure the api is available on the declared systems. Users won't test this for us, and we don't want to have their apps fail at runtime on their users's old devices.

Oh, and we'll have to talk about custom SQLite builds and SQLCipher :-)

The failing tests look a bit funny to me, since only the operator is failing, although the functions should fail on the older versions as well. And the error message makes me think even more that I failed something in the operator implementation.

What error do you get?

@myyra
Copy link
Author

myyra commented Aug 29, 2023

We need to deal with strings at some point, anyway. I'm totally fine with it.

My reluctance about key paths is based on the fact that record types do frequently, but not always, exactly reflect their database representation. The reality is that record types are a Swift interface to their inner database representation. And it's the same for "json records" as well.

That's how people use them. They'll replace a text database value with a Swift string-based enum value. They'll publicly expose a price: Decimal property instead of the priceCents database integer. They'll publicly expose a location: CLLocationCoordinate2D property instead of two latitude and longitude database doubles.

And that's totally natural. It helps users avoiding an extra layer of "behind the front line" models when they want to hide intimate database details. It's good to be able to deal with a single layer of record types (some of them acting as a facade when needed).

And this explains why key paths are not our friends: they only have access to the Swift application-facing side of the record types, not to the inner, private, database-facing side made of columns or json keys. Key paths are not columns or JSON keys.

Some people think that a properly-designed app should define two types (one "proper" model that feeds from a "data transfer object" that feeds from the database). Well the library does not prevent those people from doing as they like. The rest of us can have their record types hide intimate database details when they feel like it. People should be free to choose their level of over-engineering, and the library tries to be architecture-agnostic.

This makes total sense, and this is how I use records as well. And somehow, "We need to deal with strings at some point, anyway" really resonated with me. A bit like I initially thought, anything more wouldn't really be GRDB's job, as it's a database library. If I were to build a separate library/DSL for constructing SQLite JSON paths, it would totally make sense and be usable with GRDB. I think this really confirms who should be responsible for it.

The new Swift functions return SQLExpression, which happen to contain strings when executed. Would you mind expanding on what you mean by "funny"? Do you mean that the SQL expression does not express which Swift type should decode it?

Yeah, I was thinking about the decoding. Since part of the JSON functionality in SQLite is checking that it's valid, I thought it would be nice to indicate that with something like a return type. But

which happen to contain strings when executed

puts it well I think. It wouldn't be nice to hide what the database actually returns; JSON functions are nothing special in that way.

As for the actual decoding, I clearly didn't think it through, as of course what I was going after is already supported by the API 🤯 (to be honest I still have to look at how and why exactly does this work).

As a test, I put some simple JSON {"a":2,"c":[4,5]} to a text column, and requested it with .select(jsonExtract(Column("json"), "$")). That returns a String as discussed, but by using .asRequest(of: JSONObject.self), where JSONObject conforms to Codable and DatabaseValueConvertible, I get a nicely decoded Swift struct back. I'm sure this would work with AnyCodable as well if needed, so basically what I was going after is pretty much already supported, and maybe even in a better way.

Last time I tried including an operator, it quickly conflicted with other libraries. It just makes the life of users miserable 😅

That is more or less what I assumed :D

Could we explore subscripts, maybe?

// expression ->> '$'
expression[jsonPath: "$"]

I like this a lot. It is definitely worth exploring, and maybe could also be a nice bridge towards a prettier path API like I was talking about (in my head, I see something like expression[jsonPath: "$"]["c"][2]), though that could be over-engineering it. But I'm definitely in favor of a simple subscript, at least.

Let's take care about ->, the SQLite operator that returns a JSON string. We must design for it right now, and include it in the PR. That's how we'll be able to design clear and unambiguous apis for both. I agree that ->> looks like it is more frequently useful, but you'd be surprised by what people can come up with 😜 I can foresee users actually wanting to decode JSON strings out of ->.

I might have worded that poorly, but to clarify, I was talking about including both or none of the SQL operators as Swift operators. Both should be accessible from Swift, as you said. Subscripts sound like an excellent way to have both.

On the subject, I couldn't immediately think of a good way to differentiate between -> and ->> in API naming. SQLite docs don't seem to have any terms for them, and just use the operators. Any ideas for this are appreciated.

SQLSpecificExpressible is for "database types" like Column, SQLExpression, etc.

SQLExpressible is for the above "database types" and also "standard Swift values" like Int, String as well. It forms the full set of SQL expressions.

Swift dictionaries and arrays do not conform to those protocols.

The choice between one or the other depends on whether or not you want the user to use standard Swift values in your apis. Your experience using the JSON functions should guide your choice, here.

For example, the GRDB function max (as in Player.select(max(Column("score")))) only accepts SQLSpecificExpressible. This is how it does not conflict with the built-in Swift max function. And nobody writes MAX(42) in SQL.

Also, filter (as in Player.filter(Column("score") > 1000))) now only accepts SQLSpecificExpressible, after it used to accept SQLExpressible. This is because I witnessed some users writing Player.filter(42) instead of Player.filter(id: 42). This was library misuse allowed by the wrong choice of protocol. The fix was to fix the definition of filter: now the misguided Player.filter(42) does not compile.

On the other hand, += (as in try Player.updateAll(db, Column("score") += 1)) accepts SQLExpressible, so that users can feed a plain integer.

Thanks, I think I got it now.

While I don't see any immediate risk for conflicts with Swift itself, json("{"a": 2}") is something I can see as a helper function in some library or codebase. And while I could see people using the JSON functions as a sort of utility with Strings as inputs, I imagine it's still a minor use case, and having to use .databaseValue in that case is more than fine. And maybe most importantly, given that the API uses Strings in the JSON paths, allowing them in the input could cause some confusion. With SQLSpecificExpressible, putting a JSON path in the place of the input would cause a compile-time error. I think that's far better in terms of API design. I'll change everything to SQLSpecificExpressible.

OK. This is not easy because this is not well documented. Sometimes I have to use a conservative system version (maybe too restrictive), especially on macOS. Usually Apple synchronizes SQLite version across iOS/macOS/etc releases. Above all, do not harm: make sure the api is available on the declared systems. Users won't test this for us, and we don't want to have their apps fail at runtime on their users's old devices.

Makes sense. I've run the JSON functions on iOS 16 and macOS 13 at least, but I'll test the rest of the platforms as well. And as I mentioned, it looks like they might be compiled into the older versions. But that will be harder to test as I currently don't have access to physical devices with lower OS versions.

Oh, and we'll have to talk about custom SQLite builds and SQLCipher :-)

I'm not sure about everything that this includes :D Conditional compilation for older versions that don't include the functions by default?

What error do you get?

The original got fixed after limiting the versions, but I think this is the same one: Link to GitHub Actions. So maybe it was a version/availability issue after all? Here's the actual error from the original:

Fatal error: 'try!' expression unexpectedly raised an error: SQLite error 1: near ">>": syntax error - while executing `SELECT "name" ->> ? FROM "readers"`

@myyra
Copy link
Author

myyra commented Aug 29, 2023

I removed the custom Swift operator and added subscripts for both SQLite operators. Currently expression[objectAt: "$"] for ->, and expression[valueAt: "$"] for ->>, but I'm not sure whether they should have json somewhere in the signature. It could make them clearer, but also quite long for subscripts.

@groue
Copy link
Owner

groue commented Sep 3, 2023

Thank you @myyra. Please give me some more time.

@groue
Copy link
Owner

groue commented Sep 5, 2023

@myyra, this pull request is straight to-the-point, and this is very good. At the same time, I do have some critics, api-wise:

  • I'm not satisfied with the ergonomics of the public top-level function json that calls the JSON SQL function.

    Generally speaking, it is good that plain SQL functions are represented in Swift with a top-level function of the same name. We already do that a lot, for abs, min, max, dateTime, julianDay, etc. This helps users finding them. You were certainly right pursuing this way.

    There are exceptions, though, and I think we're in such a case.

    For example, in this PR json accepts some SQLSpecificExpressible instead of some SQLExpressible. This is indeed required because we would not want something as innocuous-looking as `json("[1, 2, 3]") to return some kind of SQL value.

    // Who could guess the database is involved in this line?
    let array = json("[1, 2, 3]")

    A SQLSpecificExpressible input forces the user to make it clear that the database is involved:

    // Better
    let array = json("[1, 2, 3]".databaseValue)

    But databaseValue is really a mouthful. And I wonder if we could help users use JSON functions with plain literals and String values.

    I'm also not thrilled because I'm afraid json is very frequently used as the name of properties and local variables. Those properties and local variables prevent the user from using the json top-level function (the compiler will emit errors). If we ship a json top-level function, users who want to use it will have to rename local variables and properties (not cool). And renaming properties is not possible when they are required by a protocol.

  • I'm also not quite satisfied that the objectAt: and valueAt: subscripts don't require the Swift code to contain "json" in a form or another, somewhere. I feel like we're breaking the principle of least surprise here.

Your opinions on above critics are very welcome.

Finally, I have questions about some of my intuitions, and whether or not they match your experience.

I spent quite some time in order to understand the difference between json_extract, -> and ->>.

My intuition is that the use case of extracting values is more frequent than extracting json. The difference becomes clear as soon as one considers strings, that -> extracts as valid JSON values wrapped in quotes:

$ sqlite3
sqlite> .headers on
sqlite> .mode markdown
sqlite> create table player(info text);
sqlite> insert into player values ('{ "name": "Arthur", "score": 1000 }');
sqlite> select info -> "name", info ->> "name", json_extract(info, "$.name") from player;
| info -> "name" | info ->> "name" | json_extract(info, "$.name") |
|----------------|-----------------|------------------------------|
| "Arthur"       | Arthur          | Arthur                       |

It's not common that one wants to extract "Arthur" (with quotes), so I assume that only ->> and json_extract can profit from some sort of shorthand notation. Since only ->> supports short paths such as name (json_extract requires $.name), we would probably favor ->> the most:

// SELECT info -> 'name' FROM player
let info = JSONColumn("info")
let names = Player // [String]
    .select(info["name"], as: String.self)
    .fetchAll(db)

Does this match your experience? Do you happen to use ->> more frequently than json_extract, itself used more frequently than ->?


Those feedbacks come from an experimental branch experimental/json where I experience, just like you, the kind of problems we are trying to solve and the difficulties to overcome.

There's something I like about the JSONColumn type defined in this branch. It makes it easy to extract values, as in the above example. It also makes it possible to create indexes on json values:

// CREATE UNIQUE INDEX unique_player_name ON player(info ->> 'name');
try db.create(
    index: "unique_player_name",
    on: "player",
    expressions: [JSONColumn("info")["name"]],
    options: .unique)

(Thinking out loud - I'm deeply missing a good knowledge of actual use cases)

@myyra
Copy link
Author

myyra commented Sep 5, 2023

  • I'm not satisfied with the ergonomics of the public top-level function json that calls the JSON SQL function.
    Generally speaking, it is good that plain SQL functions are represented in Swift with a top-level function of the same name. We already do that a lot, for abs, min, max, dateTime, julianDay, etc. You were certainly right pursuing this way.
    There are exceptions, though, and I think we're in such a case.
    For example, in this PR json accepts some SQLSpecificExpressible instead of some SQLExpressible. This is indeed required because we would not want something as innocuous-looking as `json("[1, 2, 3]") to return some kind of SQL value.

    // Who could guess the database is involved in this line?
    let array = json("[1, 2, 3]")

    A SQLSpecificExpressible input forces the user to make it clear that the database is involved:

    // Better
    let array = json("[1, 2, 3]".databaseValue)

    But databaseValue is really a mouthful. And I wonder if we could help users use JSON functions with plain literals and String values.
    I'm also not thrilled because I'm afraid json is very frequently used as the name of properties and local variables. Those properties and local variables prevent the user from using the json top-level function (the compiler will emit errors). If we ship a json top-level function, users who want to use it will have to rename local variables and properties (not cool). And renaming properties is not possible when they are required by a protocol.

Yeah, I'm not entirely happy with it either. Personally, I don't remember ever using the function in a real app, so it's hard for me to design the API for it. But I assumed it would be quite rare for people to want to input JSON text as-is to the function (because of Codable in Swift), so having to use databaseValue wouldn't be a dealbreaker. Though in a way, I also wouldn't be surprised if that was the most common use-case…

I also agree about the name. You are correct in that I was just following the established naming scheme, but plain json was always a bit weird. And not only because of possible clashes, it also isn't a very idiomatic name for a Swift function.

It might take some time for me to come up with a good alternative for the name, but since the docs speak about validation and minification, I'm thinking one of those terms could be included in the signature. Something like validated(json: some SQLExpressible) or minified(json: some SQLExpressible) (and variations of those) come to mind, but I'm not too enthusiastic about those as top-level names either. I'll continue thinking about the name, and I'm also very open to ideas (you seem to have a good vision for APIs).

  • I'm also not quite satisfied that the objectAt: and valueAt: subscripts don't require the Swift code to contain "json" in a form or another, somewhere. I feel like we're breaking the principle of least surprise here.

Agree, they felt really off after using them a bit. I was actually going to update them to jsonObjectAt and jsonValueAt before I read your comment. A bit lengthy, but still better.

Finally, I have questions about some of my intuitions, and whether or not they match your experience.

I spent quite some time in order to understand the difference between json_extract, -> and ->>.

My intuition is that the use case of extracting values is more frequent than extracting json. The difference becomes clear as soon as one considers strings, that -> extracts as valid JSON values wrapped in quotes:

$ sqlite3
sqlite> .headers on
sqlite> .mode markdown
sqlite> create table player(info text);
sqlite> insert into player values ('{ "name": "Arthur", "score": 1000 }');
sqlite> select info -> "name", info ->> "name", json_extract(info, "$.name") from player;
| info -> "name" | info ->> "name" | json_extract(info, "$.name") |
|----------------|-----------------|------------------------------|
| "Arthur"       | Arthur          | Arthur                       |

It's not common that one wants to extract "Arthur" (with quotes), so I assume that only ->> and json_extract can profit from some sort of shorthand notation. Since only ->> supports short paths such as name (json_extract requires $.name), we would probably favor ->> the most:

// SELECT info -> 'name' FROM player
let info = JSONColumn("info")
let names = Player // [String]
    .select(info["name"], as: String.self)
    .fetchAll(db)

Does this match your experience? Do you happen to use ->> more frequently than json_extract, itself used more frequently than ->?

This is quite in line with my experience. I mostly use json_extract, since it produces the "correct" (or one that matches my mental model of JSON) output in most cases. But, I can't say how much of that is from habit, as I felt it was the easiest one to use with the query API before adding the native functions. I definitely query values most of the time, with a few objects mixed in between. And the shorthand path syntax feels much nicer, possibly because it's closer to how dictionary subscripts work. So in that sense, I think you're right about the usage amounts, and it would probably be the same for me had I started with the API now.

Those feedbacks come from an experimental branch experimental/json where I experience, just like you, the kind of problems we are trying to solve and the difficulties to overcome.

There's something I like about the JSONColumn type defined in this branch. It makes it easy to extract values, as in the above example. It also makes it possible to create indexes on json values:

// CREATE UNIQUE INDEX unique_player_name ON player(info ->> 'name');
try db.create(
    index: "unique_player_name",
    on: "player",
    expressions: [JSONColumn("info")["name"]],
    options: .unique)

(Thinking out loud - I'm deeply missing a good knowledge of actual use cases)

This is excellent. I have to experiment a bit in a real app, but I've always thought of columns containing JSON as a sort of separate thing (since they basically require their own mini sub-query), and a separate type for them really matches that mental model. My only question right now is how they'd work with the standard (recommended?) Columns enum, which I tend to use exclusively when referencing columns, and also as a source of truth for available columns. I think a simple converting initializer would be enough, though. Update: missed the sqlJSON available for columns, that does it pretty nicely.

@groue
Copy link
Owner

groue commented Sep 5, 2023

Thanks for your previous insightful answer 🙏

While writing a first reply, it came to me that we need to make a break and identify JSON use cases that people (you, me, other users) might expect from GRDB. This will help clarifying our needs and goals!

And first, this requires a little analysis of the json() SQL function :-)

What json() is useful for?

The json() SQL function does three things:

  1. Parse JSON and raise an error on invalid input.

    Great, but there are better alternatives for parsing and checking JSON validity: json_valid and json_error_position.

  2. Minify JSON.

    A valid use case (though maybe a niche).

  3. Produce a JSON object.

    This is essential when one creates a JSON object that embeds another one. Compare those two arrays:

    $ sqlite3
    sqlite> create table player(info text);
    sqlite> insert into player values ('{ "name": "Arthur" }');
    sqlite> select json_array(info), json_array(json(info)) from player;
    |       json_array(info)       | json_array(json(info)) |
    |------------------------------|------------------------|
    | ["{ \"name\": \"Arthur\" }"] | [{"name":"Arthur"}]    |
    

So here is a proposed list of JSON use cases. Each use case groups related features that go well together, in a self-contained and complete unit. I care about completeness because there's few things more frustrating than developing against an api that lets you down after it has lured you into an illusory successful path.

  • Read and write JSON columns from Swift.

    This is already implemented with Codable Records, as well as the ready-made implementation of DatabaseValueConvertible for the Codable types that require a keyed container.

  • BUILD_JSON: Build new JSON objects at the SQL level

    json(), json_array(), json_object(), json_quote(), json_group_array(), json_group_object()

  • MODIFY_JSON: Modify JSON objects at the SQL level

    json_insert(), json_replace(), json_set(), json_patch(), json_remove()

  • MINIFY_JSON: Minify JSON at the SQL level

    json()

  • QUERY_SUBCOMPONENT: Access subcomponents by paths

    json_extract(), ->, ->>

  • COUNT_ARRAY_ELEMENTS: Counting JSON array elements

    json_array_length()

  • QUERY_JSON_TYPE: Query the type of a JSON object

    json_type()

  • VALIDATE_JSON: Validate JSON format at the database level

    json_valid(), json_error_position()

  • JSON_TABLES: Use JSON table-valued functions in the query builder.

    json_each(), json_tree()

@myyra, does this look sensible to you? Also, I see that your pull request comes with json() and json_array(), but not the other JSON-building function. Do you happen to care about json_array()? Do you think BUILD_JSON should be splitted?


In its current state, the PR addresses completely, or partially, some of these use cases. And there are some use cases that I wish we had 😅

Feature Not Implemented Partially Implemented Implemented @groue's Wish
BUILD_JSON X
MODIFY_JSON X
MINIFY_JSON X
QUERY_SUBCOMPONENT X X
COUNT_ARRAY_ELEMENTS X X
QUERY_JSON_TYPE X
VALIDATE_JSON X
JSON_TABLES¹ X

¹ Mentionned here but out of reach until GRDB is able to deal with table-valued functions. This will require a lot of work.

I care about VALIDATE_JSON because I see some value in adding check constraints on JSON columns (can be useful when an app stores raw JSON loaded from an untrustable remote server):

// CREATE TABLE test (info TEXT CHECK (JSON_VALID(info)))
try db.create(table: "player") { t in
    t.column("info", .jsonText)   // New type that self-documents the content type
        .check { $0.isValidJSON } // There's some value here
}

Of course, VALIDATE_JSON can come in another PR.

@groue
Copy link
Owner

groue commented Sep 6, 2023

  • I'm not satisfied with the ergonomics of the public top-level function json that calls the JSON SQL function.

Yeah, I'm not entirely happy with it either. Personally, I don't remember ever using the function in a real app, so it's hard for me to design the API for it.

So it looks like you don't quite have any need for the BUILD_JSON use case. This is good to know, because we probably can refocus the work and don't design any api around this use case (and probably remove support for json_array()).

But I assumed it would be quite rare for people to want to input JSON text as-is to the function (because of Codable in Swift), so having to use databaseValue wouldn't be a dealbreaker. Though in a way, I also wouldn't be surprised if that was the most common use-case…

You're probably correct. I was very focused on literals and raw strings due to my own tests, but this might not be a frequent use case, especially given BUILD_JSON is out of scope.

I also agree about the name. You are correct in that I was just following the established naming scheme, but plain json was always a bit weird. And not only because of possible clashes, it also isn't a very idiomatic name for a Swift function.

It might take some time for me to come up with a good alternative for the name, but since the docs speak about validation and minification, I'm thinking one of those terms could be included in the signature. Something like validated(json: some SQLExpressible) or minified(json: some SQLExpressible) (and variations of those) come to mind, but I'm not too enthusiastic about those as top-level names either. I'll continue thinking about the name, and I'm also very open to ideas (you seem to have a good vision for APIs).

Right. In my alternate branch, I explore SQLExpressible members instead of a top-level function. But I haven't found a satisfying name either. I'm currently in the SQLite subtleties around "strings as JSON string literals" and "strings as JSON objects", and this does not help my explorations 😅

  • I'm also not quite satisfied that the objectAt: and valueAt: subscripts don't require the Swift code to contain "json" in a form or another, somewhere. I feel like we're breaking the principle of least surprise here.

Agree, they felt really off after using them a bit. I was actually going to update them to jsonObjectAt and jsonValueAt before I read your comment. A bit lengthy, but still better.

We can do better for jsonValueAt:

[...] It's not common that one wants to extract "Arthur" (with quotes), so I assume that only ->> and json_extract can profit from some sort of shorthand notation. Since only ->> supports short paths such as name (json_extract requires $.name), we would probably favor ->> the most:

// SELECT info -> 'name' FROM player
let info = JSONColumn("info")
let names = Player // [String]
    .select(info["name"], as: String.self)
    .fetchAll(db)

Does this match your experience? Do you happen to use ->> more frequently than json_extract, itself used more frequently than ->?

This is quite in line with my experience.

🎉

I mostly use json_extract, since it produces the "correct" (or one that matches my mental model of JSON) output in most cases.

[...] I definitely query values most of the time, with a few objects mixed in between. And the shorthand path syntax feels much nicer, possibly because it's closer to how dictionary subscripts work. So in that sense, I think you're right about the usage amounts, and it would probably be the same for me had I started with the API now.

Then, I'm considering favoring ->> instead of json_extract(), because as you say, it's closer to how dictionary subscripts work: it accepts regular keys without the $ prefix: info ->> 'name'.

On the other hand, json_extract() requires $-prefixed paths: json_extract(info, '$.name').

To sum up:

  • ->> works well in the case of terminal values: extracted values (numbers, strings, arrays and dictionaries) are not fed into other json-consuming functions.

  • ->> does NOT work well in the case of arrays and dictionaries that are fed into other json-consuming functions, because SQLite will need to re-parse the SQL values returned from ->>.

  • json_extract() works well in the case terminal values: extracted values (numbers, strings, arrays and dictionaries) are not fed into other json-consuming functions.

  • json_extract() works well in the case of arrays and dictionaries that are fed into other json-consuming functions, because SQLite will not need to re-parse the SQL values returned from json_extract().

So yeah json_extract() would be ideal, if it would not require a $-prefixed path.

🙄

As you can see I'm still in the learning phase... I wonder if GRDB should express those SQLite subtleties, or ignore them altogether.

My experimental branch was leaning towards the first choice, but I'm now reconsidering my position. Those SQLite subtleties are hard. Experience tells me that in such case, it's often good to let SQLite experts find in GRDB a faithful companion, so that their SQLite experience can be translated into Swift code without hassle.

It's probably important to expose a set of straightforward JSON GRDB apis that match SQLite one-to-one.

At the same time, I'd very much like to see a JSONColumn type that "does the right thing"™️ for non-experts. Maybe in a future PR.

[...] the JSONColumn type [...]

This is excellent. I have to experiment a bit in a real app, but I've always thought of columns containing JSON as a sort of separate thing (since they basically require their own mini sub-query), and a separate type for them really matches that mental model. My only question right now is how they'd work with the standard (recommended?) Columns enum, which I tend to use exclusively when referencing columns, and also as a source of truth for available columns. I think a simple converting initializer would be enough, though. Update: missed the sqlJSON available for columns, that does it pretty nicely.

There are currently two ways to define columns in GRDB:

// An enum with cases
extension Player {
    enum Columns: String, ColumnExpression {
        case id, name, info
    }
}

// An empty enum with static members (frequently used for Codable records)
extension Player {
    enum Columns {
        static let id = Column(CodingKeys.id)
        static let name = Column(CodingKeys.name)
        static let info = Column(CodingKeys.info)
    }
}

JSON columns could be defined this way:

// An enum with cases
extension Player {
    enum Columns: String, ColumnExpression {
        case id, name
        static let info = JSONColumn("info")
    }
}

// An empty enum with static members (frequently used for Codable records)
extension Player {
    enum Columns {
        static let id = Column(CodingKeys.id)
        static let name = Column(CodingKeys.name)
        static let info = JSONColumn(CodingKeys.info)
    }
}

But as you say, we mainly need some way to derive a json column from ColumnExpression. It makes it possible to keep the enum regular. And this is important because some people want their columns to conform to CaseIterable:

// NOT GOOD
extension Player {
    // 😖 `info` not present in Columns.allCases
    enum Columns: String, ColumnExpression, CaseIterable {
        case id, name
        static let info = JSONColumn("info")
    }
}

// BETTER
extension Player {
    // 🙂 `info` present in Columns.allCases
    enum Columns: String, ColumnExpression, CaseIterable {
        case id, name, info
        
        // 🙂 json column for handy `infoJSON["name"]` access to subcomponents
        static let infoAsJSON = Self.info.asJSON // Something like that
    }
}

// 🙂 Works well with static members as well
extension Player {
    enum Columns {
        static let id = Column(CodingKeys.id)
        static let name = Column(CodingKeys.name)
        static let info = Column(CodingKeys.info).asJSON
    }
}

@myyra
Copy link
Author

myyra commented Sep 6, 2023

I had written this as a reply to your previous comment before noticing your newest one, so I'll post this here and reply to it separately


While writing a first reply, it came to me that we need to make a break and identify JSON use cases that people (you, me, other users) might expect from GRDB. This will help clarifying our needs and goals!

Excellent idea. Had I known the extent to which we'd discuss this, I would've started a discussion around the subject first.

Also, I didn't initially know what kind of interface for JSON would fit GRDB, so I thought I'd make a few small PRs with the JSON functions since those should sit nicely next to the other function wrappers. And my mindset has still been on just adding the simple wrapper functions.

But looking at all of this now, and especially considering the scope of your work on experimental/json (which is pretty much exactly what I would've expected GRDB's JSON interface to look like, but I'll write more of my thoughts on it after a bit more use), I agree it's a good time to step back.

And first, this requires a little analysis of the json() SQL function :-)

What json() is useful for? The `json()` SQL function does three things:
  1. Parse JSON and raise an error on invalid input.
    Great, but there are better alternatives for parsing and checking JSON validity: json_valid and json_error_position.
  2. Minify JSON.
    A valid use case (though maybe a niche).
  3. Produce a JSON object.
    This is essential when one creates a JSON object that embeds another one. Compare those two arrays:
    $ sqlite3
    sqlite> create table player(info text);
    sqlite> insert into player values ('{ "name": "Arthur" }');
    sqlite> select json_array(info), json_array(json(info)) from player;
    |       json_array(info)       | json_array(json(info)) |
    |------------------------------|------------------------|
    | ["{ \"name\": \"Arthur\" }"] | [{"name":"Arthur"}]    |
    

Thanks for the overview (and reminder)! I reread the docs, and remembered that I even had this documentation comment for it locally 😅:

/// - Attention: This function is not appropriate for checking the validity of JSON.
/// Use ``isJSONValid(_:)`` instead.

I have yet to use such nested JSON handling in SQLite, so I haven't really grasped how it should fit in the API. I agree that minification will probably be a niche use case (I've been going with the assumption that most Swift developers will handle JSON outside of SQLite through Codable).

Considering that and the recommendation against using it for validation, I almost feel something like "standardizing" would be a better term to describe it (though I'm not sure if that's confusing when comparing GRDB APIs to SQLite). I think the majority of use cases will be to ensure whatever JSON, be it from the DB or as input, would be in a "standard" form understood by SQLite. How do you feel about this?

So here is a proposed list of JSON use cases. Each use case groups related features that go well together, in a self-contained and complete unit. I care about completeness because there's few things more frustrating than developing against an api that lets you down after it has lured you into an illusory successful path.

  • Read and write JSON columns from Swift.
    This is already implemented with Codable Records, as well as the ready-made implementation of DatabaseValueConvertible for the Codable types that require a keyed container.
  • BUILD_JSON: Build new JSON objects at the SQL level
    json(), json_array(), json_object(), json_quote(), json_group_array(), json_group_object()
  • MODIFY_JSON: Modify JSON objects at the SQL level
    json_insert(), json_replace(), json_set(), json_patch(), json_remove()
  • MINIFY_JSON: Minify JSON at the SQL level
    json()
  • QUERY_SUBCOMPONENT: Access subcomponents by paths
    json_extract(), ->, ->>
  • COUNT_ARRAY_ELEMENTS: Counting JSON array elements
    json_array_length()
  • QUERY_JSON_TYPE: Query the type of a JSON object
    json_type()
  • VALIDATE_JSON: Validate JSON format at the database level
    json_valid(), json_error_position()
  • JSON_TABLES: Use JSON table-valued functions in the query builder.
    json_each(), json_tree()

@myyra, does this look sensible to you? Also, I see that your pull request comes with json() and json_array(), but not the other JSON-building function. Do you happen to care about json_array()? Do you think BUILD_JSON should be splitted?

It does. One idea: COUNT_ARRAY_ELEMENTS and QUERY_JSON_TYPE seem a bit specific; maybe those could go under INSPECT_JSON? But otherwise, it's a good grouping, including BUILD_JSON.

I actually realized that many of the remaining JSON functions probably should have been included in the PR for it to make sense, including the rest in BUILD_JSON, and already had them implemented locally but lacking tests.

I haven't needed json_array() yet (nor many in the BUILD_JSON), so I'm unsure about the real-world use cases.

In its current state, the PR addresses completely, or partially, some of these use cases. And there are some use cases that I wish we had 😅

Feature Not Implemented Partially Implemented Implemented @groue's Wish
BUILD_JSON X
MODIFY_JSON X
MINIFY_JSON X
QUERY_SUBCOMPONENT X X
COUNT_ARRAY_ELEMENTS X X
QUERY_JSON_TYPE X
VALIDATE_JSON X
JSON_TABLES¹ X

I was planning to split my PRs into a few separate ones out of habit, but with this list, the boundaries make less sense, in addition to forgetting a few, as mentioned above. So maybe I actually had too many functions implemented?

Also, considering the direction of your experimental/json branch, do you think it's still worth continuing this PR? The API in there is pretty much what I'd expect from GRDB for full and native JSON support, while I was merely considering adding a few helper functions, as I mentioned 😄.

¹ Mentionned here but out of reach until GRDB is able to deal with table-valued functions. This will require a lot of work.

I'm looking forward to when I have to use these. It seems like a perfect learning opportunity for API design and a deep understanding of GRDB's internals.

I care about VALIDATE_JSON because I see some value in adding check constraints on JSON columns (can be useful when an app stores raw JSON loaded from an untrustable remote server):

// CREATE TABLE test (info TEXT CHECK (JSON_VALID(info)))
try db.create(table: "player") { t in
    t.column("info", .jsonText)   // New type that self-documents the content type
        .check { $0.isValidJSON } // There's some value here
}

Of course, VALIDATE_JSON can come in another PR.

It certainly should be included. I noticed I had missed it when updating the documentation comments earlier, and realizing that This function is not appropriate for checking the validity of JSON is not a sensible comment for json() if I don't include the one that should be used :D I have it implemented locally, but I had a few other changes, including improvements to the documentation comments, that I wanted to finish first. I'll wrap those up and push them today (if it's still relevant, as mentioned above).

json_error_position() was not implemented because I figured it would be useless since it requires a version of SQLite that is not available even on iOS 17. But, custom SQLite builds are a thing :) I'll make sure it's in there as well.

P.S. Thanks for giving me a good laugh with the remarks about SwiftUI and SwiftData when I was glancing your comment yesterday evening 😄.

@myyra
Copy link
Author

myyra commented Sep 6, 2023

So it looks like you don't quite have any need for the BUILD_JSON use case. This is good to know, because we probably can refocus the work and don't design any api around this use case (and probably remove support for json_array()).

Not yet, at least. Removing support for json_array() (and the rest in BUILD_JSON) sounds good, and is in line with what I was wondering in my previous comment.

But I assumed it would be quite rare for people to want to input JSON text as-is to the function (because of Codable in Swift), so having to use databaseValue wouldn't be a dealbreaker. Though in a way, I also wouldn't be surprised if that was the most common use-case…

You're probably correct. I was very focused on literals and raw strings due to my own tests, but this might not be a frequent use case, especially given BUILD_JSON is out of scope.

I might not be the heaviest user of the JSON features, but it certainly felt weird writing all the tests and experimenting with strings, since in all the actual apps I've built, almost all of the interactions are with values in the database.

I also agree about the name. You are correct in that I was just following the established naming scheme, but plain json was always a bit weird. And not only because of possible clashes, it also isn't a very idiomatic name for a Swift function.
It might take some time for me to come up with a good alternative for the name, but since the docs speak about validation and minification, I'm thinking one of those terms could be included in the signature. Something like validated(json: some SQLExpressible) or minified(json: some SQLExpressible) (and variations of those) come to mind, but I'm not too enthusiastic about those as top-level names either. I'll continue thinking about the name, and I'm also very open to ideas (you seem to have a good vision for APIs).

Right. In my alternate branch, I explore SQLExpressible members instead of a top-level function. But I haven't found a satisfying name either. I'm currently in the SQLite subtleties around "strings as JSON string literals" and "strings as JSON objects", and this does not help my explorations 😅

I liked your approach there, as it type-safely ensures it's used in the correct places, which is really nice. But at the same time, I had trouble understanding when exactly it was getting called. But I think it's too early from both sides to consider such things more deeply yet.

We can do better for jsonValueAt:

[...] It's not common that one wants to extract "Arthur" (with quotes), so I assume that only ->> and json_extract can profit from some sort of shorthand notation. Since only ->> supports short paths such as name (json_extract requires $.name), we would probably favor ->> the most:

// SELECT info -> 'name' FROM player
let info = JSONColumn("info")
let names = Player // [String]
    .select(info["name"], as: String.self)
    .fetchAll(db)

Does this match your experience? Do you happen to use ->> more frequently than json_extract, itself used more frequently than ->?

This is quite in line with my experience.

🎉

Indeed!

I mostly use json_extract, since it produces the "correct" (or one that matches my mental model of JSON) output in most cases.
[...] I definitely query values most of the time, with a few objects mixed in between. And the shorthand path syntax feels much nicer, possibly because it's closer to how dictionary subscripts work. So in that sense, I think you're right about the usage amounts, and it would probably be the same for me had I started with the API now.

Then, I'm considering favoring ->> instead of json_extract(), because as you say, it's closer to how dictionary subscripts work: it accepts regular keys without the $ prefix: info ->> 'name'.

On the other hand, json_extract() requires $-prefixed paths: json_extract(info, '$.name').

To sum up:

  • ->> works well in the case of terminal values: extracted values (numbers, strings, arrays and dictionaries) are not fed into other json-consuming functions.
  • ->> does NOT work well in the case of arrays and dictionaries that are fed into other json-consuming functions, because SQLite will need to re-parse the SQL values returned from ->>.
  • json_extract() works well in the case terminal values: extracted values (numbers, strings, arrays and dictionaries) are not fed into other json-consuming functions.
  • json_extract() works well in the case of arrays and dictionaries that are fed into other json-consuming functions, because SQLite will not need to re-parse the SQL values returned from json_extract().

So yeah json_extract() would be ideal, if it would not require a $-prefixed path.

🙄

As you can see I'm still in the learning phase... I wonder if GRDB should express those SQLite subtleties, or ignore them altogether.

My experimental branch was leaning towards the first choice, but I'm now reconsidering my position. Those SQLite subtleties are hard. Experience tells me that in such case, it's often good to let SQLite experts find in GRDB a faithful companion, so that their SQLite experience can be translated into Swift code without hassle.

It's probably important to expose a set of straightforward JSON GRDB apis that match SQLite one-to-one.

At the same time, I'd very much like to see a JSONColumn type that "does the right thing"™️ for non-experts. Maybe in a future PR.

That is certainly a challenging thing to decide. One thing I remembered from the docs, though:

For compatibility with PostgreSQL, the -> and ->> operators also accept a text label or integer as their right-hand operand.

To me, this hints that the $ prefixed paths, or a "a well-formed JSON path expression" are the proper thing to use in SQLite, and the non-prefixed paths are just for compatibility.

So, should we just standardize the APIs to use $, since that works everywhere, and leave the option to use non-prefixed paths if given explicitly? And/or would it make sense to try to find an API where we can specify what type of path is allowed in which place?

And with both cases, could we have a type (JSONPath/JSONPathExpression?), that lightly wraps the paths, and ensures proper prefixing? A bit like how Column/ColumnExpression wraps String column names.

[...] the JSONColumn type [...]

This is excellent. I have to experiment a bit in a real app, but I've always thought of columns containing JSON as a sort of separate thing (since they basically require their own mini sub-query), and a separate type for them really matches that mental model. My only question right now is how they'd work with the standard (recommended?) Columns enum, which I tend to use exclusively when referencing columns, and also as a source of truth for available columns. I think a simple converting initializer would be enough, though. Update: missed the sqlJSON available for columns, that does it pretty nicely.

There are currently two ways to define columns in GRDB:

// An enum with cases
extension Player {
    enum Columns: String, ColumnExpression {
        case id, name, info
    }
}

// An empty enum with static members (frequently used for Codable records)
extension Player {
    enum Columns {
        static let id = Column(CodingKeys.id)
        static let name = Column(CodingKeys.name)
        static let info = Column(CodingKeys.info)
    }
}

JSON columns could be defined this way:

// An enum with cases
extension Player {
    enum Columns: String, ColumnExpression {
        case id, name
        static let info = JSONColumn("info")
    }
}

// An empty enum with static members (frequently used for Codable records)
extension Player {
    enum Columns {
        static let id = Column(CodingKeys.id)
        static let name = Column(CodingKeys.name)
        static let info = JSONColumn(CodingKeys.info)
    }
}

I thought of just adding a variable slightly after posting the comment, good thing it was a proper solution :D

But as you say, we mainly need some way to derive a json column from ColumnExpression. It makes it possible to keep the enum regular. And this is important because some people want their columns to conform to CaseIterable:

// NOT GOOD
extension Player {
    // 😖 `info` not present in Columns.allCases
    enum Columns: String, ColumnExpression, CaseIterable {
        case id, name
        static let info = JSONColumn("info")
    }
}

// BETTER
extension Player {
    // 🙂 `info` present in Columns.allCases
    enum Columns: String, ColumnExpression, CaseIterable {
        case id, name, info
        
        // 🙂 json column for handy `infoJSON["name"]` access to subcomponents
        static let infoAsJSON = Self.info.asJSON // Something like that
    }
}

// 🙂 Works well with static members as well
extension Player {
    enum Columns {
        static let id = Column(CodingKeys.id)
        static let name = Column(CodingKeys.name)
        static let info = Column(CodingKeys.info).asJSON
    }
}

I find my own argument less compelling after the above, but I also don't tend to use CodingKeys columns and can't argue with the rest of the logic either. After all, a JSON column is still just a regular column, with the only special thing being that its contents have (or might not have) special formatting.

@groue
Copy link
Owner

groue commented Sep 30, 2023

@myyra, I did not forget about this pull request :-) I'm actually close to something satisfying!

@groue groue mentioned this pull request Oct 2, 2023
@myyra
Copy link
Author

myyra commented Oct 3, 2023

Closing this in favor of #1436, which solves everything proposed here and much more.

@groue thanks for all the discussions; I learned so much about API design!

@myyra myyra closed this Oct 3, 2023
@myyra myyra deleted the json-functions branch October 5, 2023 10:10
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants