-
Notifications
You must be signed in to change notification settings - Fork 11.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[10.x] Add support for custom $PATH's (like DBngin) for the Schema Dump/import command #48911
Conversation
Well, you are supposed to add this Mind that, depending on the shell you are using, this file might be And if you want it available globally you can add this command to your In Windows you can add custom binary parhs through Control Panel (at least it used to be there, there are some years I don't use Windows for any development task). In my opinion, having custom binary paths is not something that should be handled by the framework. People could have custom paths for many things, like Redis, Memcached, and others. How multiple paths should be handled? This is an environment detail, that should be handled by the environment, not by the framework. In this particular case, if you are running a Linux/Unix/macOS box, you can also use a unix socket, which Laravel already supports to have custom paths. There is some performance gains by using a socket, as there is no TCP negotiation involved. |
See above. |
@rodrigopedra Thanks! I know about setting up the |
Well, from their GitHub issue tracker, this would be the case, if you are using it to run multiple database versions. Anyway, it seems they are considering having the export added to the user's TablePlus/DBngin#117 (comment) I didn't know DBngin from before, but from the example in that issue, it seems the path only changes when you use a different database version: export PATH="$PATH:/Users/Shared/DBngin/postgresql/15.1/bin" And, in that case, you would need to update your project's settings anyway, by your proposal. This is similar to about running a database server from docker. Unless one maps the binary to a known location, one will need to add a custom directory to their Mind that adding to your Also, if you are running, for example, multiple MySQL versions, you could add multiple directories to the path, so your OS will try to find the binary on each of them. Something like this (untested, just as an example, not even sure there is a postgresql 16.0) export PATH="$PATH:/Users/Shared/DBngin/postgresql/15.1/bin:/Users/Shared/DBngin/postgresql/16.0/bin" From what I could understand, on how DBngin works, unless you have multiple versions of a DB running at the same time, only one of these paths will have the binary when a particular version is running, right? Is it a must to have multiple DB versions, running at the same time? In a manner, that having multiple directories added to your path, would cause a conflict? As a last resource, you can add an event listener to the Your listener can read those binaries from a config file if you wish. It would look like this, from a Service Provider's Event::listen(ArtisanStarting::class, function () {
if ($path = config('app.my_cutom_db_path')) {
exec(sprintf('export PATH = "$PATH:%s"', $path));
}
}); I hope this helps =) |
Ah, ok, I didn't know it worked like that... and does the event listener solution work? If it does, you could make it into a package |
@ralphjsmit Quick off topic question: Is there any reason you have multiple server instances for the same version? What distinguishes these instances? |
I now added the latest DBngin MySQL version in my |
@dennisprudlo When creating a new database, DBngin takes the latest MySQL version at the moment of creation of the service. There is no way to upgrade a version, unless you manually migrate by creating a new service to replace the old one and export/import all databases etc. So that's why you see these different versions. No functional reason for it therefore. |
@ralphjsmit then why not simply creating the databases for an existing server instance? I use a single MySQL 8 DBngin server which has all databases for different projects I am developing. This could reduce some running services on your device – not that this is the issue at hand here 😅 |
@dennisprudlo Thanks, yeah, that is indeed an option. I don't have all services activated at the same time though :) But I am a freelancer and therefore have multiple projects that I work. For my main project I automatically start the service on login and for the other projects I activate the service if I want to work on that project. I like to keep things organized, so therefore I have a different MySQL service for each different client, plus some general-purpose DBs for myself (and some client DBs I still need to clean up ;)). |
@ralphjsmit alright, thanks for the clarification. I thought that maybe I am missing something and theres is a functional difference when using one service per project. But I see the organizational point 👍🏼 |
In short: this PR allows developers to set a custom path for the database binaries using a
bin
option in the database connection config, in situations where developers have themysql
,mysqldump
pg_dump
,pg_restore
orsqlite3
commands not automatically in their $PATH, like when using DBngin.Background
I'm using DBngin to create and manage databases like MySQL and Postgres. Since DBngin manages its installation completely by itself and separate from the "normal" mysql bin in the system (if you have it).
I don't have
mysql
installed on my system in a general bin directory, since I've been always using DBngin and never had a reason for it.However, this poses a problem when running
artisan schema:dump
or when importing the dump:In order to solve this problem, DBngin offers a special button to export the environment variables to the terminal. This will open a terminal and paste in the following code:
If you then run the
artisan schema:dump
command directly after the export, it will work, because we now gave an explicit pointer to the location of themysql
command.However, if you have tests that are running on MySQL as well, this won't work. Because when running tests (also e.g. via PhpStorm) it won't have the
export ...
command before the test command.Summary
This PR aims to solve this problem by providing developers with a way to register a custom location of the
mysql
,mysqldump
pg_dump
,pg_restore
orsqlite3
command using a'bin'
option in theconfig/database.php
.I have provided a PR laravel/laravel#6268 to keep the skeleton up-to-date as well.
The PR also adds a full new test for the Postgres schema state, which wasn't tested before.
Thanks!