Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve builds/cache dir documentation #39

Closed
wants to merge 1 commit into from
Closed

Improve builds/cache dir documentation #39

wants to merge 1 commit into from

Conversation

olejnjak
Copy link

👋 I have been implementing primarily local Gitlab CI caching, but when I was at it I also mounted builds dir from host.

Everything went pretty well until I ran to error:

fatal: unable to access 'https://<gitlabURL>/<repo>/': error setting certificate verify locations:  CAfile: /Volumes/My Shared Files/buildsdir/<repo>.tmp/CI_SERVER_TLS_CA_FILE CApath: none

I then played with config stage and found out that I actually need to mount buildsdir/cachedir using --dir argument for prepare stage, but I think that for others it would be useful to have it in the docs.

@edigaryev
Copy link
Contributor

Hello Jakub! 👋

Thanks for the PR, however, I'm pretty sure that the GitLab Tart Executor already does automatically what you're suggesting to add to the README.md:

if config.HostDir {
runArgs = append(runArgs, "--dir", fmt.Sprintf("hostdir:%s", gitLabEnv.HostDirPath()))
} else if buildsDir, ok := os.LookupEnv(EnvTartExecutorInternalBuildsDir); ok {
runArgs = append(runArgs, "--dir", fmt.Sprintf("buildsdir:%s", buildsDir))
}
if cacheDir, ok := os.LookupEnv(EnvTartExecutorInternalCacheDir); ok {
runArgs = append(runArgs, "--dir", fmt.Sprintf("cachedir:%s", cacheDir))
}

Which version are you running, and is your GitLab Runner configuration file looks like in the README.md?

@olejnjak
Copy link
Author

olejnjak commented Oct 16, 2023

You are right, seems like I am running in a different issue, seems like using custom buildsdir causes that issue for me.

Both guest and host are running latest Sonoma 14.0 23A344.

I am using version

~ % gitlab-tart-executor --version
executor version 1.4.3-5df3df4

When I use the following config I get the above mentioned error

~ % cat .gitlab-runner/config.toml                           
concurrent = 2
check_interval = 0
shutdown_timeout = 0

[session_server]
  session_timeout = 1800

[[runners]]
  name = "<name>"
  url = "<gitlab_url>"
  id = 834
  token = "<token>"
  token_obtained_at = <date>
  token_expires_at = <date>
  executor = "custom"
  [runners.cache]
    MaxUploadedArchiveSize = 0
  [runners.feature_flags]
    FF_RESOLVE_FULL_TLS_CHAIN = false
  [runners.custom]
    config_exec = "gitlab-tart-executor"
    config_args = ["config", "--cache-dir", "/Users/<host_user>/Library/Caches/GitlabCI/cache", "--builds-dir", "/Users/<host_user>/Library/Caches/GitlabCI/builds"]
    prepare_exec = "gitlab-tart-executor"
    prepare_args = ["prepare", "--concurrency", "2", "--cpu", "auto", "--memory", "auto", "--dir", "mint:/Users/<host_user>/.mint", "--dir", "TuistCache:/Users/<host_user>/.tuist/Cache"]
    run_exec = "gitlab-tart-executor"
    run_args = ["run"]
    cleanup_exec = "gitlab-tart-executor"
    cleanup_args = ["cleanup"]

The Gitlab job output is following

Running with gitlab-runner 16.4.1 (d89a789a)
  on <device>, system ID: <id>

Preparing the "custom" executor
Using Custom executor...
2023/10/16 22:37:06 Pulling the latest version of <image>...
2023/10/16 22:37:06 Cloning and configuring a new VM...
2023/10/16 22:37:06 Waiting for the VM to boot and be SSH-able...
2023/10/16 22:37:21 Was able to SSH!
2023/10/16 22:37:21 VM is ready.

Preparing environment
Running on admins-Virtual-Machine.local...

Getting source from Git repository
Fetching changes with git depth set to 20...
Reinitialized existing Git repository in /Volumes/My Shared Files/buildsdir/<gitlab_group>/<gitlab_repo>/.git/
fatal: unable to access 'https://<gitlab_url>/<gitlab_group>/<gitlab_repo>.git/': error setting certificate verify locations:  CAfile: /Volumes/My Shared Files/buildsdir/<gitlab_group>/<gitlab_repo>.tmp/CI_SERVER_TLS_CA_FILE CApath: none
2023/10/16 22:37:28 Process exited with status 128

Cleaning up project directory and file based variables
ERROR: Job failed: exit status 1

If i omit the --builds-dir argument in config stage, everything sails through correctly, do you have an idea what the cause and especially solution might be? Even mounting the directory explicitly in prepare stage, doesn't fix it.

We can probably move this conversation elsewhere as this PR is indeed unnecessary.

Thanks.

@olejnjak
Copy link
Author

Thinking if this might be the cause 🤔
https://stackoverflow.com/a/70294114/2294228

But I'm just throwing ideas 🤷‍♂️

@fkorotkov
Copy link
Contributor

In your tests do you run a single VM on the host? I see that your config is set to use two VMs. I'm not sure if there is some sort of collision/locking/something else might be happening since /Volumes/My Shared Files/buildsdir/<gitlab_group>/<gitlab_repo>.tmp looks like will be shared between two jobs.

@olejnjak
Copy link
Author

Yes there are two jobs running, I will try with just one if that fixes the error.

@olejnjak
Copy link
Author

Well, yeah, disabling concurrency fixed this particular issue, but another arose - various tools we use do not deal with spaces in buildsdir path. I already fixed that in Mint, I've ran into another cloning private package through SPM over SSH suddenly fails with host key verification failure.

I think I have the cause, I use customized SSH command through GIT_SSH_COMMAND environment variable, I inject private key and known hosts file to the guest system this way, I have those two files saved also in Gitlab environment. I feel like this might be the cause as the environment is saved in buildsdir as well.

I think we can close this thread if you feel it might be the cause as well 🙂

@fkorotkov
Copy link
Contributor

Yeah, let's close this PR and open issues and feature request for things like custom location of the mounted directories. 👌

@fkorotkov fkorotkov closed this Oct 17, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants