Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bug: Azure/k8s-deploy@v4 cannot file path of .aml file #307

Open
1 task done
khiemng99 opened this issue Dec 20, 2023 · 5 comments
Open
1 task done

Bug: Azure/k8s-deploy@v4 cannot file path of .aml file #307

khiemng99 opened this issue Dec 20, 2023 · 5 comments
Labels
bug Something isn't working idle Inactive for 14 days

Comments

@khiemng99
Copy link

What happened?

What happened?

I get an error "Error: undefined" when running the following pipeline:

name: build_deploy_aks_chat
on:
  push:
    branches:
      - "dev"
jobs:
  build-and-push:
    runs-on: ubuntu-latest
    steps:
      - name: Checkout source code
        uses: actions/checkout@v4
        with:
          token: ${{ secrets.GH_LIB_TOKEN }}
          submodules: recursive
      - name: Login to DockerHub
        uses: azure/docker-login@v1
        with:
          login-server: ${{ secrets.REGISTRY_LOGIN_SERVER }}
          username: ${{ secrets.REGISTRY_USERNAME }}
          password: ${{ secrets.REGISTRY_PASSWORD }}
      - name: Docker meta
        id: meta
        uses: docker/metadata-action@v5
        with:
          images: |
            ${{ secrets.registry_login_server }}/myservice-chat
          tags: |
            type=ref,event=branch
            type=ref,event=pr
            type=semver,pattern={{version}}
            type=semver,pattern={{major}}.{{minor}}
            type=sha
      - name: Build and push Docker image
        uses: docker/build-push-action@v5
        with:
          context: ./services/chat
          push: true
          tags: ${{ steps.meta.outputs.tags }}
          labels: ${{ steps.meta.outputs.labels }}
          file: ./services/chat/Dockerfile
      - name: Azure login
        uses: azure/[email protected]
        with:
          creds: ${{ secrets.AZURE_CREDENTIALS }}
      - name: Set AKS context
        uses: azure/aks-set-context@v3
        with:
          resource-group: ${{ secrets.resource_group }}
          cluster-name: ${{ secrets.cluster_name }}
      - name: Setup kubectl
        uses: azure/setup-kubectl@v3
      - name: Deploy to AKS
        uses: Azure/k8s-deploy@v4
        with:
          resource-group: ${{ secrets.resource_group }}
          name: ${{ secrets.cluster_name }}
          action: deploy
          strategy: basic
          private-cluster: true
          namespace: "myservice"
          manifests: |
            ./services/chat/manifest/deployment.yaml
            ./services/chat/manifest/service.yaml
            ./services/chat/manifest/horizontal-pod-autoscaler.yaml
          images: |
            ${{ secrets.registry_login_server }}/myservice-chat:latest
          pull-images: false

Then, I get the error:

  the path "/tmp/chat-deployment.yaml" does not exist
  the path "/tmp/chat-service.yaml" does not exist
  the path "/tmp/chat-horizontal-pod-autoscaler.yaml" does not exist
  
  Error: Error: undefined

I am not sure what else could be an issue.
Many thanks in advance!

Version

  • I am using the latest version

Runner

Standard GitHub-hosted runners

Relevant log output

Deploying manifests
  ##[debug]/tmp//tmp/deployment.yaml does not exist, and therefore cannot be moved to the manifest directory
  ##[debug]private cluster Kubectl run with invoke command: kubectl apply -f /tmp/deployment.yaml,/tmp/service.yaml,/tmp/horizontal-pod-autoscaler.yaml --namespace ***-ai-assitant-service
  ##[debug]full form of az command: az aks command invoke --resource-group *** --name *** --command kubectl apply -f /tmp/deployment.yaml,/tmp/service.yaml,/tmp/horizontal-pod-autoscaler.yaml --namespace ***-ai-assitant-service --file . -o json
  ##[debug]Could not rename /tmp//tmp/deployment.yaml to  /tmp/manifests//tmp/deployment.yaml ERROR: Error: ENOENT: no such file or directory, copyfile '/tmp//tmp/deployment.yaml' -> '/tmp/manifests//tmp/deployment.yaml'
  ##[debug]from kubectl private cluster command got run output ***"exitCode":0,"stdout":"***\n  \"exitCode\": 1,\n  \"finishedAt\": \"xxx",\n  \"id\": \"xxx\",\n  \"logs\": \"the path \\\"/tmp/deployment.yaml\\\" does not exist\\nthe path \\\"/tmp/service.yaml\\\" does not exist\\nthe path \\\"/tmp/horizontal-pod-autoscaler.yaml\\\" does not exist\\n\",\n  \"provisioningState\": \"Succeeded\",\n  \"reason\": null,\n  \"startedAt\": \"xxx\"\n***\n","stderr":""***
  the path "/tmp/deployment.yaml" does not exist
  the path "/tmp/service.yaml" does not exist
  the path "/tmp/horizontal-pod-autoscaler.yaml" does not exist
  
  ##[debug]Kubectl apply failed:Error: failed private cluster Kubectl command: kubectl apply -f /tmp/deployment.yaml,/tmp/service.yaml,/tmp/horizontal-pod-autoscaler.yaml --namespace ***-ai-assitant-service
  Error: Error: undefined
  ##[debug]Node Action run completed with exit code 1
  ##[debug]Finishing: Deploy to AKS
@khiemng99 khiemng99 added the bug Something isn't working label Dec 20, 2023
@StefanLobbenmeierObjego

/tmp//tmp/deployment.yaml

educated guess:

to me this looks like there is supposed to be something between the two //, that tends to happen when some variable is empty that is expected to be in there, not sure which one it could be though

Copy link

This issue is idle because it has been open for 14 days with no activity.

@github-actions github-actions bot added the idle Inactive for 14 days label Jan 11, 2024
@Verhaeg
Copy link

Verhaeg commented Feb 5, 2024

Just to note.. This seems to be an issue with the private-cluster setting. When working with a non-private cluster it works.
Perhaps is not necessarily a bug, but at least missing documentation on how to deploy to a private cluster.

@khiemng99
Copy link
Author

Just to note.. This seems to be an issue with the private-cluster setting. When working with a non-private cluster it works. Perhaps is not necessarily a bug, but at least missing documentation on how to deploy to a private cluster.

Thank you for your response. Yes, with non-private cluster, I can deploy multiple yaml files in the same step, but not with private cluster. Currently I solve this problem by separating each yaml file into one step (maybe it's not optimal but it's the only way I found.

@ifGarcia
Copy link

V5 e v5.0.0 tbm está dando esse problema, passando os arquilos listados, ou apenas a pasta dos arquivos.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working idle Inactive for 14 days
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants