如何部署使用Google Cloud build和Source Repository新推的多个Cloud功能?

时间:2020-03-05 11:06:03

标签: google-cloud-platform google-cloud-functions google-cloud-build build-triggers google-cloud-repository

我有一个包含不同云功能文件夹的项目文件夹,例如

Project_Folder
    -Cloud-Function-Folder1
         -main.py
         -requirements.txt
         -cloudbuild.yaml
    -Cloud-Function-Folder2
         -main.py
         -requirements.txt
         -cloudbuild.yaml
    -Cloud-Function-Folder3
         -main.py
         -requirements.txt
         -cloudbuild.yaml
            --------- and so on!

现在我现在拥有的是。我将代码从Cloud Fucntions文件夹一一推送到Source Repository到Source Repository(每个功能文件夹分别存放)。然后启用了触发器,该触发器将触发云构建,然后部署该功能。 我拥有的cloudbuild.yaml文件如下所示。

 steps:

 - name: 'python:3.7'
 entrypoint: 'bash'
 args: 
   - '-c'
   - |
       pip3 install -r requirements.txt
       pytest

 - name: 'gcr.io/cloud-builders/gcloud'
  args:
  - functions 
  - deploy
  - Function
  - --runtime=python37
  - --source=.
  - --entry-point=function_main
  - --trigger-topic=Function
  - --region=europe-west3  

现在,我想做的是一个单一的源仓库,每当我在一个云函数中更改代码并推送它时,只有部署和休息,就像以前一样。


更新

现在,我还在下面尝试了类似的方法,但是即使我正在处理单个功能,它也同时部署了所有功能。

Project_Folder
    -Cloud-Function-Folder1
         -main.py
         -requirements.txt
    -Cloud-Function-Folder2
         -main.py
         -requirements.txt
    -Cloud-Function-Folder3
         -main.py
         -requirements.txt
    -cloudbuild.yaml
    -requirements.txt

cloudbuild.yaml文件如下所示

 steps:

 - name: 'python:3.7'
 entrypoint: 'bash'
 args: 
   - '-c'
   - |
       pip3 install -r requirements.txt
       pytest

 - name: 'gcr.io/cloud-builders/gcloud'
  args:
  - functions 
  - deploy
  - Function1
  - --runtime=python37
  - --source=./Cloud-Function-Folder1
  - --entry-point=function1_main
  - --trigger-topic=Function1
  - --region=europe-west3  

 - name: 'gcr.io/cloud-builders/gcloud'
  args:
  - functions 
  - deploy
  - Function2
  - --runtime=python37
  - --source=./Cloud-Function-Folder2
  - --entry-point=function2_main
  - --trigger-topic=Function2
  - --region=europe-west3 

3 个答案:

答案 0 :(得分:1)

这更加复杂,您必须应对Cloud Build的限制和约束。

我这样做:

  • 获取自上次提交以来的目录更新
  • 在此目录上循环并执行我想要的

假设1 :使用相同的命令部署所有子文件夹

因此,为此,我将cloudbuild.yaml放在目录的根目录中,而不是子文件夹中

steps:
- name: 'gcr.io/cloud-builders/git'
  entrypoint: /bin/bash
  args:
    - -c
    - |
        # Cloud Build doesn't recover the .git file. Thus checkout the repo for this
        git clone --branch $BRANCH_NAME https://github.com/guillaumeblaquiere/cloudbuildtest.git /tmp/repo ;
        # Copy only the .git file
        mv /tmp/repo/.git .
        # Make a diff between this version and the previous one and store the result into a file
        git diff --name-only --diff-filter=AMDR @~..@ | grep "/" | cut -d"/" -f1 | uniq > /workspace/diff

# Do what you want, by performing a loop in to the directory
- name: 'python:3.7'
  entrypoint: /bin/bash
  args:
    - -c
    - |
       for i in $$(cat /workspace/diff); do
       cd $$i
           # No strong isolation between each function, take care of conflicts!!
           pip3 install -r requirements.txt
           pytest
       cd ..
       done

- name: 'gcr.io/cloud-builders/gcloud'
  entrypoint: /bin/bash
  args:
    - -c
    - |
       for i in $$(cat /workspace/diff); do
       cd $$i
           gcloud functions deploy .........           
       cd ..
       done

假设2 :具体部署由子文件夹

因此,为此,我将cloudbuild.yaml放在目录的根目录中,并将另一个放在子文件夹中

steps:
- name: 'gcr.io/cloud-builders/git'
  entrypoint: /bin/bash
  args:
    - -c
    - |
        # Cloud Build doesn't recover the .git file. Thus checkout the repo for this
        git clone --branch $BRANCH_NAME https://github.com/guillaumeblaquiere/cloudbuildtest.git /tmp/repo ;
        # Copy only the .git file
        mv /tmp/repo/.git .
        # Make a diff between this version and the previous one and store the result into a file
        git diff --name-only --diff-filter=AMDR @~..@ | grep "/" | cut -d"/" -f1 | uniq > /workspace/diff

# Do what you want, by performing a loop in to the directory. Here launch a cloud build
- name: 'gcr.io/cloud-builders/gcloud'
  entrypoint: /bin/bash
  args:
    - -c
    - |
       for i in $$(cat /workspace/diff); do
       cd $$i
           gcloud builds submit
       cd ..
       done

请注意此处的timeout,因为您可能会触发大量的Cloud Build,而且需要花费时间。


要手动运行您的版本,请不要忘记添加$ BRANCH_NAME作为替换变量

gcloud builds submit --substitutions=BRANCH_NAME=master

答案 1 :(得分:0)

这很简单,但是您需要在构建触发器方面控制行为,而不是在 cloudbuild.yaml 上。从概念上讲,您希望限制云构建触发器行为,并将其限制为存储库中的某些更改。

因此,在 Build Trigger 页面中使用 regEx glob include 过滤器: screenshot of input files glob filter on GCP triggers

您将为每个云函数构建一个触发器(或云运行)并按如下方式设置“包含的文件过滤器 (glob)”:

  • Cloud-Function1-Trigger

    Project_Folder/Cloud-Function-Folder1/**

  • Cloud-Function2-Trigger

    Project_Folder/Cloud-Function-Folder2/**

...

假设:

  1. 对于每个触发器,repo 和分支都设置为 repo 的根目录具有 Project_Folder/
  2. Repo 和 branch 被适当设置,以便触发器可以定位和访问路径 Project_Folder/Cloud-Function-Folder1/*

当我有超过 2-3 个云函数时,我倾向于使用 Terraform 以自动化方式创建所有必需的触发器。

答案 2 :(得分:-2)

如果创建单个源存储库并将代码更改为一个云功能,则必须创建一个'cloudbuild.yaml' configuration file。您需要将此单个存储库连接到Cloud Build。然后创建一个build trigger,选择此存储库作为源。另外,您还需要configure deployment,并且每当将新代码推送到存储库时,都会自动触发构建并在Cloud Functions上进行部署。