Creating Pipelines


This section shows how the application developer can create pipelines and configure the pipeline stages and jobs for applications.

Prerequisites

Before creating a pipeline, you need to complete the application creation and the basic information configuration. For the detailed steps, see Creating an Application.

Creating a Pipeline

  1. In the project list, click a project name to enter the project space.

  2. In the left navigation bar, select Development > Pipeline.

  3. Click New Pipeline and enter the basic configuration information in the Basic Information section.

    • App/Product: Select an application from the created applications list.

    • Pipline Name: Enter the name of the pipeline.

    • Language/Version: Select the language and version for developing the application.

    • Tool: Select the dependency for the language. For example, if Java is chosen for Language/Version, select Maven for Tool.

    • Git Repository: This field is automatically populated with the repository of the selected application.

    • Triggering Mode: Select the way to trigger the pipeline to run.

      • Automatic: Run the pipeline when the specified Git repository branch changes.
      • Manual: Manually select the Git repository branch and run the pipeline.
      • Scheduled: Run the pipeline at the set time.
    • Branch: Select the Git repository branch that runs the pipeline when the Triggering Mode is set to Automatic or Scheduled.

    • Triggering Time: Select the specific time to run the pipeline when the Triggering Mode is set to Scheduled.

      ../../../_images/create_pipeline.png


  4. After completing the basic information configuration, go on to the Stage section to configure the stages and jobs of the pipeline.

Configuring the Stages and Jobs

  1. Enter the name for the stage at Stage Name.

  2. Click the Add Job button, select the job type, and enter the required parameters.

    • Job Type: Build
      • Job Name: Enter the name of the build job.
      • Time Limit: Enter the timeout limit for the job.
      • Docker File Path: Enter the Dockerfile storage path, such as docker/Dockerfile, which is the project root path by default.
      • Docker Registry: Select the docker image repository.
      • Build Parameters: Enter the parameters for Maven.

    Note

    The build job will check if build.sh exists. If yes, it will automatically execute build.sh. If not, it will determine whether pom.xml exists, and if it exists, the mvn Clean package -U -DskipTests command will be executed. When building with the custom build.sh, you need to set the storage location for the build results, such as war/tar/jar/zip packages, to the target directory under the project root directory.


    • Job Type: Deploy
      • Job Name: Enter the name of the deployment task.
      • Time Limit: Enter the timeout limit for the job.
      • Cluster: Select the cluster where the application will be deployed.
      • Resource Type: Select the resource type for the deploy job, where you can select Deployment or StatefulSet.
      • Resource Name: Select the deployment configuration name of the application. See Configuring Deployments for more information.
    • Job Type: Scan
      • Job Name: Enter the name of the code scanning job.
      • Time Limit: Enter the timeout limit for the job.
      • Source Code: Enter the path to the file to be scanned, such as src/main/java, src/main/java/utils/Test.java, etc.
      • Exclusion Files: Enter file paths that do not need to be scanned, such as src/main/java/, src/main/java/*/, src/main/java/utils/Test.java, etc.
      • Unit Test: Select whether to perform the unit test. If yes, enter the path of the test file, such as src/test.

    Note

    The code scanning job will check if build.sh exists. If yes, it will automatically execute build.sh. If not, it will determine whether pom.xml exists, and if it exists, the mvn clean compile -DskipTests command will be executed. When Java is used as the programming language, it will check whether there is a sonar-project.properties file. If yes, sonar-scanner will be used to complete the code scanning, and if not, mvn sonar:sonar will be used. As for other programming languages such as Node.js, sonar-scanner will be used for code scanning by default.


    • Job Type: Publish Dependency

      • Job Name: Enter the job name.
      • Build Path: Enter the path where the pom file is located, such as share, which is the project root path by default.
      • Parameters: For Java, the default command is mvn deploy, and you can add parameters like -DskipTests. For npm, the default command is npm publish, and you can add parameters like –access=public.
    • Job Type: Jenkins Job

      • Job Name: Enter the name of the Jenkins job.
      • Time Limit: Enter the timeout limit for the job.
      • Jenkins URL: Enter the Jenkins service address, e.g.: http://jenkins-ci.envisioncn.com:8080/jenkins.
      • Username: Enter the Jenkins user name.
      • API Token: Enter Jenkins API token.
      • Job Name: Enter the Jenkins job name.
      • Parameters: Enter the required Jenkins parameters.
    • Job Type: Custom Job

      • Job Name: Enter the name of the custom job.
      • Shell Script Path: Enter the path of the script file that you want to run.


  3. Click and drag the jobs to sort the jobs’ running order.

    ../../../_images/add_stage.png


  4. After the job configuration for the stage is completed, click Add Stage and repeat the steps above to add the second stage of the pipeline.

  5. After the stage configuration is completed, click the New Pipeline button to save the pipeline configuration.

Creating a Product Pipeline

Product-level pipeline for packaging multiple applications under a product into a Chart for later deployment via Helm. Supported task types: Chart generation.

Prerequisites

Before creating a new pipeline, you need to complete product and application creation and basic information configuration. For detailed steps, see Creating Product and Creating Application.

Note

Only Common type products are supported, and at least one application must be associated with the product.

Configure pipeline base information

  1. In the project list, click a project name to enter the project space.

  2. In the left navigation bar, select Development > Pipeline.

  3. Click New Pipeline and enter the basic configuration information in the Basic Information section.

    • App/Product: From the list of created products, select to create a pipeline for the target product.

    • Pipline Name: Enter the name of the pipeline.

      ../../../_images/create_pipeline_product.png


  4. After completing the basic information configuration, go on to the Stage section to configure the stages and jobs of the pipeline.

Configuring the Stages and Jobs

Configure the phases and tasks of the pipeline by following the steps below:

  1. Enter the name for the stage at Stage Name.

  2. Click the Add Job button, select the job type, and enter the required parameters.

    • Create Chart
      • Cluster: Select the environment and cluster used by the pipeline.
      • Chart Name: Enter the Chart name, the default is the application/product name.
      • Chart Version:Enter Chart version number.
      • App Version: Enter the application version number.
  3. Click +SubChart to add a sub-Chart.

    • SubChart
      • App Name: Select the application associated with the product.
      • SubChart Name: Enter the application of the SubChart, the default is the application name.
      • SubChart Version: Enter the SubChart version number, the default is the Chart version number.
      • SubChart App Version Number: Enter the SubChart application version number, which defaults to the application version number.

    Note

    At least one SubChart must be included.

    ../../../_images/add_stage_chart.png
  4. You can click the +SubChart button and repeat the above steps to add multiple sub-charts.

  5. Once the configuration is complete, click the New Pipeline button to save the pipeline configuration.

Next Step

Once the pipeline is created, you can run the pipeline and view the status and results of the running pipeline.