Commit 62155c4d authored by GitLab Bot's avatar GitLab Bot

Automatic merge of gitlab-org/gitlab master

parents cff478eb dbf8005a
......@@ -66,7 +66,8 @@ module Git
def strip_extension(filename)
return unless filename
File.basename(filename, File.extname(filename))
encoded_filename = Gitlab::EncodingHelper.encode_utf8(filename.dup)
File.basename(encoded_filename, File.extname(encoded_filename))
end
end
end
......
---
data_category: Optional
data_category: Operational
key_path: counts_monthly.deployments
description: Total deployments count for recent 28 days
product_section: ops
......
---
data_category: Optional
data_category: Operational
key_path: redis_hll_counters.analytics.analytics_total_unique_counts_monthly
description: The number of unique users who visited any analytics feature by month
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.configure.project_clusters_enabled
description: Total GitLab Managed enabled clusters attached to projects
product_section: ops
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.secure.user_dast_jobs
description: Users who run a DAST job
product_section: sec
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.secure.dast_pipeline
description: Count of pipelines that have at least 1 DAST job
product_section: sec
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.secure.user_api_fuzzing_jobs
description: Count of API Fuzzing jobs by job name
product_section: sec
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.secure.user_api_fuzzing_dnd_jobs
description: Count of API Fuzzing `docker-in-docker` jobs by job names
product_section: sec
......
---
data_category: Optional
data_category: Operational
key_path: redis_hll_counters.incident_management.incident_management_total_unique_counts_monthly
description: Count of unique users performing events related with incidents per month
product_section: ops
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.plan.service_desk_issues
description:
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.plan.projects_jira_active
description:
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.plan.projects_jira_dvcs_cloud_active
description:
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.plan.projects_jira_dvcs_server_active
description:
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: redis_hll_counters.issues_edit.g_project_management_issue_created_monthly
description: Count of MAU creating new issues
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: redis_hll_counters.issues_edit.g_project_management_issue_closed_monthly
description: Count of MAU closing an issue
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: redis_hll_counters.issues_edit.issues_edit_total_unique_counts_monthly
description: Aggregate count of MAU taking an action related to an issue
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.secure.user_unique_users_all_secure_scanners
description:
product_section: sec
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.secure.user_sast_jobs
description: Users who run a SAST job
product_section: sec
......@@ -10,7 +10,6 @@ value_type: number
status: data_available
time_frame: 28d
data_source: database
data_category: Optional
distribution:
- ce
- ee
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.secure.user_secret_detection_jobs
description: Users who run a Secret Detection job
product_section: sec
......@@ -10,7 +10,6 @@ value_type: number
status: data_available
time_frame: 28d
data_source: database
data_category: Optional
distribution:
- ce
- ee
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.secure.sast_pipeline
description: Counts of Pipelines that have at least 1 SAST job
product_section: sec
......@@ -10,7 +10,6 @@ value_type: number
status: data_available
time_frame: 28d
data_source: database
data_category: Optional
distribution:
- ce
- ee
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.secure.secret_detection_pipeline
description: Counts of Pipelines that have at least 1 Secret Detection job
product_section: sec
......@@ -10,7 +10,6 @@ value_type: number
status: data_available
time_frame: 28d
data_source: database
data_category: Optional
distribution:
- ce
- ee
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.secure.user_coverage_fuzzing_jobs
description: ''
product_section: ''
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.secure.sast_scans
description: ''
product_section: ''
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.secure.container_scanning_scans
description: ''
product_section: ''
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.secure.dast_scans
description: ''
product_section: ''
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.secure.secret_detection_scans
description: ''
product_section: ''
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.secure.coverage_fuzzing_scans
description: ''
product_section: ''
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage_monthly.secure.api_fuzzing_scans
description: ''
product_section: ''
......
---
data_category: Optional
data_category: Operational
key_path: redis_hll_counters.terraform.p_terraform_state_api_unique_users_monthly
description: Monthly active users of GitLab Managed Terraform states
product_section: ops
......
---
data_category: Optional
data_category: Operational
key_path: redis_hll_counters.user_packages.user_packages_total_unique_counts_monthly
description: A monthly count of users that have published a package to the registry
product_section: ops
......
---
data_category: Optional
data_category: Operational
key_path: gitaly.servers
description: Total Gitalty Servers
product_section: growth
......
---
data_category: Optional
data_category: Operational
key_path: gitaly.clusters
description: Total GitLab Managed clusters both enabled and disabled
product_section: growth
......
---
data_category: Optional
data_category: Operational
key_path: counts.service_desk_issues
description: Count of service desk issues
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: counts.merge_requests
description: Count of the number of merge requests
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: counts.clusters_applications_cilium
description: Total GitLab Managed clusters with GitLab Managed App:Cilium installed
product_section: ops
......
---
data_category: Optional
data_category: Operational
key_path: counts.projects_with_terraform_reports
description: Count of projects with Terraform MR reports
product_section: ops
......
---
data_category: Optional
data_category: Operational
key_path: counts.projects_with_terraform_states
description: Count of projects with GitLab Managed Terraform State
product_section: ops
......
---
data_category: Optional
data_category: Operational
key_path: counts.ingress_modsecurity_packets_processed
description: Cumulative count of packets processed by ModSecurity since Usage Ping
was last reported
......
---
data_category: Optional
data_category: Operational
key_path: counts.ingress_modsecurity_packets_anomalous
description: Cumulative count of packets identified as anomalous by ModSecurity since
Usage Ping was last reported
......
---
data_category: Optional
data_category: Operational
key_path: counts.network_policy_forwards
description: Cumulative count of packets forwarded by Cilium (Container Network Security)
since Usage Ping was last reported
......
---
data_category: Optional
data_category: Operational
key_path: counts.network_policy_drops
description: Cumulative count of packets dropped by Cilium (Container Network Security)
since Usage Ping was last reported
......
---
data_category: Optional
data_category: Operational
key_path: counts.ingress_modsecurity_logging
description: Whether or not ModSecurity is set to logging mode
product_section: sec
......
---
data_category: Optional
data_category: Operational
key_path: counts.ingress_modsecurity_blocking
description: Whether or not ModSecurity is set to blocking mode
product_section: sec
......
---
data_category: Optional
data_category: Operational
key_path: counts.ingress_modsecurity_disabled
description: Whether or not ModSecurity is disabled within Ingress
product_section: sec
......
---
data_category: Optional
data_category: Operational
key_path: counts.ingress_modsecurity_not_installed
description: Whether or not ModSecurity has not been installed into the cluster
product_section: sec
......
---
data_category: Optional
data_category: Operational
key_path: counts.ci_builds
description: Unique builds in project
product_section: ops
......
---
data_category: Optional
data_category: Operational
key_path: counts.ci_internal_pipelines
description: Total pipelines in GitLab repositories
product_section: ops
......
---
data_category: Optional
data_category: Operational
key_path: counts.ci_external_pipelines
description: Total pipelines in external repositories
product_section: ops
......
---
data_category: Optional
data_category: Operational
key_path: counts.dast_jobs
description: Count of DAST jobs run
product_section: sec
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage.secure.user_dast_jobs
description: Count of DAST jobs
product_section: sec
......
---
data_category: Optional
data_category: Operational
key_path: counts.projects_bamboo_active
description: Count of projects with active integrations for Bamboo CI
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: counts.projects_drone_ci_active
description: Count of projects with active integrations for Drone CI
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: counts.projects_jenkins_active
description: Count of projects with active integrations for Jenkins
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: counts.projects_jira_active
description: Count of projects with active integrations for Jira
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: counts.projects_jira_server_active
description: Count of active integrations with Jira Software (server)
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: counts.projects_jira_cloud_active
description: Count of active integrations with Jira Cloud (Saas)
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: counts.projects_jira_dvcs_cloud_active
description: Count of active integrations with Jira Cloud (DVCS Connector)
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: counts.projects_jira_dvcs_server_active
description: Count of active integrations with Jira Software (DVCS connector)
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage.secure.user_api_fuzzing_jobs
description: Count of API Fuzzing jobs by job name
product_section: sec
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage.secure.user_api_fuzzing_dnd_jobs
description: Count of API Fuzzing `docker-in-docker` jobs by job name
product_section: sec
......
---
data_category: Optional
data_category: Operational
key_path: counts.projects_imported_from_github
description:
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage.manage.issue_imports.jira
description: Count of projects imported from Jira
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: counts.issues
description: Count of Issues created
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage.plan.issues
description: Count of users creating Issues
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage.plan.epics
description:
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: counts.projects
description: Count of Projects created
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: counts.todos
description: Count of todos created
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage.secure.user_unique_users_all_secure_scanners
description:
product_section: sec
......
---
data_category: Optional
data_category: Operational
key_path: counts.remote_mirrors
description: Count of total remote mirrors. Includes both push and pull mirrors
product_section: dev
......
---
data_category: Optional
data_category: Operational
key_path: counts.sast_jobs
description: Count of SAST CI jobs for the month. Job names ending in '-sast'
product_section: sec
......@@ -10,7 +10,6 @@ value_type: number
status: data_available
time_frame: all
data_source: database
data_category: Optional
distribution:
- ce
- ee
......
---
data_category: Optional
data_category: Operational
key_path: counts.secret_detection_jobs
description: Count of all 'secret-detection' CI jobs.
product_section: sec
......@@ -10,7 +10,6 @@ value_type: number
status: data_available
time_frame: all
data_source: database
data_category: Optional
distribution:
- ce
- ee
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage.secure.user_sast_jobs
description: Count of SAST jobs per user
product_section: sec
......@@ -10,7 +10,6 @@ value_type: number
status: data_available
time_frame: all
data_source: database
data_category: Optional
distribution:
- ce
- ee
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage.secure.user_secret_detection_jobs
description: Count of Secret Detection Jobs per user
product_section: sec
......@@ -10,7 +10,6 @@ value_type: number
status: data_available
time_frame: all
data_source: database
data_category: Optional
distribution:
- ce
- ee
......
---
data_category: Optional
data_category: Operational
key_path: usage_activity_by_stage.secure.user_coverage_fuzzing_jobs
description: ''
product_section: ''
......
---
data_category: Optional
data_category: Operational
key_path: gitaly.version
description: Version of Gitaly
product_section: growth
......
---
data_category: Optional
data_category: Operational
key_path: git.version
description: Information about Git version
product_section: enablement
......
---
data_category: Optional
data_category: Operational
key_path: ingress_modsecurity_enabled
description: Whether or not ModSecurity is enabled within Ingress
product_section: sec
......
......@@ -70,7 +70,7 @@ is the same as [getting the job's artifacts](#get-job-artifacts), but by
defining the job's name instead of its ID.
NOTE:
If a pipeline is [parent of other child pipelines](../ci/parent_child_pipelines.md), artifacts
If a pipeline is [parent of other child pipelines](../ci/pipelines/parent_child_pipelines.md), artifacts
are searched in hierarchical order from parent to child. For example, if both parent and
child pipelines have a job with the same name, the artifact from the parent pipeline is returned.
......@@ -172,7 +172,7 @@ pipeline for the given reference name from inside the job's artifacts archive.
The file is extracted from the archive and streamed to the client.
In [GitLab 13.5](https://gitlab.com/gitlab-org/gitlab/-/issues/201784) and later, artifacts
for [parent and child pipelines](../ci/parent_child_pipelines.md) are searched in hierarchical
for [parent and child pipelines](../ci/pipelines/parent_child_pipelines.md) are searched in hierarchical
order from parent to child. For example, if both parent and child pipelines have a
job with the same name, the artifact from the parent pipeline is returned.
......
......@@ -295,7 +295,7 @@ Example of response
```
In GitLab 13.3 and later, this endpoint [returns data for any pipeline](pipelines.md#single-pipeline-requests)
including [child pipelines](../ci/parent_child_pipelines.md).
including [child pipelines](../ci/pipelines/parent_child_pipelines.md).
In GitLab 13.5 and later, this endpoint does not return retried jobs in the response
by default. Additionally, jobs are sorted by ID in descending order (newest first).
......
......@@ -11,7 +11,7 @@ info: To determine the technical writer assigned to the Stage/Group associated w
> [Introduced](https://gitlab.com/gitlab-org/gitlab/-/merge_requests/36494) in GitLab 13.3.
Endpoints that request information about a single pipeline return data for any pipeline.
Before 13.3, requests for [child pipelines](../ci/parent_child_pipelines.md) returned
Before 13.3, requests for [child pipelines](../ci/pipelines/parent_child_pipelines.md) returned
a 404 error.
## Pipelines pagination
......
......@@ -129,7 +129,7 @@ All project maintainers have access to production secrets. If you need to limit
that can deploy to a production environment, you can create a separate project and configure a new
permission model that isolates the CD permissions from the original project and prevents the
original project's maintainers from accessing the production secret and CD configuration. You can
connect the CD project to your development projects by using [multi-project pipelines](../multi_project_pipelines.md).
connect the CD project to your development projects by using [multi-project pipelines](../pipelines/multi_project_pipelines.md).
## Protect `gitlab-ci.yml` from change
......
......@@ -221,8 +221,8 @@ check the value of the `$CI_PIPELINE_SOURCE` variable:
| `external` | When you use CI services other than GitLab. |
| `external_pull_request_event` | When an external pull request on GitHub is created or updated. See [Pipelines for external pull requests](../ci_cd_for_external_repos/index.md#pipelines-for-external-pull-requests). |
| `merge_request_event` | For pipelines created when a merge request is created or updated. Required to enable [merge request pipelines](../pipelines/merge_request_pipelines.md), [merged results pipelines](../pipelines/pipelines_for_merged_results.md), and [merge trains](../pipelines/merge_trains.md). |
| `parent_pipeline` | For pipelines triggered by a [parent/child pipeline](../parent_child_pipelines.md) with `rules`. Use this pipeline source in the child pipeline configuration so that it can be triggered by the parent pipeline. |
| `pipeline` | For [multi-project pipelines](../multi_project_pipelines.md) created by [using the API with `CI_JOB_TOKEN`](../multi_project_pipelines.md#create-multi-project-pipelines-by-using-the-api), or the [`trigger`](../yaml/index.md#trigger) keyword. |
| `parent_pipeline` | For pipelines triggered by a [parent/child pipeline](../pipelines/parent_child_pipelines.md) with `rules`. Use this pipeline source in the child pipeline configuration so that it can be triggered by the parent pipeline. |
| `pipeline` | For [multi-project pipelines](../pipelines/multi_project_pipelines.md) created by [using the API with `CI_JOB_TOKEN`](../pipelines/multi_project_pipelines.md#create-multi-project-pipelines-by-using-the-api), or the [`trigger`](../yaml/index.md#trigger) keyword. |
| `push` | For pipelines triggered by a `git push` event, including for branches and tags. |
| `schedule` | For [scheduled pipelines](../pipelines/schedules.md). |
| `trigger` | For pipelines created by using a [trigger token](../triggers/index.md#trigger-token). |
......
This diff is collapsed.
---
stage: Verify
group: Pipeline Execution
info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
type: reference
redirect_to: 'pipelines/parent_child_pipelines.md'
---
# Parent-child pipelines **(FREE)**
This document was moved to [another location](pipelines/parent_child_pipelines.md).
> [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/16094) in GitLab 12.7.
As pipelines grow more complex, a few related problems start to emerge:
- The staged structure, where all steps in a stage must be completed before the first
job in next stage begins, causes arbitrary waits, slowing things down.
- Configuration for the single global pipeline becomes very long and complicated,
making it hard to manage.
- Imports with [`include`](yaml/index.md#include) increase the complexity of the configuration, and create the potential
for namespace collisions where jobs are unintentionally duplicated.
- Pipeline UX can become unwieldy with so many jobs and stages to work with.
Additionally, sometimes the behavior of a pipeline needs to be more dynamic. The ability
to choose to start sub-pipelines (or not) is a powerful ability, especially if the
YAML is dynamically generated.
![Parent pipeline graph expanded](img/parent_pipeline_graph_expanded_v12_6.png)
Similarly to [multi-project pipelines](multi_project_pipelines.md), a pipeline can trigger a
set of concurrently running child pipelines, but within the same project:
- Child pipelines still execute each of their jobs according to a stage sequence, but
would be free to continue forward through their stages without waiting for unrelated
jobs in the parent pipeline to finish.
- The configuration is split up into smaller child pipeline configurations, which are
easier to understand. This reduces the cognitive load to understand the overall configuration.
- Imports are done at the child pipeline level, reducing the likelihood of collisions.
- Each pipeline has only relevant steps, making it easier to understand what's going on.
Child pipelines work well with other GitLab CI/CD features:
- Use [`rules: changes`](yaml/index.md#ruleschanges) to trigger pipelines only when
certain files change. This is useful for monorepos, for example.
- Since the parent pipeline in `.gitlab-ci.yml` and the child pipeline run as normal
pipelines, they can have their own behaviors and sequencing in relation to triggers.
See the [`trigger:`](yaml/index.md#trigger) keyword documentation for full details on how to
include the child pipeline configuration.
<i class="fa fa-youtube-play youtube" aria-hidden="true"></i>
For an overview, see [Parent-Child Pipelines feature demo](https://youtu.be/n8KpBSqZNbk).
## Examples
The simplest case is [triggering a child pipeline](yaml/index.md#trigger) using a
local YAML file to define the pipeline configuration. In this case, the parent pipeline
triggers the child pipeline, and continues without waiting:
```yaml
microservice_a:
trigger:
include: path/to/microservice_a.yml
```
You can include multiple files when composing a child pipeline:
```yaml
microservice_a:
trigger:
include:
- local: path/to/microservice_a.yml
- template: Security/SAST.gitlab-ci.yml
```
In [GitLab 13.5](https://gitlab.com/gitlab-org/gitlab/-/issues/205157) and later,
you can use [`include:file`](yaml/index.md#includefile) to trigger child pipelines
with a configuration file in a different project:
```yaml
microservice_a:
trigger:
include:
- project: 'my-group/my-pipeline-library'
file: 'path/to/ci-config.yml'
```
The maximum number of entries that are accepted for `trigger:include:` is three.
Similar to [multi-project pipelines](multi_project_pipelines.md#mirror-status-of-a-triggered-pipeline-in-the-trigger-job),
we can set the parent pipeline to depend on the status of the child pipeline upon completion:
```yaml
microservice_a:
trigger:
include:
- local: path/to/microservice_a.yml
- template: Security/SAST.gitlab-ci.yml
strategy: depend
```
## Merge Request child pipelines
To trigger a child pipeline as a [Merge Request Pipeline](pipelines/merge_request_pipelines.md) we need to:
- Set the trigger job to run on merge requests:
```yaml
# parent .gitlab-ci.yml
microservice_a:
trigger:
include: path/to/microservice_a.yml
rules:
- if: $CI_MERGE_REQUEST_ID
```
- Configure the child pipeline by either:
- Setting all jobs in the child pipeline to evaluate in the context of a merge request:
```yaml
# child path/to/microservice_a.yml
workflow:
rules:
- if: $CI_MERGE_REQUEST_ID
job1:
script: ...
job2:
script: ...
```
- Alternatively, setting the rule per job. For example, to create only `job1` in
the context of merge request pipelines:
```yaml
# child path/to/microservice_a.yml
job1:
script: ...
rules:
- if: $CI_MERGE_REQUEST_ID
job2:
script: ...
```
## Dynamic child pipelines
> [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/35632) in GitLab 12.9.
Instead of running a child pipeline from a static YAML file, you can define a job that runs
your own script to generate a YAML file, which is then [used to trigger a child pipeline](yaml/index.md#trigger-child-pipeline-with-generated-configuration-file).
This technique can be very powerful in generating pipelines targeting content that changed or to
build a matrix of targets and architectures.
<i class="fa fa-youtube-play youtube" aria-hidden="true"></i>
For an overview, see [Create child pipelines using dynamically generated configurations](https://youtu.be/nMdfus2JWHM).
<!-- vale gitlab.Spelling = NO -->
We also have an example project using
[Dynamic Child Pipelines with Jsonnet](https://gitlab.com/gitlab-org/project-templates/jsonnet)
which shows how to use a data templating language to generate your `.gitlab-ci.yml` at runtime. You could use a similar process for other templating languages like [Dhall](https://dhall-lang.org/) or [`ytt`](https://get-ytt.io/).
<!-- vale gitlab.Spelling = NO -->
The artifact path is parsed by GitLab, not the runner, so the path must match the
syntax for the OS running GitLab. If GitLab is running on Linux but using a Windows
runner for testing, the path separator for the trigger job would be `/`. Other CI/CD
configuration for jobs, like scripts, that use the Windows runner would use `\`.
In GitLab 12.9, the child pipeline could fail to be created in certain cases, causing the parent pipeline to fail.
This is [resolved in GitLab 12.10](https://gitlab.com/gitlab-org/gitlab/-/issues/209070).
## Nested child pipelines
> - [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/29651) in GitLab 13.4.
> - [Feature flag removed](https://gitlab.com/gitlab-org/gitlab/-/issues/243747) in GitLab 13.5.
Parent and child pipelines were introduced with a maximum depth of one level of child
pipelines, which was later increased to two. A parent pipeline can trigger many child
pipelines, and these child pipelines can trigger their own child pipelines. It's not
possible to trigger another level of child pipelines.
<i class="fa fa-youtube-play youtube" aria-hidden="true"></i>
For an overview, see [Nested Dynamic Pipelines](https://youtu.be/C5j3ju9je2M).
## Pass CI/CD variables to a child pipeline
You can pass CI/CD variables to a downstream pipeline using the same methods as
multi-project pipelines:
- [By using the `variable` keyword](multi_project_pipelines.md#pass-cicd-variables-to-a-downstream-pipeline-by-using-the-variables-keyword).
- [By using variable inheritance](multi_project_pipelines.md#pass-cicd-variables-to-a-downstream-pipeline-by-using-variable-inheritance).
<!-- This redirect file can be deleted after 2021-09-29. -->
<!-- Before deletion, see: https://docs.gitlab.com/ee/development/documentation/#move-or-rename-a-page -->
......@@ -50,8 +50,8 @@ Pipelines can be configured in many different ways:
followed by the next stage.
- [Directed Acyclic Graph Pipeline (DAG) pipelines](../directed_acyclic_graph/index.md) are based on relationships
between jobs and can run more quickly than basic pipelines.
- [Multi-project pipelines](../multi_project_pipelines.md) combine pipelines for different projects together.
- [Parent-Child pipelines](../parent_child_pipelines.md) break down complex pipelines
- [Multi-project pipelines](multi_project_pipelines.md) combine pipelines for different projects together.
- [Parent-Child pipelines](parent_child_pipelines.md) break down complex pipelines
into one parent pipeline that can trigger multiple child sub-pipelines, which all
run in the same project and with the same SHA.
- [Pipelines for Merge Requests](../pipelines/merge_request_pipelines.md) run for merge
......@@ -349,7 +349,7 @@ You can group the jobs by:
- [Job dependencies](#view-job-dependencies-in-the-pipeline-graph), which arranges
jobs based on their [`needs`](../yaml/index.md#needs) dependencies.
[Multi-project pipeline graphs](../multi_project_pipelines.md#multi-project-pipeline-visualization) help
[Multi-project pipeline graphs](multi_project_pipelines.md#multi-project-pipeline-visualization) help
you visualize the entire pipeline, including all cross-project inter-dependencies. **(PREMIUM)**
### View job dependencies in the pipeline graph
......
......@@ -112,7 +112,7 @@ the artifact.
## How searching for job artifacts works
In [GitLab 13.5](https://gitlab.com/gitlab-org/gitlab/-/issues/201784) and later, artifacts
for [parent and child pipelines](../parent_child_pipelines.md) are searched in hierarchical
for [parent and child pipelines](parent_child_pipelines.md) are searched in hierarchical
order from parent to child. For example, if both parent and child pipelines have a
job with the same name, the job artifact from the parent pipeline is returned.
......
This diff is collapsed.
---
stage: Verify
group: Pipeline Authoring
info: To determine the technical writer assigned to the Stage/Group associated with this page, see https://about.gitlab.com/handbook/engineering/ux/technical-writing/#assignments
type: reference
---
# Parent-child pipelines **(FREE)**
> [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/16094) in GitLab 12.7.
As pipelines grow more complex, a few related problems start to emerge:
- The staged structure, where all steps in a stage must be completed before the first
job in next stage begins, causes arbitrary waits, slowing things down.
- Configuration for the single global pipeline becomes very long and complicated,
making it hard to manage.
- Imports with [`include`](../yaml/index.md#include) increase the complexity of the configuration, and create the potential
for namespace collisions where jobs are unintentionally duplicated.
- Pipeline UX can become unwieldy with so many jobs and stages to work with.
Additionally, sometimes the behavior of a pipeline needs to be more dynamic. The ability
to choose to start sub-pipelines (or not) is a powerful ability, especially if the
YAML is dynamically generated.
![Parent pipeline graph expanded](img/parent_pipeline_graph_expanded_v12_6.png)
Similarly to [multi-project pipelines](multi_project_pipelines.md), a pipeline can trigger a
set of concurrently running child pipelines, but within the same project:
- Child pipelines still execute each of their jobs according to a stage sequence, but
would be free to continue forward through their stages without waiting for unrelated
jobs in the parent pipeline to finish.
- The configuration is split up into smaller child pipeline configurations, which are
easier to understand. This reduces the cognitive load to understand the overall configuration.
- Imports are done at the child pipeline level, reducing the likelihood of collisions.
- Each pipeline has only relevant steps, making it easier to understand what's going on.
Child pipelines work well with other GitLab CI/CD features:
- Use [`rules: changes`](../yaml/index.md#ruleschanges) to trigger pipelines only when
certain files change. This is useful for monorepos, for example.
- Since the parent pipeline in `.gitlab-ci.yml` and the child pipeline run as normal
pipelines, they can have their own behaviors and sequencing in relation to triggers.
See the [`trigger:`](../yaml/index.md#trigger) keyword documentation for full details on how to
include the child pipeline configuration.
<i class="fa fa-youtube-play youtube" aria-hidden="true"></i>
For an overview, see [Parent-Child Pipelines feature demo](https://youtu.be/n8KpBSqZNbk).
## Examples
The simplest case is [triggering a child pipeline](../yaml/index.md#trigger) using a
local YAML file to define the pipeline configuration. In this case, the parent pipeline
triggers the child pipeline, and continues without waiting:
```yaml
microservice_a:
trigger:
include: path/to/microservice_a.yml
```
You can include multiple files when composing a child pipeline:
```yaml
microservice_a:
trigger:
include:
- local: path/to/microservice_a.yml
- template: Security/SAST.gitlab-ci.yml
```
In [GitLab 13.5](https://gitlab.com/gitlab-org/gitlab/-/issues/205157) and later,
you can use [`include:file`](../yaml/index.md#includefile) to trigger child pipelines
with a configuration file in a different project:
```yaml
microservice_a:
trigger:
include:
- project: 'my-group/my-pipeline-library'
file: 'path/to/ci-config.yml'
```
The maximum number of entries that are accepted for `trigger:include:` is three.
Similar to [multi-project pipelines](multi_project_pipelines.md#mirror-status-of-a-triggered-pipeline-in-the-trigger-job),
we can set the parent pipeline to depend on the status of the child pipeline upon completion:
```yaml
microservice_a:
trigger:
include:
- local: path/to/microservice_a.yml
- template: Security/SAST.gitlab-ci.yml
strategy: depend
```
## Merge Request child pipelines
To trigger a child pipeline as a [Merge Request Pipeline](merge_request_pipelines.md) we need to:
- Set the trigger job to run on merge requests:
```yaml
# parent .gitlab-ci.yml
microservice_a:
trigger:
include: path/to/microservice_a.yml
rules:
- if: $CI_MERGE_REQUEST_ID
```
- Configure the child pipeline by either:
- Setting all jobs in the child pipeline to evaluate in the context of a merge request:
```yaml
# child path/to/microservice_a.yml
workflow:
rules:
- if: $CI_MERGE_REQUEST_ID
job1:
script: ...
job2:
script: ...
```
- Alternatively, setting the rule per job. For example, to create only `job1` in
the context of merge request pipelines:
```yaml
# child path/to/microservice_a.yml
job1:
script: ...
rules:
- if: $CI_MERGE_REQUEST_ID
job2:
script: ...
```
## Dynamic child pipelines
> [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/35632) in GitLab 12.9.
Instead of running a child pipeline from a static YAML file, you can define a job that runs
your own script to generate a YAML file, which is then [used to trigger a child pipeline](../yaml/index.md#trigger-child-pipeline-with-generated-configuration-file).
This technique can be very powerful in generating pipelines targeting content that changed or to
build a matrix of targets and architectures.
<i class="fa fa-youtube-play youtube" aria-hidden="true"></i>
For an overview, see [Create child pipelines using dynamically generated configurations](https://youtu.be/nMdfus2JWHM).
<!-- vale gitlab.Spelling = NO -->
We also have an example project using
[Dynamic Child Pipelines with Jsonnet](https://gitlab.com/gitlab-org/project-templates/jsonnet)
which shows how to use a data templating language to generate your `.gitlab-ci.yml` at runtime. You could use a similar process for other templating languages like [Dhall](https://dhall-lang.org/) or [`ytt`](https://get-ytt.io/).
<!-- vale gitlab.Spelling = NO -->
The artifact path is parsed by GitLab, not the runner, so the path must match the
syntax for the OS running GitLab. If GitLab is running on Linux but using a Windows
runner for testing, the path separator for the trigger job would be `/`. Other CI/CD
configuration for jobs, like scripts, that use the Windows runner would use `\`.
In GitLab 12.9, the child pipeline could fail to be created in certain cases, causing the parent pipeline to fail.
This is [resolved in GitLab 12.10](https://gitlab.com/gitlab-org/gitlab/-/issues/209070).
## Nested child pipelines
> - [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/29651) in GitLab 13.4.
> - [Feature flag removed](https://gitlab.com/gitlab-org/gitlab/-/issues/243747) in GitLab 13.5.
Parent and child pipelines were introduced with a maximum depth of one level of child
pipelines, which was later increased to two. A parent pipeline can trigger many child
pipelines, and these child pipelines can trigger their own child pipelines. It's not
possible to trigger another level of child pipelines.
<i class="fa fa-youtube-play youtube" aria-hidden="true"></i>
For an overview, see [Nested Dynamic Pipelines](https://youtu.be/C5j3ju9je2M).
## Pass CI/CD variables to a child pipeline
You can pass CI/CD variables to a downstream pipeline using the same methods as
multi-project pipelines:
- [By using the `variable` keyword](multi_project_pipelines.md#pass-cicd-variables-to-a-downstream-pipeline-by-using-the-variables-keyword).
- [By using variable inheritance](multi_project_pipelines.md#pass-cicd-variables-to-a-downstream-pipeline-by-using-variable-inheritance).
......@@ -162,7 +162,7 @@ deploy_b:
## Child / Parent Pipelines
In the examples above, it's clear we've got two types of things that could be built independently.
This is an ideal case for using [Child / Parent Pipelines](../parent_child_pipelines.md)) via
This is an ideal case for using [Child / Parent Pipelines](parent_child_pipelines.md)) via
the [`trigger` keyword](../yaml/index.md#trigger). It separates out the configuration
into multiple files, keeping things very simple. You can also combine this with:
......
......@@ -186,7 +186,7 @@ shouldn't run, saving pipeline resources.
In a basic configuration, jobs always wait for all other jobs in earlier stages to complete
before running. This is the simplest configuration, but it's also the slowest in most
cases. [Directed Acyclic Graphs](../directed_acyclic_graph/index.md) and
[parent/child pipelines](../parent_child_pipelines.md) are more flexible and can
[parent/child pipelines](parent_child_pipelines.md) are more flexible and can
be more efficient, but can also make pipelines harder to understand and analyze.
### Caching
......
......@@ -40,7 +40,7 @@ Below are the shared runners settings.
| Setting | GitLab.com | Default |
| ----------- | ----------------- | ---------- |
| [GitLab Runner](https://gitlab.com/gitlab-org/gitlab-runner) | [Runner versions dashboard](https://dashboards.gitlab.com/d/000000159/ci?from=now-1h&to=now&refresh=5m&orgId=1&panelId=12&fullscreen&theme=light) | - |
| [GitLab Runner](https://gitlab.com/gitlab-org/gitlab-runner) | [Runner versions dashboard](https://dashboards.gitlab.net/d/ci-runners-deployment/ci-runners-deployment-overview?orgId=1&refresh=1m) | - |
| Executor | `docker+machine` | - |
| Default Docker image | `ruby:2.5` | - |
| `privileged` (run [Docker in Docker](https://hub.docker.com/_/docker/)) | `true` | `false` |
......
......@@ -50,7 +50,7 @@ with the [GitLab Container Registry](../../user/packages/container_registry/inde
This way of triggering can only be used when invoked inside `.gitlab-ci.yml`,
and it creates a dependent pipeline relation visible on the
[pipeline graph](../multi_project_pipelines.md). For example:
[pipeline graph](../pipelines/multi_project_pipelines.md). For example:
```yaml
trigger_pipeline:
......
......@@ -60,11 +60,11 @@ and [templates](examples/index.md#cicd-templates).
Some pipeline types have their own detailed usage guides that you should read
if you are using that type:
- [Multi-project pipelines](multi_project_pipelines.md): Have your pipeline trigger
- [Multi-project pipelines](pipelines/multi_project_pipelines.md): Have your pipeline trigger
a pipeline in a different project.
- [Parent/child pipelines](parent_child_pipelines.md): Have your main pipeline trigger
- [Parent/child pipelines](pipelines/parent_child_pipelines.md): Have your main pipeline trigger
and run separate pipelines in the same project. You can also
[dynamically generate the child pipeline's configuration](parent_child_pipelines.md#dynamic-child-pipelines)
[dynamically generate the child pipeline's configuration](pipelines/parent_child_pipelines.md#dynamic-child-pipelines)
at runtime.
- [Pipelines for Merge Requests](pipelines/merge_request_pipelines.md): Run a pipeline
in the context of a merge request.
......
......@@ -581,8 +581,8 @@ You can override the value of a variable when you:
1. Run a job manually in the UI.
1. Use [push options](../../user/project/push_options.md#push-options-for-gitlab-cicd).
1. Trigger a pipeline by using [the API](../triggers/index.md#making-use-of-trigger-variables).
1. Pass variables to a downstream pipeline [by using the `variable` keyword](../multi_project_pipelines.md#pass-cicd-variables-to-a-downstream-pipeline-by-using-the-variables-keyword)
or [by using variable inheritance](../multi_project_pipelines.md#pass-cicd-variables-to-a-downstream-pipeline-by-using-variable-inheritance).
1. Pass variables to a downstream pipeline [by using the `variable` keyword](../pipelines/multi_project_pipelines.md#pass-cicd-variables-to-a-downstream-pipeline-by-using-the-variables-keyword)
or [by using variable inheritance](../pipelines/multi_project_pipelines.md#pass-cicd-variables-to-a-downstream-pipeline-by-using-variable-inheritance).
The pipeline variables declared in these events take [priority over other variables](#cicd-variable-precedence).
......
......@@ -1336,7 +1336,7 @@ pipeline based on branch names or pipeline types.
| `external` | When you use CI services other than GitLab. |
| `external_pull_requests` | When an external pull request on GitHub is created or updated (See [Pipelines for external pull requests](../ci_cd_for_external_repos/index.md#pipelines-for-external-pull-requests)). |
| `merge_requests` | For pipelines created when a merge request is created or updated. Enables [merge request pipelines](../pipelines/merge_request_pipelines.md), [merged results pipelines](../pipelines/pipelines_for_merged_results.md), and [merge trains](../pipelines/merge_trains.md). |
| `pipelines` | For [multi-project pipelines](../multi_project_pipelines.md) created by [using the API with `CI_JOB_TOKEN`](../multi_project_pipelines.md#create-multi-project-pipelines-by-using-the-api), or the [`trigger`](#trigger) keyword. |
| `pipelines` | For [multi-project pipelines](../pipelines/multi_project_pipelines.md) created by [using the API with `CI_JOB_TOKEN`](../pipelines/multi_project_pipelines.md#create-multi-project-pipelines-by-using-the-api), or the [`trigger`](#trigger) keyword. |
| `pushes` | For pipelines triggered by a `git push` event, including for branches and tags. |
| `schedules` | For [scheduled pipelines](../pipelines/schedules.md). |
| `tags` | When the Git reference for a pipeline is a tag. |
......@@ -1710,7 +1710,7 @@ build_job:
You can't download artifacts from jobs that run in [`parallel:`](#parallel).
To download artifacts between [parent-child pipelines](../parent_child_pipelines.md),
To download artifacts between [parent-child pipelines](../pipelines/parent_child_pipelines.md),
use [`needs:pipeline`](#artifact-downloads-to-child-pipelines).
You should not download artifacts from the same ref as a running pipeline. Concurrent
......@@ -1720,7 +1720,7 @@ pipelines running on the same ref could override the artifacts.
> [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/255983) in GitLab v13.7.
A [child pipeline](../parent_child_pipelines.md) can download artifacts from a job in
A [child pipeline](../pipelines/parent_child_pipelines.md) can download artifacts from a job in
its parent pipeline or another child pipeline in the same parent-child pipeline hierarchy.
For example, with the following parent pipeline that has a job that creates some artifacts:
......@@ -3305,7 +3305,7 @@ If there is more than one matched line in the job output, the last line is used.
For the matched line, the first occurrence of `\d+(\.\d+)?` is the code coverage.
Leading zeros are removed.
Coverage output from [child pipelines](../parent_child_pipelines.md) is not recorded
Coverage output from [child pipelines](../pipelines/parent_child_pipelines.md) is not recorded
or displayed. Check [the related issue](https://gitlab.com/gitlab-org/gitlab/-/issues/280818)
for more details.
......@@ -3561,14 +3561,14 @@ deploystacks: [vultr, data]
Use `trigger` to define a downstream pipeline trigger. When GitLab starts a `trigger` job,
a downstream pipeline is created.
Jobs with `trigger` can only use a [limited set of keywords](../multi_project_pipelines.md#define-multi-project-pipelines-in-your-gitlab-ciyml-file).
Jobs with `trigger` can only use a [limited set of keywords](../pipelines/multi_project_pipelines.md#define-multi-project-pipelines-in-your-gitlab-ciyml-file).
For example, you can't run commands with [`script`](#script), [`before_script`](#before_script),
or [`after_script`](#after_script).
You can use this keyword to create two different types of downstream pipelines:
- [Multi-project pipelines](../multi_project_pipelines.md#define-multi-project-pipelines-in-your-gitlab-ciyml-file)
- [Child pipelines](../parent_child_pipelines.md)
- [Multi-project pipelines](../pipelines/multi_project_pipelines.md#define-multi-project-pipelines-in-your-gitlab-ciyml-file)
- [Child pipelines](../pipelines/parent_child_pipelines.md)
[In GitLab 13.2](https://gitlab.com/gitlab-org/gitlab/-/issues/197140/) and later, you can
view which job triggered a downstream pipeline. In the [pipeline graph](../pipelines/index.md#visualize-pipelines),
......@@ -3633,7 +3633,7 @@ upstream_bridge:
> [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/16094) in GitLab 12.7.
To create a [child pipeline](../parent_child_pipelines.md), specify the path to the
To create a [child pipeline](../pipelines/parent_child_pipelines.md), specify the path to the
YAML file that contains the configuration of the child pipeline:
```yaml
......@@ -3642,7 +3642,7 @@ trigger_job:
include: path/to/child-pipeline.yml
```
Similar to [multi-project pipelines](../multi_project_pipelines.md#mirror-status-of-a-triggered-pipeline-in-the-trigger-job),
Similar to [multi-project pipelines](../pipelines/multi_project_pipelines.md#mirror-status-of-a-triggered-pipeline-in-the-trigger-job),
it's possible to mirror the status from a triggered pipeline:
```yaml
......@@ -3657,7 +3657,7 @@ trigger_job:
> [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/35632) in GitLab 12.9.
You can also trigger a child pipeline from a [dynamically generated configuration file](../parent_child_pipelines.md#dynamic-child-pipelines):
You can also trigger a child pipeline from a [dynamically generated configuration file](../pipelines/parent_child_pipelines.md#dynamic-child-pipelines):
```yaml
generate-config:
......
......@@ -30,7 +30,7 @@ On the left side we have the events that can trigger a pipeline based on various
- When a [merge request is created or updated](../../ci/pipelines/merge_request_pipelines.md#pipelines-for-merge-requests).
- When an MR is added to a [Merge Train](../../ci/pipelines/merge_trains.md#merge-trains).
- A [scheduled pipeline](../../ci/pipelines/schedules.md#pipeline-schedules).
- When project is [subscribed to an upstream project](../../ci/multi_project_pipelines.md#trigger-a-pipeline-when-an-upstream-project-is-rebuilt).
- When project is [subscribed to an upstream project](../../ci/pipelines/multi_project_pipelines.md#trigger-a-pipeline-when-an-upstream-project-is-rebuilt).
- When [Auto DevOps](../../topics/autodevops/index.md) is enabled.
- When GitHub integration is used with [external pull requests](../../ci/ci_cd_for_external_repos/index.md#pipelines-for-external-pull-requests).
- When an upstream pipeline contains a [bridge job](../../ci/yaml/index.md#trigger) which triggers a downstream pipeline.
......
......@@ -476,7 +476,7 @@ If you want to know the in-depth details, here's what's really happening:
The following GitLab features are used among others:
- [Manual actions](../../ci/yaml/index.md#whenmanual)
- [Multi project pipelines](../../ci/multi_project_pipelines.md)
- [Multi project pipelines](../../ci/pipelines/multi_project_pipelines.md)
- [Review Apps](../../ci/review_apps/index.md)
- [Artifacts](../../ci/yaml/index.md#artifacts)
- [Specific runner](../../ci/runners/runners_scope.md#prevent-a-specific-runner-from-being-enabled-for-other-projects)
......
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment