Commit 5b4aca67 authored by Marcel Amirault's avatar Marcel Amirault Committed by Evan Read

Update artifacts reports keyword reference and create new page

parent 0e794a62
......@@ -256,7 +256,7 @@ GitLab supports the [dotenv (`.env`)](https://github.com/bkeepers/dotenv) file f
and expands the `environment:url` value with variables defined in the `.env` file.
To use this feature, specify the
[`artifacts:reports:dotenv`](../yaml/index.md#artifactsreportsdotenv) keyword in `.gitlab-ci.yml`.
[`artifacts:reports:dotenv`](../yaml/artifacts_reports.md#artifactsreportsdotenv) keyword in `.gitlab-ci.yml`.
<i class="fa fa-youtube-play youtube" aria-hidden="true"></i>
For an overview, see [Set dynamic URLs after a job finished](https://youtu.be/70jDXtOf4Ig).
......
......@@ -37,7 +37,7 @@ For an MR, the values of these metrics from the feature branch are compared to t
## How to set it up
Add a job that creates a [metrics report](yaml/index.md#artifactsreportsmetrics) (default filename: `metrics.txt`). The file should conform to the [OpenMetrics](https://openmetrics.io/) format.
Add a job that creates a [metrics report](yaml/artifacts_reports.md#artifactsreportsmetrics) (default filename: `metrics.txt`). The file should conform to the [OpenMetrics](https://openmetrics.io/) format.
For example:
......
......@@ -41,7 +41,7 @@ Consider the following workflow:
## How it works
First, GitLab Runner uploads all [JUnit report format XML files](https://www.ibm.com/docs/en/adfz/developer-for-zos/14.1.0?topic=formats-junit-xml-format)
as [artifacts](yaml/index.md#artifactsreportsjunit) to GitLab. Then, when you visit a merge request, GitLab starts
as [artifacts](yaml/artifacts_reports.md#artifactsreportsjunit) to GitLab. Then, when you visit a merge request, GitLab starts
comparing the head and base branch's JUnit report format XML files, where:
- The base branch is the target branch (usually the default branch).
......@@ -77,7 +77,7 @@ If a test failed in the project's default branch in the last 14 days, a message
## How to set it up
To enable the Unit test reports in merge requests, you need to add
[`artifacts:reports:junit`](yaml/index.md#artifactsreportsjunit)
[`artifacts:reports:junit`](yaml/artifacts_reports.md#artifactsreportsjunit)
in `.gitlab-ci.yml`, and specify the path(s) of the generated test reports.
The reports must be `.xml` files, otherwise [GitLab returns an Error 500](https://gitlab.com/gitlab-org/gitlab/-/issues/216575).
......@@ -377,7 +377,7 @@ GitLab does not parse very [large nodes](https://nokogiri.org/tutorials/parsing_
> - [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/202114) in GitLab 13.0 behind the `:junit_pipeline_screenshots_view` feature flag, disabled by default.
> - [Feature flag removed](https://gitlab.com/gitlab-org/gitlab/-/issues/216979) in GitLab 13.12.
Upload your screenshots as [artifacts](yaml/index.md#artifactsreportsjunit) to GitLab. If JUnit
Upload your screenshots as [artifacts](yaml/artifacts_reports.md#artifactsreportsjunit) to GitLab. If JUnit
report format XML files contain an `attachment` tag, GitLab parses the attachment. Note that:
- The `attachment` tag **must** contain the relative path to `$CI_PROJECT_DIR` of the screenshots you uploaded. For
......
......@@ -554,7 +554,7 @@ These variables cannot be used as CI/CD variables to configure a pipeline, but
they can be used in job scripts.
1. In the job script, save the variable as a `.env` file.
1. Save the `.env` file as an [`artifacts:reports:dotenv`](../yaml/index.md#artifactsreportsdotenv)
1. Save the `.env` file as an [`artifacts:reports:dotenv`](../yaml/artifacts_reports.md#artifactsreportsdotenv)
artifact.
1. Set a job in a later stage to receive the artifact by using the [`dependencies`](../yaml/index.md#dependencies)
or the [`needs`](../yaml/index.md#needs) keywords.
......
This diff is collapsed.
This diff is collapsed.
......@@ -153,7 +153,7 @@ The included template:
fetches vulnerabilities found by [Starboard Operator](https://aquasecurity.github.io/starboard/v0.10.3/operator/).
GitLab saves the results as a
[Cluster Image Scanning report artifact](../../../ci/yaml/index.md#artifactsreportscluster_image_scanning)
[Cluster Image Scanning report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportscluster_image_scanning)
that you can download and analyze later. When downloading, you always receive the most recent
artifact.
......
......@@ -75,7 +75,7 @@ The included template:
(see [requirements](#requirements)) and scans it for possible vulnerabilities.
GitLab saves the results as a
[Container Scanning report artifact](../../../ci/yaml/index.md#artifactsreportscontainer_scanning)
[Container Scanning report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportscontainer_scanning)
that you can download and analyze later. When downloading, you always receive the most-recent
artifact.
......
......@@ -254,7 +254,7 @@ The included template creates a `dast` job in your CI/CD pipeline and scans
your project's running application for possible vulnerabilities.
The results are saved as a
[DAST report artifact](../../../ci/yaml/index.md#artifactsreportsdast)
[DAST report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportsdast)
that you can later download and analyze. Due to implementation limitations, we
always take the latest DAST artifact available. Behind the scenes, the
[GitLab DAST Docker image](https://gitlab.com/security-products/dast)
......
......@@ -425,7 +425,7 @@ include:
The included template creates dependency scanning jobs in your CI/CD
pipeline and scans your project's source code for possible vulnerabilities.
The results are saved as a
[dependency scanning report artifact](../../../ci/yaml/index.md#artifactsreportsdependency_scanning)
[dependency scanning report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportsdependency_scanning)
that you can later download and analyze. Due to implementation limitations, we
always take the latest dependency scanning artifact available.
......
......@@ -74,7 +74,7 @@ The included template creates IaC scanning jobs in your CI/CD pipeline and scans
your project's configuration files for possible vulnerabilities.
The results are saved as a
[SAST report artifact](../../../ci/yaml/index.md#artifactsreportssast)
[SAST report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportssast)
that you can download and analyze.
### Enable IaC Scanning via an automatic merge request
......
......@@ -184,7 +184,7 @@ The included template creates SAST jobs in your CI/CD pipeline and scans
your project's source code for possible vulnerabilities.
The results are saved as a
[SAST report artifact](../../../ci/yaml/index.md#artifactsreportssast)
[SAST report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportssast)
that you can later download and analyze. Due to implementation limitations, we
always take the latest SAST artifact available.
......
......@@ -134,7 +134,7 @@ The included template creates Secret Detection jobs in your CI/CD pipeline and s
your project's source code for secrets.
The results are saved as a
[Secret Detection report artifact](../../../ci/yaml/index.md#artifactsreportssecret_detection)
[Secret Detection report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportssecret_detection)
that you can later download and analyze. Due to implementation limitations, we
always take the latest Secret Detection artifact available.
......
......@@ -126,7 +126,7 @@ the `license_management` job, so you must migrate to the `license_scanning` job
`License-Scanning.gitlab-ci.yml` template.
The results are saved as a
[License Compliance report artifact](../../../ci/yaml/index.md#artifactsreportslicense_scanning)
[License Compliance report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportslicense_scanning)
that you can later download and analyze. Due to implementation limitations, we
always take the latest License Compliance artifact available. Behind the scenes, the
[GitLab License Compliance Docker image](https://gitlab.com/gitlab-org/security-products/analyzers/license-finder)
......
......@@ -10,7 +10,7 @@ Collaborating around Infrastructure as Code (IaC) changes requires both code cha
## Output Terraform Plan information into a merge request
Using the [GitLab Terraform Report artifact](../../../ci/yaml/index.md#artifactsreportsterraform),
Using the [GitLab Terraform Report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportsterraform),
you can expose details from `terraform plan` runs directly into a merge request widget,
enabling you to see statistics about the resources that Terraform creates,
modifies, or destroys.
......@@ -62,7 +62,7 @@ To manually configure a GitLab Terraform Report artifact:
1. Define a `script` that runs `terraform plan` and `terraform show`. These commands
pipe the output and convert the relevant bits into a store variable `PLAN_JSON`.
This JSON is used to create a
[GitLab Terraform Report artifact](../../../ci/yaml/index.md#artifactsreportsterraform).
[GitLab Terraform Report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportsterraform).
The Terraform report obtains a Terraform `tfplan.json` file. The collected
Terraform plan report is uploaded to GitLab as an artifact, and is shown in merge requests.
......
......@@ -40,7 +40,7 @@ Consider the following workflow:
## How browser performance testing works
First, define a job in your `.gitlab-ci.yml` file that generates the
[Browser Performance report artifact](../../../ci/yaml/index.md#artifactsreportsbrowser_performance).
[Browser Performance report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportsbrowser_performance).
GitLab then checks this report, compares key performance metrics for each page
between the source and target branches, and shows the information in the merge request.
......@@ -89,7 +89,7 @@ The above example:
GitLab 12.3 or earlier, you must [add the configuration manually](#gitlab-versions-132-and-earlier).
The template uses the [GitLab plugin for sitespeed.io](https://gitlab.com/gitlab-org/gl-performance),
and it saves the full HTML sitespeed.io report as a [Browser Performance report artifact](../../../ci/yaml/index.md#artifactsreportsbrowser_performance)
and it saves the full HTML sitespeed.io report as a [Browser Performance report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportsbrowser_performance)
that you can later download and analyze. This implementation always takes the latest
Browser Performance artifact available. If [GitLab Pages](../pages/index.md) is enabled,
you can view the report directly in your browser.
......
......@@ -87,7 +87,7 @@ include:
The above example creates a `code_quality` job in your CI/CD pipeline which
scans your source code for code quality issues. The report is saved as a
[Code Quality report artifact](../../../ci/yaml/index.md#artifactsreportscodequality)
[Code Quality report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportscodequality)
that you can later download and analyze.
It's also possible to override the URL to the Code Quality image by
......@@ -343,7 +343,7 @@ It's possible to have a custom tool provide Code Quality reports in GitLab. To
do this:
1. Define a job in your `.gitlab-ci.yml` file that generates the
[Code Quality report artifact](../../../ci/yaml/index.md#artifactsreportscodequality).
[Code Quality report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportscodequality).
1. Configure your tool to generate the Code Quality report artifact as a JSON
file that implements a subset of the [Code Climate
spec](https://github.com/codeclimate/platform/blob/master/spec/analyzers/SPEC.md#data-types).
......
......@@ -28,7 +28,7 @@ GET calls to a popular API endpoint in your application to see how it performs.
## How Load Performance Testing works
First, define a job in your `.gitlab-ci.yml` file that generates the
[Load Performance report artifact](../../../ci/yaml/index.md#artifactsreportsload_performance).
[Load Performance report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportsload_performance).
GitLab checks this report, compares key load performance metrics
between the source and target branches, and then shows the information in a merge request widget:
......@@ -140,7 +140,7 @@ For example, you can override the duration of the test with a CLI option:
GitLab only displays the key performance metrics in the MR widget if k6's results are saved
via [summary export](https://k6.io/docs/results-visualization/json#summary-export)
as a [Load Performance report artifact](../../../ci/yaml/index.md#artifactsreportsload_performance).
as a [Load Performance report artifact](../../../ci/yaml/artifacts_reports.md#artifactsreportsload_performance).
The latest Load Performance artifact available is always used, using the
summary values from the test.
......
......@@ -29,7 +29,7 @@ between pipeline completion and the visualization loading on the page.
For the coverage analysis to work, you have to provide a properly formatted
[Cobertura XML](https://cobertura.github.io/cobertura/) report to
[`artifacts:reports:cobertura`](../../../ci/yaml/index.md#artifactsreportscobertura).
[`artifacts:reports:cobertura`](../../../ci/yaml/artifacts_reports.md#artifactsreportscobertura).
This format was originally developed for Java, but most coverage analysis frameworks
for other languages have plugins to add support for it, like:
......
......@@ -134,7 +134,7 @@ You can also sort the requirements list by:
> - [Introduced](https://gitlab.com/groups/gitlab-org/-/epics/2859) in [GitLab Ultimate](https://about.gitlab.com/pricing/) 13.1.
> - [Added](https://gitlab.com/gitlab-org/gitlab/-/issues/215514) ability to specify individual requirements and their statuses in [GitLab Ultimate](https://about.gitlab.com/pricing/) 13.2.
GitLab supports [requirements test reports](../../../ci/yaml/index.md#artifactsreportsrequirements) now.
GitLab supports [requirements test reports](../../../ci/yaml/artifacts_reports.md#artifactsreportsrequirements) now.
You can add a job to your CI pipeline that, when triggered, marks all existing
requirements as Satisfied (you may manually satisfy a requirement in the edit form [edit a requirement](#edit-a-requirement)).
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment