Skip to content

Commit

Permalink
Add latest changes from gitlab-org/gitlab@master
Browse files Browse the repository at this point in the history
  • Loading branch information
GitLab Bot committed Oct 8, 2020
1 parent f779698 commit 028d8ac
Show file tree
Hide file tree
Showing 28 changed files with 204 additions and 157 deletions.
3 changes: 0 additions & 3 deletions app/models/project.rb
Original file line number Diff line number Diff line change
Expand Up @@ -999,9 +999,6 @@ def add_import_job
job_id =
if forked?
RepositoryForkWorker.perform_async(id)
elsif gitlab_project_import?
# Do not retry on Import/Export until https://gitlab.com/gitlab-org/gitlab-foss/issues/26189 is solved.
RepositoryImportWorker.set(retry: false).perform_async(self.id)
else
RepositoryImportWorker.perform_async(self.id)
end
Expand Down
2 changes: 1 addition & 1 deletion app/services/ci/build_report_result_service.rb
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@ def track_test_cases(build, test_suite)
def test_case_hashes(build, test_suite)
[].tap do |hashes|
test_suite.each_test_case do |test_case|
key = "#{build.project_id}-#{test_suite.name}-#{test_case.key}"
key = "#{build.project_id}-#{test_case.key}"
hashes << Digest::SHA256.hexdigest(key)
end
end
Expand Down
7 changes: 7 additions & 0 deletions app/services/ci/destroy_expired_job_artifacts_service.rb
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,13 @@ def destroy_batch(klass)
return false if artifacts.empty?

artifacts.each(&:destroy!)
run_after_destroy(artifacts)

true # This is required because of the design of `loop_until` method.
end

def run_after_destroy(artifacts); end
end
end

Ci::DestroyExpiredJobArtifactsService.prepend_if_ee('EE::Ci::DestroyExpiredJobArtifactsService')
1 change: 1 addition & 0 deletions app/workers/repository_import_worker.rb
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,7 @@ class RepositoryImportWorker # rubocop:disable Scalability/IdempotentWorker

feature_category :importers
worker_has_external_dependencies!
# Do not retry on Import/Export until https://gitlab.com/gitlab-org/gitlab/-/issues/16812 is solved.
sidekiq_options retry: false
sidekiq_options status_expiration: Gitlab::Import::StuckImportJob::IMPORT_JOBS_EXPIRATION

Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
---
title: Fix Auto Deploy scale subcommand unintentionally recreates legacy PostgreSQL
merge_request: 44535
author:
type: fixed
6 changes: 3 additions & 3 deletions doc/ci/docker/using_kaniko.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,13 +17,13 @@ kaniko solves two problems with using the
build](using_docker_build.md#use-docker-in-docker-workflow-with-docker-executor) method:

- Docker-in-Docker requires [privileged mode](https://docs.docker.com/engine/reference/run/#runtime-privilege-and-linux-capabilities)
in order to function, which is a significant security concern.
to function, which is a significant security concern.
- Docker-in-Docker generally incurs a performance penalty and can be quite slow.

## Requirements

In order to utilize kaniko with GitLab, [a runner](https://docs.gitlab.com/runner/)
with one of the following executors is required:
To use kaniko with GitLab, [a runner](https://docs.gitlab.com/runner/) with one
of the following executors is required:

- [Kubernetes](https://docs.gitlab.com/runner/executors/kubernetes.html).
- [Docker](https://docs.gitlab.com/runner/executors/docker.html).
Expand Down
15 changes: 8 additions & 7 deletions doc/ci/examples/artifactory_and_gitlab/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,10 +23,10 @@ We also assume that an Artifactory instance is available and reachable from the

## Create the simple Maven dependency

First of all, you need an application to work with: in this specific case we will
use a simple one, but it could be any Maven application. This will be the
dependency you want to package and deploy to Artifactory, in order to be
available to other projects.
First, you need an application to work with: in this specific case we'll use a
simple one, but it could be any Maven application. This will be the dependency
you want to package and deploy to Artifactory, to be available to other
projects.

### Prepare the dependency application

Expand Down Expand Up @@ -58,7 +58,7 @@ The application is ready to use, but you need some additional steps to deploy it
1. Log in to Artifactory with your user's credentials.
1. From the main screen, click on the `libs-release-local` item in the **Set Me Up** panel.
1. Copy to clipboard the configuration snippet under the **Deploy** paragraph.
1. Change the `url` value in order to have it configurable via variables.
1. Change the `url` value to have it configurable by using variables.
1. Copy the snippet in the `pom.xml` file for your project, just after the
`dependencies` section. The snippet should look like this:

Expand Down Expand Up @@ -146,8 +146,9 @@ deploy:
- master
```
The runner will use the latest [Maven Docker image](https://hub.docker.com/_/maven/), which already contains all the tools and the dependencies you need to manage the project,
in order to run the jobs.
The runner uses the latest [Maven Docker image](https://hub.docker.com/_/maven/),
which contains all of the tools and dependencies needed to manage the project
and to run the jobs.
Environment variables are set to instruct Maven to use the `homedir` of the repository instead of the user's home when searching for configuration and dependencies.

Expand Down
7 changes: 4 additions & 3 deletions doc/ci/examples/end_to_end_testing_webdriverio/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -95,9 +95,10 @@ dependency upgrade did not break anything without even having to look at your we

## Running locally

We'll get to running the above test in CI/CD in a moment. When writing tests, however, it helps if
you do not have to wait for your pipelines to succeed in order to check whether they do what you
expect them to do. In other words, let's get it to run locally.
We'll get to running the above test in CI/CD in a moment. When writing tests,
however, it helps if you don't have to wait for your pipelines to succeed to
determine whether they do what you expect them to do. In other words, let's get
it to run locally.

Make sure that your app is running locally. If you use Webpack,
you can use [the Webpack Dev Server WebdriverIO plugin](https://www.npmjs.com/package/wdio-webpack-dev-server-service)
Expand Down
23 changes: 11 additions & 12 deletions doc/ci/examples/php.md
Original file line number Diff line number Diff line change
Expand Up @@ -141,12 +141,11 @@ Of course, `my_php.ini` must be present in the root directory of your repository

## Test PHP projects using the Shell executor

The shell executor runs your job in a terminal session on your server.
Thus, in order to test your projects you first need to make sure that all
dependencies are installed.
The shell executor runs your job in a terminal session on your server. To test
your projects, you must first ensure that all dependencies are installed.

For example, in a VM running Debian 8 we first update the cache, then we
install `phpunit` and `php5-mysql`:
For example, in a VM running Debian 8, first update the cache, and then install
`phpunit` and `php5-mysql`:

```shell
sudo apt-get update -y
Expand Down Expand Up @@ -219,8 +218,8 @@ test:atoum:
### Using Composer

The majority of the PHP projects use Composer for managing their PHP packages.
In order to execute Composer before running your tests, simply add the
following in your `.gitlab-ci.yml`:
To execute Composer before running your tests, add the following to your
`.gitlab-ci.yml`:

```yaml
# Composer stores all downloaded packages in the vendor/ directory.
Expand All @@ -243,14 +242,14 @@ before_script:
## Access private packages or dependencies

If your test suite needs to access a private repository, you need to configure
[the SSH keys](../ssh_keys/README.md) in order to be able to clone it.
the [SSH keys](../ssh_keys/README.md) to be able to clone it.

## Use databases or other services

Most of the time you will need a running database in order for your tests to
run. If you are using the Docker executor you can leverage Docker's ability to
link to other containers. With GitLab Runner, this can be achieved by
defining a `service`.
Most of the time, you need a running database for your tests to be able to
run. If you're using the Docker executor, you can leverage Docker's ability to
link to other containers. With GitLab Runner, this can be achieved by defining
a `service`.

This functionality is covered in [the CI services](../services/README.md)
documentation.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ First install [Docker Engine](https://docs.docker.com/installation/).

To build this project you also need to have [GitLab Runner](https://docs.gitlab.com/runner/).
You can use public runners available on `gitlab.com` or register your own. Start by
creating a template configuration file in order to pass complex configuration:
creating a template configuration file to pass complex configuration:

```shell
cat > /tmp/test-config.template.toml << EOF
Expand Down
5 changes: 3 additions & 2 deletions doc/ci/merge_request_pipelines/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -125,8 +125,9 @@ Therefore:
- Since `C` specifies that it should only run for merge requests, it will not run for any pipeline
except a merge request pipeline.

This helps you avoid having to add the `only:` rule to all of your jobs
in order to make them always run. You can use this format to set up a Review App, helping to save resources.
This helps you avoid having to add the `only:` rule to all of your jobs to make
them always run. You can use this format to set up a Review App, helping to
save resources.

#### Excluding certain branches

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -80,8 +80,8 @@ For more information, read the [documentation on Merge Trains](merge_trains/inde

> [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/12996) in [GitLab Premium](https://about.gitlab.com/pricing/) 12.3.
GitLab CI/CD can detect the presence of redundant pipelines,
and will cancel them automatically in order to conserve CI resources.
GitLab CI/CD can detect the presence of redundant pipelines, and cancels them
to conserve CI resources.

When a user merges a merge request immediately within an ongoing merge
train, the train will be reconstructed, as it will recreate the expected
Expand Down
8 changes: 5 additions & 3 deletions doc/ci/pipelines/job_artifacts.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,8 +37,8 @@ pdf:
expire_in: 1 week
```
A job named `pdf` calls the `xelatex` command in order to build a PDF file from
the latex source file `mycv.tex`. We then define the `artifacts` paths which in
A job named `pdf` calls the `xelatex` command to build a PDF file from the
latex source file `mycv.tex`. We then define the `artifacts` paths which in
turn are defined with the `paths` keyword. All paths to files and directories
are relative to the repository that was cloned during the build.

Expand Down Expand Up @@ -429,7 +429,9 @@ To erase a job:

## Retrieve artifacts of private projects when using GitLab CI

In order to retrieve a job artifact of a different project, you might need to use a private token in order to [authenticate and download](../../api/job_artifacts.md#get-job-artifacts) the artifacts.
To retrieve a job artifact from a different project, you might need to use a
private token to [authenticate and download](../../api/job_artifacts.md#get-job-artifacts)
the artifact.

## Troubleshooting

Expand Down
3 changes: 2 additions & 1 deletion doc/ci/ssh_keys/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -134,7 +134,8 @@ on, and use that key for all projects that are run on this machine.
If you are accessing a private GitLab repository you need to add it as a
[deploy key](../../ssh/README.md#deploy-keys).

Once done, try to log in to the remote server in order to accept the fingerprint:
After generating the key, try to sign in to the remote server to accept the
fingerprint:

```shell
ssh example.com
Expand Down
6 changes: 3 additions & 3 deletions doc/ci/triggers/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -183,9 +183,9 @@ webhook URL for Push and Tag events (change the project ID, ref and token):
https://gitlab.example.com/api/v4/projects/9/ref/master/trigger/pipeline?token=TOKEN
```

`ref` should be passed as part of the URL in order to take precedence over
`ref` from the webhook body that designates the branch ref that fired the
trigger in the source repository. `ref` should be URL-encoded if it contains slashes.
You should pass `ref` as part of the URL, to take precedence over `ref` from
the webhook body that designates the branch ref that fired the trigger in the
source repository. Be sure to URL-encode `ref` if it contains slashes.

## Making use of trigger variables

Expand Down
67 changes: 34 additions & 33 deletions doc/ci/yaml/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -1979,15 +1979,17 @@ docker build service one:
- service-one/**/*
```

In the example above, a pipeline could fail due to changes to a file in `service-one/**/*`.
A later commit could then be pushed that does not include any changes to this file,
but includes changes to the `Dockerfile`, and this pipeline could pass because it's only
testing the changes to the `Dockerfile`. GitLab checks the **most recent pipeline**,
that **passed**, and shows the merge request as mergeable, despite the earlier
failed pipeline caused by a change that was not yet corrected.
In the example above, the pipeline might fail because of changes to a file in `service-one/**/*`.

With this configuration, care must be taken to check that the most recent pipeline
properly corrected any failures from previous pipelines.
A later commit that doesn't have changes in `service-one/**/*`
but does have changes to the `Dockerfile` can pass. The job
only tests the changes to the `Dockerfile`.

GitLab checks the **most recent pipeline** that **passed**. If the merge request is mergeable,
it doesn't matter that an earlier pipeline failed because of a change that has not been corrected.

When you use this configuration, ensure that the most recent pipeline
properly corrects any failures from previous pipelines.

##### Using `only:changes` without pipelines for merge requests

Expand Down Expand Up @@ -2093,8 +2095,7 @@ can choose a custom limit. For example, to set the limit to 100:
Plan.default.actual_limits.update!(ci_needs_size_limit: 100)
```

NOTE: **Note:**
To disable the ability to use DAG, set the limit to `0`.
To disable directed acyclic graphs (DAG), set the limit to `0`.

#### Artifact downloads with `needs`

Expand Down Expand Up @@ -2130,7 +2131,7 @@ rubocop:
```

Additionally, in the three syntax examples below, the `rspec` job downloads the artifacts
from all three `build_jobs`, as `artifacts` is true for `build_job_1`, and
from all three `build_jobs`. `artifacts` is true for `build_job_1` and
**defaults** to true for both `build_job_2` and `build_job_3`.

```yaml
Expand All @@ -2146,9 +2147,10 @@ rspec:

> [Introduced](https://gitlab.com/gitlab-org/gitlab/-/issues/14311) in GitLab v12.7.

`needs` can be used to download artifacts from up to five jobs in pipelines on
[other refs in the same project](#artifact-downloads-between-pipelines-in-the-same-project),
or pipelines in different projects, groups and namespaces:
Use `needs` to download artifacts from up to five jobs in pipelines:

- [On other refs in the same project](#artifact-downloads-between-pipelines-in-the-same-project).
- In different projects, groups and namespaces.

```yaml
build_job:
Expand All @@ -2171,9 +2173,10 @@ The user running the pipeline must have at least `reporter` access to the group

##### Artifact downloads between pipelines in the same project

`needs` can be used to download artifacts from different pipelines in the current project
by setting the `project` keyword as the current project's name, and specifying a ref.
In the example below, `build_job` downloads the artifacts for the latest successful
Use `needs` to download artifacts from different pipelines in the current project.
Set the `project` keyword as the current project's name, and specify a ref.

In this example, `build_job` downloads the artifacts for the latest successful
`build-1` job with the `other-ref` ref:

```yaml
Expand Down Expand Up @@ -2205,7 +2208,6 @@ build_job:
artifacts: true
```

NOTE: **Note:**
Downloading artifacts from jobs that are run in [`parallel:`](#parallel) is not supported.

### `tags`
Expand All @@ -2217,7 +2219,7 @@ When you register a runner, you can specify the runner's tags, for
example `ruby`, `postgres`, `development`.

In this example, the job is run by a runner that
has both `ruby` AND `postgres` tags defined.
has both `ruby` and `postgres` tags defined.

```yaml
job:
Expand Down Expand Up @@ -2562,9 +2564,9 @@ deploy to production:
> defined, GitLab automatically triggers a stop action when the associated
> branch is deleted.

Closing (stopping) environments can be achieved with the `on_stop` keyword defined under
`environment`. It declares a different job that runs in order to close
the environment.
Closing (stopping) environments can be achieved with the `on_stop` keyword
defined under `environment`. It declares a different job that runs to close the
environment.

Read the `environment:action` section for an example.

Expand Down Expand Up @@ -2602,21 +2604,20 @@ stop_review_app:
action: stop
```

In the above example we set up the `review_app` job to deploy to the `review`
environment, and we also defined a new `stop_review_app` job under `on_stop`.
In the above example, the `review_app` job deploys to the `review`
environment. A new `stop_review_app` job is listed under `on_stop`.
After the `review_app` job is finished, it triggers the
`stop_review_app` job based on what is defined under `when`. In this case we
set it up to `manual` so it needs a [manual action](#whenmanual) from
`stop_review_app` job based on what is defined under `when`. In this case,
it is set to `manual`, so it needs a [manual action](#whenmanual) from
GitLab's user interface to run.

Also in the example, `GIT_STRATEGY` is set to `none` so that GitLab Runner won’t
try to check out the code after the branch is deleted when the `stop_review_app`
job is [automatically triggered](../environments/index.md#automatically-stopping-an-environment).
Also in the example, `GIT_STRATEGY` is set to `none`. If the
`stop_review_app` job is [automatically triggered](../environments/index.md#automatically-stopping-an-environment),
the runner won’t try to check out the code after the branch is deleted.

NOTE: **Note:**
The above example overwrites global variables. If your stop environment job depends
on global variables, you can use [anchor variables](#yaml-anchors-for-variables) when setting the `GIT_STRATEGY`
to change it without overriding the global variables.
The example also overwrites global variables. If your `stop` `environment` job depends
on global variables, you can use [anchor variables](#yaml-anchors-for-variables) when you set the `GIT_STRATEGY`.
This changes the job without overriding the global variables.

The `stop_review_app` job is **required** to have the following keywords defined:

Expand Down
Loading

0 comments on commit 028d8ac

Please sign in to comment.