Commit afb1ee87 authored by Kamil Trzciński's avatar Kamil Trzciński

Merge branch 'patch-29' into 'master'

Update lfs_administration.md with language edits

See merge request gitlab-org/gitlab-ce!19950
parents 69e54faa 10ff1632
...@@ -17,7 +17,7 @@ There are various configuration options to help GitLab server administrators: ...@@ -17,7 +17,7 @@ There are various configuration options to help GitLab server administrators:
* Enabling/disabling Git LFS support * Enabling/disabling Git LFS support
* Changing the location of LFS object storage * Changing the location of LFS object storage
* Setting up an object storage supported by [Fog](http://fog.io/about/provider_documentation.html) * Setting up object storage supported by [Fog](http://fog.io/about/provider_documentation.html)
### Configuration for Omnibus installations ### Configuration for Omnibus installations
...@@ -44,31 +44,31 @@ In `config/gitlab.yml`: ...@@ -44,31 +44,31 @@ In `config/gitlab.yml`:
storage_path: /mnt/storage/lfs-objects storage_path: /mnt/storage/lfs-objects
``` ```
## Storing LFS objects to an object storage ## Storing LFS objects in remote object storage
> [Introduced][ee-2760] in [GitLab Premium][eep] 10.0. Brought to GitLab Core > [Introduced][ee-2760] in [GitLab Premium][eep] 10.0. Brought to GitLab Core
in 10.7. in 10.7.
It is possible to store LFS objects to a remote object storage which allows you It is possible to store LFS objects in remote object storage which allows you
to offload R/W operation on local hard disk and freed up disk space significantly. to offload local hard disk R/W operations, and free up disk space significantly.
You can check which object storage can be integrated with GitLab [here](http://fog.io/about/provider_documentation.html) GitLab is tightly integrated with `Fog`, so you can refer to its [documentation](http://fog.io/about/provider_documentation.html)
(Since GitLab is tightly integrated with `Fog`, you can refer the documentation) to check which storage services can be integrated with GitLab.
You can also use an object storage in a private local network. For example, You can also use external object storage in a private local network. For example,
[Minio](https://www.minio.io/) is standalone object storage, easy to setup, and works well with GitLab instance. [Minio](https://www.minio.io/) is a standalone object storage service, is easy to setup, and works well with GitLab instances.
GitLab provides two different options as the uploading mechanizm. One is "Direct upload", and another one is "Background upload". GitLab provides two different options for the uploading mechanism: "Direct upload" and "Background upload".
**Option 1. Direct upload** **Option 1. Direct upload**
1. User pushes a lfs file to the GitLab instance 1. User pushes an lfs file to the GitLab instance
1. GitLab-workhorse uploads the file to the object storage 1. GitLab-workhorse uploads the file directly to the external object storage
1. GitLab-workhorse notifies to GitLab-rails that the uploading process is done 1. GitLab-workhorse notifies GitLab-rails that the upload process is complete
**Option 2. Background upload** **Option 2. Background upload**
1. User pushes a lfs file to the GitLab instance 1. User pushes an lfs file to the GitLab instance
1. GitLab-rails stores the file to the local files storage 1. GitLab-rails stores the file in the local file storage
1. GitLab-rails uploads the file to object storage asynchronously 1. GitLab-rails then uploads the file to the external object storage asynchronously
The following general settings are supported. The following general settings are supported.
...@@ -83,7 +83,7 @@ The following general settings are supported. ...@@ -83,7 +83,7 @@ The following general settings are supported.
The `connection` settings match those provided by [Fog](https://github.com/fog). The `connection` settings match those provided by [Fog](https://github.com/fog).
Here is the configuration example with S3. Here is a configuration example with S3.
| Setting | Description | example | | Setting | Description | example |
|---------|-------------|---------| |---------|-------------|---------|
...@@ -101,14 +101,14 @@ Here is a configuration example with GCS. ...@@ -101,14 +101,14 @@ Here is a configuration example with GCS.
|---------|-------------|---------| |---------|-------------|---------|
| `provider` | The provider name | `Google` | | `provider` | The provider name | `Google` |
| `google_project` | GCP project name | `gcp-project-12345` | | `google_project` | GCP project name | `gcp-project-12345` |
| `google_client_email` | The email address of a service account | `foo@gcp-project-12345.iam.gserviceaccount.com` | | `google_client_email` | The email address of the service account | `foo@gcp-project-12345.iam.gserviceaccount.com` |
| `google_json_key_location` | The json key path to the | `/path/to/gcp-project-12345-abcde.json` | | `google_json_key_location` | The json key path | `/path/to/gcp-project-12345-abcde.json` |
_NOTE: Service account must have a permission to access the bucket. See more https://cloud.google.com/storage/docs/authentication_ _NOTE: The service account must have permission to access the bucket. [See more](https://cloud.google.com/storage/docs/authentication)_
### Manual uploading to an object storage ### Manual uploading to an object storage
There are two ways to do the same thing with automatic uploading which described above. There are two ways to manually do the same thing as automatic uploading (described above).
**Option 1: rake task** **Option 1: rake task**
...@@ -204,15 +204,15 @@ and [projects APIs](../../api/projects.md). ...@@ -204,15 +204,15 @@ and [projects APIs](../../api/projects.md).
## Troubleshooting: `Google::Apis::TransmissionError: execution expired` ## Troubleshooting: `Google::Apis::TransmissionError: execution expired`
If LFS integration is configred with Google Cloud Storage and background upload (`background_upload: true` and `direct_upload: false`) If LFS integration is configred with Google Cloud Storage and background uploads (`background_upload: true` and `direct_upload: false`),
sidekiq workers may encouter this error. This is because uploading timed out by huge files. sidekiq workers may encouter this error. This is because the uploading timed out with very large files.
For the record, upto 6GB lfs files can be uploaded without any extra steps, otherwise you need the following workaround. LFS files up to 6Gb can be uploaded without any extra steps, otherwise you need to use the following workaround.
```shell ```shell
$ sudo gitlab-rails console # Login to rails console $ sudo gitlab-rails console # Login to rails console
> # Setup timeouts. 20 minutes is enough to upload 30GB LFS files. > # Set up timeouts. 20 minutes is enough to upload 30GB LFS files.
> # Those settings are only effective in the same session, i.e. Those are not effective in sidekiq workers. > # These settings are only in effect for the same session, i.e. they are not effective for sidekiq workers.
> ::Google::Apis::ClientOptions.default.open_timeout_sec = 1200 > ::Google::Apis::ClientOptions.default.open_timeout_sec = 1200
> ::Google::Apis::ClientOptions.default.read_timeout_sec = 1200 > ::Google::Apis::ClientOptions.default.read_timeout_sec = 1200
> ::Google::Apis::ClientOptions.default.send_timeout_sec = 1200 > ::Google::Apis::ClientOptions.default.send_timeout_sec = 1200
...@@ -223,7 +223,7 @@ $ sudo gitlab-rails console # Login to rails console ...@@ -223,7 +223,7 @@ $ sudo gitlab-rails console # Login to rails console
> end > end
``` ```
See more information in https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/19581 See more information in [!19581](https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/19581)
## Known limitations ## Known limitations
......
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment