Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
G
gitlab-ce
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
Analytics
Analytics
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Commits
Issue Boards
Open sidebar
Boxiang Sun
gitlab-ce
Commits
10ff1632
Commit
10ff1632
authored
Jun 18, 2018
by
Marcel Amirault
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
Update lfs_administration.md with language edits
parent
ad5e4469
Changes
1
Hide whitespace changes
Inline
Side-by-side
Showing
1 changed file
with
26 additions
and
26 deletions
+26
-26
doc/workflow/lfs/lfs_administration.md
doc/workflow/lfs/lfs_administration.md
+26
-26
No files found.
doc/workflow/lfs/lfs_administration.md
View file @
10ff1632
...
...
@@ -17,7 +17,7 @@ There are various configuration options to help GitLab server administrators:
*
Enabling/disabling Git LFS support
*
Changing the location of LFS object storage
*
Setting up
an
object storage supported by
[
Fog
](
http://fog.io/about/provider_documentation.html
)
*
Setting up object storage supported by
[
Fog
](
http://fog.io/about/provider_documentation.html
)
### Configuration for Omnibus installations
...
...
@@ -44,31 +44,31 @@ In `config/gitlab.yml`:
storage_path
:
/mnt/storage/lfs-objects
```
## Storing LFS objects
to an
object storage
## Storing LFS objects
in remote
object storage
> [Introduced][ee-2760] in [GitLab Premium][eep] 10.0. Brought to GitLab Core
in 10.7.
It is possible to store LFS objects
to a
remote object storage which allows you
to offload
R/W operation on local hard disk and freed
up disk space significantly.
You can check which object storage can be integrated with GitLab
[
here
](
http://fog.io/about/provider_documentation.html
)
(Since GitLab is tightly integrated with
`Fog`
, you can refer the documentation)
You can also use
an
object storage in a private local network. For example,
[
Minio
](
https://www.minio.io/
)
is
standalone object storage, easy to setup, and works well with GitLab instance
.
It is possible to store LFS objects
in
remote object storage which allows you
to offload
local hard disk R/W operations, and free
up disk space significantly.
GitLab is tightly integrated with
`Fog`
, so you can refer to its
[
documentation
](
http://fog.io/about/provider_documentation.html
)
to check which storage services can be integrated with GitLab.
You can also use
external
object storage in a private local network. For example,
[
Minio
](
https://www.minio.io/
)
is
a standalone object storage service, is easy to setup, and works well with GitLab instances
.
GitLab provides two different options
as the uploading mechanizm. One is "Direct upload", and another one is
"Background upload".
GitLab provides two different options
for the uploading mechanism: "Direct upload" and
"Background upload".
**Option 1. Direct upload**
1.
User pushes a lfs file to the GitLab instance
1.
GitLab-workhorse uploads the file
to the
object storage
1.
GitLab-workhorse notifies
to GitLab-rails that the uploading process is don
e
1.
User pushes a
n
lfs file to the GitLab instance
1.
GitLab-workhorse uploads the file
directly to the external
object storage
1.
GitLab-workhorse notifies
GitLab-rails that the upload process is complet
e
**Option 2. Background upload**
1.
User pushes a lfs file to the GitLab instance
1.
GitLab-rails stores the file
to the local files
storage
1.
GitLab-rails
uploads the file to
object storage asynchronously
1.
User pushes a
n
lfs file to the GitLab instance
1.
GitLab-rails stores the file
in the local file
storage
1.
GitLab-rails
then uploads the file to the external
object storage asynchronously
The following general settings are supported.
...
...
@@ -83,7 +83,7 @@ The following general settings are supported.
The
`connection`
settings match those provided by
[
Fog
](
https://github.com/fog
)
.
Here is
the
configuration example with S3.
Here is
a
configuration example with S3.
| Setting | Description | example |
|---------|-------------|---------|
...
...
@@ -101,14 +101,14 @@ Here is a configuration example with GCS.
|---------|-------------|---------|
|
`provider`
| The provider name |
`Google`
|
|
`google_project`
| GCP project name |
`gcp-project-12345`
|
|
`google_client_email`
| The email address of
a
service account |
`foo@gcp-project-12345.iam.gserviceaccount.com`
|
|
`google_json_key_location`
| The json key path
to the
|
`/path/to/gcp-project-12345-abcde.json`
|
|
`google_client_email`
| The email address of
the
service account |
`foo@gcp-project-12345.iam.gserviceaccount.com`
|
|
`google_json_key_location`
| The json key path |
`/path/to/gcp-project-12345-abcde.json`
|
_NOTE:
Service account must have a permission to access the bucket. See more https://cloud.google.com/storage/docs/authentication
_
_NOTE:
The service account must have permission to access the bucket. [See more](https://cloud.google.com/storage/docs/authentication)
_
### Manual uploading to an object storage
There are two ways to
do the same thing with automatic uploading which described above
.
There are two ways to
manually do the same thing as automatic uploading (described above)
.
**Option 1: rake task**
...
...
@@ -204,15 +204,15 @@ and [projects APIs](../../api/projects.md).
## Troubleshooting: `Google::Apis::TransmissionError: execution expired`
If LFS integration is configred with Google Cloud Storage and background upload
(
`background_upload: true`
and
`direct_upload: false`
)
sidekiq workers may encouter this error. This is because
uploading timed out by hu
ge files.
For the record, upto 6GB lfs files can be uploaded without any extra steps, otherwise you need
the following workaround.
If LFS integration is configred with Google Cloud Storage and background upload
s (
`background_upload: true`
and
`direct_upload: false`
),
sidekiq workers may encouter this error. This is because
the uploading timed out with very lar
ge files.
LFS files up to 6Gb can be uploaded without any extra steps, otherwise you need to use
the following workaround.
```
shell
$
sudo
gitlab-rails console
# Login to rails console
>
# Setup timeouts. 20 minutes is enough to upload 30GB LFS files.
>
# Th
ose settings are only effective in the same session, i.e. Those are not effective in
sidekiq workers.
>
# Set
up timeouts. 20 minutes is enough to upload 30GB LFS files.
>
# Th
ese settings are only in effect for the same session, i.e. they are not effective for
sidekiq workers.
>
::Google::Apis::ClientOptions.default.open_timeout_sec
=
1200
>
::Google::Apis::ClientOptions.default.read_timeout_sec
=
1200
>
::Google::Apis::ClientOptions.default.send_timeout_sec
=
1200
...
...
@@ -223,7 +223,7 @@ $ sudo gitlab-rails console # Login to rails console
>
end
```
See more information in
https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/19581
See more information in
[
!19581
](
https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/19581
)
## Known limitations
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment