Commit fd28ff5c authored by Nur Rony's avatar Nur Rony

Merge branch 'master' into 23557-remove-extra-line-for-empty-issue-description

* master: (22 commits)
  Fix status code expectation
  Stop clearing the database cache on rake cache:clear
  Fix error in generating labels
  Fix bug where e-mails were not being sent out via Sidekiq
  Fix documents and comments on Build API `scope`. #23146 #19131
  Re-organize queues to use for Sidekiq
  Fix wrong endpoint in api/users documentation, fix same typo in spec describe blocks
  Update CHANGELOG
  Fix object data to be sent to fetch analytics data
  Fixed compare ellipsis messing with layout
  Change "Group#web_url" to return "/groups/twitter" rather than "/twitter".
  fix font weight of project feature settings
  Add hover to trash icon in notes
  Ensure custom provider tab labels don't break layout.
  Fixed issue when images are loading it would push off the tabs
  Fixed issues with sticky mr tabs & sidebar
  Refactor and add new functionality to CI yaml reference
  Ignore external issues when bulk assigning issues to author of merge request.
  Changed gitlab-shell version to avoid warning when precompiling the assets.
  Grammar fixes in docs
  ...
parents 0ca0697a a98ad03b
......@@ -4,14 +4,20 @@ Please view this file on the master branch, on stable branches it's out of date.
- Adds user project membership expired event to clarify why user was removed (Callum Dryden)
- Trim leading and trailing whitespace on project_path (Linus Thiel)
- Fix HipChat notifications rendering (airatshigapov, eisnerd)
- Add hover to trash icon in notes !7008 (blackst0ne)
- Simpler arguments passed to named_route on toggle_award_url helper method
- Fix: Backup restore doesn't clear cache
- Use MergeRequestsClosingIssues cache data on Issue#closed_by_merge_requests method
- Fix documents and comments on Build API `scope`
## 8.13.1 (unreleased)
- Fix error in generating labels
## 8.13.0 (2016-10-22)
- Removes extra line for empty issue description. (!7045)
- Fix save button on project pipeline settings page. (!6955)
- All Sidekiq workers now use their own queue
- Avoid race condition when asynchronously removing expired artifacts. (!6881)
- Improve Merge When Build Succeeds triggers and execute on pipeline success. (!6675)
- Respond with 404 Not Found for non-existent tags (Linus Thiel)
......@@ -30,6 +36,7 @@ Please view this file on the master branch, on stable branches it's out of date.
- Update duration at the end of pipeline
- ExpireBuildArtifactsWorker query builds table without ordering enqueuing one job per build to cleanup
- Add group level labels. (!6425)
- Fix Cycle analytics not showing correct data when filtering by date. !6906
- Add an example for testing a phoenix application with Gitlab CI in the docs (Manthan Mallikarjun)
- Cancelled pipelines could be retried. !6927
- Updating verbiage on git basics to be more intuitive
......
......@@ -36,7 +36,11 @@
method: 'GET',
dataType: 'json',
contentType: 'application/json',
data: { start_date: options.startDate }
data: {
cycle_analytics: {
start_date: options.startDate
}
}
}).done((data) => {
this.decorateData(data);
this.initDropdown();
......
......@@ -388,28 +388,25 @@
// So we dont affix the tabs on these
if (Breakpoints.get().getBreakpointSize() === 'xs' || !$tabs.length) return;
var tabsWidth = $tabs.outerWidth(),
$diffTabs = $('#diff-notes-app'),
offsetTop = $tabs.offset().top - ($('.navbar-fixed-top').height() + $('.layout-nav').height());
var $diffTabs = $('#diff-notes-app'),
$fixedNav = $('.navbar-fixed-top'),
$layoutNav = $('.layout-nav');
$tabs.off('affix.bs.affix affix-top.bs.affix')
.affix({
offset: {
top: offsetTop
top: function () {
var tabsTop = $diffTabs.offset().top - $tabs.height();
tabsTop = tabsTop - ($fixedNav.height() + $layoutNav.height());
return tabsTop;
}
}
}).on('affix.bs.affix', function () {
$tabs.css({
left: $tabs.offset().left,
width: tabsWidth
});
$diffTabs.css({
marginTop: $tabs.height()
});
}).on('affix-top.bs.affix', function () {
$tabs.css({
left: '',
width: ''
});
$diffTabs.css({
marginTop: ''
});
......
......@@ -185,6 +185,10 @@ header.header-sidebar-pinned {
@media (min-width: $screen-sm-min) {
padding-right: $sidebar_collapsed_width;
.merge-request-tabs-holder.affix {
right: $sidebar_collapsed_width;
}
}
.sidebar-collapsed-icon {
......@@ -207,6 +211,10 @@ header.header-sidebar-pinned {
@media (min-width: $screen-md-min) {
padding-right: $gutter_width;
.merge-request-tabs-holder.affix {
right: $gutter_width;
}
}
&.with-overlay {
......
......@@ -143,6 +143,7 @@
&:not(.active) {
background-color: $gray-light;
border-left: 1px solid $border-color;
}
a {
......@@ -170,6 +171,31 @@
}
}
// Ldap configurations may need more tabs & the tab labels are user generated (arbitrarily long).
// These styles prevent this from breaking the layout, and only applied when providers are configured.
.new-session-tabs.custom-provider-tabs {
flex-wrap: wrap;
li {
min-width: 85px;
flex-basis: auto;
// This styles tab elements that have wrapped to a second line. We cannot easily predict when this will happen.
// We are making somewhat of an assumption about the configuration here: that users do not have more than
// 3 LDAP servers configured (in addition to standard login) and they are not using especially long names for any
// of them. If either condition is false, this will work as expected. If both are true, there may be a missing border
// above one of the bottom row elements. If you know a better way, please implement it!
&:nth-child(n+5) {
border-top: 1px solid $border-color;
}
}
a {
font-size: 16px;
}
}
.form-control {
&:active, &:focus {
......@@ -203,6 +229,7 @@
.login-page {
.col-sm-5.pull-right {
float: none !important;
margin-bottom: 45px;
}
}
}
......@@ -244,7 +271,11 @@
}
.navless-container {
padding: 65px; // height of footer + bottom padding of email confirmation link
padding: 65px 15px; // height of footer + bottom padding of email confirmation link
@media (max-width: $screen-xs-max) {
padding: 0 15px 65px;
}
}
}
......@@ -263,3 +294,4 @@
bottom: 0;
}
}
......@@ -183,11 +183,11 @@
.ci-coverage {
float: right;
}
.stop-env-container {
color: $gl-text-color;
float: right;
a {
color: $gl-text-color;
}
......@@ -438,11 +438,18 @@
}
}
.merge-request-tabs {
.merge-request-tabs-holder {
background-color: #fff;
&.affix {
top: 100px;
left: 0;
z-index: 9;
transition: right .15s;
}
&:not(.affix) .container-fluid {
padding-left: 0;
padding-right: 0;
}
}
......@@ -13,9 +13,18 @@
.new_project,
.edit-project {
fieldset {
&.features .control-label {
font-weight: normal;
&.features {
.label-light {
margin-bottom: 0;
}
.help-block {
margin-top: 0;
}
}
.form-group {
......@@ -40,6 +49,7 @@
}
.input-group > div {
&:last-child {
padding-right: 0;
}
......@@ -47,6 +57,7 @@
@media (max-width: $screen-xs-max) {
.input-group > div {
margin-bottom: 14px;
&:last-child {
......@@ -60,6 +71,7 @@
}
.input-group-addon {
&.static-namespace {
height: 35px;
border-radius: 3px;
......
......@@ -68,7 +68,7 @@ class Group < Namespace
end
def web_url
Gitlab::Routing.url_helpers.group_url(self)
Gitlab::Routing.url_helpers.group_canonical_url(self)
end
def human_name
......
......@@ -4,7 +4,7 @@ module MergeRequests
@assignable_issues ||= begin
if current_user == merge_request.author
closes_issues.select do |issue|
!issue.assignee_id? && can?(current_user, :admin_issue, issue)
!issue.is_a?(ExternalIssue) && !issue.assignee_id? && can?(current_user, :admin_issue, issue)
end
else
[]
......
......@@ -75,4 +75,4 @@
- @runners.each do |runner|
= render "admin/runners/runner", runner: runner
= paginate @runners
= paginate @runners, theme: "gitlab"
......@@ -67,7 +67,7 @@
= form_for [:admin, project.namespace.becomes(Namespace), project, project.runner_projects.new] do |f|
= f.hidden_field :runner_id, value: @runner.id
= f.submit 'Enable', class: 'btn btn-xs'
= paginate @projects
= paginate @projects, theme: "gitlab"
.col-md-6
%h4 Recent builds served by this Runner
......
......@@ -10,7 +10,7 @@
= form_for(resource, as: resource_name, url: session_path(resource_name), method: :post, html: { class: 'edit_user show-gl-field-errors' }) do |f|
- resource_params = params[resource_name].presence || params
= f.hidden_field :remember_me, value: resource_params.fetch(:remember_me, 0)
.form-group
%div
= f.label 'Two-Factor Authentication code', name: :otp_attempt
= f.text_field :otp_attempt, class: 'form-control', required: true, autofocus: true, autocomplete: 'off', title: 'This field is required.'
%p.help-block.hint Enter the code from the two-factor app on your mobile device. If you've lost your device, you may enter one of your recovery codes.
......
%ul.new-session-tabs.nav-links.nav-tabs
%ul.new-session-tabs.nav-links.nav-tabs{ class: ('custom-provider-tabs' if form_based_providers.any?) }
- if crowd_enabled?
%li.active
= link_to "Crowd", "#crowd", 'data-toggle' => 'tab'
......
......@@ -10,7 +10,7 @@
= button_tag type: 'button', class: "form-control compare-dropdown-toggle js-compare-dropdown", required: true, data: { refs_url: refs_namespace_project_path(@project.namespace, @project), toggle: "dropdown", target: ".js-compare-from-dropdown", selected: params[:from], field_name: :from } do
.dropdown-toggle-text= params[:from] || 'Select branch/tag'
= render "ref_dropdown"
.compare-ellipsis ...
.compare-ellipsis.inline ...
.form-group.dropdown.compare-form-group.to.js-compare-to-dropdown
.input-group.inline-input-group
%span.input-group-addon to
......
......@@ -46,70 +46,70 @@
%h5.prepend-top-0
Feature Visibility
= f.fields_for :project_feature do |feature_fields|
.form_group.prepend-top-20
.row
.col-md-9
= feature_fields.label :repository_access_level, "Repository", class: 'label-light'
%span.help-block Push files to be stored in this project
.col-md-3.js-repo-access-level
= project_feature_access_select(:repository_access_level)
= f.fields_for :project_feature do |feature_fields|
.form_group.prepend-top-20
.row
.col-md-9
= feature_fields.label :repository_access_level, "Repository", class: 'label-light'
%span.help-block Push files to be stored in this project
.col-md-3.js-repo-access-level
= project_feature_access_select(:repository_access_level)
.col-sm-12
.row
.col-md-9.project-feature-nested
= feature_fields.label :merge_requests_access_level, "Merge requests", class: 'label-light'
%span.help-block Submit changes to be merged upstream
.col-md-3
= project_feature_access_select(:merge_requests_access_level)
.col-sm-12
.row
.col-md-9.project-feature-nested
= feature_fields.label :merge_requests_access_level, "Merge requests", class: 'label-light'
%span.help-block Submit changes to be merged upstream
.col-md-3
= project_feature_access_select(:merge_requests_access_level)
.row
.col-md-9.project-feature-nested
= feature_fields.label :builds_access_level, "Builds", class: 'label-light'
%span.help-block Submit, test and deploy your changes before merge
.col-md-3
= project_feature_access_select(:builds_access_level)
.row
.col-md-9.project-feature-nested
= feature_fields.label :builds_access_level, "Builds", class: 'label-light'
%span.help-block Submit, test and deploy your changes before merge
.col-md-3
= project_feature_access_select(:builds_access_level)
.row
.col-md-9
= feature_fields.label :snippets_access_level, "Snippets", class: 'label-light'
%span.help-block Share code pastes with others out of Git repository
.col-md-3
= project_feature_access_select(:snippets_access_level)
.row
.col-md-9
= feature_fields.label :snippets_access_level, "Snippets", class: 'label-light'
%span.help-block Share code pastes with others out of Git repository
.col-md-3
= project_feature_access_select(:snippets_access_level)
.row
.col-md-9
= feature_fields.label :issues_access_level, "Issues", class: 'label-light'
%span.help-block Lightweight issue tracking system for this project
.col-md-3
= project_feature_access_select(:issues_access_level)
.row
.col-md-9
= feature_fields.label :issues_access_level, "Issues", class: 'label-light'
%span.help-block Lightweight issue tracking system for this project
.col-md-3
= project_feature_access_select(:issues_access_level)
.row
.col-md-9
= feature_fields.label :wiki_access_level, "Wiki", class: 'label-light'
%span.help-block Pages for project documentation
.col-md-3
= project_feature_access_select(:wiki_access_level)
- if Gitlab.config.lfs.enabled && current_user.admin?
.checkbox
= f.label :lfs_enabled do
= f.check_box :lfs_enabled
%strong LFS
%br
%span.descr
Git Large File Storage
= link_to icon('question-circle'), help_page_path('workflow/lfs/manage_large_binaries_with_git_lfs')
.row
.col-md-9
= feature_fields.label :wiki_access_level, "Wiki", class: 'label-light'
%span.help-block Pages for project documentation
.col-md-3
= project_feature_access_select(:wiki_access_level)
- if Gitlab.config.lfs.enabled && current_user.admin?
.form-group
.checkbox
= f.label :container_registry_enabled do
= f.check_box :container_registry_enabled
%strong Container Registry
%br
%span.descr Enable Container Registry for this project
= link_to icon('question-circle'), help_page_path('user/project/container_registry'), target: '_blank'
.checkbox
= f.label :lfs_enabled do
= f.check_box :lfs_enabled
%strong LFS
%br
%span.descr
Git Large File Storage
= link_to icon('question-circle'), help_page_path('workflow/lfs/manage_large_binaries_with_git_lfs')
- if Gitlab.config.lfs.enabled && current_user.admin?
.form-group
.checkbox
= f.label :container_registry_enabled do
= f.check_box :container_registry_enabled
%strong Container Registry
%br
%span.descr Enable Container Registry for this project
= link_to icon('question-circle'), help_page_path('user/project/container_registry'), target: '_blank'
= render 'merge_request_settings', f: f
%hr
......@@ -288,4 +288,4 @@
Saving project.
%p Please wait a moment, this page will automatically refresh when ready.
= render 'shared/confirm_modal', phrase: @project.path
= render 'shared/confirm_modal', phrase: @project.path
\ No newline at end of file
......@@ -47,39 +47,41 @@
= link_to "command line", "#modal_merge_info", class: "how_to_merge_link vlink", title: "How To Merge", "data-toggle" => "modal"
- if @commits_count.nonzero?
%ul.merge-request-tabs.nav-links.no-top.no-bottom{ class: ("js-tabs-affix" unless ENV['RAILS_ENV'] == 'test') }
%li.notes-tab
= link_to namespace_project_merge_request_path(@project.namespace, @project, @merge_request), data: { target: 'div#notes', action: 'notes', toggle: 'tab' } do
Discussion
%span.badge= @merge_request.mr_and_commit_notes.user.count
- if @merge_request.source_project
%li.commits-tab
= link_to commits_namespace_project_merge_request_path(@project.namespace, @project, @merge_request), data: { target: 'div#commits', action: 'commits', toggle: 'tab' } do
Commits
%span.badge= @commits_count
- if @pipeline
%li.pipelines-tab
= link_to pipelines_namespace_project_merge_request_path(@project.namespace, @project, @merge_request), data: { target: '#pipelines', action: 'pipelines', toggle: 'tab' } do
Pipelines
%span.badge= @pipelines.size
%li.builds-tab
= link_to builds_namespace_project_merge_request_path(@project.namespace, @project, @merge_request), data: { target: '#builds', action: 'builds', toggle: 'tab' } do
Builds
%span.badge= @statuses.size
%li.diffs-tab
= link_to diffs_namespace_project_merge_request_path(@project.namespace, @project, @merge_request), data: { target: 'div#diffs', action: 'diffs', toggle: 'tab' } do
Changes
%span.badge= @merge_request.diff_size
%li#resolve-count-app.line-resolve-all-container.pull-right.prepend-top-10.hidden-xs{ "v-cloak" => true }
%resolve-count{ "inline-template" => true, ":logged-out" => "#{current_user.nil?}" }
.line-resolve-all{ "v-show" => "discussionCount > 0",
":class" => "{ 'has-next-btn': !loggedOut && resolvedDiscussionCount !== discussionCount }" }
%span.line-resolve-btn.is-disabled{ type: "button",
":class" => "{ 'is-active': resolvedDiscussionCount === discussionCount }" }
= render "shared/icons/icon_status_success.svg"
%span.line-resolve-text
{{ resolvedDiscussionCount }}/{{ discussionCount }} {{ discussionCount | pluralize 'discussion' }} resolved
= render "discussions/jump_to_next"
.merge-request-tabs-holder{ class: ("js-tabs-affix" unless ENV['RAILS_ENV'] == 'test') }
%div{ class: container_class }
%ul.merge-request-tabs.nav-links.no-top.no-bottom
%li.notes-tab
= link_to namespace_project_merge_request_path(@project.namespace, @project, @merge_request), data: { target: 'div#notes', action: 'notes', toggle: 'tab' } do
Discussion
%span.badge= @merge_request.mr_and_commit_notes.user.count
- if @merge_request.source_project
%li.commits-tab
= link_to commits_namespace_project_merge_request_path(@project.namespace, @project, @merge_request), data: { target: 'div#commits', action: 'commits', toggle: 'tab' } do
Commits
%span.badge= @commits_count
- if @pipeline
%li.pipelines-tab
= link_to pipelines_namespace_project_merge_request_path(@project.namespace, @project, @merge_request), data: { target: '#pipelines', action: 'pipelines', toggle: 'tab' } do
Pipelines
%span.badge= @pipelines.size
%li.builds-tab
= link_to builds_namespace_project_merge_request_path(@project.namespace, @project, @merge_request), data: { target: '#builds', action: 'builds', toggle: 'tab' } do
Builds
%span.badge= @statuses.size
%li.diffs-tab
= link_to diffs_namespace_project_merge_request_path(@project.namespace, @project, @merge_request), data: { target: 'div#diffs', action: 'diffs', toggle: 'tab' } do
Changes
%span.badge= @merge_request.diff_size
%li#resolve-count-app.line-resolve-all-container.pull-right.prepend-top-10.hidden-xs{ "v-cloak" => true }
%resolve-count{ "inline-template" => true, ":logged-out" => "#{current_user.nil?}" }
.line-resolve-all{ "v-show" => "discussionCount > 0",
":class" => "{ 'has-next-btn': !loggedOut && resolvedDiscussionCount !== discussionCount }" }
%span.line-resolve-btn.is-disabled{ type: "button",
":class" => "{ 'is-active': resolvedDiscussionCount === discussionCount }" }
= render "shared/icons/icon_status_success.svg"
%span.line-resolve-text
{{ resolvedDiscussionCount }}/{{ discussionCount }} {{ discussionCount | pluralize 'discussion' }} resolved
= render "discussions/jump_to_next"
.tab-content#diff-notes-app
#notes.notes.tab-pane.voting_notes
......
......@@ -57,7 +57,7 @@
= link_to '#', title: 'Edit comment', class: 'note-action-button js-note-edit' do
= icon('pencil', class: 'link-highlight')
= link_to namespace_project_note_path(note.project.namespace, note.project, note), title: 'Remove comment', method: :delete, data: { confirm: 'Are you sure you want to remove this comment?' }, remote: true, class: 'note-action-button hidden-xs js-note-delete danger' do
= icon('trash-o')
= icon('trash-o', class: 'danger-highlight')
.note-body{class: note_editable ? 'js-task-list-container' : ''}
.note-text.md
= preserve do
......
......@@ -26,4 +26,4 @@
%h4.underlined-title Available specific runners
%ul.bordered-list.available-specific-runners
= render partial: 'runner', collection: @assignable_runners, as: :runner
= paginate @assignable_runners
= paginate @assignable_runners, theme: "gitlab"
class AdminEmailWorker
include Sidekiq::Worker
sidekiq_options retry: false # this job auto-repeats via sidekiq-cron
include CronjobQueue
def perform
repository_check_failed_count = Project.where(last_repository_check_failed: true).count
......
class BuildCoverageWorker
include Sidekiq::Worker
sidekiq_options queue: :default
include BuildQueue
def perform(build_id)
Ci::Build.find_by(id: build_id)
......
class BuildEmailWorker
include Sidekiq::Worker
include BuildQueue
def perform(build_id, recipients, push_data)
recipients.each do |recipient|
......
class BuildFinishedWorker
include Sidekiq::Worker
include BuildQueue
def perform(build_id)
Ci::Build.find_by(id: build_id).try do |build|
......
class BuildHooksWorker
include Sidekiq::Worker
sidekiq_options queue: :default
include BuildQueue
def perform(build_id)
Ci::Build.find_by(id: build_id)
......
class BuildSuccessWorker
include Sidekiq::Worker
sidekiq_options queue: :default
include BuildQueue
def perform(build_id)
Ci::Build.find_by(id: build_id).try do |build|
......
# This worker clears all cache fields in the database, working in batches.
class ClearDatabaseCacheWorker
include Sidekiq::Worker
include DedicatedSidekiqQueue
BATCH_SIZE = 1000
......
# Concern for setting Sidekiq settings for the various CI build workers.
module BuildQueue
extend ActiveSupport::Concern
included do
sidekiq_options queue: :build
end
end
# Concern that sets various Sidekiq settings for workers executed using a
# cronjob.
module CronjobQueue
extend ActiveSupport::Concern
included do
sidekiq_options queue: :cronjob, retry: false
end
end
# Concern that sets the queue of a Sidekiq worker based on the worker's class
# name/namespace.
module DedicatedSidekiqQueue
extend ActiveSupport::Concern
included do
sidekiq_options queue: name.sub(/Worker\z/, '').underscore.tr('/', '_')
end
end
# Concern for setting Sidekiq settings for the various CI pipeline workers.
module PipelineQueue
extend ActiveSupport::Concern
included do
sidekiq_options queue: :pipeline
end
end
# Concern for setting Sidekiq settings for the various repository check workers.
module RepositoryCheckQueue
extend ActiveSupport::Concern
included do
sidekiq_options queue: :repository_check, retry: false
end
end
class DeleteUserWorker
include Sidekiq::Worker
include DedicatedSidekiqQueue
def perform(current_user_id, delete_user_id, options = {})
delete_user = User.find(delete_user_id)
......
class EmailReceiverWorker
include Sidekiq::Worker
sidekiq_options queue: :incoming_email
include DedicatedSidekiqQueue
def perform(raw)
return unless Gitlab::IncomingEmail.enabled?
......
class EmailsOnPushWorker
include Sidekiq::Worker
include DedicatedSidekiqQueue
sidekiq_options queue: :mailers
attr_reader :email, :skip_premailer
def perform(project_id, recipients, push_data, options = {})
......
class ExpireBuildArtifactsWorker
include Sidekiq::Worker
include CronjobQueue
def perform
Rails.logger.info 'Scheduling removal of build artifacts'
......
class ExpireBuildInstanceArtifactsWorker
include Sidekiq::Worker
include DedicatedSidekiqQueue
def perform(build_id)
build = Ci::Build
......
class GitGarbageCollectWorker
include Sidekiq::Worker
include Gitlab::ShellAdapter
include DedicatedSidekiqQueue
sidekiq_options queue: :gitlab_shell, retry: false
sidekiq_options retry: false
def perform(project_id)
project = Project.find(project_id)
......
class GitlabShellWorker
include Sidekiq::Worker
include Gitlab::ShellAdapter
sidekiq_options queue: :gitlab_shell
include DedicatedSidekiqQueue
def perform(action, *arg)
gitlab_shell.send(action, *arg)
......
class GroupDestroyWorker
include Sidekiq::Worker
sidekiq_options queue: :default
include DedicatedSidekiqQueue
def perform(group_id, user_id)
begin
......
class ImportExportProjectCleanupWorker
include Sidekiq::Worker
sidekiq_options queue: :default
include CronjobQueue
def perform
ImportExportCleanUpService.new.execute
......
......@@ -3,6 +3,7 @@ require 'socket'
class IrkerWorker
include Sidekiq::Worker
include DedicatedSidekiqQueue
def perform(project_id, chans, colors, push_data, settings)
project = Project.find(project_id)
......
class MergeWorker
include Sidekiq::Worker
sidekiq_options queue: :default
include DedicatedSidekiqQueue
def perform(merge_request_id, current_user_id, params)
params = params.with_indifferent_access
......
class NewNoteWorker
include Sidekiq::Worker
sidekiq_options queue: :default
include DedicatedSidekiqQueue
def perform(note_id, note_params)
note = Note.find(note_id)
......
class PipelineHooksWorker
include Sidekiq::Worker
sidekiq_options queue: :default
include PipelineQueue
def perform(pipeline_id)
Ci::Pipeline.find_by(id: pipeline_id)
......
class PipelineMetricsWorker
include Sidekiq::Worker
sidekiq_options queue: :default
include PipelineQueue
def perform(pipeline_id)
Ci::Pipeline.find_by(id: pipeline_id).try do |pipeline|
......
class PipelineProcessWorker
include Sidekiq::Worker
sidekiq_options queue: :default
include PipelineQueue
def perform(pipeline_id)
Ci::Pipeline.find_by(id: pipeline_id)
......
class PipelineSuccessWorker
include Sidekiq::Worker
sidekiq_options queue: :default
include PipelineQueue
def perform(pipeline_id)
Ci::Pipeline.find_by(id: pipeline_id).try do |pipeline|
......
class PipelineUpdateWorker
include Sidekiq::Worker
sidekiq_options queue: :default
include PipelineQueue
def perform(pipeline_id)
Ci::Pipeline.find_by(id: pipeline_id)
......
class PostReceive
include Sidekiq::Worker
sidekiq_options queue: :post_receive
include DedicatedSidekiqQueue
def perform(repo_path, identifier, changes)
if path = Gitlab.config.repositories.storages.find { |p| repo_path.start_with?(p[1].to_s) }
......
......@@ -5,8 +5,7 @@
# storage engine as much.
class ProjectCacheWorker
include Sidekiq::Worker
sidekiq_options queue: :default
include DedicatedSidekiqQueue
LEASE_TIMEOUT = 15.minutes.to_i
......
class ProjectDestroyWorker
include Sidekiq::Worker
sidekiq_options queue: :default
include DedicatedSidekiqQueue
def perform(project_id, user_id, params)
begin
......
class ProjectExportWorker
include Sidekiq::Worker
include DedicatedSidekiqQueue
sidekiq_options queue: :gitlab_shell, retry: 3
sidekiq_options retry: 3
def perform(current_user_id, project_id)
current_user = User.find(current_user_id)
......
class ProjectServiceWorker
include Sidekiq::Worker
sidekiq_options queue: :project_web_hook
include DedicatedSidekiqQueue
def perform(hook_id, data)
data = data.with_indifferent_access
......
class ProjectWebHookWorker
include Sidekiq::Worker
sidekiq_options queue: :project_web_hook
include DedicatedSidekiqQueue
def perform(hook_id, data, hook_name)
data = data.with_indifferent_access
......
class PruneOldEventsWorker
include Sidekiq::Worker
include CronjobQueue
def perform
# Contribution calendar shows maximum 12 months of events.
......
class RemoveExpiredGroupLinksWorker
include Sidekiq::Worker
include CronjobQueue
def perform
ProjectGroupLink.expired.destroy_all
......
class RemoveExpiredMembersWorker
include Sidekiq::Worker
include CronjobQueue
def perform
Member.expired.find_each do |member|
......
class RepositoryArchiveCacheWorker
include Sidekiq::Worker
sidekiq_options queue: :default
include CronjobQueue
def perform
RepositoryArchiveCleanUpService.new.execute
......
module RepositoryCheck
class BatchWorker
include Sidekiq::Worker
include CronjobQueue
RUN_TIME = 3600
sidekiq_options retry: false
def perform
start = Time.now
# This loop will break after a little more than one hour ('a little
# more' because `git fsck` may take a few minutes), or if it runs out of
# projects to check. By default sidekiq-cron will start a new
......@@ -17,15 +16,15 @@ module RepositoryCheck
project_ids.each do |project_id|
break if Time.now - start >= RUN_TIME
break unless current_settings.repository_checks_enabled
next unless try_obtain_lease(project_id)
SingleRepositoryWorker.new.perform(project_id)
end
end
private
# Project.find_each does not support WHERE clauses and
# Project.find_in_batches does not support ordering. So we just build an
# array of ID's. This is OK because we do it only once an hour, because
......@@ -39,7 +38,7 @@ module RepositoryCheck
reorder('last_repository_check_at ASC').limit(limit).pluck(:id)
never_checked_projects + old_check_projects
end
def try_obtain_lease(id)
# Use a 24-hour timeout because on servers/projects where 'git fsck' is
# super slow we definitely do not want to run it twice in parallel.
......@@ -48,7 +47,7 @@ module RepositoryCheck
timeout: 24.hours
).try_obtain
end
def current_settings
# No caching of the settings! If we cache them and an admin disables
# this feature, an active RepositoryCheckWorker would keep going for up
......
module RepositoryCheck
class ClearWorker
include Sidekiq::Worker
sidekiq_options retry: false
include RepositoryCheckQueue
def perform
# Do small batched updates because these updates will be slow and locking
......
module RepositoryCheck
class SingleRepositoryWorker
include Sidekiq::Worker
sidekiq_options retry: false
include RepositoryCheckQueue
def perform(project_id)
project = Project.find(project_id)
......
class RepositoryForkWorker
include Sidekiq::Worker
include Gitlab::ShellAdapter
sidekiq_options queue: :gitlab_shell
include DedicatedSidekiqQueue
def perform(project_id, forked_from_repository_storage_path, source_path, target_path)
Gitlab::Metrics.add_event(:fork_repository,
......
class RepositoryImportWorker
include Sidekiq::Worker
include Gitlab::ShellAdapter
sidekiq_options queue: :gitlab_shell
include DedicatedSidekiqQueue
attr_accessor :project, :current_user
......
class RequestsProfilesWorker
include Sidekiq::Worker
sidekiq_options queue: :default
include CronjobQueue
def perform
Gitlab::RequestProfiler.remove_all_profiles
......
class StuckCiBuildsWorker
include Sidekiq::Worker
include CronjobQueue
BUILD_STUCK_TIMEOUT = 1.day
......
class SystemHookWorker
include Sidekiq::Worker
sidekiq_options queue: :system_hook
include DedicatedSidekiqQueue
def perform(hook_id, data, hook_name)
SystemHook.find(hook_id).execute(data, hook_name)
......
class TrendingProjectsWorker
include Sidekiq::Worker
sidekiq_options queue: :trending_projects
include CronjobQueue
def perform
Rails.logger.info('Refreshing trending projects')
......
class UpdateMergeRequestsWorker
include Sidekiq::Worker
include DedicatedSidekiqQueue
def perform(project_id, user_id, oldrev, newrev, ref)
project = Project.find_by(id: project_id)
......
......@@ -4,6 +4,7 @@ cd $(dirname $0)/..
app_root=$(pwd)
sidekiq_pidfile="$app_root/tmp/pids/sidekiq.pid"
sidekiq_logfile="$app_root/log/sidekiq.log"
sidekiq_config="$app_root/config/sidekiq_queues.yml"
gitlab_user=$(ls -l config.ru | awk '{print $3}')
warn()
......@@ -37,7 +38,7 @@ start_no_deamonize()
start_sidekiq()
{
exec bundle exec sidekiq -q post_receive -q mailers -q archive_repo -q system_hook -q project_web_hook -q gitlab_shell -q incoming_email -q runner -q common -q default -e $RAILS_ENV -P $sidekiq_pidfile "$@"
exec bundle exec sidekiq -C "${sidekiq_config}" -e $RAILS_ENV -P $sidekiq_pidfile "$@"
}
load_ok()
......
......@@ -24,7 +24,8 @@ module Gitlab
#{config.root}/app/models/ci
#{config.root}/app/models/hooks
#{config.root}/app/models/members
#{config.root}/app/models/project_services))
#{config.root}/app/models/project_services
#{config.root}/app/workers/concerns))
config.generators.templates.push("#{config.root}/generator_templates")
......
......@@ -12,23 +12,26 @@ constraints(GroupUrlConstrainer.new) do
end
end
resources :groups, constraints: { id: /[a-zA-Z.0-9_\-]+(?<!\.atom)/ } do
member do
get :issues
get :merge_requests
get :projects
get :activity
end
scope module: :groups do
resources :group_members, only: [:index, :create, :update, :destroy], concerns: :access_requestable do
post :resend_invite, on: :member
delete :leave, on: :collection
scope constraints: { id: /[a-zA-Z.0-9_\-]+(?<!\.atom)/ } do
resources :groups, except: [:show] do
member do
get :issues
get :merge_requests
get :projects
get :activity
end
resource :avatar, only: [:destroy]
resources :milestones, constraints: { id: /[^\/]+/ }, only: [:index, :show, :update, :new, :create]
scope module: :groups do
resources :group_members, only: [:index, :create, :update, :destroy], concerns: :access_requestable do
post :resend_invite, on: :member
delete :leave, on: :collection
end
resources :labels, except: [:show], constraints: { id: /\d+/ }
resource :avatar, only: [:destroy]
resources :milestones, constraints: { id: /[^\/]+/ }, only: [:index, :show, :update, :new, :create]
resources :labels, except: [:show], constraints: { id: /\d+/ }
end
end
get 'groups/:id' => 'groups#show', as: :group_canonical
end
# This configuration file should be exclusively used to set queue settings for
# Sidekiq. Any other setting should be specified using the Sidekiq CLI or the
# Sidekiq Ruby API (see config/initializers/sidekiq.rb).
---
# All the queues to process and their weights. Every queue _must_ have a weight
# defined.
#
# The available weights are as follows
#
# 1: low priority
# 2: medium priority
# 3: high priority
# 5: _super_ high priority, this should only be used for _very_ important queues
#
# As per http://stackoverflow.com/a/21241357/290102 the formula for calculating
# the likelihood of a job being popped off a queue (given all queues have work
# to perform) is:
#
# chance = (queue weight / total weight of all queues) * 100
:queues:
- [post_receive, 5]
- [merge, 5]
- [update_merge_requests, 3]
- [new_note, 2]
- [build, 2]
- [pipeline, 2]
- [gitlab_shell, 2]
- [email_receiver, 2]
- [emails_on_push, 2]
- [mailers, 2]
- [repository_fork, 1]
- [repository_import, 1]
- [project_service, 1]
- [clear_database_cache, 1]
- [delete_user, 1]
- [expire_build_instance_artifacts, 1]
- [group_destroy, 1]
- [irker, 1]
- [project_cache, 1]
- [project_destroy, 1]
- [project_export, 1]
- [project_web_hook, 1]
- [repository_check, 1]
- [system_hook, 1]
- [git_garbage_collect, 1]
- [cronjob, 1]
- [default, 1]
require 'json'
# See http://doc.gitlab.com/ce/development/migration_style_guide.html
# for more information on how to write migrations for GitLab.
class MigrateSidekiqQueuesFromDefault < ActiveRecord::Migration
include Gitlab::Database::MigrationHelpers
DOWNTIME = true
DOWNTIME_REASON = <<-EOF
Moving Sidekiq jobs from queues requires Sidekiq to be stopped. Not stopping
Sidekiq will result in the loss of jobs that are scheduled after this
migration completes.
EOF
disable_ddl_transaction!
# Jobs for which the queue names have been changed (e.g. multiple workers
# using the same non-default queue).
#
# The keys are the old queue names, the values the jobs to move and their new
# queue names.
RENAMED_QUEUES = {
gitlab_shell: {
'GitGarbageCollectorWorker' => :git_garbage_collector,
'ProjectExportWorker' => :project_export,
'RepositoryForkWorker' => :repository_fork,
'RepositoryImportWorker' => :repository_import
},
project_web_hook: {
'ProjectServiceWorker' => :project_service
},
incoming_email: {
'EmailReceiverWorker' => :email_receiver
},
mailers: {
'EmailsOnPushWorker' => :emails_on_push
},
default: {
'AdminEmailWorker' => :cronjob,
'BuildCoverageWorker' => :build,
'BuildEmailWorker' => :build,
'BuildFinishedWorker' => :build,
'BuildHooksWorker' => :build,
'BuildSuccessWorker' => :build,
'ClearDatabaseCacheWorker' => :clear_database_cache,
'DeleteUserWorker' => :delete_user,
'ExpireBuildArtifactsWorker' => :cronjob,
'ExpireBuildInstanceArtifactsWorker' => :expire_build_instance_artifacts,
'GroupDestroyWorker' => :group_destroy,
'ImportExportProjectCleanupWorker' => :cronjob,
'IrkerWorker' => :irker,
'MergeWorker' => :merge,
'NewNoteWorker' => :new_note,
'PipelineHooksWorker' => :pipeline,
'PipelineMetricsWorker' => :pipeline,
'PipelineProcessWorker' => :pipeline,
'PipelineSuccessWorker' => :pipeline,
'PipelineUpdateWorker' => :pipeline,
'ProjectCacheWorker' => :project_cache,
'ProjectDestroyWorker' => :project_destroy,
'PruneOldEventsWorker' => :cronjob,
'RemoveExpiredGroupLinksWorker' => :cronjob,
'RemoveExpiredMembersWorker' => :cronjob,
'RepositoryArchiveCacheWorker' => :cronjob,
'RepositoryCheck::BatchWorker' => :cronjob,
'RepositoryCheck::ClearWorker' => :repository_check,
'RepositoryCheck::SingleRepositoryWorker' => :repository_check,
'RequestsProfilesWorker' => :cronjob,
'StuckCiBuildsWorker' => :cronjob,
'UpdateMergeRequestsWorker' => :update_merge_requests
}
}
def up
Sidekiq.redis do |redis|
RENAMED_QUEUES.each do |queue, jobs|
migrate_from_queue(redis, queue, jobs)
end
end
end
def down
Sidekiq.redis do |redis|
RENAMED_QUEUES.each do |dest_queue, jobs|
jobs.each do |worker, from_queue|
migrate_from_queue(redis, from_queue, worker => dest_queue)
end
end
end
end
def migrate_from_queue(redis, queue, job_mapping)
while job = redis.lpop("queue:#{queue}")
payload = JSON.load(job)
new_queue = job_mapping[payload['class']]
# If we have no target queue to migrate to we're probably dealing with
# some ancient job for which the worker no longer exists. In that case
# there's no sane option we can take, other than just dropping the job.
next unless new_queue
payload['queue'] = new_queue
redis.lpush("queue:#{new_queue}", JSON.dump(payload))
end
end
end
......@@ -11,10 +11,10 @@ GET /projects/:id/builds
| Attribute | Type | Required | Description |
|-----------|---------|----------|---------------------|
| `id` | integer | yes | The ID of a project |
| `scope` | string **or** array of strings | no | The scope of builds to show, one or array of: `pending`, `running`, `failed`, `success`, `canceled`; showing all builds if none provided |
| `scope` | string **or** array of strings | no | The scope of builds to show, one or array of: `created`, `pending`, `running`, `failed`, `success`, `canceled`, `skipped`; showing all builds if none provided |
```
curl --header "PRIVATE-TOKEN: 9koXpg98eAheJpvBs5tK" "https://gitlab.example.com/api/v3/projects/1/builds"
curl --header "PRIVATE-TOKEN: 9koXpg98eAheJpvBs5tK" 'https://gitlab.example.com/api/v3/projects/1/builds?scope%5B0%5D=pending&scope%5B1%5D=running'
```
Example of response
......@@ -132,10 +132,10 @@ GET /projects/:id/repository/commits/:sha/builds
|-----------|---------|----------|---------------------|
| `id` | integer | yes | The ID of a project |
| `sha` | string | yes | The SHA id of a commit |
| `scope` | string **or** array of strings | no | The scope of builds to show, one or array of: `pending`, `running`, `failed`, `success`, `canceled`; showing all builds if none provided |
| `scope` | string **or** array of strings | no | The scope of builds to show, one or array of: `created`, `pending`, `running`, `failed`, `success`, `canceled`, `skipped`; showing all builds if none provided |
```
curl --header "PRIVATE-TOKEN: 9koXpg98eAheJpvBs5tK" "https://gitlab.example.com/api/v3/projects/1/repository/commits/0ff3ae198f8601a285adcf5c0fff204ee6fba5fd/builds"
curl --header "PRIVATE-TOKEN: 9koXpg98eAheJpvBs5tK" 'https://gitlab.example.com/api/v3/projects/1/repository/commits/0ff3ae198f8601a285adcf5c0fff204ee6fba5fd/builds?scope%5B0%5D=pending&scope%5B1%5D=running'
```
Example of response
......
......@@ -643,7 +643,7 @@ Parameters:
| `id` | integer | yes | The ID of the user |
```bash
curl --header "PRIVATE-TOKEN: 9koXpg98eAheJpvBs5tK" https://gitlab.example.com/api/v3/user/:id/events
curl --header "PRIVATE-TOKEN: 9koXpg98eAheJpvBs5tK" https://gitlab.example.com/api/v3/users/:id/events
```
Example response:
......
......@@ -188,7 +188,7 @@ In order to do that, follow the steps:
image = "docker:latest"
privileged = false
disable_cache = false
volumes = ["/var/run/docker.sock", "/cache"]
volumes = ["/var/run/docker.sock:/var/run/docker.sock", "/cache"]
[runners.cache]
Insecure = false
```
......
......@@ -37,7 +37,7 @@ The registered runner will use the `ruby:2.1` docker image and will run two
services, `postgres:latest` and `mysql:latest`, both of which will be
accessible during the build process.
## What is image
## What is an image
The `image` keyword is the name of the docker image that is present in the
local Docker Engine (list all images with `docker images`) or any image that
......@@ -47,7 +47,7 @@ Hub please read the [Docker Fundamentals][] documentation.
In short, with `image` we refer to the docker image, which will be used to
create a container on which your build will run.
## What is service
## What is a service
The `services` keyword defines just another docker image that is run during
your build and is linked to the docker image that the `image` keyword defines.
......@@ -61,7 +61,7 @@ time the project is built.
You can see some widely used services examples in the relevant documentation of
[CI services examples](../services/README.md).
### How is service linked to the build
### How services are linked to the build
To better understand how the container linking works, read
[Linking containers together][linking-containers].
......
......@@ -146,13 +146,17 @@ variables:
```
These variables can be later used in all executed commands and scripts.
The YAML-defined variables are also set to all created service containers,
thus allowing to fine tune them.
thus allowing to fine tune them. Variables can be also defined on a
[job level](#job-variables).
Variables can be also defined on [job level](#job-variables).
Except for the user defined variables, there are also the ones set up by the
Runner itself. One example would be `CI_BUILD_REF_NAME` which has the value of
the branch or tag name for which project is built. Apart from the variables
you can set in `.gitlab-ci.yml`, there are also the so called secret variables
which can be set in GitLab's UI.
[Learn more about variables.](../variables/README.md)
[Learn more about variables.][variables]
### cache
......@@ -541,20 +545,29 @@ An example usage of manual actions is deployment to production.
> Introduced in GitLab 8.9.
`environment` is used to define that a job deploys to a specific [environment].
This allows easy tracking of all deployments to your environments straight from
GitLab.
> You can read more about environments and find more examples in the
[documentation about environments][environment].
`environment` is used to define that a job deploys to a specific environment.
If `environment` is specified and no environment under that name exists, a new
one will be created automatically.
The `environment` name must contain only letters, digits, '-', '_', '/', '$', '{', '}' and spaces. Common
names are `qa`, `staging`, and `production`, but you can use whatever name works
with your workflow.
The `environment` name can contain:
---
- letters
- digits
- spaces
- `-`
- `_`
- `/`
- `$`
- `{`
- `}`
**Example configurations**
Common names are `qa`, `staging`, and `production`, but you can use whatever
name works with your workflow.
In its simplest form, the `environment` keyword can be defined like:
```
deploy to production:
......@@ -563,39 +576,134 @@ deploy to production:
environment: production
```
The `deploy to production` job will be marked as doing deployment to
`production` environment.
In the above example, the `deploy to production` job will be marked as doing a
deployment to the `production` environment.
#### environment:name
> Introduced in GitLab 8.11.
>**Note:**
Before GitLab 8.11, the name of an environment could be defined as a string like
`environment: production`. The recommended way now is to define it under the
`name` keyword.
Instead of defining the name of the environment right after the `environment`
keyword, it is also possible to define it as a separate value. For that, use
the `name` keyword under `environment`:
```
deploy to production:
stage: deploy
script: git push production HEAD:master
environment:
name: production
```
#### environment:url
> Introduced in GitLab 8.11.
>**Note:**
Before GitLab 8.11, the URL could be added only in GitLab's UI. The
recommended way now is to define it in `.gitlab-ci.yml`.
This is an optional value that when set, it exposes buttons in various places
in GitLab which when clicked take you to the defined URL.
In the example below, if the job finishes successfully, it will create buttons
in the merge requests and in the environments/deployments pages which will point
to `https://prod.example.com`.
```
deploy to production:
stage: deploy
script: git push production HEAD:master
environment:
name: production
url: https://prod.example.com
```
#### environment:on_stop
> [Introduced][ce-6669] in GitLab 8.13.
Closing (stoping) environments can be achieved with the `on_stop` keyword defined under
`environment`. It declares a different job that runs in order to close
the environment.
Read the `environment:action` section for an example.
#### environment:action
> [Introduced][ce-6669] in GitLab 8.13.
The `action` keyword is to be used in conjunction with `on_stop` and is defined
in the job that is called to close the environment.
Take for instance:
```yaml
review_app:
stage: deploy
script: make deploy-app
environment:
name: review
on_stop: stop_review_app
stop_review_app:
stage: deploy
script: make delete-app
when: manual
environment:
name: review
action: stop
```
In the above example we set up the `review_app` job to deploy to the `review`
environment, and we also defined a new `stop_review_app` job under `on_stop`.
Once the `review_app` job is successfully finished, it will trigger the
`stop_review_app` job based on what is defined under `when`. In this case we
set it up to `manual` so it will need a [manual action](#manual-actions) via
GitLab's web interface in order to run.
The `stop_review_app` job is **required** to have the following keywords defined:
- `when` - [reference](#when)
- `environment:name`
- `environment:action`
#### dynamic environments
> [Introduced][ce-6323] in GitLab 8.12 and GitLab Runner 1.6.
`environment` can also represent a configuration hash with `name` and `url`.
These parameters can use any of the defined CI [variables](#variables)
These parameters can use any of the defined [CI variables](#variables)
(including predefined, secure variables and `.gitlab-ci.yml` variables).
The common use case is to create dynamic environments for branches and use them
as review apps.
---
**Example configurations**
For example:
```
deploy as review app:
stage: deploy
script: ...
script: make deploy
environment:
name: review-apps/$CI_BUILD_REF_NAME
url: https://$CI_BUILD_REF_NAME.review.example.com/
```
The `deploy as review app` job will be marked as deployment to dynamically
create the `review-apps/branch-name` environment.
create the `review-apps/$CI_BUILD_REF_NAME` environment, which `$CI_BUILD_REF_NAME`
is an [environment variable][variables] set by the Runner. If for example the
`deploy as review app` job was run in a branch named `pow`, this environment
should be accessible under `https://pow.review.example.com/`.
This environment should be accessible under `https://branch-name.review.example.com/`.
This of course implies that the underlying server which hosts the application
is properly configured.
You can see a simple example at https://gitlab.com/gitlab-examples/review-apps-nginx/.
The common use case is to create dynamic environments for branches and use them
as Review Apps. You can see a simple example using Review Apps at
https://gitlab.com/gitlab-examples/review-apps-nginx/.
### artifacts
......@@ -1105,3 +1213,5 @@ CI with various languages.
[examples]: ../examples/README.md
[ce-6323]: https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/6323
[environment]: ../environments.md
[ce-6669]: https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/6669
[variables]: ../variables/README.md
......@@ -14,7 +14,8 @@
- [Testing standards and style guidelines](testing.md)
- [UI guide](ui_guide.md) for building GitLab with existing CSS styles and elements
- [Frontend guidelines](frontend.md)
- [SQL guidelines](sql.md) for SQL guidelines
- [SQL guidelines](sql.md) for working with SQL queries
- [Sidekiq guidelines](sidekiq_style_guide.md) for working with Sidekiq workers
## Process
......
# Sidekiq Style Guide
This document outlines various guidelines that should be followed when adding or
modifying Sidekiq workers.
## Default Queue
Use of the "default" queue is not allowed. Every worker should use a queue that
matches the worker's purpose the closest. For example, workers that are to be
executed periodically should use the "cronjob" queue.
A list of all available queues can be found in `config/sidekiq_queues.yml`.
## Dedicated Queues
Most workers should use their own queue. To ease this process a worker can
include the `DedicatedSidekiqQueue` concern as follows:
```ruby
class ProcessSomethingWorker
include Sidekiq::Worker
include DedicatedSidekiqQueue
end
```
This will set the queue name based on the class' name, minus the `Worker`
suffix. In the above example this would lead to the queue being
`process_something`.
In some cases multiple workers do use the same queue. For example, the various
workers for updating CI pipelines all use the `pipeline` queue. Adding workers
to existing queues should be done with care, as adding more workers can lead to
slow jobs blocking work (even for different jobs) on the shared queue.
## Tests
Each Sidekiq worker must be tested using RSpec, just like any other class. These
tests should be placed in `spec/workers`.
......@@ -72,7 +72,7 @@ sudo -u git -H git checkout 8-12-stable-ee
```bash
cd /home/git/gitlab-shell
sudo -u git -H git fetch --all --tags
sudo -u git -H git checkout v3.6.0
sudo -u git -H git checkout v3.6.1
```
### 6. Update gitlab-workhorse
......
......@@ -8,7 +8,7 @@ module API
#
# Parameters:
# id (required) - The ID of a project
# scope (optional) - The scope of builds to show (one or array of: pending, running, failed, success, canceled;
# scope (optional) - The scope of builds to show (one or array of: created, pending, running, failed, success, canceled, skipped;
# if none provided showing all builds)
# Example Request:
# GET /projects/:id/builds
......@@ -25,7 +25,7 @@ module API
# Parameters:
# id (required) - The ID of a project
# sha (required) - The SHA id of a commit
# scope (optional) - The scope of builds to show (one or array of: pending, running, failed, success, canceled;
# scope (optional) - The scope of builds to show (one or array of: created, pending, running, failed, success, canceled, skipped;
# if none provided showing all builds)
# Example Request:
# GET /projects/:id/repository/commits/:sha/builds
......
......@@ -19,7 +19,7 @@ module Gitlab
]
labels.each do |params|
::Labels::FindOrCreateService.new(project.owner, project).execute(params)
::Labels::FindOrCreateService.new(project.owner, project, params).execute
end
end
end
......
......@@ -29,5 +29,5 @@ namespace :cache do
task all: [:db, :redis]
end
task clear: 'cache:clear:all'
task clear: 'cache:clear:redis'
end
......@@ -70,4 +70,19 @@ describe Projects::LabelsController do
get :index, namespace_id: project.namespace.to_param, project_id: project.to_param
end
end
describe 'POST #generate' do
let(:admin) { create(:admin) }
let(:project) { create(:empty_project) }
before do
sign_in(admin)
end
it 'creates labels' do
post :generate, namespace_id: project.namespace.to_param, project_id: project.to_param
expect(response).to have_http_status(302)
end
end
end
......@@ -265,4 +265,10 @@ describe Group, models: true do
members
end
describe '#web_url' do
it 'returns the canonical URL' do
expect(group.web_url).to include("groups/#{group.name}")
end
end
end
......@@ -846,7 +846,7 @@ describe API::API, api: true do
end
end
describe 'PUT /user/:id/block' do
describe 'PUT /users/:id/block' do
before { admin }
it 'blocks existing user' do
put api("/users/#{user.id}/block", admin)
......@@ -873,7 +873,7 @@ describe API::API, api: true do
end
end
describe 'PUT /user/:id/unblock' do
describe 'PUT /users/:id/unblock' do
let(:blocked_user) { create(:user, state: 'blocked') }
before { admin }
......@@ -914,7 +914,7 @@ describe API::API, api: true do
end
end
describe 'GET /user/:id/events' do
describe 'GET /users/:id/events' do
let(:user) { create(:user) }
let(:project) { create(:empty_project) }
let(:note) { create(:note_on_issue, note: 'What an awesome day!', project: project) }
......
......@@ -46,4 +46,16 @@ describe MergeRequests::AssignIssuesService, services: true do
it 'assigns these to the merge request owner' do
expect { service.execute }.to change { issue.reload.assignee }.to(user)
end
it 'ignores external issues' do
external_issue = ExternalIssue.new('JIRA-123', project)
service = described_class.new(
project,
user,
merge_request: merge_request,
closes_issues: [external_issue]
)
expect(service.assignable_issues.count).to eq 0
end
end
require 'spec_helper'
describe BuildQueue do
let(:worker) do
Class.new do
include Sidekiq::Worker
include BuildQueue
end
end
it 'sets the queue name of a worker' do
expect(worker.sidekiq_options['queue'].to_s).to eq('build')
end
end
require 'spec_helper'
describe CronjobQueue do
let(:worker) do
Class.new do
include Sidekiq::Worker
include CronjobQueue
end
end
it 'sets the queue name of a worker' do
expect(worker.sidekiq_options['queue'].to_s).to eq('cronjob')
end
it 'disables retrying of failed jobs' do
expect(worker.sidekiq_options['retry']).to eq(false)
end
end
require 'spec_helper'
describe DedicatedSidekiqQueue do
let(:worker) do
Class.new do
def self.name
'Foo::Bar::DummyWorker'
end
include Sidekiq::Worker
include DedicatedSidekiqQueue
end
end
describe 'queue names' do
it 'sets the queue name based on the class name' do
expect(worker.sidekiq_options['queue']).to eq('foo_bar_dummy')
end
end
end
require 'spec_helper'
describe PipelineQueue do
let(:worker) do
Class.new do
include Sidekiq::Worker
include PipelineQueue
end
end
it 'sets the queue name of a worker' do
expect(worker.sidekiq_options['queue'].to_s).to eq('pipeline')
end
end
require 'spec_helper'
describe RepositoryCheckQueue do
let(:worker) do
Class.new do
include Sidekiq::Worker
include RepositoryCheckQueue
end
end
it 'sets the queue name of a worker' do
expect(worker.sidekiq_options['queue'].to_s).to eq('repository_check')
end
it 'disables retrying of failed jobs' do
expect(worker.sidekiq_options['retry']).to eq(false)
end
end
require 'spec_helper'
describe 'Every Sidekiq worker' do
let(:workers) do
root = Rails.root.join('app', 'workers')
concerns = root.join('concerns').to_s
workers = Dir[root.join('**', '*.rb')].
reject { |path| path.start_with?(concerns) }
workers.map do |path|
ns = Pathname.new(path).relative_path_from(root).to_s.gsub('.rb', '')
ns.camelize.constantize
end
end
it 'does not use the default queue' do
workers.each do |worker|
expect(worker.sidekiq_options['queue'].to_s).not_to eq('default')
end
end
it 'uses the cronjob queue when the worker runs as a cronjob' do
cron_workers = Settings.cron_jobs.
map { |job_name, options| options['job_class'].constantize }.
to_set
workers.each do |worker|
next unless cron_workers.include?(worker)
expect(worker.sidekiq_options['queue'].to_s).to eq('cronjob')
end
end
it 'defines the queue in the Sidekiq configuration file' do
config = YAML.load_file(Rails.root.join('config', 'sidekiq_queues.yml').to_s)
queue_names = config[:queues].map { |(queue, _)| queue }.to_set
workers.each do |worker|
expect(queue_names).to include(worker.sidekiq_options['queue'].to_s)
end
end
end
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment