Commit 567ef6c1 authored by Lin Jen-Shin's avatar Lin Jen-Shin

Merge remote-tracking branch 'upstream/master' into artifacts-from-ref-and-build-name

* upstream/master: (123 commits)
  Limit the size of SVGs when viewing them as blobs
  Add a spec for ProjectsFinder project_ids_relation option
  Fix ProjectsFinder spec
  Pass project IDs relation to ProjectsFinder instead of using a block
  Speed up todos queries by limiting the projects set we join with
  Used phantomjs variable
  Ability to specify branches for pivotal tracker integration
  Fix a memory leak caused by Banzai::Filter::SanitizationFilter
  Update phantomjs link
  Remove sleeping and replace escaped text.
  Filters test fix
  Fixed filtering tests
  Removed screenshot command 💩
  Updated failing tests
  Used mirrored version on GitLab
  Updated tests
  Fix `U2fSpec` for PhantomJS versions > 2.
  Fix file downloading
  Install latest stable phantomjs
  Use new PhantomJS version
  ...
parents 44b29b5b 06e3bb9f
image: "ruby:2.1"
image: "ruby:2.3.1"
cache:
key: "ruby21"
key: "ruby-231"
paths:
- vendor/apt
- vendor/ruby
......@@ -15,6 +15,7 @@ variables:
USE_DB: "true"
USE_BUNDLE_INSTALL: "true"
GIT_DEPTH: "20"
PHANTOMJS_VERSION: "2.1.1"
before_script:
- source ./scripts/prepare_build.sh
......@@ -138,57 +139,57 @@ spinach 7 10: *spinach-knapsack
spinach 8 10: *spinach-knapsack
spinach 9 10: *spinach-knapsack
# Execute all testing suites against Ruby 2.3
.ruby-23: &ruby-23
image: "ruby:2.3"
# Execute all testing suites against Ruby 2.1
.ruby-21: &ruby-21
image: "ruby:2.1"
<<: *use-db
only:
- master
cache:
key: "ruby-23"
key: "ruby21"
paths:
- vendor/apt
- vendor/ruby
.rspec-knapsack-ruby23: &rspec-knapsack-ruby23
.rspec-knapsack-ruby21: &rspec-knapsack-ruby21
<<: *rspec-knapsack
<<: *ruby-23
<<: *ruby-21
.spinach-knapsack-ruby23: &spinach-knapsack-ruby23
.spinach-knapsack-ruby21: &spinach-knapsack-ruby21
<<: *spinach-knapsack
<<: *ruby-23
<<: *ruby-21
rspec 0 20 ruby23: *rspec-knapsack-ruby23
rspec 1 20 ruby23: *rspec-knapsack-ruby23
rspec 2 20 ruby23: *rspec-knapsack-ruby23
rspec 3 20 ruby23: *rspec-knapsack-ruby23
rspec 4 20 ruby23: *rspec-knapsack-ruby23
rspec 5 20 ruby23: *rspec-knapsack-ruby23
rspec 6 20 ruby23: *rspec-knapsack-ruby23
rspec 7 20 ruby23: *rspec-knapsack-ruby23
rspec 8 20 ruby23: *rspec-knapsack-ruby23
rspec 9 20 ruby23: *rspec-knapsack-ruby23
rspec 10 20 ruby23: *rspec-knapsack-ruby23
rspec 11 20 ruby23: *rspec-knapsack-ruby23
rspec 12 20 ruby23: *rspec-knapsack-ruby23
rspec 13 20 ruby23: *rspec-knapsack-ruby23
rspec 14 20 ruby23: *rspec-knapsack-ruby23
rspec 15 20 ruby23: *rspec-knapsack-ruby23
rspec 16 20 ruby23: *rspec-knapsack-ruby23
rspec 17 20 ruby23: *rspec-knapsack-ruby23
rspec 18 20 ruby23: *rspec-knapsack-ruby23
rspec 19 20 ruby23: *rspec-knapsack-ruby23
rspec 0 20 ruby21: *rspec-knapsack-ruby21
rspec 1 20 ruby21: *rspec-knapsack-ruby21
rspec 2 20 ruby21: *rspec-knapsack-ruby21
rspec 3 20 ruby21: *rspec-knapsack-ruby21
rspec 4 20 ruby21: *rspec-knapsack-ruby21
rspec 5 20 ruby21: *rspec-knapsack-ruby21
rspec 6 20 ruby21: *rspec-knapsack-ruby21
rspec 7 20 ruby21: *rspec-knapsack-ruby21
rspec 8 20 ruby21: *rspec-knapsack-ruby21
rspec 9 20 ruby21: *rspec-knapsack-ruby21
rspec 10 20 ruby21: *rspec-knapsack-ruby21
rspec 11 20 ruby21: *rspec-knapsack-ruby21
rspec 12 20 ruby21: *rspec-knapsack-ruby21
rspec 13 20 ruby21: *rspec-knapsack-ruby21
rspec 14 20 ruby21: *rspec-knapsack-ruby21
rspec 15 20 ruby21: *rspec-knapsack-ruby21
rspec 16 20 ruby21: *rspec-knapsack-ruby21
rspec 17 20 ruby21: *rspec-knapsack-ruby21
rspec 18 20 ruby21: *rspec-knapsack-ruby21
rspec 19 20 ruby21: *rspec-knapsack-ruby21
spinach 0 10 ruby23: *spinach-knapsack-ruby23
spinach 1 10 ruby23: *spinach-knapsack-ruby23
spinach 2 10 ruby23: *spinach-knapsack-ruby23
spinach 3 10 ruby23: *spinach-knapsack-ruby23
spinach 4 10 ruby23: *spinach-knapsack-ruby23
spinach 5 10 ruby23: *spinach-knapsack-ruby23
spinach 6 10 ruby23: *spinach-knapsack-ruby23
spinach 7 10 ruby23: *spinach-knapsack-ruby23
spinach 8 10 ruby23: *spinach-knapsack-ruby23
spinach 9 10 ruby23: *spinach-knapsack-ruby23
spinach 0 10 ruby21: *spinach-knapsack-ruby21
spinach 1 10 ruby21: *spinach-knapsack-ruby21
spinach 2 10 ruby21: *spinach-knapsack-ruby21
spinach 3 10 ruby21: *spinach-knapsack-ruby21
spinach 4 10 ruby21: *spinach-knapsack-ruby21
spinach 5 10 ruby21: *spinach-knapsack-ruby21
spinach 6 10 ruby21: *spinach-knapsack-ruby21
spinach 7 10 ruby21: *spinach-knapsack-ruby21
spinach 8 10 ruby21: *spinach-knapsack-ruby21
spinach 9 10 ruby21: *spinach-knapsack-ruby21
# Other generic tests
......
#
# This list is used by git-shortlog to make contributions from the
# same person appearing to be so.
#
Achilleas Pipinellis <axilleas@axilleas.me> <axilleas@archlinux.gr>
Achilleas Pipinellis <axilleas@axilleas.me> <axilleas@users.noreply.github.com>
Dmitriy Zaporozhets <dzaporozhets@gitlab.com> <dmitriy.zaporozhets@gmail.com>
Dmitriy Zaporozhets <dzaporozhets@gitlab.com> <dzaporozhets@sphereconsultinginc.com>
Douwe Maan <douwe@gitlab.com> <douwe@selenight.nl>
Douwe Maan <douwe@gitlab.com> <me@douwe.me>
Grzegorz Bizon <grzegorz@gitlab.com> <grzegorz.bizon@ntsn.pl>
Grzegorz Bizon <grzegorz@gitlab.com> <grzesiek.bizon@gmail.com>
Jacob Vosmaer <jacob@gitlab.com> <contact@jacobvosmaer.nl>
Jacob Vosmaer <jacob@gitlab.com> Jacob Vosmaer (GitLab) <jacob@gitlab.com>
Jacob Schatz <jschatz@gitlab.com> <jacobschatz@Jacobs-MacBook-Pro.local>
Jacob Schatz <jschatz@gitlab.com> <jacobschatz@Jacobs-MBP.fios-router.home>
Jacob Schatz <jschatz@gitlab.com> <jschatz1@gmail.com>
James Lopez <james@jameslopez.es> <james@gitlab.com>
James Lopez <james@jameslopez.es> <james.lopez@vodafone.com>
Kamil Trzciński <kamil@gitlab.com> <ayufan@ayufan.eu>
Marin Jankovski <maxlazio@gmail.com> <marin@gitlab.com>
Phil Hughes <me@iamphill.com> <theephil@gmail.com>
Rémy Coutable <remy@rymai.me> <remy@gitlab.com>
Robert Schilling <rschilling@student.tugraz.at> <Razer6@users.noreply.github.com>
Robert Schilling <rschilling@student.tugraz.at> <schilling.ro@gmail.com>
Robert Speicher <robert@gitlab.com> <rspeicher@gmail.com>
Stan Hu <stanhu@gmail.com> <stanhu@alum.mit.edu>
Stan Hu <stanhu@gmail.com> <stanhu@packetzoom.com>
Stan Hu <stanhu@gmail.com> <stanhu@users.noreply.github.com>
Stan Hu <stanhu@gmail.com> stanhu <stanhu@gmail.com>
Sytse Sijbrandij <sytse@gitlab.com> <sytse+admin@gitlab.com>
Sytse Sijbrandij <sytse@gitlab.com> <sytse@dosire.com>
Sytse Sijbrandij <sytse@gitlab.com> <sytses@gmail.com>
Sytse Sijbrandij <sytse@gitlab.com> dosire <sytse@gitlab.com>
Please view this file on the master branch, on stable branches it's out of date.
v 8.11.0 (unreleased)
- Remove the http_parser.rb dependency by removing the tinder gem. !5758 (tbalthazar)
- Ability to specify branches for Pivotal Tracker integration (Egor Lynko)
- Fix don't pass a local variable called `i` to a partial. !20510 (herminiotorres)
- Fix rename `add_users_into_project` and `projects_ids`. !20512 (herminiotorres)
- Fix the title of the toggle dropdown button. !5515 (herminiotorres)
- Rename `markdown_preview` routes to `preview_markdown`. (Christopher Bartz)
- Update to Ruby 2.3.1. !4948
- Improve diff performance by eliminating redundant checks for text blobs
- Ensure that branch names containing escapable characters (e.g. %20) aren't unescaped indiscriminately. !5770 (ewiltshi)
- Convert switch icon into icon font (ClemMakesApps)
- API: Endpoints for enabling and disabling deploy keys
- API: List access requests, request access, approve, and deny access requests to a project or a group. !4833
- Use long options for curl examples in documentation !5703 (winniehell)
- Remove magic comments (`# encoding: UTF-8`) from Ruby files. !5456 (winniehell)
- Add support for relative links starting with ./ or / to RelativeLinkFilter (winniehell)
- Ignore URLs starting with // in Markdown links !5677 (winniehell)
- Fix CI status icon link underline (ClemMakesApps)
- The Repository class is now instrumented
- Fix filter label tooltip HTML rendering (ClemMakesApps)
- Cache the commit author in RequestStore to avoid extra lookups in PostReceive
- Add a button to download latest successful artifacts for branches and tags
- Expand commit message width in repo view (ClemMakesApps)
- Cache highlighted diff lines for merge requests
- Pre-create all builds for a Pipeline when the new Pipeline is created !5295
- Fix of 'Commits being passed to custom hooks are already reachable when using the UI'
- Show member roles to all users on members page
- Project.visible_to_user is instrumented again
- Fix awardable button mutuality loading spinners (ClemMakesApps)
- Add support for using RequestStore within Sidekiq tasks via SIDEKIQ_REQUEST_STORE env variable
- Optimize maximum user access level lookup in loading of notes
- Add "No one can push" as an option for protected branches. !5081
- Improve performance of AutolinkFilter#text_parse by using XPath
- Add experimental Redis Sentinel support !1877
- Rendering of SVGs as blobs is now limited to SVGs with a size smaller or equal to 2MB
- Fix branches page dropdown sort initial state (ClemMakesApps)
- Environments have an url to link to
- Various redundant database indexes have been removed
- Update `timeago` plugin to use multiple string/locale settings
- Remove unused images (ClemMakesApps)
- Limit git rev-list output count to one in forced push check
- Show deployment status on merge requests with external URLs
- Clean up unused routes (Josef Strzibny)
- Fix issue on empty project to allow developers to only push to protected branches if given permission
- Add green outline to New Branch button. !5447 (winniehell)
......@@ -37,10 +52,13 @@ v 8.11.0 (unreleased)
- Retrieve rendered HTML from cache in one request
- Fix renaming repository when name contains invalid chararacters under project settings
- Upgrade Grape from 0.13.0 to 0.15.0. !4601
- Trigram indexes for the "ci_runners" table have been removed to speed up UPDATE queries
- Fix devise deprecation warnings.
- Update version_sorter and use new interface for faster tag sorting
- Optimize checking if a user has read access to a list of issues !5370
- Store all DB secrets in secrets.yml, under descriptive names !5274
- Nokogiri's various parsing methods are now instrumented
- Add archived badge to project list !5798
- Add simple identifier to public SSH keys (muteor)
- Admin page now references docs instead of a specific file !5600 (AnAverageHuman)
- Add a way to send an email and create an issue based on private personal token. Find the email address from issues page. !3363
......@@ -67,6 +85,7 @@ v 8.11.0 (unreleased)
- Add GitLab Workhorse version to admin dashboard (Katarzyna Kobierska Ula Budziszewska)
- Allow branch names ending with .json for graph and network page !5579 (winniehell)
- Add the `sprockets-es6` gem
- Improve OAuth2 client documentation (muteor)
- Multiple trigger variables show in separate lines (Katarzyna Kobierska Ula Budziszewska)
- Profile requests when a header is passed
- Avoid calculation of line_code and position for _line partial when showing diff notes on discussion tab.
......@@ -79,8 +98,10 @@ v 8.11.0 (unreleased)
- Bump gitlab_git to lazy load compare commits
- Reduce number of queries made for merge_requests/:id/diffs
- Sensible state specific default sort order for issues and merge requests !5453 (tomb0y)
- Fix bug where destroying a namespace would not always destroy projects
- Fix RequestProfiler::Middleware error when code is reloaded in development
- Catch what warden might throw when profiling requests to re-throw it
- Avoid commit lookup on diff_helper passing existing local variable to the helper method
- Add description to new_issue email and new_merge_request_email in text/plain content type. !5663 (dixpac)
- Speed up and reduce memory usage of Commit#repo_changes, Repository#expire_avatar_cache and IrkerWorker
- Add unfold links for Side-by-Side view. !5415 (Tim Masliuchenko)
......@@ -89,8 +110,15 @@ v 8.11.0 (unreleased)
- Avoid to show the original password field when password is automatically set. !5712 (duduribeiro)
- Fix importing GitLab projects with an invalid MR source project
- Sort folders with submodules in Files view !5521
- Each `File::exists?` replaced to `File::exist?` because of deprecate since ruby version 2.2.0
- Add auto-completition in pipeline (Katarzyna Kobierska Ula Budziszewska)
- Fix a memory leak caused by Banzai::Filter::SanitizationFilter
- Speed up todos queries by limiting the projects set we join with
v 8.10.5 (unreleased)
v 8.10.5
- Add a data migration to fix some missing timestamps in the members table. !5670
- Revert the "Defend against 'Host' header injection" change in the source NGINX templates. !5706
- Cache project count for 5 minutes to reduce DB load. !5746 & !5754
v 8.10.4
- Don't close referenced upstream issues from a forked project.
......@@ -250,6 +278,7 @@ v 8.10.0
- Fix new snippet style bug (elliotec)
- Instrument Rinku usage
- Be explicit to define merge request discussion variables
- Use cache for todos counter calling TodoService
- Metrics for Rouge::Plugins::Redcarpet and Rouge::Formatters::HTMLGitlab
- RailsCache metris now includes fetch_hit/fetch_miss and read_hit/read_miss info.
- Allow [ci skip] to be in any case and allow [skip ci]. !4785 (simon_w)
......
source 'https://rubygems.org'
gem 'rails', '4.2.7'
gem 'rails', '4.2.7.1'
gem 'rails-deprecated_sanitizer', '~> 1.0.3'
# Responders respond_to and respond_with
......@@ -163,9 +163,6 @@ gem 'redis-rails', '~> 4.0.0'
gem 'redis', '~> 3.2'
gem 'connection_pool', '~> 2.0'
# Campfire integration
gem 'tinder', '~> 1.10.0'
# HipChat integration
gem 'hipchat', '~> 1.5.0'
......
......@@ -3,34 +3,34 @@ GEM
specs:
RedCloth (4.3.2)
ace-rails-ap (4.0.2)
actionmailer (4.2.7)
actionpack (= 4.2.7)
actionview (= 4.2.7)
activejob (= 4.2.7)
actionmailer (4.2.7.1)
actionpack (= 4.2.7.1)
actionview (= 4.2.7.1)
activejob (= 4.2.7.1)
mail (~> 2.5, >= 2.5.4)
rails-dom-testing (~> 1.0, >= 1.0.5)
actionpack (4.2.7)
actionview (= 4.2.7)
activesupport (= 4.2.7)
actionpack (4.2.7.1)
actionview (= 4.2.7.1)
activesupport (= 4.2.7.1)
rack (~> 1.6)
rack-test (~> 0.6.2)
rails-dom-testing (~> 1.0, >= 1.0.5)
rails-html-sanitizer (~> 1.0, >= 1.0.2)
actionview (4.2.7)
activesupport (= 4.2.7)
actionview (4.2.7.1)
activesupport (= 4.2.7.1)
builder (~> 3.1)
erubis (~> 2.7.0)
rails-dom-testing (~> 1.0, >= 1.0.5)
rails-html-sanitizer (~> 1.0, >= 1.0.2)
activejob (4.2.7)
activesupport (= 4.2.7)
activejob (4.2.7.1)
activesupport (= 4.2.7.1)
globalid (>= 0.3.0)
activemodel (4.2.7)
activesupport (= 4.2.7)
activemodel (4.2.7.1)
activesupport (= 4.2.7.1)
builder (~> 3.1)
activerecord (4.2.7)
activemodel (= 4.2.7)
activesupport (= 4.2.7)
activerecord (4.2.7.1)
activemodel (= 4.2.7.1)
activesupport (= 4.2.7.1)
arel (~> 6.0)
activerecord-session_store (1.0.0)
actionpack (>= 4.0, < 5.1)
......@@ -38,7 +38,7 @@ GEM
multi_json (~> 1.11, >= 1.11.2)
rack (>= 1.5.2, < 3)
railties (>= 4.0, < 5.1)
activesupport (4.2.7)
activesupport (4.2.7.1)
i18n (~> 0.7)
json (~> 1.7, >= 1.7.7)
minitest (~> 5.1)
......@@ -289,7 +289,7 @@ GEM
omniauth (~> 1.0)
pyu-ruby-sasl (~> 0.0.3.1)
rubyntlm (~> 0.3)
globalid (0.3.6)
globalid (0.3.7)
activesupport (>= 4.1.0)
gollum-grit_adapter (1.0.1)
gitlab-grit (~> 2.7, >= 2.7.1)
......@@ -335,7 +335,6 @@ GEM
activesupport (>= 2)
nokogiri (~> 1.4)
htmlentities (4.3.4)
http_parser.rb (0.5.3)
httparty (0.13.7)
json (~> 1.8)
multi_xml (>= 0.5.2)
......@@ -519,16 +518,16 @@ GEM
rack
rack-test (0.6.3)
rack (>= 1.0)
rails (4.2.7)
actionmailer (= 4.2.7)
actionpack (= 4.2.7)
actionview (= 4.2.7)
activejob (= 4.2.7)
activemodel (= 4.2.7)
activerecord (= 4.2.7)
activesupport (= 4.2.7)
rails (4.2.7.1)
actionmailer (= 4.2.7.1)
actionpack (= 4.2.7.1)
actionview (= 4.2.7.1)
activejob (= 4.2.7.1)
activemodel (= 4.2.7.1)
activerecord (= 4.2.7.1)
activesupport (= 4.2.7.1)
bundler (>= 1.3.0, < 2.0)
railties (= 4.2.7)
railties (= 4.2.7.1)
sprockets-rails
rails-deprecated_sanitizer (1.0.3)
activesupport (>= 4.2.0.alpha)
......@@ -538,9 +537,9 @@ GEM
rails-deprecated_sanitizer (>= 1.0.1)
rails-html-sanitizer (1.0.3)
loofah (~> 2.0)
railties (4.2.7)
actionpack (= 4.2.7)
activesupport (= 4.2.7)
railties (4.2.7.1)
actionpack (= 4.2.7.1)
activesupport (= 4.2.7.1)
rake (>= 0.8.7)
thor (>= 0.18.1, < 2.0)
rainbow (2.1.0)
......@@ -672,7 +671,6 @@ GEM
redis-namespace (>= 1.5.2)
rufus-scheduler (>= 2.0.24)
sidekiq (>= 4.0.0)
simple_oauth (0.1.9)
simplecov (0.12.0)
docile (~> 1.1.0)
json (>= 1.8, < 3)
......@@ -742,21 +740,8 @@ GEM
tilt (2.0.5)
timecop (0.8.1)
timfel-krb5-auth (0.8.3)
tinder (1.10.1)
eventmachine (~> 1.0)
faraday (~> 0.9.0)
faraday_middleware (~> 0.9)
hashie (>= 1.0)
json (~> 1.8.0)
mime-types
multi_json (~> 1.7)
twitter-stream (~> 0.1)
turbolinks (2.5.3)
coffee-rails
twitter-stream (0.1.16)
eventmachine (>= 0.12.8)
http_parser.rb (~> 0.5.1)
simple_oauth (~> 0.1.4)
tzinfo (1.2.2)
thread_safe (~> 0.1)
u2f (0.2.1)
......@@ -929,7 +914,7 @@ DEPENDENCIES
rack-attack (~> 4.3.1)
rack-cors (~> 0.4.0)
rack-oauth2 (~> 1.2.1)
rails (= 4.2.7)
rails (= 4.2.7.1)
rails-deprecated_sanitizer (~> 1.0.3)
rainbow (~> 2.1.0)
rblineprof (~> 0.3.6)
......@@ -981,7 +966,6 @@ DEPENDENCIES
teaspoon-jasmine (~> 2.2.0)
test_after_commit (~> 0.4.2)
thin (~> 1.7.0)
tinder (~> 1.10.0)
turbolinks (~> 2.5.0)
u2f (~> 0.2.1)
uglifier (~> 2.7.2)
......
......@@ -161,21 +161,9 @@
$emojiButton = votesBlock.find("[data-emoji=" + mutualVote + "]").parent();
isAlreadyVoted = $emojiButton.hasClass('active');
if (isAlreadyVoted) {
this.showEmojiLoader($emojiButton);
return this.addAward(votesBlock, awardUrl, mutualVote, false, function() {
return $emojiButton.removeClass('is-loading');
});
}
this.addAward(votesBlock, awardUrl, mutualVote, false);
}
};
AwardsHandler.prototype.showEmojiLoader = function($emojiButton) {
var $loader;
$loader = $emojiButton.find('.fa-spinner');
if (!$loader.length) {
$emojiButton.append('<i class="fa fa-spinner fa-spin award-control-icon award-control-icon-loading"></i>');
}
return $emojiButton.addClass('is-loading');
};
AwardsHandler.prototype.isActive = function($emojiButton) {
......
......@@ -186,6 +186,12 @@
break;
case 'projects':
new NamespaceSelects();
break;
case 'labels':
switch (path[2]) {
case 'edit':
new Labels();
}
}
break;
case 'dashboard':
......@@ -211,6 +217,7 @@
new ProjectNew();
break;
case 'show':
new Star();
new ProjectNew();
new ProjectShow();
new NotificationsDropdown();
......
/*= require markdown_preview */
/*= require preview_markdown */
(function() {
this.DropzoneInput = (function() {
......
......@@ -5,13 +5,10 @@
this.Issuable = {
init: function() {
if (!issuable_created) {
issuable_created = true;
Issuable.initTemplates();
Issuable.initSearch();
Issuable.initChecks();
return Issuable.initLabelFilterRemove();
}
},
initTemplates: function() {
return Issuable.labelRow = _.template('<% _.each(labels, function(label){ %> <span class="label-row btn-group" role="group" aria-label="<%- label.title %>" style="color: <%- label.text_color %>;"> <a href="#" class="btn btn-transparent has-tooltip" style="background-color: <%- label.color %>;" title="<%- label.description %>" data-container="body"> <%- label.title %> </a> <button type="button" class="btn btn-transparent label-remove js-label-filter-remove" style="background-color: <%- label.color %>;" data-label="<%- label.title %>"> <i class="fa fa-times"></i> </button> </span> <% }); %>');
......
......@@ -28,7 +28,7 @@
};
MarkdownPreview.prototype.renderMarkdown = function(text, success) {
if (!window.markdown_preview_path) {
if (!window.preview_markdown_path) {
return;
}
if (text === this.ajaxCache.text) {
......@@ -36,7 +36,7 @@
}
return $.ajax({
type: 'POST',
url: window.markdown_preview_path,
url: window.preview_markdown_path,
data: {
text: text
},
......
.environments {
.commit-title {
margin: 0;
}
.fa-play {
font-size: 14px;
}
.dropdown-new {
color: $table-text-gray;
}
.dropdown-menu {
.fa {
margin-right: 6px;
color: $table-text-gray;
}
}
.branch-name {
color: $gl-dark-link-color;
}
}
.table.builds.environments {
min-width: 500px;
.icon-container {
width: 20px;
text-align: center;
}
}
......@@ -182,6 +182,17 @@
.btn {
color: inherit;
}
a.btn {
padding: 0;
.has-tooltip {
top: 0;
border-top-right-radius: 0;
border-bottom-right-radius: 0;
line-height: 1.1;
}
}
}
.label-options-toggle {
......
......@@ -69,6 +69,10 @@
&.ci-success {
color: $gl-success;
a.environment {
color: inherit;
}
}
&.ci-success_with_warnings {
......@@ -126,7 +130,6 @@
&.has-conflicts .fa-exclamation-triangle {
color: $gl-warning;
}
}
p:last-child {
......
......@@ -48,9 +48,9 @@ class Admin::GroupsController < Admin::ApplicationController
end
def destroy
DestroyGroupService.new(@group, current_user).execute
DestroyGroupService.new(@group, current_user).async_execute
redirect_to admin_groups_path, notice: 'Group was successfully deleted.'
redirect_to admin_groups_path, alert: "Group '#{@group.name}' was scheduled for deletion."
end
private
......
......@@ -37,8 +37,8 @@ class Dashboard::TodosController < Dashboard::ApplicationController
def todos_counts
{
count: TodosFinder.new(current_user, state: :pending).execute.count,
done_count: TodosFinder.new(current_user, state: :done).execute.count
count: current_user.todos_pending_count,
done_count: current_user.todos_done_count
}
end
end
......@@ -87,9 +87,9 @@ class GroupsController < Groups::ApplicationController
end
def destroy
DestroyGroupService.new(@group, current_user).execute
DestroyGroupService.new(@group, current_user).async_execute
redirect_to root_path, alert: "Group '#{@group.name}' was successfully deleted."
redirect_to root_path, alert: "Group '#{@group.name}' was scheduled for deletion."
end
protected
......
......@@ -8,8 +8,9 @@ class Projects::BadgesController < Projects::ApplicationController
respond_to do |format|
format.html { render_404 }
format.svg do
send_data(badge.data, type: badge.type, disposition: 'inline')
render 'badge', locals: { badge: badge.template }
end
end
end
......
class Projects::BranchesController < Projects::ApplicationController
include ActionView::Helpers::SanitizeHelper
include SortingHelper
# Authorize
before_action :require_non_empty_project
before_action :authorize_download_code!
before_action :authorize_push_code!, only: [:new, :create, :destroy]
def index
@sort = params[:sort].presence || 'name'
@sort = params[:sort].presence || sort_value_name
@branches = BranchesFinder.new(@repository, params).execute
@branches = Kaminari.paginate_array(@branches).page(params[:page])
......
......@@ -6,7 +6,7 @@ class Projects::BuildsController < Projects::ApplicationController
def index
@scope = params[:scope]
@all_builds = project.builds
@all_builds = project.builds.relevant
@builds = @all_builds.order('created_at DESC')
@builds =
case @scope
......
......@@ -134,8 +134,8 @@ class Projects::CommitController < Projects::ApplicationController
end
def define_status_vars
@statuses = CommitStatus.where(pipeline: pipelines)
@builds = Ci::Build.where(pipeline: pipelines)
@statuses = CommitStatus.where(pipeline: pipelines).relevant
@builds = Ci::Build.where(pipeline: pipelines).relevant
end
def assign_change_commit_vars(mr_source_branch)
......
# This file should be identical in GitLab Community Edition and Enterprise Edition
class Projects::GitHttpClientController < Projects::ApplicationController
include ActionController::HttpAuthentication::Basic
include KerberosSpnegoHelper
attr_reader :user
# Git clients will not know what authenticity token to send along
skip_before_action :verify_authenticity_token
skip_before_action :repository
before_action :authenticate_user
before_action :ensure_project_found!
private
def authenticate_user
if project && project.public? && download_request?
return # Allow access
end
if allow_basic_auth? && basic_auth_provided?
login, password = user_name_and_password(request)
auth_result = Gitlab::Auth.find_for_git_client(login, password, project: project, ip: request.ip)
if auth_result.type == :ci && download_request?
@ci = true
elsif auth_result.type == :oauth && !download_request?
# Not allowed
else
@user = auth_result.user
end
if ci? || user
return # Allow access
end
elsif allow_kerberos_spnego_auth? && spnego_provided?
@user = find_kerberos_user
if user
send_final_spnego_response
return # Allow access
end
end
send_challenges
render plain: "HTTP Basic: Access denied\n", status: 401
end
def basic_auth_provided?
has_basic_credentials?(request)
end
def send_challenges
challenges = []
challenges << 'Basic realm="GitLab"' if allow_basic_auth?
challenges << spnego_challenge if allow_kerberos_spnego_auth?
headers['Www-Authenticate'] = challenges.join("\n") if challenges.any?
end
def ensure_project_found!
render_not_found if project.blank?
end
def project
return @project if defined?(@project)
project_id, _ = project_id_with_suffix
if project_id.blank?
@project = nil
else
@project = Project.find_with_namespace("#{params[:namespace_id]}/#{project_id}")
end
end
# This method returns two values so that we can parse
# params[:project_id] (untrusted input!) in exactly one place.
def project_id_with_suffix
id = params[:project_id] || ''
%w[.wiki.git .git].each do |suffix|
if id.end_with?(suffix)
# Be careful to only remove the suffix from the end of 'id'.
# Accidentally removing it from the middle is how security
# vulnerabilities happen!
return [id.slice(0, id.length - suffix.length), suffix]
end
end
# Something is wrong with params[:project_id]; do not pass it on.
[nil, nil]
end
def repository
_, suffix = project_id_with_suffix
if suffix == '.wiki.git'
project.wiki.repository
else
project.repository
end
end
def render_not_found
render plain: 'Not Found', status: :not_found
end
def ci?
@ci.present?
end
end
# This file should be identical in GitLab Community Edition and Enterprise Edition
class Projects::GitHttpController < Projects::ApplicationController
include ActionController::HttpAuthentication::Basic
include KerberosSpnegoHelper
attr_reader :user
# Git clients will not know what authenticity token to send along
skip_before_action :verify_authenticity_token
skip_before_action :repository
before_action :authenticate_user
before_action :ensure_project_found!
class Projects::GitHttpController < Projects::GitHttpClientController
# GET /foo/bar.git/info/refs?service=git-upload-pack (git pull)
# GET /foo/bar.git/info/refs?service=git-receive-pack (git push)
def info_refs
......@@ -46,81 +35,8 @@ class Projects::GitHttpController < Projects::ApplicationController
private
def authenticate_user
if project && project.public? && upload_pack?
return # Allow access
end
if allow_basic_auth? && basic_auth_provided?
login, password = user_name_and_password(request)
auth_result = Gitlab::Auth.find_for_git_client(login, password, project: project, ip: request.ip)
if auth_result.type == :ci && upload_pack?
@ci = true
elsif auth_result.type == :oauth && !upload_pack?
# Not allowed
else
@user = auth_result.user
end
if ci? || user
return # Allow access
end
elsif allow_kerberos_spnego_auth? && spnego_provided?
@user = find_kerberos_user
if user
send_final_spnego_response
return # Allow access
end
end
send_challenges
render plain: "HTTP Basic: Access denied\n", status: 401
end
def basic_auth_provided?
has_basic_credentials?(request)
end
def send_challenges
challenges = []
challenges << 'Basic realm="GitLab"' if allow_basic_auth?
challenges << spnego_challenge if allow_kerberos_spnego_auth?
headers['Www-Authenticate'] = challenges.join("\n") if challenges.any?
end
def ensure_project_found!
render_not_found if project.blank?
end
def project
return @project if defined?(@project)
project_id, _ = project_id_with_suffix
if project_id.blank?
@project = nil
else
@project = Project.find_with_namespace("#{params[:namespace_id]}/#{project_id}")
end
end
# This method returns two values so that we can parse
# params[:project_id] (untrusted input!) in exactly one place.
def project_id_with_suffix
id = params[:project_id] || ''
%w[.wiki.git .git].each do |suffix|
if id.end_with?(suffix)
# Be careful to only remove the suffix from the end of 'id'.
# Accidentally removing it from the middle is how security
# vulnerabilities happen!
return [id.slice(0, id.length - suffix.length), suffix]
end
end
# Something is wrong with params[:project_id]; do not pass it on.
[nil, nil]
def download_request?
upload_pack?
end
def upload_pack?
......@@ -143,19 +59,6 @@ class Projects::GitHttpController < Projects::ApplicationController
render json: Gitlab::Workhorse.git_http_ok(repository, user)
end
def repository
_, suffix = project_id_with_suffix
if suffix == '.wiki.git'
project.wiki.repository
else
project.repository
end
end
def render_not_found
render plain: 'Not Found', status: :not_found
end
def render_http_not_allowed
render plain: access_check.message, status: :forbidden
end
......@@ -169,10 +72,6 @@ class Projects::GitHttpController < Projects::ApplicationController
end
end
def ci?
@ci.present?
end
def upload_pack_allowed?
return false unless Gitlab.config.gitlab_shell.upload_pack
......
class Projects::LfsApiController < Projects::GitHttpClientController
include LfsHelper
before_action :require_lfs_enabled!
before_action :lfs_check_access!, except: [:deprecated]
def batch
unless objects.present?
render_lfs_not_found
return
end
if download_request?
render json: { objects: download_objects! }
elsif upload_request?
render json: { objects: upload_objects! }
else
raise "Never reached"
end
end
def deprecated
render(
json: {
message: 'Server supports batch API only, please update your Git LFS client to version 1.0.1 and up.',
documentation_url: "#{Gitlab.config.gitlab.url}/help",
},
status: 501
)
end
private
def objects
@objects ||= (params[:objects] || []).to_a
end
def existing_oids
@existing_oids ||= begin
storage_project.lfs_objects.where(oid: objects.map { |o| o['oid'].to_s }).pluck(:oid)
end
end
def download_objects!
objects.each do |object|
if existing_oids.include?(object[:oid])
object[:actions] = download_actions(object)
else
object[:error] = {
code: 404,
message: "Object does not exist on the server or you don't have permissions to access it",
}
end
end
objects
end
def upload_objects!
objects.each do |object|
object[:actions] = upload_actions(object) unless existing_oids.include?(object[:oid])
end
objects
end
def download_actions(object)
{
download: {
href: "#{project.http_url_to_repo}/gitlab-lfs/objects/#{object[:oid]}",
header: {
Authorization: request.headers['Authorization']
}.compact
}
}
end
def upload_actions(object)
{
upload: {
href: "#{project.http_url_to_repo}/gitlab-lfs/objects/#{object[:oid]}/#{object[:size]}",
header: {
Authorization: request.headers['Authorization']
}.compact
}
}
end
def download_request?
params[:operation] == 'download'
end
def upload_request?
params[:operation] == 'upload'
end
end
class Projects::LfsStorageController < Projects::GitHttpClientController
include LfsHelper
before_action :require_lfs_enabled!
before_action :lfs_check_access!
def download
lfs_object = LfsObject.find_by_oid(oid)
unless lfs_object && lfs_object.file.exists?
render_lfs_not_found
return
end
send_file lfs_object.file.path, content_type: "application/octet-stream"
end
def upload_authorize
render(
json: {
StoreLFSPath: "#{Gitlab.config.lfs.storage_path}/tmp/upload",
LfsOid: oid,
LfsSize: size,
},
content_type: 'application/json; charset=utf-8'
)
end
def upload_finalize
unless tmp_filename
render_lfs_forbidden
return
end
if store_file(oid, size, tmp_filename)
head 200
else
render plain: 'Unprocessable entity', status: 422
end
end
private
def download_request?
action_name == 'download'
end
def upload_request?
%w[upload_authorize upload_finalize].include? action_name
end
def oid
params[:oid].to_s
end
def size
params[:size].to_i
end
def tmp_filename
name = request.headers['X-Gitlab-Lfs-Tmp']
return if name.include?('/')
return unless oid.present? && name.start_with?(oid)
name
end
def store_file(oid, size, tmp_file)
# Define tmp_file_path early because we use it in "ensure"
tmp_file_path = File.join("#{Gitlab.config.lfs.storage_path}/tmp/upload", tmp_file)
object = LfsObject.find_or_create_by(oid: oid, size: size)
file_exists = object.file.exists? || move_tmp_file_to_storage(object, tmp_file_path)
file_exists && link_to_project(object)
ensure
FileUtils.rm_f(tmp_file_path)
end
def move_tmp_file_to_storage(object, path)
File.open(path) do |f|
object.file = f
end
object.file.store!
object.save
end
def link_to_project(object)
if object && !object.projects.exists?(storage_project.id)
object.projects << storage_project
object.save
end
end
end
......@@ -160,7 +160,7 @@ class Projects::MergeRequestsController < Projects::ApplicationController
@diff_notes_disabled = true
@pipeline = @merge_request.pipeline
@statuses = @pipeline.statuses if @pipeline
@statuses = @pipeline.statuses.relevant if @pipeline
@note_counts = Note.where(commit_id: @commits.map(&:id)).
group(:commit_id).count
......@@ -362,7 +362,7 @@ class Projects::MergeRequestsController < Projects::ApplicationController
@commits_count = @merge_request.commits.count
@pipeline = @merge_request.pipeline
@statuses = @pipeline.statuses if @pipeline
@statuses = @pipeline.statuses.relevant if @pipeline
if @merge_request.locked_long_ago?
@merge_request.unlock_mr
......
......@@ -19,7 +19,7 @@ class Projects::PipelinesController < Projects::ApplicationController
end
def create
@pipeline = Ci::CreatePipelineService.new(project, current_user, create_params).execute
@pipeline = Ci::CreatePipelineService.new(project, current_user, create_params).execute(ignore_skip_ci: true, save_on_errors: false)
unless @pipeline.persisted?
render 'new'
return
......
......@@ -3,7 +3,7 @@ class Projects::PipelinesSettingsController < Projects::ApplicationController
def show
@ref = params[:ref] || @project.default_branch || 'master'
@build_badge = Gitlab::Badge::Build.new(@project, @ref)
@build_badge = Gitlab::Badge::Build.new(@project, @ref).metadata
end
def update
......
......@@ -91,7 +91,7 @@ class Projects::WikisController < Projects::ApplicationController
)
end
def markdown_preview
def preview_markdown
text = params[:text]
ext = Gitlab::ReferenceExtractor.new(@project, current_user)
......
......@@ -125,7 +125,7 @@ class ProjectsController < Projects::ApplicationController
def destroy
return access_denied! unless can?(current_user, :remove_project, @project)
::Projects::DestroyService.new(@project, current_user, {}).pending_delete!
::Projects::DestroyService.new(@project, current_user, {}).async_execute
flash[:alert] = "Project '#{@project.name}' will be deleted."
redirect_to dashboard_projects_path
......@@ -238,7 +238,7 @@ class ProjectsController < Projects::ApplicationController
}
end
def markdown_preview
def preview_markdown
text = params[:text]
ext = Gitlab::ReferenceExtractor.new(@project, current_user)
......
class ProjectsFinder < UnionFinder
def execute(current_user = nil, options = {})
def execute(current_user = nil, project_ids_relation = nil)
segments = all_projects(current_user)
segments.map! { |s| s.where(id: project_ids_relation) } if project_ids_relation
find_union(segments, Project)
end
......
......@@ -27,9 +27,11 @@ class TodosFinder
items = by_action_id(items)
items = by_action(items)
items = by_author(items)
items = by_project(items)
items = by_state(items)
items = by_type(items)
# Filtering by project HAS TO be the last because we use
# the project IDs yielded by the todos query thus far
items = by_project(items)
items.reorder(id: :desc)
end
......@@ -91,14 +93,9 @@ class TodosFinder
@project
end
def projects
return @projects if defined?(@projects)
if project?
@projects = project
else
@projects = ProjectsFinder.new.execute(current_user)
end
def projects(items)
item_project_ids = items.reorder(nil).select(:project_id)
ProjectsFinder.new.execute(current_user, item_project_ids)
end
def type?
......@@ -136,8 +133,9 @@ class TodosFinder
def by_project(items)
if project?
items = items.where(project: project)
elsif projects
items = items.merge(projects).joins(:project)
else
item_projects = projects(items)
items = items.merge(item_projects).joins(:project)
end
items
......
......@@ -7,8 +7,6 @@ module AvatarsHelper
}))
end
private
def user_avatar(options = {})
avatar_size = options[:size] || 16
user_name = options[:user].try(:name) || options[:user_name]
......
......@@ -109,11 +109,10 @@ module DiffHelper
end
end
def diff_file_html_data(project, diff_file)
commit = commit_for_diff(diff_file)
def diff_file_html_data(project, diff_file_path, diff_commit_id)
{
blob_diff_path: namespace_project_blob_diff_path(project.namespace, project,
tree_join(commit.id, diff_file.file_path)),
tree_join(diff_commit_id, diff_file_path)),
view: diff_view
}
end
......
module LfsHelper
def require_lfs_enabled!
return if Gitlab.config.lfs.enabled
render(
json: {
message: 'Git LFS is not enabled on this GitLab server, contact your admin.',
documentation_url: "#{Gitlab.config.gitlab.url}/help",
},
status: 501
)
end
def lfs_check_access!
return if download_request? && lfs_download_access?
return if upload_request? && lfs_upload_access?
if project.public? || (user && user.can?(:read_project, project))
render_lfs_forbidden
else
render_lfs_not_found
end
end
def lfs_download_access?
project.public? || ci? || (user && user.can?(:download_code, project))
end
def lfs_upload_access?
user && user.can?(:push_code, project)
end
def render_lfs_forbidden
render(
json: {
message: 'Access forbidden. Check your access level.',
documentation_url: "#{Gitlab.config.gitlab.url}/help",
},
content_type: "application/vnd.git-lfs+json",
status: 403
)
end
def render_lfs_not_found
render(
json: {
message: 'Not found.',
documentation_url: "#{Gitlab.config.gitlab.url}/help",
},
content_type: "application/vnd.git-lfs+json",
status: 404
)
end
def storage_project
@storage_project ||= begin
result = project
loop do
break unless result.forked?
result = result.forked_from_project
end
result
end
end
end
......@@ -6,12 +6,6 @@ module MembersHelper
"#{action}_#{member.type.underscore}".to_sym
end
def default_show_roles(member)
can?(current_user, action_member_permission(:update, member), member) ||
can?(current_user, action_member_permission(:destroy, member), member) ||
can?(current_user, action_member_permission(:admin, member), member.source)
end
def remove_member_message(member, user: nil)
user = current_user if defined?(current_user)
......
module TodosHelper
def todos_pending_count
@todos_pending_count ||= TodosFinder.new(current_user, state: :pending).execute.count
@todos_pending_count ||= current_user.todos_pending_count
end
def todos_done_count
@todos_done_count ||= TodosFinder.new(current_user, state: :done).execute.count
@todos_done_count ||= current_user.todos_done_count
end
def todo_action_name(todo)
......
......@@ -3,6 +3,9 @@ class Blob < SimpleDelegator
CACHE_TIME = 60 # Cache raw blobs referred to by a (mutable) ref for 1 minute
CACHE_TIME_IMMUTABLE = 3600 # Cache blobs referred to by an immutable reference for 1 hour
# The maximum size of an SVG that can be displayed.
MAXIMUM_SVG_SIZE = 2.megabytes
# Wrap a Gitlab::Git::Blob object, or return nil when given nil
#
# This method prevents the decorated object from evaluating to "truthy" when
......@@ -31,6 +34,10 @@ class Blob < SimpleDelegator
text? && language && language.name == 'SVG'
end
def size_within_svg_limits?
size <= MAXIMUM_SVG_SIZE
end
def video?
UploaderHelper::VIDEO_EXT.include?(extname.downcase.delete('.'))
end
......
......@@ -16,7 +16,7 @@ module Ci
scope :with_artifacts_not_expired, ->() { with_artifacts.where('artifacts_expire_at IS NULL OR artifacts_expire_at > ?', Time.now) }
scope :with_expired_artifacts, ->() { with_artifacts.where('artifacts_expire_at < ?', Time.now) }
scope :last_month, ->() { where('created_at > ?', Date.today - 1.month) }
scope :manual_actions, ->() { where(when: :manual) }
scope :manual_actions, ->() { where(when: :manual).relevant }
mount_uploader :artifacts_file, ArtifactUploader
mount_uploader :artifacts_metadata, ArtifactUploader
......@@ -42,40 +42,35 @@ module Ci
end
def retry(build, user = nil)
new_build = Ci::Build.new(status: 'pending')
new_build.ref = build.ref
new_build.tag = build.tag
new_build.options = build.options
new_build.commands = build.commands
new_build.tag_list = build.tag_list
new_build.project = build.project
new_build.pipeline = build.pipeline
new_build.name = build.name
new_build.allow_failure = build.allow_failure
new_build.stage = build.stage
new_build.stage_idx = build.stage_idx
new_build.trigger_request = build.trigger_request
new_build.yaml_variables = build.yaml_variables
new_build.when = build.when
new_build.user = user
new_build.environment = build.environment
new_build.save
new_build = Ci::Build.create(
ref: build.ref,
tag: build.tag,
options: build.options,
commands: build.commands,
tag_list: build.tag_list,
project: build.project,
pipeline: build.pipeline,
name: build.name,
allow_failure: build.allow_failure,
stage: build.stage,
stage_idx: build.stage_idx,
trigger_request: build.trigger_request,
yaml_variables: build.yaml_variables,
when: build.when,
user: user,
environment: build.environment,
status_event: 'enqueue'
)
MergeRequests::AddTodoWhenBuildFailsService.new(build.project, nil).close(new_build)
new_build
end
end
state_machine :status, initial: :pending do
state_machine :status do
after_transition pending: :running do |build|
build.execute_hooks
end
# We use around_transition to create builds for next stage as soon as possible, before the `after_*` is executed
around_transition any => [:success, :failed, :canceled] do |build, block|
block.call
build.pipeline.create_next_builds(build) if build.pipeline
end
after_transition any => [:success, :failed, :canceled] do |build|
build.update_coverage
build.execute_hooks
......@@ -107,7 +102,7 @@ module Ci
def play(current_user = nil)
# Try to queue a current build
if self.queue
if self.enqueue
self.update(user: current_user)
self
else
......
......@@ -13,13 +13,51 @@ module Ci
has_many :trigger_requests, dependent: :destroy, class_name: 'Ci::TriggerRequest', foreign_key: :commit_id
validates_presence_of :sha
validates_presence_of :ref
validates_presence_of :status
validate :valid_commit_sha
# Invalidate object and save if when touched
after_touch :update_state
after_save :keep_around_commits
state_machine :status, initial: :created do
event :enqueue do
transition created: :pending
transition [:success, :failed, :canceled, :skipped] => :running
end
event :run do
transition any => :running
end
event :skip do
transition any => :skipped
end
event :drop do
transition any => :failed
end
event :succeed do
transition any => :success
end
event :cancel do
transition any => :canceled
end
before_transition [:created, :pending] => :running do |pipeline|
pipeline.started_at = Time.now
end
before_transition any => [:success, :failed, :canceled] do |pipeline|
pipeline.finished_at = Time.now
end
before_transition do |pipeline|
pipeline.update_duration
end
end
# ref can't be HEAD or SHA, can only be branch/tag name
def self.latest_successful_for(ref)
where(ref: ref).order(id: :desc).success.first
......@@ -109,37 +147,6 @@ module Ci
trigger_requests.any?
end
def create_builds(user, trigger_request = nil)
##
# We persist pipeline only if there are builds available
#
return unless config_processor
build_builds_for_stages(config_processor.stages, user,
'success', trigger_request) && save
end
def create_next_builds(build)
return unless config_processor
# don't create other builds if this one is retried
latest_builds = builds.latest
return unless latest_builds.exists?(build.id)
# get list of stages after this build
next_stages = config_processor.stages.drop_while { |stage| stage != build.stage }
next_stages.delete(build.stage)
# get status for all prior builds
prior_builds = latest_builds.where.not(stage: next_stages)
prior_status = prior_builds.status
# build builds for next stage that has builds available
# and save pipeline if we have builds
build_builds_for_stages(next_stages, build.user, prior_status,
build.trigger_request) && save
end
def retried
@retried ||= (statuses.order(id: :desc) - statuses.latest)
end
......@@ -151,6 +158,14 @@ module Ci
end
end
def config_builds_attributes
return [] unless config_processor
config_processor.
builds_for_ref(ref, tag?, trigger_requests.first).
sort_by { |build| build[:stage_idx] }
end
def has_warnings?
builds.latest.ignored.any?
end
......@@ -182,10 +197,6 @@ module Ci
end
end
def skip_ci?
git_commit_message =~ /\[(ci skip|skip ci)\]/i if git_commit_message
end
def environments
builds.where.not(environment: nil).success.pluck(:environment).uniq
end
......@@ -207,37 +218,37 @@ module Ci
Note.for_commit_id(sha)
end
def process!
Ci::ProcessPipelineService.new(project, user).execute(self)
end
def build_updated
case latest_builds_status
when 'pending' then enqueue
when 'running' then run
when 'success' then succeed
when 'failed' then drop
when 'canceled' then cancel
when 'skipped' then skip
end
end
def predefined_variables
[
{ key: 'CI_PIPELINE_ID', value: id.to_s, public: true }
]
end
def update_duration
self.duration = statuses.latest.duration
end
private
def build_builds_for_stages(stages, user, status, trigger_request)
##
# Note that `Array#any?` implements a short circuit evaluation, so we
# build builds only for the first stage that has builds available.
#
stages.any? do |stage|
CreateBuildsService.new(self).
execute(stage, user, status, trigger_request).
any?(&:active?)
end
end
def latest_builds_status
return 'failed' unless yaml_errors.blank?
def update_state
statuses.reload
self.status = if yaml_errors.blank?
statuses.latest.status || 'skipped'
else
'failed'
end
self.started_at = statuses.started_at
self.finished_at = statuses.finished_at
self.duration = statuses.latest.duration
save
end
def keep_around_commits
......
......@@ -5,7 +5,7 @@ class CommitStatus < ActiveRecord::Base
self.table_name = 'ci_builds'
belongs_to :project, class_name: '::Project', foreign_key: :gl_project_id
belongs_to :pipeline, class_name: 'Ci::Pipeline', foreign_key: :commit_id, touch: true
belongs_to :pipeline, class_name: 'Ci::Pipeline', foreign_key: :commit_id
belongs_to :user
delegate :commit, to: :pipeline
......@@ -25,28 +25,36 @@ class CommitStatus < ActiveRecord::Base
scope :ordered, -> { order(:name) }
scope :ignored, -> { where(allow_failure: true, status: [:failed, :canceled]) }
state_machine :status, initial: :pending do
event :queue do
transition skipped: :pending
state_machine :status do
event :enqueue do
transition [:created, :skipped] => :pending
end
event :run do
transition pending: :running
end
event :skip do
transition [:created, :pending] => :skipped
end
event :drop do
transition [:pending, :running] => :failed
transition [:created, :pending, :running] => :failed
end
event :success do
transition [:pending, :running] => :success
transition [:created, :pending, :running] => :success
end
event :cancel do
transition [:pending, :running] => :canceled
transition [:created, :pending, :running] => :canceled
end
after_transition pending: :running do |commit_status|
after_transition created: [:pending, :running] do |commit_status|
commit_status.update_attributes queued_at: Time.now
end
after_transition [:created, :pending] => :running do |commit_status|
commit_status.update_attributes started_at: Time.now
end
......@@ -54,7 +62,18 @@ class CommitStatus < ActiveRecord::Base
commit_status.update_attributes finished_at: Time.now
end
after_transition [:pending, :running] => :success do |commit_status|
# We use around_transition to process pipeline on next stages as soon as possible, before the `after_*` is executed
around_transition any => [:success, :failed, :canceled] do |commit_status, block|
block.call
commit_status.pipeline.try(:process!)
end
after_transition do |commit_status, transition|
commit_status.pipeline.try(:build_updated) unless transition.loopback?
end
after_transition [:created, :pending, :running] => :success do |commit_status|
MergeRequests::MergeWhenBuildSucceedsService.new(commit_status.pipeline.project, nil).trigger(commit_status)
end
......
module Statuseable
extend ActiveSupport::Concern
AVAILABLE_STATUSES = %w(pending running success failed canceled skipped)
AVAILABLE_STATUSES = %w[created pending running success failed canceled skipped]
STARTED_STATUSES = %w[running success failed skipped]
ACTIVE_STATUSES = %w[pending running]
COMPLETED_STATUSES = %w[success failed canceled]
class_methods do
def status_sql
builds = all.select('count(*)').to_sql
success = all.success.select('count(*)').to_sql
ignored = all.ignored.select('count(*)').to_sql if all.respond_to?(:ignored)
scope = all.relevant
builds = scope.select('count(*)').to_sql
success = scope.success.select('count(*)').to_sql
ignored = scope.ignored.select('count(*)').to_sql if scope.respond_to?(:ignored)
ignored ||= '0'
pending = all.pending.select('count(*)').to_sql
running = all.running.select('count(*)').to_sql
canceled = all.canceled.select('count(*)').to_sql
skipped = all.skipped.select('count(*)').to_sql
pending = scope.pending.select('count(*)').to_sql
running = scope.running.select('count(*)').to_sql
canceled = scope.canceled.select('count(*)').to_sql
skipped = scope.skipped.select('count(*)').to_sql
deduce_status = "(CASE
WHEN (#{builds})=0 THEN NULL
......@@ -48,7 +52,8 @@ module Statuseable
included do
validates :status, inclusion: { in: AVAILABLE_STATUSES }
state_machine :status, initial: :pending do
state_machine :status, initial: :created do
state :created, value: 'created'
state :pending, value: 'pending'
state :running, value: 'running'
state :failed, value: 'failed'
......@@ -57,6 +62,8 @@ module Statuseable
state :skipped, value: 'skipped'
end
scope :created, -> { where(status: 'created') }
scope :relevant, -> { where.not(status: 'created') }
scope :running, -> { where(status: 'running') }
scope :pending, -> { where(status: 'pending') }
scope :success, -> { where(status: 'success') }
......@@ -68,14 +75,14 @@ module Statuseable
end
def started?
!pending? && !canceled? && started_at
STARTED_STATUSES.include?(status) && started_at
end
def active?
running? || pending?
ACTIVE_STATUSES.include?(status)
end
def complete?
canceled? || success? || failed?
COMPLETED_STATUSES.include?(status)
end
end
......@@ -36,4 +36,10 @@ class Deployment < ActiveRecord::Base
def manual_actions
deployable.try(:other_actions)
end
def includes_commit?(commit)
return false unless commit
project.repository.is_ancestor?(commit.id, sha)
end
end
......@@ -25,4 +25,10 @@ class Environment < ActiveRecord::Base
def nullify_external_url
self.external_url = nil if self.external_url.blank?
end
def includes_commit?(commit)
return false unless last_deployment
last_deployment.includes_commit?(commit)
end
end
......@@ -8,6 +8,7 @@ class ProjectMember < Member
# Make sure project member points only to project as it source
default_value_for :source_type, SOURCE_TYPE
validates_format_of :source_type, with: /\AProject\z/
validates :access_level, inclusion: { in: Gitlab::Access.values }
default_scope { where(source_type: SOURCE_TYPE) }
scope :in_project, ->(project) { where(source_id: project.id) }
......
......@@ -104,6 +104,7 @@ class MergeRequest < ActiveRecord::Base
scope :from_project, ->(project) { where(source_project_id: project.id) }
scope :merged, -> { with_state(:merged) }
scope :closed_and_merged, -> { with_states(:closed, :merged) }
scope :from_source_branches, ->(branches) { where(source_branch: branches) }
scope :join_project, -> { joins(:target_project) }
scope :references_project, -> { references(:target_project) }
......@@ -590,6 +591,14 @@ class MergeRequest < ActiveRecord::Base
!pipeline || pipeline.success?
end
def environments
return unless diff_head_commit
target_project.environments.select do |environment|
environment.includes_commit?(diff_head_commit)
end
end
def state_human_name
if merged?
"Merged"
......
class Namespace < ActiveRecord::Base
acts_as_paranoid
include Sortable
include Gitlab::ShellAdapter
......
......@@ -999,6 +999,10 @@ class Project < ActiveRecord::Base
project_members.find_by(user_id: user)
end
def add_user(user, access_level, current_user = nil)
team.add_user(user, access_level, current_user)
end
def default_branch
@default_branch ||= repository.root_ref if repository.exists?
end
......@@ -1163,16 +1167,6 @@ class Project < ActiveRecord::Base
@wiki ||= ProjectWiki.new(self, self.owner)
end
def schedule_delete!(user_id, params)
# Queue this task for after the commit, so once we mark pending_delete it will run
run_after_commit do
job_id = ProjectDestroyWorker.perform_async(id, user_id, params)
Rails.logger.info("User #{user_id} scheduled destruction of project #{path_with_namespace} with job ID #{job_id}")
end
update_attribute(:pending_delete, true)
end
def running_or_pending_build_count(force: false)
Rails.cache.fetch(['projects', id, 'running_or_pending_build_count'], force: force) do
builds.running_or_pending.count(:all)
......
class CampfireService < Service
include HTTParty
prop_accessor :token, :subdomain, :room
validates :token, presence: true, if: :activated?
......@@ -29,18 +31,53 @@ class CampfireService < Service
def execute(data)
return unless supported_events.include?(data[:object_kind])
room = gate.find_room_by_name(self.room)
return true unless room
self.class.base_uri base_uri
message = build_message(data)
room.speak(message)
speak(self.room, message, auth)
end
private
def gate
@gate ||= Tinder::Campfire.new(subdomain, token: token)
def base_uri
@base_uri ||= "https://#{subdomain}.campfirenow.com"
end
def auth
# use a dummy password, as explained in the Campfire API doc:
# https://github.com/basecamp/campfire-api#authentication
@auth ||= {
basic_auth: {
username: token,
password: 'X'
}
}
end
# Post a message into a room, returns the message Hash in case of success.
# Returns nil otherwise.
# https://github.com/basecamp/campfire-api/blob/master/sections/messages.md#create-message
def speak(room_name, message, auth)
room = rooms(auth).find { |r| r["name"] == room_name }
return nil unless room
path = "/room/#{room["id"]}/speak.json"
body = {
body: {
message: {
type: 'TextMessage',
body: message
}
}
}
res = self.class.post(path, auth.merge(body))
res.code == 201 ? res : nil
end
# Returns a list of rooms, or [].
# https://github.com/basecamp/campfire-api/blob/master/sections/rooms.md#get-rooms
def rooms(auth)
res = self.class.get("/rooms.json", auth)
res.code == 200 ? res["rooms"] : []
end
def build_message(push)
......
class PivotaltrackerService < Service
include HTTParty
prop_accessor :token
API_ENDPOINT = 'https://www.pivotaltracker.com/services/v5/source_commits'
prop_accessor :token, :restrict_to_branch
validates :token, presence: true, if: :activated?
def title
......@@ -18,7 +20,17 @@ class PivotaltrackerService < Service
def fields
[
{ type: 'text', name: 'token', placeholder: '' }
{
type: 'text',
name: 'token',
placeholder: 'Pivotal Tracker API token.'
},
{
type: 'text',
name: 'restrict_to_branch',
placeholder: 'Comma-separated list of branches which will be ' \
'automatically inspected. Leave blank to include all branches.'
}
]
end
......@@ -28,8 +40,8 @@ class PivotaltrackerService < Service
def execute(data)
return unless supported_events.include?(data[:object_kind])
return unless allowed_branch?(data[:ref])
url = 'https://www.pivotaltracker.com/services/v5/source_commits'
data[:commits].each do |commit|
message = {
'source_commit' => {
......@@ -40,7 +52,7 @@ class PivotaltrackerService < Service
}
}
PivotaltrackerService.post(
url,
API_ENDPOINT,
body: message.to_json,
headers: {
'Content-Type' => 'application/json',
......@@ -49,4 +61,15 @@ class PivotaltrackerService < Service
)
end
end
private
def allowed_branch?(ref)
return true unless ref.present? && restrict_to_branch.present?
branch = Gitlab::Git.ref_name(ref)
allowed_branches = restrict_to_branch.split(',').map(&:strip)
branch.present? && allowed_branches.include?(branch)
end
end
class SpamReport < ActiveRecord::Base
belongs_to :user
validates :user, presence: true
end
......@@ -23,13 +23,13 @@ class User < ActiveRecord::Base
default_value_for :theme_id, gitlab_config.default_theme
attr_encrypted :otp_secret,
key: Gitlab::Application.config.secret_key_base,
key: Gitlab::Application.secrets.otp_key_base,
mode: :per_attribute_iv_and_salt,
insecure_mode: true,
algorithm: 'aes-256-cbc'
devise :two_factor_authenticatable,
otp_secret_encryption_key: Gitlab::Application.config.secret_key_base
otp_secret_encryption_key: Gitlab::Application.secrets.otp_key_base
devise :two_factor_backupable, otp_number_of_backup_codes: 10
serialize :otp_backup_codes, JSON
......@@ -809,13 +809,13 @@ class User < ActiveRecord::Base
def todos_done_count(force: false)
Rails.cache.fetch(['users', id, 'todos_done_count'], force: force) do
todos.done.count
TodosFinder.new(self, state: :done).execute.count
end
end
def todos_pending_count(force: false)
Rails.cache.fetch(['users', id, 'todos_pending_count'], force: force) do
todos.pending.count
TodosFinder.new(self, state: :pending).execute.count
end
end
......
module Ci
class CreateBuildsService
def initialize(pipeline)
@pipeline = pipeline
@config = pipeline.config_processor
end
def execute(stage, user, status, trigger_request = nil)
builds_attrs = @config.builds_for_stage_and_ref(stage, @pipeline.ref, @pipeline.tag, trigger_request)
# check when to create next build
builds_attrs = builds_attrs.select do |build_attrs|
case build_attrs[:when]
when 'on_success'
status == 'success'
when 'on_failure'
status == 'failed'
when 'always', 'manual'
%w(success failed).include?(status)
end
end
# don't create the same build twice
builds_attrs.reject! do |build_attrs|
@pipeline.builds.find_by(ref: @pipeline.ref,
tag: @pipeline.tag,
trigger_request: trigger_request,
name: build_attrs[:name])
end
builds_attrs.map do |build_attrs|
build_attrs.slice!(:name,
:commands,
:tag_list,
:options,
:allow_failure,
:stage,
:stage_idx,
:environment,
:when,
:yaml_variables)
build_attrs.merge!(pipeline: @pipeline,
ref: @pipeline.ref,
tag: @pipeline.tag,
trigger_request: trigger_request,
user: user,
project: @pipeline.project)
# TODO: The proper implementation for this is in
# https://gitlab.com/gitlab-org/gitlab-ce/merge_requests/5295
build_attrs[:status] = 'skipped' if build_attrs[:when] == 'manual'
##
# We do not persist new builds here.
# Those will be persisted when @pipeline is saved.
#
@pipeline.builds.new(build_attrs)
end
end
end
end
module Ci
class CreatePipelineBuildsService < BaseService
attr_reader :pipeline
def execute(pipeline)
@pipeline = pipeline
new_builds.map do |build_attributes|
create_build(build_attributes)
end
end
private
def create_build(build_attributes)
build_attributes = build_attributes.merge(
pipeline: pipeline,
project: pipeline.project,
ref: pipeline.ref,
tag: pipeline.tag,
user: current_user,
trigger_request: trigger_request
)
pipeline.builds.create(build_attributes)
end
def new_builds
@new_builds ||= pipeline.config_builds_attributes.
reject { |build| existing_build_names.include?(build[:name]) }
end
def existing_build_names
@existing_build_names ||= pipeline.builds.pluck(:name)
end
def trigger_request
return @trigger_request if defined?(@trigger_request)
@trigger_request ||= pipeline.trigger_requests.first
end
end
end
module Ci
class CreatePipelineService < BaseService
def execute
pipeline = project.pipelines.new(params)
pipeline.user = current_user
attr_reader :pipeline
unless ref_names.include?(params[:ref])
pipeline.errors.add(:base, 'Reference not found')
return pipeline
def execute(ignore_skip_ci: false, save_on_errors: true, trigger_request: nil)
@pipeline = Ci::Pipeline.new(
project: project,
ref: ref,
sha: sha,
before_sha: before_sha,
tag: tag?,
trigger_requests: Array(trigger_request),
user: current_user
)
unless project.builds_enabled?
return error('Pipeline is disabled')
end
if commit
pipeline.sha = commit.id
else
pipeline.errors.add(:base, 'Commit not found')
return pipeline
unless trigger_request || can?(current_user, :create_pipeline, project)
return error('Insufficient permissions to create a new pipeline')
end
unless can?(current_user, :create_pipeline, project)
pipeline.errors.add(:base, 'Insufficient permissions to create a new pipeline')
return pipeline
unless branch? || tag?
return error('Reference not found')
end
unless commit
return error('Commit not found')
end
unless pipeline.config_processor
pipeline.errors.add(:base, pipeline.yaml_errors || 'Missing .gitlab-ci.yml file')
return pipeline
unless pipeline.ci_yaml_file
return error('Missing .gitlab-ci.yml file')
end
return error(pipeline.yaml_errors, save: save_on_errors)
end
pipeline.save!
if !ignore_skip_ci && skip_ci?
pipeline.skip if save_on_errors
return pipeline
end
unless pipeline.create_builds(current_user)
pipeline.errors.add(:base, 'No builds for this pipeline.')
unless pipeline.config_builds_attributes.present?
return error('No builds for this pipeline.')
end
pipeline.save
pipeline.process!
pipeline
end
private
def ref_names
@ref_names ||= project.repository.ref_names
def skip_ci?
pipeline.git_commit_message =~ /\[(ci skip|skip ci)\]/i if pipeline.git_commit_message
end
def commit
@commit ||= project.commit(params[:ref])
@commit ||= project.commit(origin_sha || origin_ref)
end
def sha
commit.try(:id)
end
def before_sha
params[:checkout_sha] || params[:before] || Gitlab::Git::BLANK_SHA
end
def origin_sha
params[:checkout_sha] || params[:after]
end
def origin_ref
params[:ref]
end
def branch?
project.repository.ref_exists?(Gitlab::Git::BRANCH_REF_PREFIX + ref)
end
def tag?
project.repository.ref_exists?(Gitlab::Git::TAG_REF_PREFIX + ref)
end
def ref
Gitlab::Git.ref_name(origin_ref)
end
def valid_sha?
origin_sha && origin_sha != Gitlab::Git::BLANK_SHA
end
def error(message, save: false)
pipeline.errors.add(:base, message)
pipeline.drop if save
pipeline
end
end
end
module Ci
class CreateTriggerRequestService
def execute(project, trigger, ref, variables = nil)
commit = project.commit(ref)
return unless commit
trigger_request = trigger.trigger_requests.create(variables: variables)
# check if ref is tag
tag = project.repository.find_tag(ref).present?
pipeline = project.pipelines.create(sha: commit.sha, ref: ref, tag: tag)
trigger_request = trigger.trigger_requests.create!(
variables: variables,
pipeline: pipeline,
)
if pipeline.create_builds(nil, trigger_request)
pipeline = Ci::CreatePipelineService.new(project, nil, ref: ref).
execute(ignore_skip_ci: true, trigger_request: trigger_request)
if pipeline.persisted?
trigger_request
end
end
......
module Ci
class ProcessPipelineService < BaseService
attr_reader :pipeline
def execute(pipeline)
@pipeline = pipeline
# This method will ensure that our pipeline does have all builds for all stages created
if created_builds.empty?
create_builds!
end
new_builds =
stage_indexes_of_created_builds.map do |index|
process_stage(index)
end
# Return a flag if a when builds got enqueued
new_builds.flatten.any?
end
private
def create_builds!
Ci::CreatePipelineBuildsService.new(project, current_user).execute(pipeline)
end
def process_stage(index)
current_status = status_for_prior_stages(index)
created_builds_in_stage(index).select do |build|
process_build(build, current_status)
end
end
def process_build(build, current_status)
return false unless Statuseable::COMPLETED_STATUSES.include?(current_status)
if valid_statuses_for_when(build.when).include?(current_status)
build.enqueue
true
else
build.skip
false
end
end
def valid_statuses_for_when(value)
case value
when 'on_success'
%w[success]
when 'on_failure'
%w[failed]
when 'always'
%w[success failed]
else
[]
end
end
def status_for_prior_stages(index)
pipeline.builds.where('stage_idx < ?', index).latest.status || 'success'
end
def stage_indexes_of_created_builds
created_builds.order(:stage_idx).pluck('distinct stage_idx')
end
def created_builds_in_stage(index)
created_builds.where(stage_idx: index)
end
def created_builds
pipeline.builds.created
end
end
end
class CreateCommitBuildsService
def execute(project, user, params)
return unless project.builds_enabled?
before_sha = params[:checkout_sha] || params[:before]
sha = params[:checkout_sha] || params[:after]
origin_ref = params[:ref]
ref = Gitlab::Git.ref_name(origin_ref)
tag = Gitlab::Git.tag_ref?(origin_ref)
# Skip branch removal
if sha == Gitlab::Git::BLANK_SHA
return false
end
@pipeline = Ci::Pipeline.new(
project: project,
sha: sha,
ref: ref,
before_sha: before_sha,
tag: tag,
user: user)
##
# Skip creating pipeline if no gitlab-ci.yml is found
#
unless @pipeline.ci_yaml_file
return false
end
##
# Skip creating builds for commits that have [ci skip]
# but save pipeline object
#
if @pipeline.skip_ci?
return save_pipeline!
end
##
# Skip creating builds when CI config is invalid
# but save pipeline object
#
unless @pipeline.config_processor
return save_pipeline!
end
##
# Skip creating pipeline object if there are no builds for it.
#
unless @pipeline.create_builds(user)
@pipeline.errors.add(:base, 'No builds created')
return false
end
save_pipeline!
end
private
##
# Create a new pipeline and touch object to calculate status
#
def save_pipeline!
@pipeline.save!
@pipeline.touch
@pipeline
end
end
......@@ -18,9 +18,14 @@ class DeleteUserService
user.personal_projects.each do |project|
# Skip repository removal because we remove directory with namespace
# that contain all this repositories
::Projects::DestroyService.new(project, current_user, skip_repo: true).pending_delete!
::Projects::DestroyService.new(project, current_user, skip_repo: true).async_execute
end
user.destroy
# Destroy the namespace after destroying the user since certain methods may depend on the namespace existing
namespace = user.namespace
user_data = user.destroy
namespace.really_destroy!
user_data
end
end
......@@ -5,13 +5,23 @@ class DestroyGroupService
@group, @current_user = group, user
end
def async_execute
group.transaction do
# Soft delete via paranoia gem
group.destroy
job_id = GroupDestroyWorker.perform_async(group.id, current_user.id)
Rails.logger.info("User #{current_user.id} scheduled a deletion of group ID #{group.id} with job ID #{job_id}")
end
end
def execute
group.projects.each do |project|
# Execute the destruction of the models immediately to ensure atomic cleanup.
# Skip repository removal because we remove directory with namespace
# that contain all this repositories
::Projects::DestroyService.new(project, current_user, skip_repo: true).pending_delete!
# that contain all these repositories
::Projects::DestroyService.new(project, current_user, skip_repo: true).execute
end
group.destroy
group.really_destroy!
end
end
......@@ -69,7 +69,7 @@ class GitPushService < BaseService
SystemHooksService.new.execute_hooks(build_push_data_system_hook.dup, :push_hooks)
@project.execute_hooks(build_push_data.dup, :push_hooks)
@project.execute_services(build_push_data.dup, :push_hooks)
CreateCommitBuildsService.new.execute(@project, current_user, build_push_data)
Ci::CreatePipelineService.new(project, current_user, build_push_data).execute
ProjectCacheWorker.perform_async(@project.id)
end
......
......@@ -11,7 +11,7 @@ class GitTagPushService < BaseService
SystemHooksService.new.execute_hooks(build_system_push_data.dup, :tag_push_hooks)
project.execute_hooks(@push_data.dup, :tag_push_hooks)
project.execute_services(@push_data.dup, :tag_push_hooks)
CreateCommitBuildsService.new.execute(project, current_user, @push_data)
Ci::CreatePipelineService.new(project, current_user, @push_data).execute
ProjectCacheWorker.perform_async(project.id)
true
......
......@@ -2,8 +2,9 @@ module Members
class DestroyService < BaseService
attr_accessor :member, :current_user
def initialize(member, user)
@member, @current_user = member, user
def initialize(member, current_user)
@member = member
@current_user = current_user
end
def execute
......
module MergeRequests
class GetUrlsService < BaseService
attr_reader :project
def initialize(project)
@project = project
end
def execute(changes)
branches = get_branches(changes)
merge_requests_map = opened_merge_requests_from_source_branches(branches)
branches.map do |branch|
existing_merge_request = merge_requests_map[branch]
if existing_merge_request
url_for_existing_merge_request(existing_merge_request)
else
url_for_new_merge_request(branch)
end
end
end
private
def opened_merge_requests_from_source_branches(branches)
merge_requests = MergeRequest.from_project(project).opened.from_source_branches(branches)
merge_requests.inject({}) do |hash, mr|
hash[mr.source_branch] = mr
hash
end
end
def get_branches(changes)
return [] if project.empty_repo?
return [] unless project.merge_requests_enabled
changes_list = Gitlab::ChangesList.new(changes)
changes_list.map do |change|
next unless Gitlab::Git.branch_ref?(change[:ref])
# Deleted branch
next if Gitlab::Git.blank_ref?(change[:newrev])
# Default branch
branch_name = Gitlab::Git.branch_name(change[:ref])
next if branch_name == project.default_branch
branch_name
end.compact
end
def url_for_new_merge_request(branch_name)
merge_request_params = { source_branch: branch_name }
url = Gitlab::Routing.url_helpers.new_namespace_project_merge_request_url(project.namespace, project, merge_request: merge_request_params)
{ branch_name: branch_name, url: url, new_merge_request: true }
end
def url_for_existing_merge_request(merge_request)
target_project = merge_request.target_project
url = Gitlab::Routing.url_helpers.namespace_project_merge_request_url(target_project.namespace, target_project, merge_request)
{ branch_name: merge_request.source_branch, url: url, new_merge_request: false }
end
end
end
......@@ -6,8 +6,12 @@ module Projects
DELETED_FLAG = '+deleted'
def pending_delete!
project.schedule_delete!(current_user.id, params)
def async_execute
project.transaction do
project.update_attribute(:pending_delete, true)
job_id = ProjectDestroyWorker.perform_async(project.id, current_user.id, params)
Rails.logger.info("User #{current_user.id} scheduled destruction of project #{project.path_with_namespace} with job ID #{job_id}")
end
end
def execute
......
......@@ -144,8 +144,9 @@ class TodoService
def mark_todos_as_done(todos, current_user)
todos = current_user.todos.where(id: todos.map(&:id)) unless todos.respond_to?(:update_all)
todos.update_all(state: :done)
marked_todos = todos.update_all(state: :done)
current_user.update_todos_count_cache
marked_todos
end
# When user marks an issue as todo
......
......@@ -112,7 +112,7 @@
%h4 Projects
.data
= link_to admin_namespaces_projects_path do
%h1= number_with_delimiter(Project.count)
%h1= number_with_delimiter(Project.cached_count)
%hr
= link_to('New Project', new_project_path, class: "btn btn-new")
.col-sm-4
......
......@@ -28,6 +28,3 @@
.form-actions
= f.submit 'Save', class: 'btn btn-save js-save-button'
= link_to "Cancel", admin_labels_path, class: 'btn btn-cancel'
:javascript
new Labels();
......@@ -6,13 +6,13 @@
- content_for :scripts_body_top do
- project = @target_project || @project
- if @project_wiki && @page
- markdown_preview_path = namespace_project_wiki_markdown_preview_path(project.namespace, project, @page.slug)
- preview_markdown_path = namespace_project_wiki_preview_markdown_path(project.namespace, project, @page.slug)
- else
- markdown_preview_path = markdown_preview_namespace_project_path(project.namespace, project)
- preview_markdown_path = preview_markdown_namespace_project_path(project.namespace, project)
- if current_user
:javascript
window.project_uploads_path = "#{namespace_project_uploads_path project.namespace,project}";
window.markdown_preview_path = "#{markdown_preview_path}";
window.preview_markdown_path = "#{preview_markdown_path}";
- content_for :scripts_body do
= render "layouts/init_auto_complete" if current_user
......
......@@ -24,6 +24,3 @@
.project-clone-holder
= render "shared/clone_panel"
:javascript
new Star();
<svg xmlns="http://www.w3.org/2000/svg" width="<%= badge.width %>" height="20">
<linearGradient id="b" x2="0" y2="100%">
<stop offset="0" stop-color="#bbb" stop-opacity=".1"/>
<stop offset="1" stop-opacity=".1"/>
</linearGradient>
<mask id="a">
<rect width="<%= badge.width %>" height="20" rx="3" fill="#fff"/>
</mask>
<g mask="url(#a)">
<path fill="<%= badge.key_color %>"
d="M0 0 h<%= badge.key_width %> v20 H0 z"/>
<path fill="<%= badge.value_color %>"
d="M<%= badge.key_width %> 0 h<%= badge.value_width %> v20 H<%= badge.key_width %> z"/>
<path fill="url(#b)"
d="M0 0 h<%= badge.width %> v20 H0 z"/>
</g>
<g fill="#fff" text-anchor="middle">
<g font-family="DejaVu Sans,Verdana,Geneva,sans-serif" font-size="11">
<text x="<%= badge.key_text_anchor %>" y="15" fill="#010101" fill-opacity=".3">
<%= badge.key_text %>
</text>
<text x="<%= badge.key_text_anchor %>" y="14">
<%= badge.key_text %>
</text>
<text x="<%= badge.value_text_anchor %>" y="15" fill="#010101" fill-opacity=".3">
<%= badge.value_text %>
</text>
<text x="<%= badge.value_text_anchor %>" y="14">
<%= badge.value_text %>
</text>
</g>
</g>
</svg>
.file-content.image_file
- if blob.svg?
- if blob.size_within_svg_limits?
- # We need to scrub SVG but we cannot do so in the RawController: it would
- # be wrong/strange if RawController modified the data.
- blob.load_all_data!(@repository)
- blob = sanitize_svg(blob)
%img{src: "data:#{blob.mime_type};base64,#{Base64.encode64(blob.data)}"}
- else
.nothing-here-block
The SVG could not be displayed as it is too large, you can
#{link_to('view the raw file', namespace_project_raw_path(@project.namespace, @project, @id), target: '_blank')}
instead.
- else
%img{src: namespace_project_raw_path(@project.namespace, @project, tree_join(@commit.id, blob.path))}
......@@ -33,7 +33,7 @@
Cant find HEAD commit for this branch
- stages_status = pipeline.statuses.latest.stages_status
- stages_status = pipeline.statuses.relevant.latest.stages_status
- stages.each do |stage|
%td.stage-cell
- status = stages_status[stage]
......
......@@ -46,5 +46,5 @@
- if pipeline.project.build_coverage_enabled?
%th Coverage
%th
- pipeline.statuses.stages.each do |stage|
= render 'projects/commit/ci_stage', stage: stage, statuses: pipeline.statuses.where(stage: stage)
- pipeline.statuses.relevant.stages.each do |stage|
= render 'projects/commit/ci_stage', stage: stage, statuses: pipeline.statuses.relevant.where(stage: stage)
......@@ -2,9 +2,9 @@
.pull-right
- actions = deployment.manual_actions
- if actions.present?
.btn-group.inline
.btn-group
%a.dropdown-toggle.btn.btn-default{type: 'button', 'data-toggle' => 'dropdown'}
.inline
.dropdown
%a.dropdown-new.btn.btn-default{type: 'button', 'data-toggle' => 'dropdown'}
= icon("play")
%b.caret
%ul.dropdown-menu.dropdown-menu-align-right
......
%div.branch-commit
- if deployment.ref
= link_to deployment.ref, namespace_project_commits_path(@project.namespace, @project, deployment.ref), class: "monospace"
&middot;
.icon-container
= deployment.tag? ? icon('tag') : icon('code-fork')
= link_to deployment.ref, namespace_project_commits_path(@project.namespace, @project, deployment.ref), class: "monospace branch-name"
.icon-container
= custom_icon("icon_commit")
= link_to deployment.short_sha, namespace_project_commit_path(@project.namespace, @project, deployment.sha), class: "commit-id monospace"
%p.commit-title
%span
- if commit_title = deployment.commit_title
= author_avatar(deployment.commit, size: 20)
= link_to_gfm commit_title, namespace_project_commit_path(@project.namespace, @project, deployment.sha), class: "commit-row-message"
- else
Cant find HEAD commit for this branch
......@@ -8,6 +8,7 @@
%td
- if deployment.deployable
= link_to [@project.namespace.becomes(Namespace), @project, deployment.deployable] do
= user_avatar(user: deployment.user, size: 20)
= "#{deployment.deployable.name} (##{deployment.deployable.id})"
%td
......
.diff-file.file-holder{id: "diff-#{index}", data: diff_file_html_data(project, diff_file)}
.diff-file.file-holder{id: "diff-#{index}", data: diff_file_html_data(project, diff_file.file_path, diff_commit.id)}
.file-title{id: "file-path-#{hexdigest(diff_file.file_path)}"}
= render "projects/diffs/file_header", diff_file: diff_file, blob: blob, diff_commit: diff_commit, project: project, url: "#diff-#{index}"
......
......@@ -2,9 +2,13 @@
%tr.environment
%td
%strong
= link_to environment.name, namespace_project_environment_path(@project.namespace, @project, environment)
%td
- if last_deployment
= user_avatar(user: last_deployment.user, size: 20)
%strong ##{last_deployment.id}
%td
- if last_deployment
= render 'projects/deployments/commit', deployment: last_deployment
......
......@@ -23,10 +23,11 @@
New environment
- else
.table-holder
%table.table.environments
%table.table.builds.environments
%tbody
%th Environment
%th Last deployment
%th Date
%th Last Deployment
%th Commit
%th
%th
= render @environments
......@@ -23,13 +23,13 @@
= link_to "Read more", help_page_path("ci/environments"), class: "btn btn-success"
- else
.table-holder
%table.table.environments
%table.table.builds.environments
%thead
%tr
%th ID
%th Commit
%th Build
%th Date
%th
%th
= render @deployments
......
......@@ -42,3 +42,16 @@
.ci_widget.ci-error{style: "display:none"}
= icon("times-circle")
Could not connect to the CI server. Please check your settings and try again.
- @merge_request.environments.each do |environment|
.mr-widget-heading
.ci_widget.ci-success
= ci_icon_for_status("success")
%span.hidden-sm
Deployed to
= succeed '.' do
= link_to environment.name, namespace_project_environment_path(@project.namespace, @project, environment), class: 'environment'
- external_url = environment.external_url
- if external_url
= link_to external_url, target: '_blank' do
= icon('external-link', text: "View on #{external_url.gsub(/\A.*?:\/\//, '')}", right: true)
......@@ -46,28 +46,18 @@
%div
- if github_import_enabled?
= link_to new_import_github_path, class: 'btn import_github' do
= icon 'github', text: 'GitHub'
= icon('github', text: 'GitHub')
%div
- if bitbucket_import_enabled?
- if bitbucket_import_configured?
= link_to status_import_bitbucket_path, class: 'btn import_bitbucket', "data-no-turbolink" => "true" do
%i.fa.fa-bitbucket
Bitbucket
- else
= link_to status_import_bitbucket_path, class: 'how_to_import_link btn import_bitbucket', "data-no-turbolink" => "true" do
%i.fa.fa-bitbucket
Bitbucket
= link_to status_import_bitbucket_path, class: "btn import_bitbucket #{'how_to_import_link' unless bitbucket_import_configured?}", "data-no-turbolink" => "true" do
= icon('bitbucket', text: 'Bitbucket')
- unless bitbucket_import_configured?
= render 'bitbucket_import_modal'
%div
- if gitlab_import_enabled?
- if gitlab_import_configured?
= link_to status_import_gitlab_path, class: 'btn import_gitlab' do
%i.fa.fa-heart
GitLab.com
- else
= link_to status_import_gitlab_path, class: 'how_to_import_link btn import_gitlab' do
%i.fa.fa-heart
GitLab.com
= link_to status_import_gitlab_path, class: "btn import_gitlab #{'how_to_import_link' unless bitbucket_import_configured?}" do
= icon('gitlab', text: 'GitLab.com')
- unless gitlab_import_configured?
= render 'gitlab_import_modal'
%div
- if gitorious_import_enabled?
......@@ -77,23 +67,19 @@
%div
- if google_code_import_enabled?
= link_to new_import_google_code_path, class: 'btn import_google_code' do
%i.fa.fa-google
Google Code
= icon('google', text: 'Google Code')
%div
- if fogbugz_import_enabled?
= link_to new_import_fogbugz_path, class: 'btn import_fogbugz' do
%i.fa.fa-bug
Fogbugz
= icon('bug', text: 'Fogbugz')
%div
- if git_import_enabled?
= link_to "#", class: 'btn js-toggle-button import_git' do
%i.fa.fa-git
%span Repo by URL
= icon('git', text: 'Repo by URL')
%div{ class: 'import_gitlab_project' }
- if gitlab_project_import_enabled?
= link_to new_import_gitlab_project_path, class: 'btn btn_import_gitlab_project project-submit' do
%i.fa.fa-gitlab
%span GitLab export
= icon('gitlab', text: 'GitLab export')
.js-toggle-content.hide
= render "shared/import_form", f: f
......
......@@ -9,7 +9,7 @@
.form-group
= f.label :ref, 'Create for', class: 'control-label'
.col-sm-10
= f.text_field :ref, required: true, tabindex: 2, class: 'form-control'
= f.text_field :ref, required: true, tabindex: 2, class: 'form-control js-branch-name ui-autocomplete-input', autocomplete: :false, id: :ref
.help-block Existing branch name, tag
.form-actions
= f.submit 'Create pipeline', class: 'btn btn-create', tabindex: 3
......
- labels.each do |label|
%span.label-row.btn-group{ role: "group", aria: { label: label.name }, style: "color: #{text_color_for_bg(label.color)}" }
= link_to label.name, label_filter_path(@project, label, type: controller.controller_name),
class: "btn btn-transparent has-tooltip",
style: "background-color: #{label.color};",
title: escape_once(label.description),
data: { container: "body" }
= link_to_label(label, css_class: 'btn btn-transparent')
%button.btn.btn-transparent.label-remove.js-label-filter-remove{ type: "button", style: "background-color: #{label.color};", data: { label: label.title } }
= icon("times")
- show_roles = local_assigns.fetch(:show_roles, default_show_roles(member))
- show_roles = local_assigns.fetch(:show_roles, true)
- show_controls = local_assigns.fetch(:show_controls, true)
- user = member.user
......
......@@ -12,6 +12,8 @@
%li.project-row{ class: css_class }
= cache(cache_key) do
.controls
- if project.archived
%span.label.label-warning archived
- if project.commit.try(:status)
%span
= render_commit_status(project.commit)
......
class GroupDestroyWorker
include Sidekiq::Worker
sidekiq_options queue: :default
def perform(group_id, user_id)
begin
group = Group.with_deleted.find(group_id)
rescue ActiveRecord::RecordNotFound
return
end
user = User.find(user_id)
DestroyGroupService.new(group, user).execute
end
end
# GIT over HTTP
require_dependency Rails.root.join('lib/gitlab/backend/grack_auth')
# GIT over SSH
require_dependency Rails.root.join('lib/gitlab/backend/shell')
......
......@@ -148,6 +148,9 @@ if Gitlab::Metrics.enabled?
config.instrument_methods(Gitlab::Highlight)
config.instrument_instance_methods(Gitlab::Highlight)
# This is a Rails scope so we have to instrument it manually.
config.instrument_method(Project, :visible_to_user)
end
GC::Profiler.enable
......
......@@ -12,3 +12,10 @@ Mime::Type.register_alias "text/html", :md
Mime::Type.register "video/mp4", :mp4, [], [:m4v, :mov]
Mime::Type.register "video/webm", :webm
Mime::Type.register "video/ogg", :ogv
middlewares = Gitlab::Application.config.middleware
middlewares.swap(ActionDispatch::ParamsParser, ActionDispatch::ParamsParser, {
Mime::Type.lookup('application/vnd.git-lfs+json') => lambda do |body|
ActiveSupport::JSON.decode(body)
end
})
......@@ -2,49 +2,86 @@
require 'securerandom'
# Your secret key for verifying the integrity of signed cookies.
# If you change this key, all old signed cookies will become invalid!
# Make sure the secret is at least 30 characters and all random,
# no regular words or you'll be exposed to dictionary attacks.
def find_secure_token
token_file = Rails.root.join('.secret')
if ENV.key?('SECRET_KEY_BASE')
ENV['SECRET_KEY_BASE']
elsif File.exist? token_file
# Use the existing token.
File.read(token_file).chomp
else
# Generate a new token of 64 random hexadecimal characters and store it in token_file.
token = SecureRandom.hex(64)
File.write(token_file, token)
token
# Transition material in .secret to the secret_key_base key in config/secrets.yml.
# Historically, ENV['SECRET_KEY_BASE'] takes precedence over .secret, so we maintain that
# behavior.
#
# It also used to be the case that the key material in ENV['SECRET_KEY_BASE'] or .secret
# was used to encrypt OTP (two-factor authentication) data so if present, we copy that key
# material into config/secrets.yml under otp_key_base.
#
# Finally, if we have successfully migrated all secrets to config/secrets.yml, delete the
# .secret file to avoid confusion.
#
def create_tokens
secret_file = Rails.root.join('.secret')
file_secret_key = File.read(secret_file).chomp if File.exist?(secret_file)
env_secret_key = ENV['SECRET_KEY_BASE']
# Ensure environment variable always overrides secrets.yml.
Rails.application.secrets.secret_key_base = env_secret_key if env_secret_key.present?
defaults = {
secret_key_base: file_secret_key || generate_new_secure_token,
otp_key_base: env_secret_key || file_secret_key || generate_new_secure_token,
db_key_base: generate_new_secure_token
}
missing_secrets = set_missing_keys(defaults)
write_secrets_yml(missing_secrets) unless missing_secrets.empty?
begin
File.delete(secret_file) if file_secret_key
rescue => e
warn "Error deleting useless .secret file: #{e}"
end
end
Rails.application.config.secret_token = find_secure_token
Rails.application.config.secret_key_base = find_secure_token
# CI
def generate_new_secure_token
SecureRandom.hex(64)
end
if Rails.application.secrets.db_key_base.blank?
warn "Missing `db_key_base` for '#{Rails.env}' environment. The secrets will be generated and stored in `config/secrets.yml`"
def warn_missing_secret(secret)
warn "Missing Rails.application.secrets.#{secret} for #{Rails.env} environment. The secret will be generated and stored in config/secrets.yml."
end
all_secrets = YAML.load_file('config/secrets.yml') if File.exist?('config/secrets.yml')
all_secrets ||= {}
def set_missing_keys(defaults)
defaults.stringify_keys.each_with_object({}) do |(key, default), missing|
if Rails.application.secrets[key].blank?
warn_missing_secret(key)
# generate secrets
env_secrets = all_secrets[Rails.env.to_s] || {}
env_secrets['db_key_base'] ||= generate_new_secure_token
all_secrets[Rails.env.to_s] = env_secrets
missing[key] = Rails.application.secrets[key] = default
end
end
end
def write_secrets_yml(missing_secrets)
secrets_yml = Rails.root.join('config/secrets.yml')
rails_env = Rails.env.to_s
secrets = YAML.load_file(secrets_yml) if File.exist?(secrets_yml)
secrets ||= {}
secrets[rails_env] ||= {}
secrets[rails_env].merge!(missing_secrets) do |key, old, new|
# Previously, it was possible this was set to the literal contents of an Erb
# expression that evaluated to an empty value. We don't want to support that
# specifically, just ensure we don't break things further.
#
if old.present?
warn <<EOM
Rails.application.secrets.#{key} was blank, but the literal value in config/secrets.yml was:
#{old}
This probably isn't the expected value for this secret. To keep using a literal Erb string in config/secrets.yml, replace `<%` with `<%%`.
EOM
# save secrets
File.open('config/secrets.yml', 'w', 0600) do |file|
file.write(YAML.dump(all_secrets))
exit 1 # rubocop:disable Rails/Exit
end
Rails.application.secrets.db_key_base = env_secrets['db_key_base']
new
end
File.write(secrets_yml, YAML.dump(secrets), mode: 'w', perm: 0600)
end
create_tokens
......@@ -84,9 +84,6 @@ Rails.application.routes.draw do
# Health check
get 'health_check(/:checks)' => 'health_check#index', as: :health_check
# Enable Grack support (for LFS only)
mount Grack::AuthSpawner, at: '/', constraints: lambda { |request| /[-\/\w\.]+\.git\/(info\/lfs|gitlab-lfs)/.match(request.path_info) }, via: [:get, :post, :put]
# Help
get 'help' => 'help#index'
get 'help/shortcuts' => 'help#shortcuts'
......@@ -471,7 +468,7 @@ Rails.application.routes.draw do
post :unarchive
post :housekeeping
post :toggle_star
post :markdown_preview
post :preview_markdown
post :export
post :remove_export
post :generate_new_export
......@@ -482,11 +479,26 @@ Rails.application.routes.draw do
end
scope module: :projects do
# Git HTTP clients ('git clone' etc.)
scope constraints: { id: /.+\.git/, format: nil } do
# Git HTTP clients ('git clone' etc.)
get '/info/refs', to: 'git_http#info_refs'
post '/git-upload-pack', to: 'git_http#git_upload_pack'
post '/git-receive-pack', to: 'git_http#git_receive_pack'
# Git LFS API (metadata)
post '/info/lfs/objects/batch', to: 'lfs_api#batch'
post '/info/lfs/objects', to: 'lfs_api#deprecated'
get '/info/lfs/objects/*oid', to: 'lfs_api#deprecated'
# GitLab LFS object storage
scope constraints: { oid: /[a-f0-9]{64}/ } do
get '/gitlab-lfs/objects/*oid', to: 'lfs_storage#download'
scope constraints: { size: /[0-9]+/ } do
put '/gitlab-lfs/objects/*oid/*size/authorize', to: 'lfs_storage#upload_authorize'
put '/gitlab-lfs/objects/*oid/*size', to: 'lfs_storage#upload_finalize'
end
end
end
# Allow /info/refs, /info/refs?service=git-upload-pack, and
......@@ -660,7 +672,7 @@ Rails.application.routes.draw do
get '/wikis/*id', to: 'wikis#show', as: 'wiki', constraints: WIKI_SLUG_ID
delete '/wikis/*id', to: 'wikis#destroy', constraints: WIKI_SLUG_ID
put '/wikis/*id', to: 'wikis#update', constraints: WIKI_SLUG_ID
post '/wikis/*id/markdown_preview', to: 'wikis#markdown_preview', constraints: WIKI_SLUG_ID, as: 'wiki_markdown_preview'
post '/wikis/*id/preview_markdown', to: 'wikis#preview_markdown', constraints: WIKI_SLUG_ID, as: 'wiki_preview_markdown'
end
resource :repository, only: [:create] do
......
# rubocop:disable all
class FixNamespaces < ActiveRecord::Migration
DOWNTIME = false
def up
Namespace.where('name <> path and type is null').each do |namespace|
namespace.update_attribute(:name, namespace.path)
namespaces = exec_query('SELECT id, path FROM namespaces WHERE name <> path and type is null')
namespaces.each do |row|
id = row['id']
path = row['path']
exec_query("UPDATE namespaces SET name = '#{path}' WHERE id = #{id}")
end
end
......
class AddQueuedAtToCiBuilds < ActiveRecord::Migration
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
def change
add_column :ci_builds, :queued_at, :timestamp
end
end
class AddDeletedAtToNamespaces < ActiveRecord::Migration
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
disable_ddl_transaction!
def change
add_column :namespaces, :deleted_at, :datetime
add_concurrent_index :namespaces, :deleted_at
end
end
# See http://doc.gitlab.com/ce/development/migration_style_guide.html
# for more information on how to write migrations for GitLab.
class RemoveCiRunnerTrigramIndexes < ActiveRecord::Migration
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
# Disabled for the "down" method so the indexes can be re-created concurrently.
disable_ddl_transaction!
def up
return unless Gitlab::Database.postgresql?
transaction do
execute 'DROP INDEX IF EXISTS index_ci_runners_on_token_trigram;'
execute 'DROP INDEX IF EXISTS index_ci_runners_on_description_trigram;'
end
end
def down
return unless Gitlab::Database.postgresql?
execute 'CREATE INDEX CONCURRENTLY index_ci_runners_on_token_trigram ON ci_runners USING gin(token gin_trgm_ops);'
execute 'CREATE INDEX CONCURRENTLY index_ci_runners_on_description_trigram ON ci_runners USING gin(description gin_trgm_ops);'
end
end
# See http://doc.gitlab.com/ce/development/migration_style_guide.html
# for more information on how to write migrations for GitLab.
class RemoveRedundantIndexes < ActiveRecord::Migration
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
disable_ddl_transaction!
def up
indexes = [
[:ci_taggings, 'ci_taggings_idx'],
[:audit_events, 'index_audit_events_on_author_id'],
[:audit_events, 'index_audit_events_on_type'],
[:ci_builds, 'index_ci_builds_on_erased_by_id'],
[:ci_builds, 'index_ci_builds_on_project_id_and_commit_id'],
[:ci_builds, 'index_ci_builds_on_type'],
[:ci_commits, 'index_ci_commits_on_project_id'],
[:ci_commits, 'index_ci_commits_on_project_id_and_committed_at'],
[:ci_commits, 'index_ci_commits_on_project_id_and_committed_at_and_id'],
[:ci_commits, 'index_ci_commits_on_project_id_and_sha'],
[:ci_commits, 'index_ci_commits_on_sha'],
[:ci_events, 'index_ci_events_on_created_at'],
[:ci_events, 'index_ci_events_on_is_admin'],
[:ci_events, 'index_ci_events_on_project_id'],
[:ci_jobs, 'index_ci_jobs_on_deleted_at'],
[:ci_jobs, 'index_ci_jobs_on_project_id'],
[:ci_projects, 'index_ci_projects_on_gitlab_id'],
[:ci_projects, 'index_ci_projects_on_shared_runners_enabled'],
[:ci_services, 'index_ci_services_on_project_id'],
[:ci_sessions, 'index_ci_sessions_on_session_id'],
[:ci_sessions, 'index_ci_sessions_on_updated_at'],
[:ci_tags, 'index_ci_tags_on_name'],
[:ci_triggers, 'index_ci_triggers_on_deleted_at'],
[:identities, 'index_identities_on_created_at_and_id'],
[:issues, 'index_issues_on_title'],
[:keys, 'index_keys_on_created_at_and_id'],
[:members, 'index_members_on_created_at_and_id'],
[:members, 'index_members_on_type'],
[:milestones, 'index_milestones_on_created_at_and_id'],
[:namespaces, 'index_namespaces_on_visibility_level'],
[:projects, 'index_projects_on_builds_enabled_and_shared_runners_enabled'],
[:services, 'index_services_on_category'],
[:services, 'index_services_on_created_at_and_id'],
[:services, 'index_services_on_default'],
[:snippets, 'index_snippets_on_created_at'],
[:snippets, 'index_snippets_on_created_at_and_id'],
[:todos, 'index_todos_on_state'],
[:web_hooks, 'index_web_hooks_on_created_at_and_id'],
# These indexes _may_ be used but they can be replaced by other existing
# indexes.
# There's already a composite index on (project_id, iid) which means that
# a separate index for _just_ project_id is not needed.
[:issues, 'index_issues_on_project_id'],
# These are all composite indexes for the columns (created_at, id). In all
# these cases there's already a standalone index for "created_at" which
# can be used instead.
#
# Because the "id" column of these composite indexes is never needed (due
# to "id" already being indexed as its a primary key) these composite
# indexes are useless.
[:issues, 'index_issues_on_created_at_and_id'],
[:merge_requests, 'index_merge_requests_on_created_at_and_id'],
[:namespaces, 'index_namespaces_on_created_at_and_id'],
[:notes, 'index_notes_on_created_at_and_id'],
[:projects, 'index_projects_on_created_at_and_id'],
[:users, 'index_users_on_created_at_and_id'],
]
transaction do
indexes.each do |(table, index)|
remove_index(table, name: index) if index_exists_by_name?(table, index)
end
end
add_concurrent_index(:users, :created_at)
add_concurrent_index(:projects, :created_at)
add_concurrent_index(:namespaces, :created_at)
end
def down
# We're only restoring the composite indexes that could be replaced with
# individual ones, just in case somebody would ever want to revert.
transaction do
remove_index(:users, :created_at)
remove_index(:projects, :created_at)
remove_index(:namespaces, :created_at)
end
[:issues, :merge_requests, :namespaces, :notes, :projects, :users].each do |table|
add_concurrent_index(table, [:created_at, :id],
name: "index_#{table}_on_created_at_and_id")
end
end
# Rails' index_exists? doesn't work when you only give it a table and index
# name. As such we have to use some extra code to check if an index exists for
# a given name.
def index_exists_by_name?(table, index)
indexes_for_table[table].include?(index)
end
def indexes_for_table
@indexes_for_table ||= Hash.new do |hash, table_name|
hash[table_name] = indexes(table_name).map(&:name)
end
end
end
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment