Commit cfae6ec2 authored by Yorick Peterse's avatar Yorick Peterse

Backport the EE schema and migrations to CE

This backports all EE schema changes to CE, including EE migrations,
ensuring both use the same schema.

== Updated tests

A spec related to ghost and support bot users had to be modified to make
it pass. The spec in question assumes that the "support_bot" column
exists when defining the spec. In the single codebase setup this is not
the case, as the column is backported in a later migration. Any attempt
to use a different schema version or use of "around" blocks to
conditionally disable specs won't help, as reverting the backport
migration would also drop the "support_bot" column. Removing the
"support_bot" tests entirely appears to be the only solution.

We also need to update some foreign key tests now that we have
backported the EE columns. Fortunately, these changes are very minor.

== Backporting migrations

This commit moves EE specific migrations (except those for the Geo
tracking database) and related files to CE, and also removes any traces
of the ee/db directory.

Some migrations had to be modified or removed, as they no longer work
with the schema being backported. These migrations were all quite old,
so we opted for removing them where modifying them would take too much
time and effort.

Some old migrations were modified in EE, while also existing in CE. In
these cases we took the EE code, and in one case removed them entirely.
It's not worth spending time trying to merge these changes somehow as we
plan to remove old migrations around the release of 12.0, see
https://gitlab.com/gitlab-org/gitlab-ce/issues/59177 for more details.
parent 0c1b8c88
---
title: Backport the EE schema and migrations to CE
merge_request: 26940
author: Yorick Peterse
type: other
...@@ -13,7 +13,6 @@ DB_FILES = [ ...@@ -13,7 +13,6 @@ DB_FILES = [
'lib/gitlab/github_import/', 'lib/gitlab/github_import/',
'lib/gitlab/sql/', 'lib/gitlab/sql/',
'rubocop/cop/migration', 'rubocop/cop/migration',
'ee/db/',
'ee/lib/gitlab/database/' 'ee/lib/gitlab/database/'
].freeze ].freeze
......
# See http://doc.gitlab.com/ce/development/migration_style_guide.html
# for more information on how to write migrations for GitLab.
class RemoveUndeletedGroups < ActiveRecord::Migration[4.2]
DOWNTIME = false
def up
is_ee = defined?(Gitlab::License)
if is_ee
execute <<-EOF.strip_heredoc
DELETE FROM path_locks
WHERE project_id IN (
SELECT project_id
FROM projects
WHERE namespace_id IN (#{namespaces_pending_removal})
);
EOF
execute <<-EOF.strip_heredoc
DELETE FROM remote_mirrors
WHERE project_id IN (
SELECT project_id
FROM projects
WHERE namespace_id IN (#{namespaces_pending_removal})
);
EOF
end
execute <<-EOF.strip_heredoc
DELETE FROM lists
WHERE label_id IN (
SELECT id
FROM labels
WHERE group_id IN (#{namespaces_pending_removal})
);
EOF
execute <<-EOF.strip_heredoc
DELETE FROM lists
WHERE board_id IN (
SELECT id
FROM boards
WHERE project_id IN (
SELECT project_id
FROM projects
WHERE namespace_id IN (#{namespaces_pending_removal})
)
);
EOF
execute <<-EOF.strip_heredoc
DELETE FROM labels
WHERE group_id IN (#{namespaces_pending_removal});
EOF
execute <<-EOF.strip_heredoc
DELETE FROM boards
WHERE project_id IN (
SELECT project_id
FROM projects
WHERE namespace_id IN (#{namespaces_pending_removal})
)
EOF
execute <<-EOF.strip_heredoc
DELETE FROM projects
WHERE namespace_id IN (#{namespaces_pending_removal});
EOF
if is_ee
# EE adds these columns but we have to make sure this data is cleaned up
# here before we run the DELETE below. An alternative would be patching
# this migration in EE but this will only result in a mess and confusing
# migrations.
execute <<-EOF.strip_heredoc
DELETE FROM protected_branch_push_access_levels
WHERE group_id IN (#{namespaces_pending_removal});
EOF
execute <<-EOF.strip_heredoc
DELETE FROM protected_branch_merge_access_levels
WHERE group_id IN (#{namespaces_pending_removal});
EOF
end
# This removes namespaces that were supposed to be deleted but still reside
# in the database.
execute "DELETE FROM namespaces WHERE deleted_at IS NOT NULL;"
end
def down
# This is an irreversible migration;
# If someone is trying to rollback for other reasons, we should not throw an Exception.
# raise ActiveRecord::IrreversibleMigration
end
def namespaces_pending_removal
"SELECT id FROM (
SELECT id
FROM namespaces
WHERE deleted_at IS NOT NULL
) namespace_ids"
end
end
class CleanUpFromMergeRequestDiffsAndCommits < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
class MergeRequestDiff < ActiveRecord::Base
self.table_name = 'merge_request_diffs'
include ::EachBatch
end
disable_ddl_transaction!
def up
Gitlab::BackgroundMigration.steal('DeserializeMergeRequestDiffsAndCommits')
# The literal '--- []\n' value is created by the import process and treated
# as null by the application, so we can ignore those - even if we were
# migrating, it wouldn't create any rows.
literal_prefix = Gitlab::Database.postgresql? ? 'E' : ''
non_empty = "
(st_commits IS NOT NULL AND st_commits != #{literal_prefix}'--- []\n')
OR
(st_diffs IS NOT NULL AND st_diffs != #{literal_prefix}'--- []\n')
".squish
MergeRequestDiff.where(non_empty).each_batch(of: 500) do |relation, index|
range = relation.pluck('MIN(id)', 'MAX(id)').first
Gitlab::BackgroundMigration::DeserializeMergeRequestDiffsAndCommits.new.perform(*range)
end
end
def down
end
end
# See http://doc.gitlab.com/ce/development/migration_style_guide.html
# for more information on how to write migrations for GitLab.
class CleanUpForMembers < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
# Set this constant to true if this migration requires downtime.
DOWNTIME = false
disable_ddl_transaction!
class Member < ActiveRecord::Base
include EachBatch
self.table_name = 'members'
end
def up
condition = <<~EOF.squish
invite_token IS NULL AND
NOT EXISTS (SELECT 1 FROM users WHERE users.id = members.user_id)
EOF
Member.each_batch(of: 10_000) do |batch|
batch.where(condition).delete_all
end
end
def down
end
end
...@@ -10,7 +10,6 @@ class CreateResourceLabelEvents < ActiveRecord::Migration[4.2] ...@@ -10,7 +10,6 @@ class CreateResourceLabelEvents < ActiveRecord::Migration[4.2]
t.integer :action, null: false t.integer :action, null: false
t.references :issue, null: true, index: true, foreign_key: { on_delete: :cascade } t.references :issue, null: true, index: true, foreign_key: { on_delete: :cascade }
t.references :merge_request, null: true, index: true, foreign_key: { on_delete: :cascade } t.references :merge_request, null: true, index: true, foreign_key: { on_delete: :cascade }
t.references :epic, null: true, index: true, foreign_key: { on_delete: :cascade }
t.references :label, index: true, foreign_key: { on_delete: :nullify } t.references :label, index: true, foreign_key: { on_delete: :nullify }
t.references :user, index: true, foreign_key: { on_delete: :nullify } t.references :user, index: true, foreign_key: { on_delete: :nullify }
t.datetime_with_timezone :created_at, null: false t.datetime_with_timezone :created_at, null: false
......
This source diff could not be displayed because it is too large. You can view the blob instead.
# This is the counterpart of RequeuePendingDeleteProjects and cleans all
# projects with `pending_delete = true` and that do not have a namespace.
class CleanupNamespacelessPendingDeleteProjects < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
disable_ddl_transaction!
def up
@offset = 0
loop do
ids = pending_delete_batch
break if ids.empty?
args = ids.map { |id| Array(id) }
NamespacelessProjectDestroyWorker.bulk_perform_async(args)
@offset += 1
end
end
def down
# noop
end
private
def pending_delete_batch
connection.exec_query(find_batch).map { |row| row['id'].to_i }
end
BATCH_SIZE = 5000
def find_batch
projects = Arel::Table.new(:projects)
projects.project(projects[:id])
.where(projects[:pending_delete].eq(true))
.where(projects[:namespace_id].eq(nil))
.skip(@offset * BATCH_SIZE)
.take(BATCH_SIZE)
.to_sql
end
end
class ScheduleMergeRequestDiffMigrations < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
BATCH_SIZE = 2500
MIGRATION = 'DeserializeMergeRequestDiffsAndCommits'
disable_ddl_transaction!
class MergeRequestDiff < ActiveRecord::Base
self.table_name = 'merge_request_diffs'
include ::EachBatch
end
# Assuming that there are 5 million rows affected (which is more than on
# GitLab.com), and that each batch of 2,500 rows takes up to 5 minutes, then
# we can migrate all the rows in 7 days.
#
# On staging, plucking the IDs themselves takes 5 seconds.
def up
non_empty = 'st_commits IS NOT NULL OR st_diffs IS NOT NULL'
MergeRequestDiff.where(non_empty).each_batch(of: BATCH_SIZE) do |relation, index|
range = relation.pluck('MIN(id)', 'MAX(id)').first
BackgroundMigrationWorker.perform_in(index * 5.minutes, MIGRATION, range)
end
end
def down
end
end
class ScheduleMergeRequestDiffMigrationsTakeTwo < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
BATCH_SIZE = 500
MIGRATION = 'DeserializeMergeRequestDiffsAndCommits'
DELAY_INTERVAL = 10.minutes
disable_ddl_transaction!
class MergeRequestDiff < ActiveRecord::Base
self.table_name = 'merge_request_diffs'
include ::EachBatch
default_scope { where('st_commits IS NOT NULL OR st_diffs IS NOT NULL') }
end
# By this point, we assume ScheduleMergeRequestDiffMigrations - the first
# version of this - has already run. On GitLab.com, we have ~220k un-migrated
# rows, but these rows will, in general, take a long time.
#
# With a gap of 10 minutes per batch, and 500 rows per batch, these migrations
# are scheduled over 220_000 / 500 / 6 ~= 74 hours, which is a little over
# three days.
def up
queue_background_migration_jobs_by_range_at_intervals(MergeRequestDiff, MIGRATION, DELAY_INTERVAL, batch_size: BATCH_SIZE)
end
def down
end
end
# See http://doc.gitlab.com/ce/development/migration_style_guide.html
# for more information on how to write migrations for GitLab.
class ScheduleCreateGpgKeySubkeysFromGpgKeys < ActiveRecord::Migration[4.2]
disable_ddl_transaction!
DOWNTIME = false
MIGRATION = 'CreateGpgKeySubkeysFromGpgKeys'
class GpgKey < ActiveRecord::Base
self.table_name = 'gpg_keys'
include EachBatch
end
def up
GpgKey.select(:id).each_batch do |gpg_keys|
jobs = gpg_keys.pluck(:id).map do |id|
[MIGRATION, [id]]
end
BackgroundMigrationWorker.bulk_perform_async(jobs)
end
end
def down
end
end
# frozen_string_literal: true
class SchedulePopulateMergeRequestMetricsWithEventsData < ActiveRecord::Migration[4.2]
DOWNTIME = false
BATCH_SIZE = 10_000
MIGRATION = 'PopulateMergeRequestMetricsWithEventsData'
disable_ddl_transaction!
class MergeRequest < ActiveRecord::Base
self.table_name = 'merge_requests'
include ::EachBatch
end
def up
say 'Scheduling `PopulateMergeRequestMetricsWithEventsData` jobs'
# It will update around 4_000_000 records in batches of 10_000 merge
# requests (running between 10 minutes) and should take around 66 hours to complete.
# Apparently, production PostgreSQL is able to vacuum 10k-20k dead_tuples by
# minute, and at maximum, each of these jobs should UPDATE 20k records.
#
# More information about the updates in `PopulateMergeRequestMetricsWithEventsData` class.
#
MergeRequest.all.each_batch(of: BATCH_SIZE) do |relation, index|
range = relation.pluck('MIN(id)', 'MAX(id)').first
BackgroundMigrationWorker.perform_in(index * 10.minutes, MIGRATION, range)
end
end
def down
execute "update merge_request_metrics set latest_closed_at = null"
execute "update merge_request_metrics set latest_closed_by_id = null"
execute "update merge_request_metrics set merged_by_id = null"
end
end
# See http://doc.gitlab.com/ce/development/migration_style_guide.html
# for more information on how to write migrations for GitLab.
class RemoveSoftRemovedObjects < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
# Set this constant to true if this migration requires downtime.
DOWNTIME = false
disable_ddl_transaction!
module SoftRemoved
extend ActiveSupport::Concern
included do
scope :soft_removed, -> { where('deleted_at IS NOT NULL') }
end
end
class User < ActiveRecord::Base
self.table_name = 'users'
include EachBatch
end
class Issue < ActiveRecord::Base
self.table_name = 'issues'
include EachBatch
include SoftRemoved
end
class MergeRequest < ActiveRecord::Base
self.table_name = 'merge_requests'
include EachBatch
include SoftRemoved
end
class Namespace < ActiveRecord::Base
self.table_name = 'namespaces'
include EachBatch
include SoftRemoved
scope :soft_removed_personal, -> { soft_removed.where(type: nil) }
scope :soft_removed_group, -> { soft_removed.where(type: 'Group') }
end
class Route < ActiveRecord::Base
self.table_name = 'routes'
include EachBatch
include SoftRemoved
end
class Project < ActiveRecord::Base
self.table_name = 'projects'
include EachBatch
include SoftRemoved
end
class CiPipelineSchedule < ActiveRecord::Base
self.table_name = 'ci_pipeline_schedules'
include EachBatch
include SoftRemoved
end
class CiTrigger < ActiveRecord::Base
self.table_name = 'ci_triggers'
include EachBatch
include SoftRemoved
end
MODELS = [Issue, MergeRequest, CiPipelineSchedule, CiTrigger].freeze
def up
disable_statement_timeout do
remove_personal_routes
remove_personal_namespaces
remove_group_namespaces
remove_simple_soft_removed_rows
end
end
def down
# The data removed by this migration can't be restored in an automated way.
end
def remove_simple_soft_removed_rows
create_temporary_indexes
MODELS.each do |model|
say_with_time("Removing soft removed rows from #{model.table_name}") do
model.soft_removed.each_batch do |batch, index|
batch.delete_all
end
end
end
ensure
remove_temporary_indexes
end
def create_temporary_indexes
MODELS.each do |model|
index_name = temporary_index_name_for(model)
# Without this index the removal process can take a very long time. For
# example, getting the next ID of a batch for the `issues` table in
# staging would take between 15 and 20 seconds.
next if temporary_index_exists?(model)
say_with_time("Creating temporary index #{index_name}") do
add_concurrent_index(
model.table_name,
[:deleted_at, :id],
name: index_name,
where: 'deleted_at IS NOT NULL'
)
end
end
end
def remove_temporary_indexes
MODELS.each do |model|
index_name = temporary_index_name_for(model)
next unless temporary_index_exists?(model)
say_with_time("Removing temporary index #{index_name}") do
remove_concurrent_index_by_name(model.table_name, index_name)
end
end
end
def temporary_index_name_for(model)
"index_on_#{model.table_name}_tmp"
end
def temporary_index_exists?(model)
index_name = temporary_index_name_for(model)
index_exists?(model.table_name, [:deleted_at, :id], name: index_name)
end
def remove_personal_namespaces
# Some personal namespaces are left behind in case of GitLab.com. In these
# cases the associated data such as the projects and users has already been
# removed.
Namespace.soft_removed_personal.each_batch do |batch|
batch.delete_all
end
end
def remove_group_namespaces
admin_id = id_for_admin_user
unless admin_id
say 'Not scheduling soft removed groups for removal as no admin user ' \
'could be found. You will need to remove any such groups manually.'
return
end
# Left over groups can't be easily removed because we may also need to
# remove memberships, repositories, and other associated data. As a result
# we'll just schedule a Sidekiq job to remove these.
#
# As of January 5th, 2018 there are 36 groups that will be removed using
# this code.
Namespace.select(:id).soft_removed_group.each_batch(of: 10) do |batch, index|
batch.each do |ns|
schedule_group_removal(index * 5.minutes, ns.id, admin_id)
end
end
end
def schedule_group_removal(delay, group_id, user_id)
if migrate_inline?
GroupDestroyWorker.new.perform(group_id, user_id)
else
GroupDestroyWorker.perform_in(delay, group_id, user_id)
end
end
def remove_personal_routes
namespaces = Namespace.select(1)
.soft_removed
.where('namespaces.type IS NULL')
.where('routes.source_type = ?', 'Namespace')
.where('routes.source_id = namespaces.id')
Route.where('EXISTS (?)', namespaces).each_batch do |batch|
batch.delete_all
end
end
def id_for_admin_user
User.where(admin: true).limit(1).pluck(:id).first
end
def migrate_inline?
Rails.env.test? || Rails.env.development?
end
end
class MigrateImportAttributesDataFromProjectsToProjectMirrorData < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
UP_MIGRATION = 'PopulateImportState'.freeze
DOWN_MIGRATION = 'RollbackImportStateData'.freeze
BATCH_SIZE = 1000
DELAY_INTERVAL = 5.minutes
disable_ddl_transaction!
class Project < ActiveRecord::Base
include EachBatch
self.table_name = 'projects'
end
class ProjectImportState < ActiveRecord::Base
include EachBatch
self.table_name = 'project_mirror_data'
end
def up
projects = Project.where.not(import_status: :none)
queue_background_migration_jobs_by_range_at_intervals(projects, UP_MIGRATION, DELAY_INTERVAL, batch_size: BATCH_SIZE)
end
def down
import_state = ProjectImportState.where.not(status: :none)
queue_background_migration_jobs_by_range_at_intervals(import_state, DOWN_MIGRATION, DELAY_INTERVAL, batch_size: BATCH_SIZE)
end
end
class MigrateRemainingMrMetricsPopulatingBackgroundMigration < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
BATCH_SIZE = 5_000
MIGRATION = 'PopulateMergeRequestMetricsWithEventsData'
DELAY_INTERVAL = 10.minutes
disable_ddl_transaction!
class MergeRequest < ActiveRecord::Base
self.table_name = 'merge_requests'
include ::EachBatch
end
def up
# Perform any ongoing background migration that might still be running. This
# avoids scheduling way too many of the same jobs on self-hosted instances
# if they're updating GitLab across multiple versions. The "Take one"
# migration was executed on 10.4 on
# SchedulePopulateMergeRequestMetricsWithEventsData.
Gitlab::BackgroundMigration.steal(MIGRATION)
metrics_not_exists_clause = <<~SQL
NOT EXISTS (SELECT 1 FROM merge_request_metrics
WHERE merge_request_metrics.merge_request_id = merge_requests.id)
SQL
relation = MergeRequest.where(metrics_not_exists_clause)
# We currently have ~400_000 MR records without metrics on GitLab.com.
# This means it'll schedule ~80 jobs (5000 MRs each) with a 10 minutes gap,
# so this should take ~14 hours for all background migrations to complete.
#
queue_background_migration_jobs_by_range_at_intervals(relation,
MIGRATION,
DELAY_INTERVAL,
batch_size: BATCH_SIZE)
end
def down
end
end
class EnqueueDeleteDiffFilesWorkers < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
SCHEDULER = 'ScheduleDiffFilesDeletion'.freeze
TMP_INDEX = 'tmp_partial_diff_id_with_files_index'.freeze
disable_ddl_transaction!
def up
unless index_exists_by_name?(:merge_request_diffs, TMP_INDEX)
add_concurrent_index(:merge_request_diffs, :id, where: "(state NOT IN ('without_files', 'empty'))", name: TMP_INDEX)
end
BackgroundMigrationWorker.perform_async(SCHEDULER)
# We don't remove the index since it's going to be used on DeleteDiffFiles
# worker. We should remove it in an upcoming release.
end
def down
if index_exists_by_name?(:merge_request_diffs, TMP_INDEX)
remove_concurrent_index_by_name(:merge_request_diffs, TMP_INDEX)
end
end
end
# frozen_string_literal: true
class DeleteInconsistentInternalIdRecords < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
disable_ddl_transaction!
# This migration cleans up any inconsistent records in internal_ids.
#
# That is, it deletes records that track a `last_value` that is
# smaller than the maximum internal id (usually `iid`) found in
# the corresponding model records.
def up
disable_statement_timeout do
delete_internal_id_records('issues', 'project_id')
delete_internal_id_records('merge_requests', 'project_id', 'target_project_id')
delete_internal_id_records('deployments', 'project_id')
delete_internal_id_records('milestones', 'project_id')
delete_internal_id_records('milestones', 'namespace_id', 'group_id')
delete_internal_id_records('ci_pipelines', 'project_id')
delete_internal_id_records('epics', 'namespace_id', 'group_id')
end
end
class InternalId < ActiveRecord::Base
self.table_name = 'internal_ids'
enum usage: { issues: 0, merge_requests: 1, deployments: 2, milestones: 3, epics: 4, ci_pipelines: 5 }
end
private
def delete_internal_id_records(base_table, scope_column_name, base_scope_column_name = scope_column_name)
sql = <<~SQL
SELECT id FROM ( -- workaround for MySQL
SELECT internal_ids.id FROM (
SELECT #{base_scope_column_name} AS #{scope_column_name}, max(iid) as maximum_iid from #{base_table} GROUP BY #{scope_column_name}
) maxima JOIN internal_ids USING (#{scope_column_name})
WHERE internal_ids.usage=#{InternalId.usages.fetch(base_table)} AND maxima.maximum_iid > internal_ids.last_value
) internal_ids
SQL
InternalId.where("id IN (#{sql})").tap do |ids| # rubocop:disable GitlabSecurity/SqlInjection
say "Deleting internal_id records for #{base_table}: #{ids.pluck(:project_id, :last_value)}" unless ids.empty?
end.delete_all
end
end
# frozen_string_literal: true
class RemoveOrphanedLabelLinks < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
disable_ddl_transaction!
class LabelLinks < ActiveRecord::Base
self.table_name = 'label_links'
include EachBatch
def self.orphaned
where('NOT EXISTS ( SELECT 1 FROM labels WHERE labels.id = label_links.label_id )')
end
end
def up
# Some of these queries can take up to 10 seconds to run on GitLab.com,
# which is pretty close to our 15 second statement timeout. To ensure a
# smooth deployment procedure we disable the statement timeouts for this
# migration, just in case.
disable_statement_timeout do
# On GitLab.com there are over 2,000,000 orphaned label links. On
# staging, removing 100,000 rows generated a max replication lag of 6.7
# MB. In total, removing all these rows will only generate about 136 MB
# of data, so it should be safe to do this.
LabelLinks.orphaned.each_batch(of: 100_000) do |batch|
batch.delete_all
end
end
add_concurrent_foreign_key(:label_links, :labels, column: :label_id, on_delete: :cascade)
end
def down
# There is no way to restore orphaned label links.
if foreign_key_exists?(:label_links, column: :label_id)
remove_foreign_key(:label_links, column: :label_id)
end
end
end
# frozen_string_literal: true
class ConsumeRemainingDiffFilesDeletionJobs < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
disable_ddl_transaction!
MIGRATION = 'ScheduleDiffFilesDeletion'.freeze
TMP_INDEX = 'tmp_partial_diff_id_with_files_index'.freeze
def up
# Perform any ongoing background migration that might still be scheduled.
Gitlab::BackgroundMigration.steal(MIGRATION)
remove_concurrent_index_by_name(:merge_request_diffs, TMP_INDEX)
end
def down
add_concurrent_index(:merge_request_diffs, :id, where: "(state NOT IN ('without_files', 'empty'))", name: TMP_INDEX)
end
end
# frozen_string_literal: true
# See http://doc.gitlab.com/ce/development/migration_style_guide.html
# for more information on how to write migrations for GitLab.
class PopulateExternalPipelineSource < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
# Set this constant to true if this migration requires downtime.
DOWNTIME = false
MIGRATION = 'PopulateExternalPipelineSource'.freeze
BATCH_SIZE = 500
disable_ddl_transaction!
class Pipeline < ActiveRecord::Base
include EachBatch
self.table_name = 'ci_pipelines'
end
def up
Pipeline.where(source: nil).tap do |relation|
queue_background_migration_jobs_by_range_at_intervals(relation,
MIGRATION,
5.minutes,
batch_size: BATCH_SIZE)
end
end
def down
# noop
end
end
# frozen_string_literal: true
class EnqueueRedactLinks < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
BATCH_SIZE = 1000
DELAY_INTERVAL = 5.minutes.to_i
MIGRATION = 'RedactLinks'
disable_ddl_transaction!
class Note < ActiveRecord::Base
include EachBatch
self.table_name = 'notes'
self.inheritance_column = :_type_disabled
end
class Issue < ActiveRecord::Base
include EachBatch
self.table_name = 'issues'
self.inheritance_column = :_type_disabled
end
class MergeRequest < ActiveRecord::Base
include EachBatch
self.table_name = 'merge_requests'
self.inheritance_column = :_type_disabled
end
class Snippet < ActiveRecord::Base
include EachBatch
self.table_name = 'snippets'
self.inheritance_column = :_type_disabled
end
def up
disable_statement_timeout do
schedule_migration(Note, 'note')
schedule_migration(Issue, 'description')
schedule_migration(MergeRequest, 'description')
schedule_migration(Snippet, 'description')
end
end
def down
# nothing to do
end
private
def schedule_migration(model, field)
link_pattern = "%/sent_notifications/" + ("_" * 32) + "/unsubscribe%"
model.where("#{field} like ?", link_pattern).each_batch(of: BATCH_SIZE) do |batch, index|
start_id, stop_id = batch.pluck('MIN(id)', 'MAX(id)').first
BackgroundMigrationWorker.perform_in(index * DELAY_INTERVAL, MIGRATION, [model.name.demodulize, field, start_id, stop_id])
end
end
end
# frozen_string_literal: true
# See http://doc.gitlab.com/ce/development/migration_style_guide.html
# for more information on how to write migrations for GitLab.
class PopulateMrMetricsWithEventsData < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
BATCH_SIZE = 10_000
MIGRATION = 'PopulateMergeRequestMetricsWithEventsDataImproved'
PREVIOUS_MIGRATION = 'PopulateMergeRequestMetricsWithEventsData'
disable_ddl_transaction!
def up
# Perform any ongoing background migration that might still be running from
# previous try (see https://gitlab.com/gitlab-org/gitlab-ce/issues/47676).
Gitlab::BackgroundMigration.steal(PREVIOUS_MIGRATION)
say 'Scheduling `PopulateMergeRequestMetricsWithEventsData` jobs'
# It will update around 4_000_000 records in batches of 10_000 merge
# requests (running between 5 minutes) and should take around 53 hours to complete.
# Apparently, production PostgreSQL is able to vacuum 10k-20k dead_tuples
# per minute. So this should give us enough space.
#
# More information about the updates in `PopulateMergeRequestMetricsWithEventsDataImproved` class.
#
MergeRequest.all.each_batch(of: BATCH_SIZE) do |relation, index|
range = relation.pluck('MIN(id)', 'MAX(id)').first
BackgroundMigrationWorker.perform_in(index * 8.minutes, MIGRATION, range)
end
end
def down
end
end
...@@ -7,8 +7,18 @@ class PopulateProjectStatisticsPackagesSize < ActiveRecord::Migration[5.0] ...@@ -7,8 +7,18 @@ class PopulateProjectStatisticsPackagesSize < ActiveRecord::Migration[5.0]
disable_ddl_transaction! disable_ddl_transaction!
class ProjectStatistics < ActiveRecord::Base
self.table_name = 'project_statistics'
end
def up def up
stats_ids = ProjectStatistics.joins(project: { packages: :package_files }).distinct.select(:id) stats_ids = ProjectStatistics.joins(
<<~SQL.strip_heredoc
INNER JOIN projects ON projects.id = project_statistics.project_id
INNER JOIN packages_packages ON packages_packages.project_id = projects.id
INNER JOIN packages_package_files ON packages_package_files.package_id = packages_packages.id
SQL
).distinct.select(:id)
packages_size = Arel.sql( packages_size = Arel.sql(
'(SELECT SUM(size) FROM packages_package_files ' \ '(SELECT SUM(size) FROM packages_package_files ' \
......
...@@ -147,7 +147,6 @@ The `ModelConfigurationSpec` checks and confirms the addition of new models: ...@@ -147,7 +147,6 @@ The `ModelConfigurationSpec` checks and confirms the addition of new models:
If you think this model should be included in the export, please add it to `#{Gitlab::ImportExport.config_file}`. If you think this model should be included in the export, please add it to `#{Gitlab::ImportExport.config_file}`.
Definitely add it to `#{File.expand_path(ce_models_yml)}` Definitely add it to `#{File.expand_path(ce_models_yml)}`
#{"or `#{File.expand_path(ee_models_yml)}` if the model/associations are EE-specific\n" if ee_models_hash.any?}
to signal that you've handled this error and to prevent it from showing up in the future. to signal that you've handled this error and to prevent it from showing up in the future.
MSG MSG
``` ```
...@@ -253,7 +252,7 @@ Model relationships to be included in the project import/export: ...@@ -253,7 +252,7 @@ Model relationships to be included in the project import/export:
```yaml ```yaml
project_tree: project_tree:
- labels: - labels:
:priorities - :priorities
- milestones: - milestones:
- events: - events:
- :push_event_payload - :push_event_payload
......
# rubocop:disable Migration/Timestamps
class InitEESchema < ActiveRecord::Migration[4.2]
DOWNTIME = false
def up
add_column :namespaces, :ldap_cn, :string, null: true
add_column :namespaces, :ldap_access, :integer, null: true
create_table :git_hooks do |t|
t.string :force_push_regex
t.string :delete_branch_regex
t.string :commit_message_regex
t.boolean :deny_delete_tag
t.integer :project_id
t.timestamps null: true
end
end
def down
raise ActiveRecord::IrreversibleMigration, "The initial migration is not revertable"
end
end
# rubocop:disable Migration/Timestamps
class CreateAppearances < ActiveRecord::Migration[4.2]
DOWNTIME = false
def change
# GitLab CE may already have created this table, so to preserve
# the upgrade path from CE -> EE we only create this if necessary.
unless table_exists?(:appearances)
create_table :appearances do |t|
t.string :title
t.text :description
t.string :logo
t.integer :updated_by
t.timestamps null: true
end
end
end
end
class AddMrTemplateToProject < ActiveRecord::Migration[4.2]
def change
add_column :projects, :merge_requests_template, :text
end
end
class AddUsernamePasswordApiVersionToServices < ActiveRecord::Migration[4.2]
def change
add_column :services, :username, :string
add_column :services, :password, :string
add_column :services, :api_version, :string
end
end
# rubocop:disable Migration/Datetime
class AddUnsubscribedAtFieldToUsers < ActiveRecord::Migration[4.2]
def change
add_column :users, :admin_email_unsubscribed_at, :datetime
end
end
# rubocop:disable all
class AddJiraIssueTransitionIdToServices < ActiveRecord::Migration[4.2]
def up
add_column :services, :jira_issue_transition_id, :string, default: '2'
Service.reset_column_information
Service.where(jira_issue_transition_id: nil).update_all jira_issue_transition_id: '2'
end
def down
remove_column :services, :jira_issue_transition_id
end
end
# rubocop:disable Migration/Timestamps
class AddLdapGroupsTable < ActiveRecord::Migration[4.2]
DOWNTIME = false
def up
create_table :ldap_groups do |t|
t.string :cn, null: false
t.integer :group_access, null: false
t.references :group, null: false
t.timestamps null: true
end
end
def down
drop_table :ldap_groups
end
end
class RenameLdapGroupToLdapGroupLink < ActiveRecord::Migration[4.2]
def up
rename_table :ldap_groups, :ldap_group_links
# NOTE: we use the old_ methods because the new methods are overloaded
# for backwards compatibility
time = Time.now.strftime('%Y-%m-%d %H:%M:%S')
execute "INSERT INTO ldap_group_links ( group_access, cn, group_id, created_at, updated_at )
SELECT ldap_access, ldap_cn, id, DATE('#{time}'), DATE('#{time}') FROM namespaces
WHERE ldap_cn IS NOT NULL;"
end
def down
rename_table :ldap_group_links, :ldap_groups
end
end
# rubocop:disable Migration/RemoveColumn
class RemoveColumnsForServices < ActiveRecord::Migration[4.2]
def change
remove_column :services, :username, :string
remove_column :services, :password, :string
remove_column :services, :jira_issue_transition_id, :string
remove_column :services, :api_version, :string
end
end
class AddProviderToLdapGroupLinks < ActiveRecord::Migration[4.2]
def change
add_column :ldap_group_links, :provider, :string
end
end
class AddAuthorEmailRegexToGitHook < ActiveRecord::Migration[4.2]
def change
add_column :git_hooks, :author_email_regex, :string
end
end
# rubocop:disable all
class AddMemberCheckToGitHooks < ActiveRecord::Migration[4.2]
def change
add_column :git_hooks, :member_check, :boolean, default: false, null: false
end
end
class AddFileNameRegexToGitHooks < ActiveRecord::Migration[4.2]
def change
add_column :git_hooks, :file_name_regex, :string
end
end
# rubocop:disable all
class AddGroupMembershipLock < ActiveRecord::Migration[4.2]
def change
add_column :namespaces, :membership_lock, :boolean, default: false
end
end
class AddHeaderLogosToAppearances < ActiveRecord::Migration[4.2]
def change
add_column :appearances, :dark_logo, :string
add_column :appearances, :light_logo, :string
end
end
# rubocop:disable Migration/RemoveColumn
class RemoveOldFieldsFromNamespace < ActiveRecord::Migration[4.2]
def up
remove_column :namespaces, :ldap_cn
remove_column :namespaces, :ldap_access
end
def down
add_column :namespaces, :ldap_cn, :string, null: true
add_column :namespaces, :ldap_access, :integer, null: true
end
end
# rubocop:disable all
class AddRebaseSettingToProjects < ActiveRecord::Migration[4.2]
def change
add_column :projects, :merge_requests_rebase_default, :boolean, default: true
end
end
class HelpTextToApplicationSettings < ActiveRecord::Migration[4.2]
def change
add_column :application_settings, :help_text, :text
end
end
class AddGroupIdToWebHooks < ActiveRecord::Migration[4.2]
def change
add_column :web_hooks, :group_id, :integer, after: :project_id
end
end
# rubocop:disable all
class AddIsSampleToGitHooks < ActiveRecord::Migration[4.2]
def change
add_column :git_hooks, :is_sample, :boolean, default: false
end
end
# rubocop:disable Migration/Timestamps
class CreateLicenses < ActiveRecord::Migration[4.2]
DOWNTIME = false
def change
create_table :licenses do |t|
t.text :data, null: false
t.timestamps null: true
end
end
end
# rubocop:disable Migration/Timestamps
class CreateHistoricalData < ActiveRecord::Migration[4.2]
DOWNTIME = false
def change
create_table :historical_data do |t|
t.date :date, null: false
t.integer :active_user_count
t.timestamps null: true
end
end
end
# rubocop:disable all
class AddMaxFileSizeToGitHooks < ActiveRecord::Migration[4.2]
def change
add_column :git_hooks, :max_file_size, :integer, default: 0
end
end
# rubocop:disable Migration/Timestamps
class CreateApproves < ActiveRecord::Migration[4.2]
DOWNTIME = false
def change
create_table :approvals do |t|
t.integer :merge_request_id, null: false
t.integer :user_id, null: false
t.timestamps null: true
end
end
end
# rubocop:disable all
class AddProjectMergeApproves < ActiveRecord::Migration[4.2]
def change
add_column :projects, :approvals_before_merge, :integer, null: false, default: 0
end
end
# rubocop:disable Migration/Timestamps
class AddApproversTable < ActiveRecord::Migration[4.2]
DOWNTIME = false
def change
create_table :approvers do |t|
t.integer :target_id, null: false
t.string :target_type
t.integer :user_id, null: false
t.timestamps null: true
t.index [:target_id, :target_type]
t.index :user_id
end
end
end
# rubocop:disable all
class AddResetApproversToProject < ActiveRecord::Migration[4.2]
def change
add_column :projects, :reset_approvers_on_push, :boolean, default: true
end
end
class RenameResetApprovers < ActiveRecord::Migration[4.2]
def change
rename_column :projects, :reset_approvers_on_push, :reset_approvals_on_push
end
end
class RemoveInvalidApprovers < ActiveRecord::Migration[4.2]
def up
execute("DELETE FROM approvers WHERE user_id = 0")
end
def down
end
end
# rubocop:disable all
class MigrateRebaseFeature < ActiveRecord::Migration[4.2]
def up
execute %q{UPDATE projects SET merge_requests_ff_only_enabled = TRUE WHERE merge_requests_rebase_enabled IS TRUE}
remove_column :projects, :merge_requests_rebase_default
end
def down
add_column :projects, :merge_requests_rebase_default, :boolean, default: true
end
end
class AddIssuesTemplateToProject < ActiveRecord::Migration[4.2]
def change
add_column :projects, :issues_template, :text
end
end
class UpdateGroupLinks < ActiveRecord::Migration[4.2]
def change
provider = quote_string(Gitlab::Auth::LDAP::Config.providers.first)
execute("UPDATE ldap_group_links SET provider = '#{provider}' WHERE provider IS NULL")
end
end
# rubocop:disable all
class AddMirrorToProject < ActiveRecord::Migration[4.2]
def change
add_column :projects, :mirror, :boolean, default: false, null: false
add_column :projects, :mirror_last_update_at, :datetime
add_column :projects, :mirror_last_successful_update_at, :datetime
add_column :projects, :mirror_user_id, :integer
end
end
class CanonicalizeKerberosIdentities < ActiveRecord::Migration[4.2]
# This migration can be performed online without errors.
# It makes sure that all Kerberos identities are in canonical form
# with a realm name (`username` => `username@DEFAULT.REALM`).
# Before this migration, Kerberos identities using the default realm are typically stored
# without the realm part.
def kerberos_default_realm
@kerberos_default_realm ||= begin
require "krb5_auth"
krb5 = ::Krb5Auth::Krb5.new
default_realm = krb5.get_default_realm
krb5.close # release memory allocated by the krb5 library
default_realm || ''
rescue StandardError, LoadError
'' # could not find the system's default realm, maybe there's no Kerberos at all
end
end
def change
reversible do |dir|
# rubocop:disable Cop/AvoidReturnFromBlocks
return unless kerberos_default_realm.present?
dir.up do
# add the default realm to any kerberos identity not having a realm already
execute("UPDATE identities SET extern_uid = CONCAT(extern_uid, '@#{quote_string(kerberos_default_realm)}')
WHERE provider = 'kerberos' AND extern_uid NOT LIKE '%@%'")
end
dir.down do
# remove the realm from kerberos identities using the default realm
execute("UPDATE identities SET extern_uid = REPLACE(extern_uid, '@#{quote_string(kerberos_default_realm)}', '')
WHERE provider = 'kerberos' AND extern_uid LIKE '%@#{quote_string(kerberos_default_realm)}'")
end
end
end
end
class AddNoteToUsers < ActiveRecord::Migration[4.2]
def up
# Column "note" has been added to schema mistakenly (without actual migration),
# and this is why it can exist in some instances.
unless column_exists?(:users, :note)
add_column :users, :note, :text
end
end
def down
end
end
class AddWeightToIssue < ActiveRecord::Migration[4.2]
def change
add_column :issues, :weight, :integer
end
end
class RenameJenkinsService < ActiveRecord::Migration[4.2]
def up
execute "UPDATE services SET type = 'JenkinsDeprecatedService' WHERE type = 'JenkinsService';"
end
def down
execute "UPDATE services SET type = 'JenkinsService' WHERE type = 'JenkinsDeprecatedService';"
end
end
class ChangeMaxFileSizeToNotNullOnGitHooks < ActiveRecord::Migration[4.2]
def change
change_column_null :git_hooks, :max_file_size, false, 0
end
end
class CreateGeoNodes < ActiveRecord::Migration[4.2]
def change
create_table :geo_nodes do |t|
t.string :schema
t.string :host, index: true
t.integer :port
t.string :relative_url_root
t.boolean :primary, index: true
end
end
end
# rubocop:disable all
class AddMirrorTriggerBuildsToProjects < ActiveRecord::Migration[4.2]
def change
add_column :projects, :mirror_trigger_builds, :boolean, default: false, null: false
end
end
# rubocop:disable Migration/Timestamps
class CreateIndexStatuses < ActiveRecord::Migration[4.2]
def change
create_table :index_statuses do |t|
t.integer :project_id, null: false
t.datetime :indexed_at
t.text :note
t.string :last_commit
t.timestamps null: false
end
add_index :index_statuses, :project_id, unique: true
end
end
class UpdateJenkinsServiceCategory < ActiveRecord::Migration[4.2]
def up
category = quote_column_name('category')
type = quote_column_name('type')
execute <<-EOS
UPDATE services
SET #{category} = 'ci'
WHERE #{type} IN (
'JenkinsService',
'JenkinsDeprecatedService'
)
EOS
end
def down
# don't do anything
end
end
class AddGeoNodeKeyToGeoNode < ActiveRecord::Migration[4.2]
def change
change_table :geo_nodes do |t|
t.belongs_to :geo_node_key, index: true
end
end
end
class AddDoorkeeperApplicationToGeoNode < ActiveRecord::Migration[4.2]
def change
change_table :geo_nodes do |t|
t.belongs_to :oauth_application
end
end
end
# rubocop:disable Migration/RemoveColumn
class RenameHeaderFieldOnAppearrance < ActiveRecord::Migration[4.2]
def up
unless column_exists?(:appearances, :header_logo)
rename_column :appearances, :light_logo, :header_logo
end
if column_exists?(:appearances, :dark_logo)
remove_column :appearances, :dark_logo
end
end
def down
rename_column(:appearances, :header_logo, :light_logo)
add_column(:appearances, :dark_logo, :string)
end
end
class AddSecondaryExternUidToIdentities < ActiveRecord::Migration[4.2]
def change
add_column :identities, :secondary_extern_uid, :string
end
end
# rubocop:disable all
class GitHooksProjectIdIndex < ActiveRecord::Migration[4.2]
disable_ddl_transaction!
def change
args = [:git_hooks, :project_id]
if Gitlab::Database.postgresql?
args << { algorithm: :concurrently }
end
add_index(*args)
end
end
# rubocop:disable all
class AddLastSyncTimeToGroups < ActiveRecord::Migration[4.2]
def change
add_column :namespaces, :last_ldap_sync_at, :datetime
add_index :namespaces, :last_ldap_sync_at
end
end
# rubocop:disable Migration/Timestamps
class CreateRemoteMirrorsEE < ActiveRecord::Migration[4.2]
def up
# When moving from CE to EE, remote_mirrors may already exist
return if table_exists?(:remote_mirrors)
create_table :remote_mirrors do |t|
t.references :project, index: true, foreign_key: true
t.string :url
t.boolean :enabled, default: true
t.string :update_status
t.datetime :last_update_at
t.datetime :last_successful_update_at
t.string :last_error
t.text :encrypted_credentials
t.string :encrypted_credentials_iv
t.string :encrypted_credentials_salt
t.timestamps null: false
end
end
def down
drop_table :remote_mirrors if table_exists?(:remote_mirrors)
end
end
class AddSystemHookToGeoNode < ActiveRecord::Migration[4.2]
def change
change_table :geo_nodes do |t|
t.references :system_hook
end
end
end
# See http://doc.gitlab.com/ce/development/migration_style_guide.html
# for more information on how to write migrations for GitLab.
class MakeRemoteMirrorsDisabledByDefaultEE < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
# When using the methods "add_concurrent_index" or "add_column_with_default"
# you must disable the use of transactions as these methods can not run in an
# existing transaction. When using "add_concurrent_index" make sure that this
# method is the _only_ method called in the migration, any other changes
# should go in a separate migration. This ensures that upon failure _only_ the
# index creation fails and can be retried or reverted easily.
#
# To disable transactions uncomment the following line and remove these
# comments:
# disable_ddl_transaction!
def up
change_column :remote_mirrors, :enabled, :boolean, default: false
end
def down
change_column :remote_mirrors, :enabled, :boolean, default: true
end
end
# rubocop:disable Migration/Timestamps
class CreatePathLocksTable < ActiveRecord::Migration[4.2]
def change
create_table :path_locks do |t|
t.string :path, null: false, index: true
t.references :project, index: true, foreign_key: true
t.references :user, index: true, foreign_key: true
t.timestamps null: false
end
end
end
class AddEsToApplicationSettings < ActiveRecord::Migration[4.2]
def up
add_column :application_settings, :elasticsearch_indexing, :boolean, default: false, null: false
add_column :application_settings, :elasticsearch_search, :boolean, default: false, null: false
add_column :application_settings, :elasticsearch_host, :string, default: 'localhost'
add_column :application_settings, :elasticsearch_port, :string, default: '9200'
es_enabled = Settings.elasticsearch['enabled']
es_host = Settings.elasticsearch['host']
es_host = es_host.join(',') if es_host.is_a?(Array)
es_port = Settings.elasticsearch['port']
execute <<-SQL.strip_heredoc
UPDATE application_settings
SET
elasticsearch_indexing = #{es_enabled},
elasticsearch_search = #{es_enabled},
elasticsearch_host = '#{es_host}',
elasticsearch_port = '#{es_port}'
SQL
end
def down
remove_column :application_settings, :elasticsearch_indexing
remove_column :application_settings, :elasticsearch_search
remove_column :application_settings, :elasticsearch_host
remove_column :application_settings, :elasticsearch_port
end
end
# RemoveWrongImportUrlFromProjects migration missed setting the mirror flag to false when making import_url nil
# for invalid URIs that why we need this migration.
class DisableMirrorWithoutImportUrl < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
def up
execute("UPDATE projects SET mirror = false WHERE projects.mirror = true AND (projects.import_url IS NULL OR projects.import_url = '')")
end
end
class AddApprovalsBeforeMergeToMergeRequests < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
def change
add_column :merge_requests, :approvals_before_merge, :integer
end
end
# See http://doc.gitlab.com/ce/development/migration_style_guide.html
# for more information on how to write migrations for GitLab.
class RenameGitHooksToPushRules < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
# When using the methods "add_concurrent_index" or "add_column_with_default"
# you must disable the use of transactions as these methods can not run in an
# existing transaction. When using "add_concurrent_index" make sure that this
# method is the _only_ method called in the migration, any other changes
# should go in a separate migration. This ensures that upon failure _only_ the
# index creation fails and can be retried or reverted easily.
#
# To disable transactions uncomment the following line and remove these
# comments:
# disable_ddl_transaction!
def change
rename_table :git_hooks, :push_rules
end
end
# Migration type: online without errors (works on previous version and new one)
# rubocop:disable Migration/Datetime
# rubocop:disable Migration/UpdateLargeTable
class AddLdapSyncStateToGroups < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
disable_ddl_transaction!
DOWNTIME = false
def up
add_column_with_default :namespaces, :ldap_sync_status, :string, default: 'ready'
add_column :namespaces, :ldap_sync_error, :string
add_column :namespaces, :ldap_sync_last_update_at, :datetime
add_column :namespaces, :ldap_sync_last_successful_update_at, :datetime
add_column :namespaces, :ldap_sync_last_sync_at, :datetime
end
def down
remove_column :namespaces, :ldap_sync_status
remove_column :namespaces, :ldap_sync_error
remove_column :namespaces, :ldap_sync_last_update_at
remove_column :namespaces, :ldap_sync_last_successful_update_at
remove_column :namespaces, :ldap_sync_last_sync_at
end
end
# Migration type: online without errors (works on previous version and new one)
# rubocop:disable RemoveIndex
class AddLdapSyncStateIndicesToGroups < ActiveRecord::Migration[4.2]
include Gitlab::Database::MigrationHelpers
disable_ddl_transaction!
DOWNTIME = false
def up
add_concurrent_index :namespaces, :ldap_sync_last_update_at
add_concurrent_index :namespaces, :ldap_sync_last_successful_update_at
end
def down
remove_index :namespaces, column: :ldap_sync_last_update_at if index_exists?(:namespaces, :ldap_sync_last_update_at)
remove_index :namespaces, column: :ldap_sync_last_successful_update_at if index_exists?(:namespaces, :ldap_sync_last_successful_update_at)
end
end
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment