Commit a4deb7c4 authored by Clement Ho's avatar Clement Ho

Merge branch 'ce-to-ee-2018-06-29' into 'master'

CE upstream - 2018-06-29 18:21 UTC

See merge request gitlab-org/gitlab-ee!6332
parents 1426dfed 9e25def2
...@@ -15,6 +15,8 @@ ...@@ -15,6 +15,8 @@
- [Between the 1st and the 7th](#between-the-1st-and-the-7th) - [Between the 1st and the 7th](#between-the-1st-and-the-7th)
- [On the 7th](#on-the-7th) - [On the 7th](#on-the-7th)
- [After the 7th](#after-the-7th) - [After the 7th](#after-the-7th)
- [Regressions](#regressions)
- [How to manage a regression](#how-to-manage-a-regression)
- [Release retrospective and kickoff](#release-retrospective-and-kickoff) - [Release retrospective and kickoff](#release-retrospective-and-kickoff)
- [Retrospective](#retrospective) - [Retrospective](#retrospective)
- [Kickoff](#kickoff) - [Kickoff](#kickoff)
...@@ -207,7 +209,7 @@ you can ask for an exception to be made. ...@@ -207,7 +209,7 @@ you can ask for an exception to be made.
Check [this guide](https://gitlab.com/gitlab-org/release/docs/blob/master/general/exception-request/process.md) about how to open an exception request before opening one. Check [this guide](https://gitlab.com/gitlab-org/release/docs/blob/master/general/exception-request/process.md) about how to open an exception request before opening one.
### Regressions ## Regressions
A regression for a particular monthly release is a bug that exists in that A regression for a particular monthly release is a bug that exists in that
release, but wasn't present in the release before. This includes bugs in release, but wasn't present in the release before. This includes bugs in
...@@ -225,10 +227,30 @@ month. When we say 'the most recent monthly release', this can refer to either ...@@ -225,10 +227,30 @@ month. When we say 'the most recent monthly release', this can refer to either
the version currently running on GitLab.com, or the most recent version the version currently running on GitLab.com, or the most recent version
available in the package repositories. available in the package repositories.
A regression issue should be labeled with the appropriate [subject label](../CONTRIBUTING.md#subject-labels-wiki-container-registry-ldap-api-etc) ### How to manage a regression
and [team label](../CONTRIBUTING.md#team-labels-ci-discussion-edge-platform-etc),
just like any other issue, to help GitLab team members focus on issues that are Regressions are very important, and they should be considered high priority
relevant to [their area of responsibility](https://about.gitlab.com/handbook/engineering/workflow/#choosing-something-to-work-on). issues that should be solved as soon as possible, especially if they affect
users. Despite that, ~regression label itself does not imply when the issue
will be scheduled.
When a regression is found:
1. Create an issue describing the problem in the most detailed way possible
1. If possible, provide links to real examples and how to reproduce the problem
1. Label the issue properly, using the [team label](../CONTRIBUTING.md#team-labels),
the [subject label](../CONTRIBUTING.md#subject-labels)
and any other label that may apply in the specific case
1. Add the ~bug and ~regression labels
1. Notify the respective Engineering Manager to evaluate the Severity of the regression and add a [Severity label](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/CONTRIBUTING.md#bug-severity-labels). The counterpart Product Manager is included to weigh-in on prioritization as needed to set the [Priority label](https://gitlab.com/gitlab-org/gitlab-ce/blob/master/CONTRIBUTING.md#bug-priority-labels).
1. If the regression is either an ~S1, ~S2 or ~S3 severity, label the regression with the current milestone as it should be fixed in the current milestone.
1. If the regression was introduced in an RC of the current release, label with ~Deliverable
1. If the regression was introduced in the previous release, label with ~"Next Patch Release"
1. If the regression is an ~S4 severity, the regression may be scheduled for later milestones at the discretion of Engineering Manager and Product Manager.
When a new issue is found, the fix should start as soon as possible. You can
ping the Engineering Manager or the Product Manager for the relative area to
make them aware of the issue earlier. They will analyze the priority and change
it if needed.
## Release retrospective and kickoff ## Release retrospective and kickoff
......
...@@ -3,6 +3,7 @@ import { mapState, mapGetters, mapActions } from 'vuex'; ...@@ -3,6 +3,7 @@ import { mapState, mapGetters, mapActions } from 'vuex';
import Icon from '~/vue_shared/components/icon.vue'; import Icon from '~/vue_shared/components/icon.vue';
import { __ } from '~/locale'; import { __ } from '~/locale';
import createFlash from '~/flash'; import createFlash from '~/flash';
import eventHub from '../../notes/event_hub';
import LoadingIcon from '../../vue_shared/components/loading_icon.vue'; import LoadingIcon from '../../vue_shared/components/loading_icon.vue';
import CompareVersions from './compare_versions.vue'; import CompareVersions from './compare_versions.vue';
import ChangedFiles from './changed_files.vue'; import ChangedFiles from './changed_files.vue';
...@@ -62,7 +63,7 @@ export default { ...@@ -62,7 +63,7 @@ export default {
plainDiffPath: state => state.diffs.plainDiffPath, plainDiffPath: state => state.diffs.plainDiffPath,
emailPatchPath: state => state.diffs.emailPatchPath, emailPatchPath: state => state.diffs.emailPatchPath,
}), }),
...mapGetters(['isParallelView']), ...mapGetters(['isParallelView', 'isNotesFetched']),
targetBranch() { targetBranch() {
return { return {
branchName: this.targetBranchName, branchName: this.targetBranchName,
...@@ -94,20 +95,36 @@ export default { ...@@ -94,20 +95,36 @@ export default {
this.adjustView(); this.adjustView();
}, },
shouldShow() { shouldShow() {
// When the shouldShow property changed to true, the route is rendered for the first time
// and if we have the isLoading as true this means we didn't fetch the data
if (this.isLoading) {
this.fetchData();
}
this.adjustView(); this.adjustView();
}, },
}, },
mounted() { mounted() {
this.setBaseConfig({ endpoint: this.endpoint, projectPath: this.projectPath }); this.setBaseConfig({ endpoint: this.endpoint, projectPath: this.projectPath });
this.fetchDiffFiles().catch(() => {
createFlash(__('Fetching diff files failed. Please reload the page to try again!')); if (this.shouldShow) {
}); this.fetchData();
}
}, },
created() { created() {
this.adjustView(); this.adjustView();
}, },
methods: { methods: {
...mapActions(['setBaseConfig', 'fetchDiffFiles']), ...mapActions(['setBaseConfig', 'fetchDiffFiles']),
fetchData() {
this.fetchDiffFiles().catch(() => {
createFlash(__('Something went wrong on our end. Please try again!'));
});
if (!this.isNotesFetched) {
eventHub.$emit('fetchNotesData');
}
},
setActive(filePath) { setActive(filePath) {
this.activeFile = filePath; this.activeFile = filePath;
}, },
......
...@@ -66,59 +66,61 @@ export default { ...@@ -66,59 +66,61 @@ export default {
@click="clearSearch" @click="clearSearch"
></i> ></i>
</div> </div>
<ul> <div class="dropdown-content">
<li <ul>
v-for="diffFile in filteredDiffFiles" <li
:key="diffFile.name" v-for="diffFile in filteredDiffFiles"
> :key="diffFile.name"
<a
:href="`#${diffFile.fileHash}`"
:title="diffFile.newPath"
class="diff-changed-file"
> >
<icon <a
:name="fileChangedIcon(diffFile)" :href="`#${diffFile.fileHash}`"
:size="16" :title="diffFile.newPath"
:class="fileChangedClass(diffFile)" class="diff-changed-file"
class="diff-file-changed-icon append-right-8" >
/> <icon
<span class="diff-changed-file-content append-right-8"> :name="fileChangedIcon(diffFile)"
<strong :size="16"
v-if="diffFile.blob && diffFile.blob.name" :class="fileChangedClass(diffFile)"
class="diff-changed-file-name" class="diff-file-changed-icon append-right-8"
> />
{{ diffFile.blob.name }} <span class="diff-changed-file-content append-right-8">
</strong> <strong
<strong v-if="diffFile.blob && diffFile.blob.name"
v-else class="diff-changed-file-name"
class="diff-changed-blank-file-name" >
> {{ diffFile.blob.name }}
{{ s__('Diffs|No file name available') }} </strong>
</strong> <strong
<span class="diff-changed-file-path prepend-top-5"> v-else
{{ truncatedDiffPath(diffFile.blob.path) }} class="diff-changed-blank-file-name"
>
{{ s__('Diffs|No file name available') }}
</strong>
<span class="diff-changed-file-path prepend-top-5">
{{ truncatedDiffPath(diffFile.blob.path) }}
</span>
</span> </span>
</span> <span class="diff-changed-stats">
<span class="diff-changed-stats"> <span class="cgreen">
<span class="cgreen"> +{{ diffFile.addedLines }}
+{{ diffFile.addedLines }} </span>
<span class="cred">
-{{ diffFile.removedLines }}
</span>
</span> </span>
<span class="cred"> </a>
-{{ diffFile.removedLines }} </li>
</span>
</span>
</a>
</li>
<li <li
v-show="filteredDiffFiles.length === 0" v-show="filteredDiffFiles.length === 0"
class="dropdown-menu-empty-item" class="dropdown-menu-empty-item"
> >
<a> <a>
{{ __('No files found') }} {{ __('No files found') }}
</a> </a>
</li> </li>
</ul> </ul>
</div>
</div> </div>
</span> </span>
</template> </template>
...@@ -15,10 +15,6 @@ export const setBaseConfig = ({ commit }, options) => { ...@@ -15,10 +15,6 @@ export const setBaseConfig = ({ commit }, options) => {
commit(types.SET_BASE_CONFIG, { endpoint, projectPath }); commit(types.SET_BASE_CONFIG, { endpoint, projectPath });
}; };
export const setLoadingState = ({ commit }, state) => {
commit(types.SET_LOADING, state);
};
export const fetchDiffFiles = ({ state, commit }) => { export const fetchDiffFiles = ({ state, commit }) => {
commit(types.SET_LOADING, true); commit(types.SET_LOADING, true);
...@@ -88,7 +84,6 @@ export const expandAllFiles = ({ commit }) => { ...@@ -88,7 +84,6 @@ export const expandAllFiles = ({ commit }) => {
export default { export default {
setBaseConfig, setBaseConfig,
setLoadingState,
fetchDiffFiles, fetchDiffFiles,
setInlineDiffViewType, setInlineDiffViewType,
setParallelDiffViewType, setParallelDiffViewType,
......
...@@ -24,8 +24,8 @@ export default { ...@@ -24,8 +24,8 @@ export default {
this.isLoading = true; this.isLoading = true;
this.$store this.message
.dispatch(this.message.action, this.message.actionPayload) .action(this.message.actionPayload)
.then(() => { .then(() => {
this.isLoading = false; this.isLoading = false;
}) })
......
import Vue from 'vue';
import VueResource from 'vue-resource';
import axios from '~/lib/utils/axios_utils'; import axios from '~/lib/utils/axios_utils';
import Api from '~/api'; import Api from '~/api';
Vue.use(VueResource);
export default { export default {
getTreeData(endpoint) {
return Vue.http.get(endpoint, { params: { format: 'json' } });
},
getFileData(endpoint) { getFileData(endpoint) {
return Vue.http.get(endpoint, { params: { format: 'json', viewer: 'none' } }); return axios.get(endpoint, {
params: { format: 'json', viewer: 'none' },
});
}, },
getRawFileData(file) { getRawFileData(file) {
if (file.tempFile) { if (file.tempFile) {
...@@ -21,7 +16,11 @@ export default { ...@@ -21,7 +16,11 @@ export default {
return Promise.resolve(file.raw); return Promise.resolve(file.raw);
} }
return Vue.http.get(file.rawPath, { params: { format: 'json' } }).then(res => res.text()); return axios
.get(file.rawPath, {
params: { format: 'json' },
})
.then(({ data }) => data);
}, },
getBaseRawFileData(file, sha) { getBaseRawFileData(file, sha) {
if (file.tempFile) { if (file.tempFile) {
...@@ -32,11 +31,11 @@ export default { ...@@ -32,11 +31,11 @@ export default {
return Promise.resolve(file.baseRaw); return Promise.resolve(file.baseRaw);
} }
return Vue.http return axios
.get(file.rawPath.replace(`/raw/${file.branchId}/${file.path}`, `/raw/${sha}/${file.path}`), { .get(file.rawPath.replace(`/raw/${file.branchId}/${file.path}`, `/raw/${sha}/${file.path}`), {
params: { format: 'json' }, params: { format: 'json' },
}) })
.then(res => res.text()); .then(({ data }) => data);
}, },
getProjectData(namespace, project) { getProjectData(namespace, project) {
return Api.project(`${namespace}/${project}`); return Api.project(`${namespace}/${project}`);
...@@ -53,21 +52,9 @@ export default { ...@@ -53,21 +52,9 @@ export default {
getBranchData(projectId, currentBranchId) { getBranchData(projectId, currentBranchId) {
return Api.branchSingle(projectId, currentBranchId); return Api.branchSingle(projectId, currentBranchId);
}, },
createBranch(projectId, payload) {
const url = Api.buildUrl(Api.createBranchPath).replace(':id', projectId);
return Vue.http.post(url, payload);
},
commit(projectId, payload) { commit(projectId, payload) {
return Api.commitMultiple(projectId, payload); return Api.commitMultiple(projectId, payload);
}, },
getTreeLastCommit(endpoint) {
return Vue.http.get(endpoint, {
params: {
format: 'json',
},
});
},
getFiles(projectUrl, branchId) { getFiles(projectUrl, branchId) {
const url = `${projectUrl}/files/${branchId}`; const url = `${projectUrl}/files/${branchId}`;
return axios.get(url, { params: { format: 'json' } }); return axios.get(url, { params: { format: 'json' } });
......
import { normalizeHeaders } from '~/lib/utils/common_utils'; import { __ } from '../../../locale';
import flash from '~/flash'; import { normalizeHeaders } from '../../../lib/utils/common_utils';
import eventHub from '../../eventhub'; import eventHub from '../../eventhub';
import service from '../../services'; import service from '../../services';
import * as types from '../mutation_types'; import * as types from '../mutation_types';
...@@ -66,13 +66,10 @@ export const getFileData = ({ state, commit, dispatch }, { path, makeFileActive ...@@ -66,13 +66,10 @@ export const getFileData = ({ state, commit, dispatch }, { path, makeFileActive
.getFileData( .getFileData(
`${gon.relative_url_root ? gon.relative_url_root : ''}${file.url.replace('/-/', '/')}`, `${gon.relative_url_root ? gon.relative_url_root : ''}${file.url.replace('/-/', '/')}`,
) )
.then(res => { .then(({ data, headers }) => {
const pageTitle = decodeURI(normalizeHeaders(res.headers)['PAGE-TITLE']); const normalizedHeaders = normalizeHeaders(headers);
setPageTitle(pageTitle); setPageTitle(decodeURI(normalizedHeaders['PAGE-TITLE']));
return res.json();
})
.then(data => {
commit(types.SET_FILE_DATA, { data, file }); commit(types.SET_FILE_DATA, { data, file });
commit(types.TOGGLE_FILE_OPEN, path); commit(types.TOGGLE_FILE_OPEN, path);
if (makeFileActive) dispatch('setFileActive', path); if (makeFileActive) dispatch('setFileActive', path);
...@@ -80,7 +77,13 @@ export const getFileData = ({ state, commit, dispatch }, { path, makeFileActive ...@@ -80,7 +77,13 @@ export const getFileData = ({ state, commit, dispatch }, { path, makeFileActive
}) })
.catch(() => { .catch(() => {
commit(types.TOGGLE_LOADING, { entry: file }); commit(types.TOGGLE_LOADING, { entry: file });
flash('Error loading file data. Please try again.', 'alert', document, null, false, true); dispatch('setErrorMessage', {
text: __('An error occured whilst loading the file.'),
action: payload =>
dispatch('getFileData', payload).then(() => dispatch('setErrorMessage', null)),
actionText: __('Please try again'),
actionPayload: { path, makeFileActive },
});
}); });
}; };
...@@ -88,7 +91,7 @@ export const setFileMrChange = ({ commit }, { file, mrChange }) => { ...@@ -88,7 +91,7 @@ export const setFileMrChange = ({ commit }, { file, mrChange }) => {
commit(types.SET_FILE_MERGE_REQUEST_CHANGE, { file, mrChange }); commit(types.SET_FILE_MERGE_REQUEST_CHANGE, { file, mrChange });
}; };
export const getRawFileData = ({ state, commit }, { path, baseSha }) => { export const getRawFileData = ({ state, commit, dispatch }, { path, baseSha }) => {
const file = state.entries[path]; const file = state.entries[path];
return new Promise((resolve, reject) => { return new Promise((resolve, reject) => {
service service
...@@ -113,7 +116,13 @@ export const getRawFileData = ({ state, commit }, { path, baseSha }) => { ...@@ -113,7 +116,13 @@ export const getRawFileData = ({ state, commit }, { path, baseSha }) => {
} }
}) })
.catch(() => { .catch(() => {
flash('Error loading file content. Please try again.'); dispatch('setErrorMessage', {
text: __('An error occured whilst loading the file content.'),
action: payload =>
dispatch('getRawFileData', payload).then(() => dispatch('setErrorMessage', null)),
actionText: __('Please try again'),
actionPayload: { path, baseSha },
});
reject(); reject();
}); });
}); });
......
import flash from '~/flash'; import { __ } from '../../../locale';
import service from '../../services'; import service from '../../services';
import * as types from '../mutation_types'; import * as types from '../mutation_types';
export const getMergeRequestData = ( export const getMergeRequestData = (
{ commit, state }, { commit, dispatch, state },
{ projectId, mergeRequestId, force = false } = {}, { projectId, mergeRequestId, force = false } = {},
) => ) =>
new Promise((resolve, reject) => { new Promise((resolve, reject) => {
if (!state.projects[projectId].mergeRequests[mergeRequestId] || force) { if (!state.projects[projectId].mergeRequests[mergeRequestId] || force) {
service service
.getProjectMergeRequestData(projectId, mergeRequestId) .getProjectMergeRequestData(projectId, mergeRequestId)
.then(res => res.data) .then(({ data }) => {
.then(data => {
commit(types.SET_MERGE_REQUEST, { commit(types.SET_MERGE_REQUEST, {
projectPath: projectId, projectPath: projectId,
mergeRequestId, mergeRequestId,
...@@ -21,7 +20,15 @@ export const getMergeRequestData = ( ...@@ -21,7 +20,15 @@ export const getMergeRequestData = (
resolve(data); resolve(data);
}) })
.catch(() => { .catch(() => {
flash('Error loading merge request data. Please try again.'); dispatch('setErrorMessage', {
text: __('An error occured whilst loading the merge request.'),
action: payload =>
dispatch('getMergeRequestData', payload).then(() =>
dispatch('setErrorMessage', null),
),
actionText: __('Please try again'),
actionPayload: { projectId, mergeRequestId, force },
});
reject(new Error(`Merge Request not loaded ${projectId}`)); reject(new Error(`Merge Request not loaded ${projectId}`));
}); });
} else { } else {
...@@ -30,15 +37,14 @@ export const getMergeRequestData = ( ...@@ -30,15 +37,14 @@ export const getMergeRequestData = (
}); });
export const getMergeRequestChanges = ( export const getMergeRequestChanges = (
{ commit, state }, { commit, dispatch, state },
{ projectId, mergeRequestId, force = false } = {}, { projectId, mergeRequestId, force = false } = {},
) => ) =>
new Promise((resolve, reject) => { new Promise((resolve, reject) => {
if (!state.projects[projectId].mergeRequests[mergeRequestId].changes.length || force) { if (!state.projects[projectId].mergeRequests[mergeRequestId].changes.length || force) {
service service
.getProjectMergeRequestChanges(projectId, mergeRequestId) .getProjectMergeRequestChanges(projectId, mergeRequestId)
.then(res => res.data) .then(({ data }) => {
.then(data => {
commit(types.SET_MERGE_REQUEST_CHANGES, { commit(types.SET_MERGE_REQUEST_CHANGES, {
projectPath: projectId, projectPath: projectId,
mergeRequestId, mergeRequestId,
...@@ -47,7 +53,15 @@ export const getMergeRequestChanges = ( ...@@ -47,7 +53,15 @@ export const getMergeRequestChanges = (
resolve(data); resolve(data);
}) })
.catch(() => { .catch(() => {
flash('Error loading merge request changes. Please try again.'); dispatch('setErrorMessage', {
text: __('An error occured whilst loading the merge request changes.'),
action: payload =>
dispatch('getMergeRequestChanges', payload).then(() =>
dispatch('setErrorMessage', null),
),
actionText: __('Please try again'),
actionPayload: { projectId, mergeRequestId, force },
});
reject(new Error(`Merge Request Changes not loaded ${projectId}`)); reject(new Error(`Merge Request Changes not loaded ${projectId}`));
}); });
} else { } else {
...@@ -56,7 +70,7 @@ export const getMergeRequestChanges = ( ...@@ -56,7 +70,7 @@ export const getMergeRequestChanges = (
}); });
export const getMergeRequestVersions = ( export const getMergeRequestVersions = (
{ commit, state }, { commit, dispatch, state },
{ projectId, mergeRequestId, force = false } = {}, { projectId, mergeRequestId, force = false } = {},
) => ) =>
new Promise((resolve, reject) => { new Promise((resolve, reject) => {
...@@ -73,7 +87,15 @@ export const getMergeRequestVersions = ( ...@@ -73,7 +87,15 @@ export const getMergeRequestVersions = (
resolve(data); resolve(data);
}) })
.catch(() => { .catch(() => {
flash('Error loading merge request versions. Please try again.'); dispatch('setErrorMessage', {
text: __('An error occured whilst loading the merge request version data.'),
action: payload =>
dispatch('getMergeRequestVersions', payload).then(() =>
dispatch('setErrorMessage', null),
),
actionText: __('Please try again'),
actionPayload: { projectId, mergeRequestId, force },
});
reject(new Error(`Merge Request Versions not loaded ${projectId}`)); reject(new Error(`Merge Request Versions not loaded ${projectId}`));
}); });
} else { } else {
......
...@@ -104,7 +104,7 @@ export const createNewBranchFromDefault = ({ state, dispatch, getters }, branch) ...@@ -104,7 +104,7 @@ export const createNewBranchFromDefault = ({ state, dispatch, getters }, branch)
.catch(() => { .catch(() => {
dispatch('setErrorMessage', { dispatch('setErrorMessage', {
text: __('An error occured creating the new branch.'), text: __('An error occured creating the new branch.'),
action: 'createNewBranchFromDefault', action: payload => dispatch('createNewBranchFromDefault', payload),
actionText: __('Please try again'), actionText: __('Please try again'),
actionPayload: branch, actionPayload: branch,
}); });
...@@ -119,7 +119,7 @@ export const showBranchNotFoundError = ({ dispatch }, branchId) => { ...@@ -119,7 +119,7 @@ export const showBranchNotFoundError = ({ dispatch }, branchId) => {
}, },
false, false,
), ),
action: 'createNewBranchFromDefault', action: payload => dispatch('createNewBranchFromDefault', payload),
actionText: __('Create branch'), actionText: __('Create branch'),
actionPayload: branchId, actionPayload: branchId,
}); });
......
import { normalizeHeaders } from '~/lib/utils/common_utils';
import flash from '~/flash';
import { __ } from '../../../locale'; import { __ } from '../../../locale';
import service from '../../services'; import service from '../../services';
import * as types from '../mutation_types'; import * as types from '../mutation_types';
import { findEntry } from '../utils';
import FilesDecoratorWorker from '../workers/files_decorator_worker'; import FilesDecoratorWorker from '../workers/files_decorator_worker';
export const toggleTreeOpen = ({ commit }, path) => { export const toggleTreeOpen = ({ commit }, path) => {
...@@ -37,32 +34,6 @@ export const handleTreeEntryAction = ({ commit, dispatch }, row) => { ...@@ -37,32 +34,6 @@ export const handleTreeEntryAction = ({ commit, dispatch }, row) => {
dispatch('showTreeEntry', row.path); dispatch('showTreeEntry', row.path);
}; };
export const getLastCommitData = ({ state, commit, dispatch }, tree = state) => {
if (!tree || tree.lastCommitPath === null || !tree.lastCommitPath) return;
service
.getTreeLastCommit(tree.lastCommitPath)
.then(res => {
const lastCommitPath = normalizeHeaders(res.headers)['MORE-LOGS-URL'] || null;
commit(types.SET_LAST_COMMIT_URL, { tree, url: lastCommitPath });
return res.json();
})
.then(data => {
data.forEach(lastCommit => {
const entry = findEntry(tree.tree, lastCommit.type, lastCommit.file_name);
if (entry) {
commit(types.SET_LAST_COMMIT_DATA, { entry, lastCommit });
}
});
dispatch('getLastCommitData', tree);
})
.catch(() => flash('Error fetching log data.', 'alert', document, null, false, true));
};
export const getFiles = ({ state, commit, dispatch }, { projectId, branchId } = {}) => export const getFiles = ({ state, commit, dispatch }, { projectId, branchId } = {}) =>
new Promise((resolve, reject) => { new Promise((resolve, reject) => {
if ( if (
...@@ -106,14 +77,13 @@ export const getFiles = ({ state, commit, dispatch }, { projectId, branchId } = ...@@ -106,14 +77,13 @@ export const getFiles = ({ state, commit, dispatch }, { projectId, branchId } =
if (e.response.status === 404) { if (e.response.status === 404) {
dispatch('showBranchNotFoundError', branchId); dispatch('showBranchNotFoundError', branchId);
} else { } else {
flash( dispatch('setErrorMessage', {
__('Error loading tree data. Please try again.'), text: __('An error occured whilst loading all the files.'),
'alert', action: payload =>
document, dispatch('getFiles', payload).then(() => dispatch('setErrorMessage', null)),
null, actionText: __('Please try again'),
false, actionPayload: { projectId, branchId },
true, });
);
} }
reject(e); reject(e);
}); });
......
...@@ -164,7 +164,7 @@ export default class Job extends LogOutputBehaviours { ...@@ -164,7 +164,7 @@ export default class Job extends LogOutputBehaviours {
// eslint-disable-next-line class-methods-use-this // eslint-disable-next-line class-methods-use-this
shouldHideSidebarForViewport() { shouldHideSidebarForViewport() {
const bootstrapBreakpoint = bp.getBreakpointSize(); const bootstrapBreakpoint = bp.getBreakpointSize();
return bootstrapBreakpoint === 'xs' || bootstrapBreakpoint === 'sm'; return bootstrapBreakpoint === 'xs';
} }
toggleSidebar(shouldHide) { toggleSidebar(shouldHide) {
......
...@@ -45,17 +45,17 @@ export default function initMrNotes() { ...@@ -45,17 +45,17 @@ export default function initMrNotes() {
this.updateDiscussionTabCounter(); this.updateDiscussionTabCounter();
}, },
}, },
created() {
this.setActiveTab(window.mrTabs.getCurrentAction());
},
mounted() { mounted() {
this.notesCountBadge = $('.issuable-details').find('.notes-tab .badge'); this.notesCountBadge = $('.issuable-details').find('.notes-tab .badge');
this.setActiveTab(window.mrTabs.getCurrentAction());
window.mrTabs.eventHub.$on('MergeRequestTabChange', tab => {
this.setActiveTab(tab);
});
$(document).on('visibilitychange', this.updateDiscussionTabCounter); $(document).on('visibilitychange', this.updateDiscussionTabCounter);
window.mrTabs.eventHub.$on('MergeRequestTabChange', this.setActiveTab);
}, },
beforeDestroy() { beforeDestroy() {
$(document).off('visibilitychange', this.updateDiscussionTabCounter); $(document).off('visibilitychange', this.updateDiscussionTabCounter);
window.mrTabs.eventHub.$off('MergeRequestTabChange', this.setActiveTab);
}, },
methods: { methods: {
...mapActions(['setActiveTab']), ...mapActions(['setActiveTab']),
......
...@@ -3,6 +3,7 @@ import { mapGetters, mapActions } from 'vuex'; ...@@ -3,6 +3,7 @@ import { mapGetters, mapActions } from 'vuex';
import { getLocationHash } from '../../lib/utils/url_utility'; import { getLocationHash } from '../../lib/utils/url_utility';
import Flash from '../../flash'; import Flash from '../../flash';
import * as constants from '../constants'; import * as constants from '../constants';
import eventHub from '../event_hub';
import noteableNote from './noteable_note.vue'; import noteableNote from './noteable_note.vue';
import noteableDiscussion from './noteable_discussion.vue'; import noteableDiscussion from './noteable_discussion.vue';
import systemNote from '../../vue_shared/components/notes/system_note.vue'; import systemNote from '../../vue_shared/components/notes/system_note.vue';
...@@ -49,7 +50,7 @@ export default { ...@@ -49,7 +50,7 @@ export default {
}; };
}, },
computed: { computed: {
...mapGetters(['discussions', 'getNotesDataByProp', 'discussionCount']), ...mapGetters(['isNotesFetched', 'discussions', 'getNotesDataByProp', 'discussionCount']),
noteableType() { noteableType() {
return this.noteableData.noteableType; return this.noteableData.noteableType;
}, },
...@@ -61,19 +62,30 @@ export default { ...@@ -61,19 +62,30 @@ export default {
isSkeletonNote: true, isSkeletonNote: true,
}); });
} }
return this.discussions; return this.discussions;
}, },
}, },
watch: {
shouldShow() {
if (!this.isNotesFetched) {
this.fetchNotes();
}
},
},
created() { created() {
this.setNotesData(this.notesData); this.setNotesData(this.notesData);
this.setNoteableData(this.noteableData); this.setNoteableData(this.noteableData);
this.setUserData(this.userData); this.setUserData(this.userData);
this.setTargetNoteHash(getLocationHash()); this.setTargetNoteHash(getLocationHash());
eventHub.$once('fetchNotesData', this.fetchNotes);
}, },
mounted() { mounted() {
this.fetchNotes(); if (this.shouldShow) {
const { parentElement } = this.$el; this.fetchNotes();
}
const { parentElement } = this.$el;
if (parentElement && parentElement.classList.contains('js-vue-notes-event')) { if (parentElement && parentElement.classList.contains('js-vue-notes-event')) {
parentElement.addEventListener('toggleAward', event => { parentElement.addEventListener('toggleAward', event => {
const { awardName, noteId } = event.detail; const { awardName, noteId } = event.detail;
...@@ -93,6 +105,7 @@ export default { ...@@ -93,6 +105,7 @@ export default {
setLastFetchedAt: 'setLastFetchedAt', setLastFetchedAt: 'setLastFetchedAt',
setTargetNoteHash: 'setTargetNoteHash', setTargetNoteHash: 'setTargetNoteHash',
toggleDiscussion: 'toggleDiscussion', toggleDiscussion: 'toggleDiscussion',
setNotesFetchedState: 'setNotesFetchedState',
}), }),
getComponentName(discussion) { getComponentName(discussion) {
if (discussion.isSkeletonNote) { if (discussion.isSkeletonNote) {
...@@ -119,11 +132,13 @@ export default { ...@@ -119,11 +132,13 @@ export default {
}) })
.then(() => { .then(() => {
this.isLoading = false; this.isLoading = false;
this.setNotesFetchedState(true);
}) })
.then(() => this.$nextTick()) .then(() => this.$nextTick())
.then(() => this.checkLocationHash()) .then(() => this.checkLocationHash())
.catch(() => { .catch(() => {
this.isLoading = false; this.isLoading = false;
this.setNotesFetchedState(true);
Flash('Something went wrong while fetching comments. Please try again.'); Flash('Something went wrong while fetching comments. Please try again.');
}); });
}, },
...@@ -161,11 +176,12 @@ export default { ...@@ -161,11 +176,12 @@ export default {
<template> <template>
<div <div
v-if="shouldShow" v-if="shouldShow"
id="notes"> id="notes"
>
<ul <ul
id="notes-list" id="notes-list"
class="notes main-notes-list timeline"> class="notes main-notes-list timeline"
>
<component <component
v-for="discussion in allDiscussions" v-for="discussion in allDiscussions"
:is="getComponentName(discussion)" :is="getComponentName(discussion)"
......
...@@ -28,6 +28,9 @@ export const setInitialNotes = ({ commit }, discussions) => ...@@ -28,6 +28,9 @@ export const setInitialNotes = ({ commit }, discussions) =>
export const setTargetNoteHash = ({ commit }, data) => commit(types.SET_TARGET_NOTE_HASH, data); export const setTargetNoteHash = ({ commit }, data) => commit(types.SET_TARGET_NOTE_HASH, data);
export const setNotesFetchedState = ({ commit }, state) =>
commit(types.SET_NOTES_FETCHED_STATE, state);
export const toggleDiscussion = ({ commit }, data) => commit(types.TOGGLE_DISCUSSION, data); export const toggleDiscussion = ({ commit }, data) => commit(types.TOGGLE_DISCUSSION, data);
export const fetchDiscussions = ({ commit }, path) => export const fetchDiscussions = ({ commit }, path) =>
......
...@@ -8,6 +8,8 @@ export const targetNoteHash = state => state.targetNoteHash; ...@@ -8,6 +8,8 @@ export const targetNoteHash = state => state.targetNoteHash;
export const getNotesData = state => state.notesData; export const getNotesData = state => state.notesData;
export const isNotesFetched = state => state.isNotesFetched;
export const getNotesDataByProp = state => prop => state.notesData[prop]; export const getNotesDataByProp = state => prop => state.notesData[prop];
export const getNoteableData = state => state.noteableData; export const getNoteableData = state => state.noteableData;
......
...@@ -10,6 +10,7 @@ export default { ...@@ -10,6 +10,7 @@ export default {
// View layer // View layer
isToggleStateButtonLoading: false, isToggleStateButtonLoading: false,
isNotesFetched: false,
// holds endpoints and permissions provided through haml // holds endpoints and permissions provided through haml
notesData: { notesData: {
......
...@@ -15,6 +15,7 @@ export const TOGGLE_DISCUSSION = 'TOGGLE_DISCUSSION'; ...@@ -15,6 +15,7 @@ export const TOGGLE_DISCUSSION = 'TOGGLE_DISCUSSION';
export const UPDATE_NOTE = 'UPDATE_NOTE'; export const UPDATE_NOTE = 'UPDATE_NOTE';
export const UPDATE_DISCUSSION = 'UPDATE_DISCUSSION'; export const UPDATE_DISCUSSION = 'UPDATE_DISCUSSION';
export const SET_DISCUSSION_DIFF_LINES = 'SET_DISCUSSION_DIFF_LINES'; export const SET_DISCUSSION_DIFF_LINES = 'SET_DISCUSSION_DIFF_LINES';
export const SET_NOTES_FETCHED_STATE = 'SET_NOTES_FETCHED_STATE';
// Issue // Issue
export const CLOSE_ISSUE = 'CLOSE_ISSUE'; export const CLOSE_ISSUE = 'CLOSE_ISSUE';
......
...@@ -205,6 +205,10 @@ export default { ...@@ -205,6 +205,10 @@ export default {
Object.assign(state, { isToggleStateButtonLoading: value }); Object.assign(state, { isToggleStateButtonLoading: value });
}, },
[types.SET_NOTES_FETCHED_STATE](state, value) {
Object.assign(state, { isNotesFetched: value });
},
[types.SET_DISCUSSION_DIFF_LINES](state, { discussionId, diffLines }) { [types.SET_DISCUSSION_DIFF_LINES](state, { discussionId, diffLines }) {
const discussion = utils.findNoteObjectById(state.discussions, discussionId); const discussion = utils.findNoteObjectById(state.discussions, discussionId);
const index = state.discussions.indexOf(discussion); const index = state.discussions.indexOf(discussion);
......
...@@ -48,7 +48,7 @@ export default class Wikis { ...@@ -48,7 +48,7 @@ export default class Wikis {
static sidebarCanCollapse() { static sidebarCanCollapse() {
const bootstrapBreakpoint = bp.getBreakpointSize(); const bootstrapBreakpoint = bp.getBreakpointSize();
return bootstrapBreakpoint === 'xs' || bootstrapBreakpoint === 'sm'; return bootstrapBreakpoint === 'xs';
} }
renderSidebar() { renderSidebar() {
......
...@@ -113,7 +113,7 @@ export default { ...@@ -113,7 +113,7 @@ export default {
> >
<div <div
v-if="currentRequest" v-if="currentRequest"
class="container-fluid container-limited" class="d-flex container-fluid container-limited"
> >
<div <div
id="peek-view-host" id="peek-view-host"
...@@ -179,6 +179,7 @@ export default { ...@@ -179,6 +179,7 @@ export default {
v-if="currentRequest" v-if="currentRequest"
:current-request="currentRequest" :current-request="currentRequest"
:requests="requests" :requests="requests"
class="ml-auto"
@change-current-request="changeCurrentRequest" @change-current-request="changeCurrentRequest"
/> />
</div> </div>
......
...@@ -35,10 +35,7 @@ export default { ...@@ -35,10 +35,7 @@ export default {
}; };
</script> </script>
<template> <template>
<div <div id="peek-request-selector">
id="peek-request-selector"
class="float-right"
>
<select v-model="currentRequestId"> <select v-model="currentRequestId">
<option <option
v-for="request in requests" v-for="request in requests"
......
<script> <script>
export default { export default {
name: 'PipelinesSvgState', name: 'PipelinesSvgState',
props: { props: {
svgPath: { svgPath: {
type: String, type: String,
required: true, required: true,
}, },
message: { message: {
type: String, type: String,
required: true, required: true,
},
}, },
}; },
};
</script> </script>
<template> <template>
......
<script> <script>
export default { export default {
name: 'PipelinesEmptyState', name: 'PipelinesEmptyState',
props: { props: {
helpPagePath: { helpPagePath: {
type: String, type: String,
required: true, required: true,
},
emptyStateSvgPath: {
type: String,
required: true,
},
canSetCi: {
type: Boolean,
required: true,
},
}, },
}; emptyStateSvgPath: {
type: String,
required: true,
},
canSetCi: {
type: Boolean,
required: true,
},
},
};
</script> </script>
<template> <template>
<div class="row empty-state js-empty-state"> <div class="row empty-state js-empty-state">
......
...@@ -41,7 +41,6 @@ export default { ...@@ -41,7 +41,6 @@ export default {
type: String, type: String,
required: true, required: true,
}, },
}, },
data() { data() {
return { return {
...@@ -67,7 +66,8 @@ export default { ...@@ -67,7 +66,8 @@ export default {
this.isDisabled = true; this.isDisabled = true;
axios.post(`${this.link}.json`) axios
.post(`${this.link}.json`)
.then(() => { .then(() => {
this.isDisabled = false; this.isDisabled = false;
this.$emit('pipelineActionRequestComplete'); this.$emit('pipelineActionRequestComplete');
......
<script> <script>
import ciIcon from '../../../vue_shared/components/ci_icon.vue'; import ciIcon from '../../../vue_shared/components/ci_icon.vue';
/** /**
* Component that renders both the CI icon status and the job name. * Component that renders both the CI icon status and the job name.
* Used in * Used in
* - Badge component * - Badge component
* - Dropdown badge components * - Dropdown badge components
*/ */
export default { export default {
components: { components: {
ciIcon, ciIcon,
},
props: {
name: {
type: String,
required: true,
}, },
props: {
name: {
type: String,
required: true,
},
status: { status: {
type: Object, type: Object,
required: true, required: true,
},
}, },
}; },
};
</script> </script>
<template> <template>
<span class="ci-job-name-component"> <span class="ci-job-name-component">
......
<script> <script>
import ciHeader from '../../vue_shared/components/header_ci_component.vue'; import ciHeader from '../../vue_shared/components/header_ci_component.vue';
import eventHub from '../event_hub'; import eventHub from '../event_hub';
import loadingIcon from '../../vue_shared/components/loading_icon.vue'; import loadingIcon from '../../vue_shared/components/loading_icon.vue';
export default { export default {
name: 'PipelineHeaderSection', name: 'PipelineHeaderSection',
components: { components: {
ciHeader, ciHeader,
loadingIcon, loadingIcon,
},
props: {
pipeline: {
type: Object,
required: true,
}, },
props: { isLoading: {
pipeline: { type: Boolean,
type: Object, required: true,
required: true,
},
isLoading: {
type: Boolean,
required: true,
},
},
data() {
return {
actions: this.getActions(),
};
}, },
},
data() {
return {
actions: this.getActions(),
};
},
computed: { computed: {
status() { status() {
return this.pipeline.details && this.pipeline.details.status; return this.pipeline.details && this.pipeline.details.status;
}, },
shouldRenderContent() { shouldRenderContent() {
return !this.isLoading && Object.keys(this.pipeline).length; return !this.isLoading && Object.keys(this.pipeline).length;
},
}, },
},
watch: { watch: {
pipeline() { pipeline() {
this.actions = this.getActions(); this.actions = this.getActions();
},
}, },
},
methods: { methods: {
postAction(action) { postAction(action) {
const index = this.actions.indexOf(action); const index = this.actions.indexOf(action);
this.$set(this.actions[index], 'isLoading', true); this.$set(this.actions[index], 'isLoading', true);
eventHub.$emit('headerPostAction', action); eventHub.$emit('headerPostAction', action);
}, },
getActions() { getActions() {
const actions = []; const actions = [];
if (this.pipeline.retry_path) { if (this.pipeline.retry_path) {
actions.push({ actions.push({
label: 'Retry', label: 'Retry',
path: this.pipeline.retry_path, path: this.pipeline.retry_path,
cssClass: 'js-retry-button btn btn-inverted-secondary', cssClass: 'js-retry-button btn btn-inverted-secondary',
type: 'button', type: 'button',
isLoading: false, isLoading: false,
}); });
} }
if (this.pipeline.cancel_path) { if (this.pipeline.cancel_path) {
actions.push({ actions.push({
label: 'Cancel running', label: 'Cancel running',
path: this.pipeline.cancel_path, path: this.pipeline.cancel_path,
cssClass: 'js-btn-cancel-pipeline btn btn-danger', cssClass: 'js-btn-cancel-pipeline btn btn-danger',
type: 'button', type: 'button',
isLoading: false, isLoading: false,
}); });
} }
return actions; return actions;
},
}, },
}; },
};
</script> </script>
<template> <template>
<div class="pipeline-header-container"> <div class="pipeline-header-container">
......
<script> <script>
import LoadingButton from '../../vue_shared/components/loading_button.vue'; import LoadingButton from '../../vue_shared/components/loading_button.vue';
export default { export default {
name: 'PipelineNavControls', name: 'PipelineNavControls',
components: { components: {
LoadingButton, LoadingButton,
},
props: {
newPipelinePath: {
type: String,
required: false,
default: null,
}, },
props: {
newPipelinePath: {
type: String,
required: false,
default: null,
},
resetCachePath: { resetCachePath: {
type: String, type: String,
required: false, required: false,
default: null, default: null,
}, },
ciLintPath: { ciLintPath: {
type: String, type: String,
required: false, required: false,
default: null, default: null,
}, },
isResetCacheButtonLoading: { isResetCacheButtonLoading: {
type: Boolean, type: Boolean,
required: false, required: false,
default: false, default: false,
},
}, },
methods: { },
onClickResetCache() { methods: {
this.$emit('resetRunnersCache', this.resetCachePath); onClickResetCache() {
}, this.$emit('resetRunnersCache', this.resetCachePath);
}, },
}; },
};
</script> </script>
<template> <template>
<div class="nav-controls"> <div class="nav-controls">
......
<script> <script>
import userAvatarLink from '../../vue_shared/components/user_avatar/user_avatar_link.vue'; import userAvatarLink from '../../vue_shared/components/user_avatar/user_avatar_link.vue';
import tooltip from '../../vue_shared/directives/tooltip'; import tooltip from '../../vue_shared/directives/tooltip';
import popover from '../../vue_shared/directives/popover'; import popover from '../../vue_shared/directives/popover';
export default { export default {
components: { components: {
userAvatarLink, userAvatarLink,
},
directives: {
tooltip,
popover,
},
props: {
pipeline: {
type: Object,
required: true,
}, },
directives: { autoDevopsHelpPath: {
tooltip, type: String,
popover, required: true,
}, },
props: { },
pipeline: { computed: {
type: Object, user() {
required: true, return this.pipeline.user;
},
autoDevopsHelpPath: {
type: String,
required: true,
},
}, },
computed: { popoverOptions() {
user() { return {
return this.pipeline.user; html: true,
}, trigger: 'focus',
popoverOptions() { placement: 'top',
return { title: `<div class="autodevops-title">
html: true,
trigger: 'focus',
placement: 'top',
title: `<div class="autodevops-title">
This pipeline makes use of a predefined CI/CD configuration enabled by <b>Auto DevOps.</b> This pipeline makes use of a predefined CI/CD configuration enabled by <b>Auto DevOps.</b>
</div>`, </div>`,
content: `<a content: `<a
class="autodevops-link" class="autodevops-link"
href="${this.autoDevopsHelpPath}" href="${this.autoDevopsHelpPath}"
target="_blank" target="_blank"
rel="noopener noreferrer nofollow"> rel="noopener noreferrer nofollow">
Learn more about Auto DevOps Learn more about Auto DevOps
</a>`, </a>`,
}; };
},
}, },
}; },
};
</script> </script>
<template> <template>
<div class="table-section section-15 d-none d-sm-none d-md-block pipeline-tags"> <div class="table-section section-15 d-none d-sm-none d-md-block pipeline-tags">
......
<script> <script>
import eventHub from '../event_hub'; import eventHub from '../event_hub';
import loadingIcon from '../../vue_shared/components/loading_icon.vue'; import loadingIcon from '../../vue_shared/components/loading_icon.vue';
import icon from '../../vue_shared/components/icon.vue'; import icon from '../../vue_shared/components/icon.vue';
import tooltip from '../../vue_shared/directives/tooltip'; import tooltip from '../../vue_shared/directives/tooltip';
export default { export default {
directives: { directives: {
tooltip, tooltip,
},
components: {
loadingIcon,
icon,
},
props: {
actions: {
type: Array,
required: true,
}, },
components: { },
loadingIcon, data() {
icon, return {
}, isLoading: false,
props: { };
actions: { },
type: Array, methods: {
required: true, onClickAction(endpoint) {
}, this.isLoading = true;
},
data() {
return {
isLoading: false,
};
},
methods: {
onClickAction(endpoint) {
this.isLoading = true;
eventHub.$emit('postAction', endpoint); eventHub.$emit('postAction', endpoint);
}, },
isActionDisabled(action) { isActionDisabled(action) {
if (action.playable === undefined) { if (action.playable === undefined) {
return false; return false;
} }
return !action.playable; return !action.playable;
},
}, },
}; },
};
</script> </script>
<template> <template>
<div class="btn-group"> <div class="btn-group">
......
<script> <script>
import tooltip from '../../vue_shared/directives/tooltip'; import tooltip from '../../vue_shared/directives/tooltip';
import icon from '../../vue_shared/components/icon.vue'; import icon from '../../vue_shared/components/icon.vue';
export default { export default {
directives: { directives: {
tooltip, tooltip,
},
components: {
icon,
},
props: {
artifacts: {
type: Array,
required: true,
}, },
components: { },
icon, };
},
props: {
artifacts: {
type: Array,
required: true,
},
},
};
</script> </script>
<template> <template>
<div <div
......
<script> <script>
import Modal from '~/vue_shared/components/gl_modal.vue'; import Modal from '~/vue_shared/components/gl_modal.vue';
import { s__, sprintf } from '~/locale'; import { s__, sprintf } from '~/locale';
import PipelinesTableRowComponent from './pipelines_table_row.vue'; import PipelinesTableRowComponent from './pipelines_table_row.vue';
import eventHub from '../event_hub'; import eventHub from '../event_hub';
/** /**
* Pipelines Table Component. * Pipelines Table Component.
* *
* Given an array of objects, renders a table. * Given an array of objects, renders a table.
*/ */
export default { export default {
components: { components: {
PipelinesTableRowComponent, PipelinesTableRowComponent,
Modal, Modal,
},
props: {
pipelines: {
type: Array,
required: true,
}, },
props: { updateGraphDropdown: {
pipelines: { type: Boolean,
type: Array, required: false,
required: true, default: false,
},
updateGraphDropdown: {
type: Boolean,
required: false,
default: false,
},
autoDevopsHelpPath: {
type: String,
required: true,
},
viewType: {
type: String,
required: true,
},
}, },
data() { autoDevopsHelpPath: {
return { type: String,
pipelineId: '', required: true,
endpoint: '',
cancelingPipeline: null,
};
}, },
computed: { viewType: {
modalTitle() { type: String,
return sprintf(s__('Pipeline|Stop pipeline #%{pipelineId}?'), { required: true,
},
},
data() {
return {
pipelineId: '',
endpoint: '',
cancelingPipeline: null,
};
},
computed: {
modalTitle() {
return sprintf(
s__('Pipeline|Stop pipeline #%{pipelineId}?'),
{
pipelineId: `${this.pipelineId}`, pipelineId: `${this.pipelineId}`,
}, false); },
}, false,
modalText() { );
return sprintf(s__('Pipeline|You’re about to stop pipeline %{pipelineId}.'), {
pipelineId: `<strong>#${this.pipelineId}</strong>`,
}, false);
},
}, },
created() { modalText() {
eventHub.$on('openConfirmationModal', this.setModalData); return sprintf(
s__('Pipeline|You’re about to stop pipeline %{pipelineId}.'),
{
pipelineId: `<strong>#${this.pipelineId}</strong>`,
},
false,
);
}, },
beforeDestroy() { },
eventHub.$off('openConfirmationModal', this.setModalData); created() {
eventHub.$on('openConfirmationModal', this.setModalData);
},
beforeDestroy() {
eventHub.$off('openConfirmationModal', this.setModalData);
},
methods: {
setModalData(data) {
this.pipelineId = data.pipelineId;
this.endpoint = data.endpoint;
}, },
methods: { onSubmit() {
setModalData(data) { eventHub.$emit('postAction', this.endpoint);
this.pipelineId = data.pipelineId; this.cancelingPipeline = this.pipelineId;
this.endpoint = data.endpoint;
},
onSubmit() {
eventHub.$emit('postAction', this.endpoint);
this.cancelingPipeline = this.pipelineId;
},
}, },
}; },
};
</script> </script>
<template> <template>
<div class="ci-table"> <div class="ci-table">
......
<script> <script>
import iconTimerSvg from 'icons/_icon_timer.svg'; import iconTimerSvg from 'icons/_icon_timer.svg';
import '../../lib/utils/datetime_utility'; import '../../lib/utils/datetime_utility';
import tooltip from '../../vue_shared/directives/tooltip'; import tooltip from '../../vue_shared/directives/tooltip';
import timeagoMixin from '../../vue_shared/mixins/timeago'; import timeagoMixin from '../../vue_shared/mixins/timeago';
export default { export default {
directives: { directives: {
tooltip, tooltip,
},
mixins: [timeagoMixin],
props: {
finishedTime: {
type: String,
required: true,
}, },
mixins: [ duration: {
timeagoMixin, type: Number,
], required: true,
props: {
finishedTime: {
type: String,
required: true,
},
duration: {
type: Number,
required: true,
},
}, },
data() { },
return { data() {
iconTimerSvg, return {
}; iconTimerSvg,
};
},
computed: {
hasDuration() {
return this.duration > 0;
}, },
computed: { hasFinishedTime() {
hasDuration() { return this.finishedTime !== '';
return this.duration > 0; },
}, durationFormated() {
hasFinishedTime() { const date = new Date(this.duration * 1000);
return this.finishedTime !== '';
},
durationFormated() {
const date = new Date(this.duration * 1000);
let hh = date.getUTCHours(); let hh = date.getUTCHours();
let mm = date.getUTCMinutes(); let mm = date.getUTCMinutes();
let ss = date.getSeconds(); let ss = date.getSeconds();
// left pad // left pad
if (hh < 10) { if (hh < 10) {
hh = `0${hh}`; hh = `0${hh}`;
} }
if (mm < 10) { if (mm < 10) {
mm = `0${mm}`; mm = `0${mm}`;
} }
if (ss < 10) { if (ss < 10) {
ss = `0${ss}`; ss = `0${ss}`;
} }
return `${hh}:${mm}:${ss}`; return `${hh}:${mm}:${ss}`;
},
}, },
}; },
};
</script> </script>
<template> <template>
<div class="table-section section-15 pipelines-time-ago"> <div class="table-section section-15 pipelines-time-ago">
......
...@@ -75,8 +75,7 @@ export default { ...@@ -75,8 +75,7 @@ export default {
// Stop polling // Stop polling
this.poll.stop(); this.poll.stop();
// Update the table // Update the table
return this.getPipelines() return this.getPipelines().then(() => this.poll.restart());
.then(() => this.poll.restart());
}, },
fetchPipelines() { fetchPipelines() {
if (!this.isMakingRequest) { if (!this.isMakingRequest) {
...@@ -86,9 +85,10 @@ export default { ...@@ -86,9 +85,10 @@ export default {
} }
}, },
getPipelines() { getPipelines() {
return this.service.getPipelines(this.requestData) return this.service
.getPipelines(this.requestData)
.then(response => this.successCallback(response)) .then(response => this.successCallback(response))
.catch((error) => this.errorCallback(error)); .catch(error => this.errorCallback(error));
}, },
setCommonData(pipelines) { setCommonData(pipelines) {
this.store.storePipelines(pipelines); this.store.storePipelines(pipelines);
...@@ -118,7 +118,8 @@ export default { ...@@ -118,7 +118,8 @@ export default {
} }
}, },
postAction(endpoint) { postAction(endpoint) {
this.service.postAction(endpoint) this.service
.postAction(endpoint)
.then(() => this.fetchPipelines()) .then(() => this.fetchPipelines())
.catch(() => Flash(__('An error occurred while making the request.'))); .catch(() => Flash(__('An error occurred while making the request.')));
}, },
......
...@@ -52,7 +52,8 @@ export default class pipelinesMediator { ...@@ -52,7 +52,8 @@ export default class pipelinesMediator {
refreshPipeline() { refreshPipeline() {
this.poll.stop(); this.poll.stop();
return this.service.getPipeline() return this.service
.getPipeline()
.then(response => this.successCallback(response)) .then(response => this.successCallback(response))
.catch(() => this.errorCallback()) .catch(() => this.errorCallback())
.finally(() => this.poll.restart()); .finally(() => this.poll.restart());
......
...@@ -243,3 +243,15 @@ label { ...@@ -243,3 +243,15 @@ label {
} }
} }
} }
.input-icon-wrapper {
position: relative;
.input-icon-right {
position: absolute;
right: 0.8em;
top: 50%;
transform: translateY(-50%);
color: $theme-gray-600;
}
}
...@@ -234,7 +234,7 @@ $md-area-border: #ddd; ...@@ -234,7 +234,7 @@ $md-area-border: #ddd;
/* /*
* Code * Code
*/ */
$code_font_size: 12px; $code_font_size: 90%;
$code_line_height: 1.6; $code_line_height: 1.6;
/* /*
......
...@@ -737,6 +737,10 @@ ...@@ -737,6 +737,10 @@
max-width: 560px; max-width: 560px;
width: 100%; width: 100%;
z-index: 150; z-index: 150;
min-height: $dropdown-min-height;
max-height: $dropdown-max-height;
overflow-y: auto;
margin-bottom: 0;
@include media-breakpoint-up(sm) { @include media-breakpoint-up(sm) {
left: $gl-padding; left: $gl-padding;
......
...@@ -296,25 +296,12 @@ ...@@ -296,25 +296,12 @@
} }
} }
.modal-doorkeepr-auth,
.doorkeeper-app-form {
.scope-description {
color: $theme-gray-700;
}
}
.modal-doorkeepr-auth { .modal-doorkeepr-auth {
.modal-body { .modal-body {
padding: $gl-padding; padding: $gl-padding;
} }
} }
.doorkeeper-app-form {
.scope-description {
margin: 0 0 5px 17px;
}
}
.deprecated-service { .deprecated-service {
cursor: default; cursor: default;
} }
......
...@@ -7,7 +7,6 @@ ...@@ -7,7 +7,6 @@
top: 0; top: 0;
width: 100%; width: 100%;
z-index: 2000; z-index: 2000;
overflow-x: hidden;
height: $performance-bar-height; height: $performance-bar-height;
background: $black; background: $black;
...@@ -82,7 +81,7 @@ ...@@ -82,7 +81,7 @@
.view { .view {
margin-right: 15px; margin-right: 15px;
float: left; flex-shrink: 0;
&:last-child { &:last-child {
margin-right: 0; margin-right: 0;
......
...@@ -55,7 +55,7 @@ class Issue < ActiveRecord::Base ...@@ -55,7 +55,7 @@ class Issue < ActiveRecord::Base
scope :unassigned, -> { where('NOT EXISTS (SELECT TRUE FROM issue_assignees WHERE issue_id = issues.id)') } scope :unassigned, -> { where('NOT EXISTS (SELECT TRUE FROM issue_assignees WHERE issue_id = issues.id)') }
scope :assigned_to, ->(u) { where('EXISTS (SELECT TRUE FROM issue_assignees WHERE user_id = ? AND issue_id = issues.id)', u.id)} scope :assigned_to, ->(u) { where('EXISTS (SELECT TRUE FROM issue_assignees WHERE user_id = ? AND issue_id = issues.id)', u.id)}
scope :with_due_date, -> { where('due_date IS NOT NULL') } scope :with_due_date, -> { where.not(due_date: nil) }
scope :without_due_date, -> { where(due_date: nil) } scope :without_due_date, -> { where(due_date: nil) }
scope :due_before, ->(date) { where('issues.due_date < ?', date) } scope :due_before, ->(date) { where('issues.due_date < ?', date) }
scope :due_between, ->(from_date, to_date) { where('issues.due_date >= ?', from_date).where('issues.due_date <= ?', to_date) } scope :due_between, ->(from_date, to_date) { where('issues.due_date >= ?', from_date).where('issues.due_date <= ?', to_date) }
...@@ -63,7 +63,7 @@ class Issue < ActiveRecord::Base ...@@ -63,7 +63,7 @@ class Issue < ActiveRecord::Base
scope :order_due_date_asc, -> { reorder('issues.due_date IS NULL, issues.due_date ASC') } scope :order_due_date_asc, -> { reorder('issues.due_date IS NULL, issues.due_date ASC') }
scope :order_due_date_desc, -> { reorder('issues.due_date IS NULL, issues.due_date DESC') } scope :order_due_date_desc, -> { reorder('issues.due_date IS NULL, issues.due_date DESC') }
scope :order_closest_future_date, -> { reorder('CASE WHEN due_date >= CURRENT_DATE THEN 0 ELSE 1 END ASC, ABS(CURRENT_DATE - due_date) ASC') } scope :order_closest_future_date, -> { reorder('CASE WHEN issues.due_date >= CURRENT_DATE THEN 0 ELSE 1 END ASC, ABS(CURRENT_DATE - issues.due_date) ASC') }
scope :preload_associations, -> { preload(:labels, project: :namespace) } scope :preload_associations, -> { preload(:labels, project: :namespace) }
......
...@@ -7,10 +7,10 @@ ...@@ -7,10 +7,10 @@
- values = Gitlab::Auth::OAuth::Provider.providers.map { |name| ["#{Gitlab::Auth::OAuth::Provider.label_for(name)} (#{name})", name] } - values = Gitlab::Auth::OAuth::Provider.providers.map { |name| ["#{Gitlab::Auth::OAuth::Provider.label_for(name)} (#{name})", name] }
= f.select :provider, values, { allow_blank: false }, class: 'form-control' = f.select :provider, values, { allow_blank: false }, class: 'form-control'
.form-group.row .form-group.row
= f.label :extern_uid, "Identifier", class: 'col-form-label col-sm-2' = f.label :extern_uid, _("Identifier"), class: 'col-form-label col-sm-2'
.col-sm-10 .col-sm-10
= f.text_field :extern_uid, class: 'form-control', required: true = f.text_field :extern_uid, class: 'form-control', required: true
.form-actions .form-actions
= f.submit 'Save changes', class: "btn btn-save" = f.submit _('Save changes'), class: "btn btn-save"
...@@ -5,8 +5,8 @@ ...@@ -5,8 +5,8 @@
= identity.extern_uid = identity.extern_uid
%td %td
= link_to edit_admin_user_identity_path(@user, identity), class: 'btn btn-sm btn-grouped' do = link_to edit_admin_user_identity_path(@user, identity), class: 'btn btn-sm btn-grouped' do
Edit = _("Edit")
= link_to [:admin, @user, identity], method: :delete, = link_to [:admin, @user, identity], method: :delete,
class: 'btn btn-sm btn-danger', class: 'btn btn-sm btn-danger',
data: { confirm: "Are you sure you want to remove this identity?" } do data: { confirm: _("Are you sure you want to remove this identity?") } do
Delete = _('Delete')
- page_title "Edit", @identity.provider, "Identities", @user.name, "Users" - page_title _("Edit"), @identity.provider, _("Identities"), @user.name, _("Users")
%h3.page-title %h3.page-title
Edit identity for #{@user.name} = _('Edit identity for %{user_name}') % { user_name: @user.name }
%hr %hr
= render 'form' = render 'form'
- page_title "Identities", @user.name, "Users" - page_title _("Identities"), @user.name, _("Users")
= render 'admin/users/head' = render 'admin/users/head'
= link_to 'New identity', new_admin_user_identity_path, class: 'float-right btn btn-new' = link_to _('New identity'), new_admin_user_identity_path, class: 'float-right btn btn-new'
- if @identities.present? - if @identities.present?
.table-holder .table-holder
%table.table %table.table
%thead %thead
%tr %tr
%th Provider %th= _('Provider')
%th Identifier %th= _('Identifier')
%th %th
= render @identities = render @identities
- else - else
%h4 This user has no identities %h4= _('This user has no identities')
- page_title "New Identity" - page_title _("New Identity")
%h3.page-title New identity %h3.page-title= _('New identity')
%hr %hr
= render 'form' = render 'form'
...@@ -9,13 +9,17 @@ ...@@ -9,13 +9,17 @@
= form_errors(token) = form_errors(token)
.form-group .row
= f.label :name, class: 'label-light' .form-group.col-md-6
= f.text_field :name, class: "form-control", required: true = f.label :name, class: 'label-light'
= f.text_field :name, class: "form-control", required: true
.form-group .row
= f.label :expires_at, class: 'label-light' .form-group.col-md-6
= f.text_field :expires_at, class: "datepicker form-control" = f.label :expires_at, class: 'label-light'
.input-icon-wrapper
= f.text_field :expires_at, class: "datepicker form-control", placeholder: 'YYYY-MM-DD'
= icon('calendar', { class: 'input-icon-right' })
.form-group .form-group
= f.label :scopes, class: 'label-light' = f.label :scopes, class: 'label-light'
......
...@@ -3,7 +3,7 @@ ...@@ -3,7 +3,7 @@
- token = local_assigns.fetch(:token) - token = local_assigns.fetch(:token)
- scopes.each do |scope| - scopes.each do |scope|
%fieldset %fieldset.form-group.form-check
= check_box_tag "#{prefix}[scopes][]", scope, token.scopes.include?(scope), id: "#{prefix}_scopes_#{scope}" = check_box_tag "#{prefix}[scopes][]", scope, token.scopes.include?(scope), id: "#{prefix}_scopes_#{scope}", class: 'form-check-input'
= label_tag ("#{prefix}_scopes_#{scope}"), scope, class: "label-light" = label_tag ("#{prefix}_scopes_#{scope}"), scope, class: 'label-light form-check-label'
.scope-description= t scope, scope: [:doorkeeper, :scope_desc] .text-secondary= t scope, scope: [:doorkeeper, :scope_desc]
---
title: Removes the environment scope field for users that cannot edit it
merge_request: 19643
author:
type: changed
---
title: Fix ambiguous due_date column for Issue scopes
merge_request:
author:
type: fixed
---
title: Fix sidebar collapse breapoints for job and wiki pages
merge_request:
author:
type: fixed
---
title: Fix performance bar modal visibility in Safari
merge_request:
author:
type: fixed
---
title: fix size of code blocks in headings
merge_request:
author:
type: fixed
---
title: Fully migrate pipeline stages position
merge_request: 19369
author:
type: performance
---
title: Always serve favicon from main GitLab domain so that CI badge can be drawn
over it
merge_request:
author:
type: fixed
---
title: Fix OAuth Application Authorization screen to appear with each access
merge_request: 20216
author:
type: fixed
---
title: Enable Doorkeeper option to avoid generating new tokens when users login via
oauth
merge_request: 20200
author:
type: fixed
---
title: Schedule workers to delete non-latest diffs in post-migration
merge_request:
author:
type: other
---
title: Rails5 fix MySQL milliseconds problem in specs
merge_request: 20221
author: Jasper Maes
type: fixed
---
title: Rails5 fix Mysql comparison failure caused by milliseconds problem
merge_request: 20222
author: Jasper Maes
type: fixed
---
title: Add transfer project API endpoint
merge_request: 20122
author: Aram Visser
type: added
...@@ -37,7 +37,7 @@ Doorkeeper.configure do ...@@ -37,7 +37,7 @@ Doorkeeper.configure do
# Reuse access token for the same resource owner within an application (disabled by default) # Reuse access token for the same resource owner within an application (disabled by default)
# Rationale: https://github.com/doorkeeper-gem/doorkeeper/issues/383 # Rationale: https://github.com/doorkeeper-gem/doorkeeper/issues/383
# reuse_access_token reuse_access_token
# Issue access tokens with refresh token (disabled by default) # Issue access tokens with refresh token (disabled by default)
use_refresh_token use_refresh_token
...@@ -106,3 +106,53 @@ Doorkeeper.configure do ...@@ -106,3 +106,53 @@ Doorkeeper.configure do
base_controller '::Gitlab::BaseDoorkeeperController' base_controller '::Gitlab::BaseDoorkeeperController'
end end
# Monkey patch to avoid creating new applications if the scope of the
# app created does not match the complete list of scopes of the configured app.
# It also prevents the OAuth authorize application window to appear every time.
# Remove after we upgrade the doorkeeper gem from version 4.3.2
if Doorkeeper.gem_version > Gem::Version.new('4.3.2')
raise "Doorkeeper was upgraded, please remove the monkey patch in #{__FILE__}"
end
module Doorkeeper
module AccessTokenMixin
module ClassMethods
def matching_token_for(application, resource_owner_or_id, scopes)
resource_owner_id =
if resource_owner_or_id.respond_to?(:to_key)
resource_owner_or_id.id
else
resource_owner_or_id
end
tokens = authorized_tokens_for(application.try(:id), resource_owner_id)
tokens.detect do |token|
scopes_match?(token.scopes, scopes, application.try(:scopes))
end
end
def scopes_match?(token_scopes, param_scopes, app_scopes)
return true if token_scopes.empty? && param_scopes.empty?
(token_scopes.sort == param_scopes.sort) &&
Doorkeeper::OAuth::Helpers::ScopeChecker.valid?(
param_scopes.to_s,
Doorkeeper.configuration.scopes,
app_scopes)
end
def authorized_tokens_for(application_id, resource_owner_id)
ordered_by(:created_at, :desc)
.where(application_id: application_id,
resource_owner_id: resource_owner_id,
revoked_at: nil)
end
def last_authorized_token_for(application_id, resource_owner_id)
authorized_tokens_for(application_id, resource_owner_id).first
end
end
end
end
class CleanupStagesPositionMigration < ActiveRecord::Migration
include Gitlab::Database::MigrationHelpers
DOWNTIME = false
TMP_INDEX_NAME = 'tmp_id_stage_position_partial_null_index'.freeze
disable_ddl_transaction!
class Stages < ActiveRecord::Base
include EachBatch
self.table_name = 'ci_stages'
end
def up
disable_statement_timeout
Gitlab::BackgroundMigration.steal('MigrateStageIndex')
unless index_exists_by_name?(:ci_stages, TMP_INDEX_NAME)
add_concurrent_index(:ci_stages, :id, where: 'position IS NULL', name: TMP_INDEX_NAME)
end
migratable = <<~SQL
position IS NULL AND EXISTS (
SELECT 1 FROM ci_builds WHERE stage_id = ci_stages.id AND stage_idx IS NOT NULL
)
SQL
Stages.where(migratable).each_batch(of: 1000) do |batch|
batch.pluck(:id).each do |stage|
Gitlab::BackgroundMigration::MigrateStageIndex.new.perform(stage, stage)
end
end
remove_concurrent_index_by_name(:ci_stages, TMP_INDEX_NAME)
end
def down
if index_exists_by_name?(:ci_stages, TMP_INDEX_NAME)
remove_concurrent_index_by_name(:ci_stages, TMP_INDEX_NAME)
end
end
end
class EnqueueDeleteDiffFilesWorkers < ActiveRecord::Migration
include Gitlab::Database::MigrationHelpers
class MergeRequestDiff < ActiveRecord::Base
self.table_name = 'merge_request_diffs'
belongs_to :merge_request
include EachBatch
end
DOWNTIME = false
BATCH_SIZE = 1000
MIGRATION = 'DeleteDiffFiles'
DELAY_INTERVAL = 8.minutes
TMP_INDEX = 'tmp_partial_diff_id_with_files_index'.freeze
disable_ddl_transaction!
def up
# We add temporary index, to make iteration over batches more performant.
# Conditional here is to avoid the need of doing that in a separate
# migration file to make this operation idempotent.
#
unless index_exists_by_name?(:merge_request_diffs, TMP_INDEX)
add_concurrent_index(:merge_request_diffs, :id, where: "(state NOT IN ('without_files', 'empty'))", name: TMP_INDEX)
end
diffs_with_files = MergeRequestDiff.where.not(state: ['without_files', 'empty'])
# explain (analyze, buffers) example for the iteration:
#
# Index Only Scan using tmp_index_20013 on merge_request_diffs (cost=0.43..1630.19 rows=60567 width=4) (actual time=0.047..9.572 rows=56976 loops=1)
# Index Cond: ((id >= 764586) AND (id < 835298))
# Heap Fetches: 8
# Buffers: shared hit=18188
# Planning time: 0.752 ms
# Execution time: 12.430 ms
#
diffs_with_files.each_batch(of: BATCH_SIZE) do |relation, outer_index|
ids = relation.pluck(:id)
ids.each_with_index do |diff_id, inner_index|
# This will give some space between batches of workers.
interval = DELAY_INTERVAL * outer_index + inner_index.minutes
# A single `merge_request_diff` can be associated with way too many
# `merge_request_diff_files`. It's better to avoid batching these and
# schedule one at a time.
#
# Considering roughly 6M jobs, this should take ~30 days to process all
# of them.
#
BackgroundMigrationWorker.perform_in(interval, MIGRATION, [diff_id])
end
end
# We remove temporary index, because it is not required during standard
# operations and runtime.
#
remove_concurrent_index_by_name(:merge_request_diffs, TMP_INDEX)
end
def down
if index_exists_by_name?(:merge_request_diffs, TMP_INDEX)
remove_concurrent_index_by_name(:merge_request_diffs, TMP_INDEX)
end
end
end
...@@ -1502,6 +1502,16 @@ DELETE /projects/:id/push_rule ...@@ -1502,6 +1502,16 @@ DELETE /projects/:id/push_rule
| --------- | ---- | -------- | ----------- | | --------- | ---- | -------- | ----------- |
| `id` | integer/string | yes | The ID or [URL-encoded path of the project](README.md#namespaced-path-encoding) | | `id` | integer/string | yes | The ID or [URL-encoded path of the project](README.md#namespaced-path-encoding) |
### Transfer a project to a new namespace
```
PUT /projects/:id/transfer
```
| Attribute | Type | Required | Description |
| --------- | ---- | -------- | ----------- |
| `namespace` | integer/string | yes | The ID or path of the namespace to transfer to project to |
## Branches ## Branches
Read more in the [Branches](branches.md) documentation. Read more in the [Branches](branches.md) documentation.
......
...@@ -32,7 +32,8 @@ with all their related data and be moved into a new GitLab instance. ...@@ -32,7 +32,8 @@ with all their related data and be moved into a new GitLab instance.
| GitLab version | Import/Export version | | GitLab version | Import/Export version |
| ---------------- | --------------------- | | ---------------- | --------------------- |
| 10.8 to current | 0.2.3 | | 11.1 to current | 0.2.4 |
| 10.8 | 0.2.3 |
| 10.4 | 0.2.2 | | 10.4 | 0.2.2 |
| 10.3 | 0.2.1 | | 10.3 | 0.2.1 |
| 10.0 | 0.2.0 | | 10.0 | 0.2.0 |
......
...@@ -466,6 +466,23 @@ module API ...@@ -466,6 +466,23 @@ module API
conflict!(error.message) conflict!(error.message)
end end
end end
desc 'Transfer a project to a new namespace'
params do
requires :namespace, type: String, desc: 'The ID or path of the new namespace'
end
put ":id/transfer" do
authorize! :change_namespace, user_project
namespace = find_namespace!(params[:namespace])
result = ::Projects::TransferService.new(user_project, current_user).execute(namespace)
if result
present user_project, with: Entities::Project
else
render_api_error!("Failed to transfer project #{user_project.errors.messages}", 400)
end
end
end end
end end
end end
...@@ -45,11 +45,7 @@ module Banzai ...@@ -45,11 +45,7 @@ module Banzai
def self.transform_context(context) def self.transform_context(context)
context[:only_path] = true unless context.key?(:only_path) context[:only_path] = true unless context.key?(:only_path)
context.merge( context
# EmojiFilter
asset_host: Gitlab::Application.config.asset_host,
asset_root: Gitlab.config.gitlab.base_url
)
end end
end end
end end
......
# frozen_string_literal: true
# rubocop:disable Metrics/AbcSize
# rubocop:disable Style/Documentation
module Gitlab
module BackgroundMigration
class DeleteDiffFiles
def perform(merge_request_diff_id)
merge_request_diff = MergeRequestDiff.find_by(id: merge_request_diff_id)
return unless merge_request_diff
return unless should_delete_diff_files?(merge_request_diff)
MergeRequestDiff.transaction do
merge_request_diff.update_column(:state, 'without_files')
# explain (analyze, buffers) when deleting 453 diff files:
#
# Delete on merge_request_diff_files (cost=0.57..8487.35 rows=4846 width=6) (actual time=43.265..43.265 rows=0 loops=1)
# Buffers: shared hit=2043 read=259 dirtied=254
# -> Index Scan using index_merge_request_diff_files_on_mr_diff_id_and_order on merge_request_diff_files (cost=0.57..8487.35 rows=4846 width=6) (actu
# al time=0.466..26.317 rows=453 loops=1)
# Index Cond: (merge_request_diff_id = 463448)
# Buffers: shared hit=17 read=84
# Planning time: 0.107 ms
# Execution time: 43.287 ms
#
MergeRequestDiffFile.where(merge_request_diff_id: merge_request_diff.id).delete_all
end
end
private
def should_delete_diff_files?(merge_request_diff)
return false if merge_request_diff.state == 'without_files'
merge_request = merge_request_diff.merge_request
return false unless merge_request.state == 'merged'
return false if merge_request_diff.id == merge_request.latest_merge_request_diff_id
true
end
end
end
end
...@@ -38,7 +38,8 @@ module Gitlab ...@@ -38,7 +38,8 @@ module Gitlab
# we only want to create full urls when there's a different asset_host # we only want to create full urls when there's a different asset_host
# configured. # configured.
def host def host
if Gitlab::Application.config.asset_host.nil? || Gitlab::Application.config.asset_host == Gitlab.config.gitlab.base_url asset_host = ActionController::Base.asset_host
if asset_host.nil? || asset_host == Gitlab.config.gitlab.base_url
nil nil
else else
Gitlab.config.gitlab.base_url Gitlab.config.gitlab.base_url
......
...@@ -3,7 +3,7 @@ module Gitlab ...@@ -3,7 +3,7 @@ module Gitlab
extend self extend self
# For every version update, the version history in import_export.md has to be kept up to date. # For every version update, the version history in import_export.md has to be kept up to date.
VERSION = '0.2.3'.freeze VERSION = '0.2.4'.freeze
FILENAME_LIMIT = 50 FILENAME_LIMIT = 50
def export_path(relative_path:) def export_path(relative_path:)
......
...@@ -10,15 +10,22 @@ namespace :gitlab do ...@@ -10,15 +10,22 @@ namespace :gitlab do
puts YAML.load_file(Gitlab::ImportExport.config_file)['project_tree'].to_yaml(SortKeys: true) puts YAML.load_file(Gitlab::ImportExport.config_file)['project_tree'].to_yaml(SortKeys: true)
end end
desc 'GitLab | Bumps the Import/Export version for test_project_export.tar.gz' desc 'GitLab | Bumps the Import/Export version in fixtures and project templates'
task bump_test_version: :environment do task bump_version: :environment do
Dir.mktmpdir do |tmp_dir| archives = Dir['vendor/project_templates/*.tar.gz']
system("tar -zxf spec/features/projects/import_export/test_project_export.tar.gz -C #{tmp_dir} > /dev/null") archives.push('spec/features/projects/import_export/test_project_export.tar.gz')
File.write(File.join(tmp_dir, 'VERSION'), Gitlab::ImportExport.version, mode: 'w')
system("tar -zcvf spec/features/projects/import_export/test_project_export.tar.gz -C #{tmp_dir} . > /dev/null") archives.each do |archive|
raise ArgumentError unless File.exist?(archive)
Dir.mktmpdir do |tmp_dir|
system("tar -zxf #{archive} -C #{tmp_dir} > /dev/null")
File.write(File.join(tmp_dir, 'VERSION'), Gitlab::ImportExport.version, mode: 'w')
system("tar -zcvf #{archive} -C #{tmp_dir} . > /dev/null")
end
end end
puts "Updated to #{Gitlab::ImportExport.version}" puts "Updated #{archives} to #{Gitlab::ImportExport.version}."
end end
end end
end end
...@@ -8,8 +8,8 @@ msgid "" ...@@ -8,8 +8,8 @@ msgid ""
msgstr "" msgstr ""
"Project-Id-Version: gitlab 1.0.0\n" "Project-Id-Version: gitlab 1.0.0\n"
"Report-Msgid-Bugs-To: \n" "Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2018-06-20 16:52+0300\n" "POT-Creation-Date: 2018-06-29 16:17+1000\n"
"PO-Revision-Date: 2018-06-20 16:52+0300\n" "PO-Revision-Date: 2018-06-29 16:17+1000\n"
"Last-Translator: FULL NAME <EMAIL@ADDRESS>\n" "Last-Translator: FULL NAME <EMAIL@ADDRESS>\n"
"Language-Team: LANGUAGE <LL@li.org>\n" "Language-Team: LANGUAGE <LL@li.org>\n"
"Language: \n" "Language: \n"
...@@ -87,6 +87,9 @@ msgstr[1] "" ...@@ -87,6 +87,9 @@ msgstr[1] ""
msgid "%{filePath} deleted" msgid "%{filePath} deleted"
msgstr "" msgstr ""
msgid "%{group_docs_link_start}Groups%{group_docs_link_end} allow you to manage and collaborate across multiple projects. Members of a group have access to all of its projects."
msgstr ""
msgid "%{loadingIcon} Started" msgid "%{loadingIcon} Started"
msgstr "" msgstr ""
...@@ -357,6 +360,9 @@ msgstr "" ...@@ -357,6 +360,9 @@ msgstr ""
msgid "Alternatively, you can use a %{personal_access_token_link}. When you create your Personal Access Token, you will need to select the <code>repo</code> scope, so we can display a list of your public and private repositories which are available to import." msgid "Alternatively, you can use a %{personal_access_token_link}. When you create your Personal Access Token, you will need to select the <code>repo</code> scope, so we can display a list of your public and private repositories which are available to import."
msgstr "" msgstr ""
msgid "An error occured creating the new branch."
msgstr ""
msgid "An error occurred previewing the blob" msgid "An error occurred previewing the blob"
msgstr "" msgstr ""
...@@ -438,6 +444,9 @@ msgstr "" ...@@ -438,6 +444,9 @@ msgstr ""
msgid "Are you sure you want to delete this pipeline schedule?" msgid "Are you sure you want to delete this pipeline schedule?"
msgstr "" msgstr ""
msgid "Are you sure you want to remove this identity?"
msgstr ""
msgid "Are you sure you want to reset registration token?" msgid "Are you sure you want to reset registration token?"
msgstr "" msgstr ""
...@@ -555,6 +564,9 @@ msgstr "" ...@@ -555,6 +564,9 @@ msgstr ""
msgid "Average per day: %{average}" msgid "Average per day: %{average}"
msgstr "" msgstr ""
msgid "Background color"
msgstr ""
msgid "Background jobs" msgid "Background jobs"
msgstr "" msgstr ""
...@@ -630,6 +642,12 @@ msgstr "" ...@@ -630,6 +642,12 @@ msgstr ""
msgid "Begin with the selected commit" msgid "Begin with the selected commit"
msgstr "" msgstr ""
msgid "Boards"
msgstr ""
msgid "Branch %{branchName} was not found in this project's repository."
msgstr ""
msgid "Branch (%{branch_count})" msgid "Branch (%{branch_count})"
msgid_plural "Branches (%{branch_count})" msgid_plural "Branches (%{branch_count})"
msgstr[0] "" msgstr[0] ""
...@@ -836,6 +854,9 @@ msgstr "" ...@@ -836,6 +854,9 @@ msgstr ""
msgid "CICD|You need to specify a domain if you want to use Auto Review Apps and Auto Deploy stages." msgid "CICD|You need to specify a domain if you want to use Auto Review Apps and Auto Deploy stages."
msgstr "" msgstr ""
msgid "Can't find HEAD commit for this branch"
msgstr ""
msgid "Cancel" msgid "Cancel"
msgstr "" msgstr ""
...@@ -899,6 +920,9 @@ msgstr "" ...@@ -899,6 +920,9 @@ msgstr ""
msgid "Choose a branch/tag (e.g. %{master}) or enter a commit (e.g. %{sha}) to see what's changed or to create a merge request." msgid "Choose a branch/tag (e.g. %{master}) or enter a commit (e.g. %{sha}) to see what's changed or to create a merge request."
msgstr "" msgstr ""
msgid "Choose any color."
msgstr ""
msgid "Choose file..." msgid "Choose file..."
msgstr "" msgstr ""
...@@ -1094,7 +1118,7 @@ msgstr "" ...@@ -1094,7 +1118,7 @@ msgstr ""
msgid "ClusterIntegration|Environment scope" msgid "ClusterIntegration|Environment scope"
msgstr "" msgstr ""
msgid "ClusterIntegration|Every new Google Cloud Platform (GCP) account receives $300 in credit upon %{sign_up_link}. In partnership with Google, GitLab is able to offer an additional $200 for new GCP accounts to get started with GitLab's Google Kubernetes Engine Integration." msgid "ClusterIntegration|Every new Google Cloud Platform (GCP) account receives $300 in credit upon %{sign_up_link}. In partnership with Google, GitLab is able to offer an additional $200 for both new and existing GCP accounts to get started with GitLab's Google Kubernetes Engine Integration."
msgstr "" msgstr ""
msgid "ClusterIntegration|Fetching machine types" msgid "ClusterIntegration|Fetching machine types"
...@@ -1443,6 +1467,9 @@ msgstr "" ...@@ -1443,6 +1467,9 @@ msgstr ""
msgid "Committed by" msgid "Committed by"
msgstr "" msgstr ""
msgid "Commit…"
msgstr ""
msgid "Compare" msgid "Compare"
msgstr "" msgstr ""
...@@ -1560,6 +1587,9 @@ msgstr "" ...@@ -1560,6 +1587,9 @@ msgstr ""
msgid "Continuous Integration and Deployment" msgid "Continuous Integration and Deployment"
msgstr "" msgstr ""
msgid "Contribute to GitLab"
msgstr ""
msgid "Contribution" msgid "Contribution"
msgstr "" msgstr ""
...@@ -1743,6 +1773,9 @@ msgstr "" ...@@ -1743,6 +1773,9 @@ msgstr ""
msgid "Delete" msgid "Delete"
msgstr "" msgstr ""
msgid "Delete list"
msgstr ""
msgid "Deploy" msgid "Deploy"
msgid_plural "Deploys" msgid_plural "Deploys"
msgstr[0] "" msgstr[0] ""
...@@ -1955,12 +1988,18 @@ msgstr "" ...@@ -1955,12 +1988,18 @@ msgstr ""
msgid "Edit" msgid "Edit"
msgstr "" msgstr ""
msgid "Edit Label"
msgstr ""
msgid "Edit Pipeline Schedule %{id}" msgid "Edit Pipeline Schedule %{id}"
msgstr "" msgstr ""
msgid "Edit files in the editor and commit changes here" msgid "Edit files in the editor and commit changes here"
msgstr "" msgstr ""
msgid "Edit identity for %{user_name}"
msgstr ""
msgid "Email" msgid "Email"
msgstr "" msgstr ""
...@@ -2096,6 +2135,9 @@ msgstr "" ...@@ -2096,6 +2135,9 @@ msgstr ""
msgid "Error loading project data. Please try again." msgid "Error loading project data. Please try again."
msgstr "" msgstr ""
msgid "Error loading tree data. Please try again."
msgstr ""
msgid "Error occurred when toggling the notification subscription" msgid "Error occurred when toggling the notification subscription"
msgstr "" msgstr ""
...@@ -2183,6 +2225,9 @@ msgstr "" ...@@ -2183,6 +2225,9 @@ msgstr ""
msgid "February" msgid "February"
msgstr "" msgstr ""
msgid "Fetching diff files failed. Please reload the page to try again!"
msgstr ""
msgid "Fields on this page are now uneditable, you can configure" msgid "Fields on this page are now uneditable, you can configure"
msgstr "" msgstr ""
...@@ -2341,6 +2386,9 @@ msgstr "" ...@@ -2341,6 +2386,9 @@ msgstr ""
msgid "GroupSettings|remove the share with group lock from %{ancestor_group_name}" msgid "GroupSettings|remove the share with group lock from %{ancestor_group_name}"
msgstr "" msgstr ""
msgid "Groups can also be nested by creating %{subgroup_docs_link_start}subgroups%{subgroup_docs_link_end}."
msgstr ""
msgid "GroupsEmptyState|A group is a collection of several projects." msgid "GroupsEmptyState|A group is a collection of several projects."
msgstr "" msgstr ""
...@@ -2427,6 +2475,9 @@ msgstr "" ...@@ -2427,6 +2475,9 @@ msgstr ""
msgid "I accept the|Terms of Service and Privacy Policy" msgid "I accept the|Terms of Service and Privacy Policy"
msgstr "" msgstr ""
msgid "ID"
msgstr ""
msgid "IDE|Commit" msgid "IDE|Commit"
msgstr "" msgstr ""
...@@ -2442,6 +2493,12 @@ msgstr "" ...@@ -2442,6 +2493,12 @@ msgstr ""
msgid "IDE|Review" msgid "IDE|Review"
msgstr "" msgstr ""
msgid "Identifier"
msgstr ""
msgid "Identities"
msgstr ""
msgid "If you already have files you can push them using the %{link_to_cli} below." msgid "If you already have files you can push them using the %{link_to_cli} below."
msgstr "" msgstr ""
...@@ -2505,6 +2562,9 @@ msgstr "" ...@@ -2505,6 +2562,9 @@ msgstr ""
msgid "Introducing Cycle Analytics" msgid "Introducing Cycle Analytics"
msgstr "" msgstr ""
msgid "Issue Board"
msgstr ""
msgid "Issue events" msgid "Issue events"
msgstr "" msgstr ""
...@@ -2523,6 +2583,9 @@ msgstr "" ...@@ -2523,6 +2583,9 @@ msgstr ""
msgid "January" msgid "January"
msgstr "" msgstr ""
msgid "Job"
msgstr ""
msgid "Job has been erased" msgid "Job has been erased"
msgstr "" msgstr ""
...@@ -2837,6 +2900,9 @@ msgstr "" ...@@ -2837,6 +2900,9 @@ msgstr ""
msgid "Name new label" msgid "Name new label"
msgstr "" msgstr ""
msgid "Name your individual key via a title"
msgstr ""
msgid "Nav|Help" msgid "Nav|Help"
msgstr "" msgstr ""
...@@ -2849,6 +2915,9 @@ msgstr "" ...@@ -2849,6 +2915,9 @@ msgstr ""
msgid "Nav|Sign out and sign in with a different account" msgid "Nav|Sign out and sign in with a different account"
msgstr "" msgstr ""
msgid "New Identity"
msgstr ""
msgid "New Issue" msgid "New Issue"
msgid_plural "New Issues" msgid_plural "New Issues"
msgstr[0] "" msgstr[0] ""
...@@ -2860,6 +2929,9 @@ msgstr "" ...@@ -2860,6 +2929,9 @@ msgstr ""
msgid "New Kubernetes cluster" msgid "New Kubernetes cluster"
msgstr "" msgstr ""
msgid "New Label"
msgstr ""
msgid "New Pipeline Schedule" msgid "New Pipeline Schedule"
msgstr "" msgstr ""
...@@ -2878,6 +2950,9 @@ msgstr "" ...@@ -2878,6 +2950,9 @@ msgstr ""
msgid "New group" msgid "New group"
msgstr "" msgstr ""
msgid "New identity"
msgstr ""
msgid "New issue" msgid "New issue"
msgstr "" msgstr ""
...@@ -3076,6 +3151,9 @@ msgstr "" ...@@ -3076,6 +3151,9 @@ msgstr ""
msgid "Options" msgid "Options"
msgstr "" msgstr ""
msgid "Or you can choose one of the suggested colors below"
msgstr ""
msgid "Other Labels" msgid "Other Labels"
msgstr "" msgstr ""
...@@ -3112,6 +3190,9 @@ msgstr "" ...@@ -3112,6 +3190,9 @@ msgstr ""
msgid "Password" msgid "Password"
msgstr "" msgstr ""
msgid "Paste your public SSH key, which is usually contained in the file '~/.ssh/id_rsa.pub' and begins with 'ssh-rsa'. Don't use your private SSH key."
msgstr ""
msgid "Pause" msgid "Pause"
msgstr "" msgstr ""
...@@ -3298,6 +3379,9 @@ msgstr "" ...@@ -3298,6 +3379,9 @@ msgstr ""
msgid "Please solve the reCAPTCHA" msgid "Please solve the reCAPTCHA"
msgstr "" msgstr ""
msgid "Please try again"
msgstr ""
msgid "Please wait while we import the repository for you. Refresh at will." msgid "Please wait while we import the repository for you. Refresh at will."
msgstr "" msgstr ""
...@@ -3553,6 +3637,9 @@ msgstr "" ...@@ -3553,6 +3637,9 @@ msgstr ""
msgid "Protip:" msgid "Protip:"
msgstr "" msgstr ""
msgid "Provider"
msgstr ""
msgid "Public - The group and any public projects can be viewed without any authentication." msgid "Public - The group and any public projects can be viewed without any authentication."
msgstr "" msgstr ""
...@@ -3571,6 +3658,9 @@ msgstr "" ...@@ -3571,6 +3658,9 @@ msgstr ""
msgid "Quick actions can be used in the issues description and comment boxes." msgid "Quick actions can be used in the issues description and comment boxes."
msgstr "" msgstr ""
msgid "Re-deploy"
msgstr ""
msgid "Read more" msgid "Read more"
msgstr "" msgstr ""
...@@ -3699,6 +3789,9 @@ msgstr "" ...@@ -3699,6 +3789,9 @@ msgstr ""
msgid "Reviewing (merge request !%{mergeRequestId})" msgid "Reviewing (merge request !%{mergeRequestId})"
msgstr "" msgstr ""
msgid "Rollback"
msgstr ""
msgid "Runners" msgid "Runners"
msgstr "" msgstr ""
...@@ -3714,6 +3807,9 @@ msgstr "" ...@@ -3714,6 +3807,9 @@ msgstr ""
msgid "SSH Keys" msgid "SSH Keys"
msgstr "" msgstr ""
msgid "Save"
msgstr ""
msgid "Save changes" msgid "Save changes"
msgstr "" msgstr ""
...@@ -4058,6 +4154,9 @@ msgstr "" ...@@ -4058,6 +4154,9 @@ msgstr ""
msgid "Stage" msgid "Stage"
msgstr "" msgstr ""
msgid "Stage & Commit"
msgstr ""
msgid "Stage all changes" msgid "Stage all changes"
msgstr "" msgstr ""
...@@ -4303,6 +4402,9 @@ msgstr "" ...@@ -4303,6 +4402,9 @@ msgstr ""
msgid "There are no issues to show" msgid "There are no issues to show"
msgstr "" msgstr ""
msgid "There are no labels yet"
msgstr ""
msgid "There are no merge requests to show" msgid "There are no merge requests to show"
msgstr "" msgstr ""
...@@ -4417,6 +4519,9 @@ msgstr "" ...@@ -4417,6 +4519,9 @@ msgstr ""
msgid "This source diff could not be displayed because it is too large." msgid "This source diff could not be displayed because it is too large."
msgstr "" msgstr ""
msgid "This user has no identities"
msgstr ""
msgid "Time before an issue gets scheduled" msgid "Time before an issue gets scheduled"
msgstr "" msgstr ""
...@@ -4592,6 +4697,9 @@ msgstr "" ...@@ -4592,6 +4697,9 @@ msgstr ""
msgid "To GitLab" msgid "To GitLab"
msgstr "" msgstr ""
msgid "To add an SSH key you need to %{generate_link_start}generate one%{link_end} or use an %{existing_link_start}existing key%{link_end}."
msgstr ""
msgid "To import GitHub repositories, you can use a %{personal_access_token_link}. When you create your Personal Access Token, you will need to select the <code>repo</code> scope, so we can display a list of your public and private repositories which are available to import." msgid "To import GitHub repositories, you can use a %{personal_access_token_link}. When you create your Personal Access Token, you will need to select the <code>repo</code> scope, so we can display a list of your public and private repositories which are available to import."
msgstr "" msgstr ""
...@@ -4691,6 +4799,11 @@ msgstr "" ...@@ -4691,6 +4799,11 @@ msgstr ""
msgid "Up to date" msgid "Up to date"
msgstr "" msgstr ""
msgid "Update %{files}"
msgid_plural "Update %{files} files"
msgstr[0] ""
msgstr[1] ""
msgid "Update your group name, description, avatar, and other general settings." msgid "Update your group name, description, avatar, and other general settings."
msgstr "" msgstr ""
...@@ -4724,6 +4837,9 @@ msgstr "" ...@@ -4724,6 +4837,9 @@ msgstr ""
msgid "User and IP Rate Limits" msgid "User and IP Rate Limits"
msgstr "" msgstr ""
msgid "Users"
msgstr ""
msgid "Variables" msgid "Variables"
msgstr "" msgstr ""
...@@ -4940,9 +5056,6 @@ msgstr "" ...@@ -4940,9 +5056,6 @@ msgstr ""
msgid "Withdraw Access Request" msgid "Withdraw Access Request"
msgstr "" msgstr ""
msgid "Write a commit message..."
msgstr ""
msgid "Yes" msgid "Yes"
msgstr "" msgstr ""
......
...@@ -2,19 +2,12 @@ require 'spec_helper' ...@@ -2,19 +2,12 @@ require 'spec_helper'
describe Oauth::AuthorizationsController do describe Oauth::AuthorizationsController do
let(:user) { create(:user) } let(:user) { create(:user) }
let!(:application) { create(:oauth_application, scopes: 'api read_user', redirect_uri: 'http://example.com') }
let(:doorkeeper) do
Doorkeeper::Application.create(
name: "MyApp",
redirect_uri: 'http://example.com',
scopes: "")
end
let(:params) do let(:params) do
{ {
response_type: "code", response_type: "code",
client_id: doorkeeper.uid, client_id: application.uid,
redirect_uri: doorkeeper.redirect_uri, redirect_uri: application.redirect_uri,
state: 'state' state: 'state'
} }
end end
...@@ -44,7 +37,7 @@ describe Oauth::AuthorizationsController do ...@@ -44,7 +37,7 @@ describe Oauth::AuthorizationsController do
end end
it 'deletes session.user_return_to and redirects when skip authorization' do it 'deletes session.user_return_to and redirects when skip authorization' do
doorkeeper.update(trusted: true) application.update(trusted: true)
request.session['user_return_to'] = 'http://example.com' request.session['user_return_to'] = 'http://example.com'
get :new, params get :new, params
...@@ -52,6 +45,25 @@ describe Oauth::AuthorizationsController do ...@@ -52,6 +45,25 @@ describe Oauth::AuthorizationsController do
expect(request.session['user_return_to']).to be_nil expect(request.session['user_return_to']).to be_nil
expect(response).to have_gitlab_http_status(302) expect(response).to have_gitlab_http_status(302)
end end
context 'when there is already an access token for the application' do
context 'when the request scope matches any of the created token scopes' do
before do
scopes = Doorkeeper::OAuth::Scopes.from_string('api')
allow(Doorkeeper.configuration).to receive(:scopes).and_return(scopes)
create :oauth_access_token, application: application, resource_owner_id: user.id, scopes: scopes
end
it 'authorizes the request and redirects' do
get :new, params
expect(request.session['user_return_to']).to be_nil
expect(response).to have_gitlab_http_status(302)
end
end
end
end end
end end
end end
...@@ -5,6 +5,7 @@ describe 'Dashboard Issues Calendar Feed' do ...@@ -5,6 +5,7 @@ describe 'Dashboard Issues Calendar Feed' do
let!(:user) { create(:user, email: 'private1@example.com', public_email: 'public1@example.com') } let!(:user) { create(:user, email: 'private1@example.com', public_email: 'public1@example.com') }
let!(:assignee) { create(:user, email: 'private2@example.com', public_email: 'public2@example.com') } let!(:assignee) { create(:user, email: 'private2@example.com', public_email: 'public2@example.com') }
let!(:project) { create(:project) } let!(:project) { create(:project) }
let(:milestone) { create(:milestone, project_id: project.id, title: 'v1.0') }
before do before do
project.add_master(user) project.add_master(user)
...@@ -14,7 +15,9 @@ describe 'Dashboard Issues Calendar Feed' do ...@@ -14,7 +15,9 @@ describe 'Dashboard Issues Calendar Feed' do
context 'with no referer' do context 'with no referer' do
it 'renders calendar feed' do it 'renders calendar feed' do
sign_in user sign_in user
visit issues_dashboard_path(:ics) visit issues_dashboard_path(:ics,
due_date: Issue::DueNextMonthAndPreviousTwoWeeks.name,
sort: 'closest_future_date')
expect(response_headers['Content-Type']).to have_content('text/calendar') expect(response_headers['Content-Type']).to have_content('text/calendar')
expect(body).to have_text('BEGIN:VCALENDAR') expect(body).to have_text('BEGIN:VCALENDAR')
...@@ -25,19 +28,37 @@ describe 'Dashboard Issues Calendar Feed' do ...@@ -25,19 +28,37 @@ describe 'Dashboard Issues Calendar Feed' do
it 'renders calendar feed as text/plain' do it 'renders calendar feed as text/plain' do
sign_in user sign_in user
page.driver.header('Referer', issues_dashboard_url(host: Settings.gitlab.base_url)) page.driver.header('Referer', issues_dashboard_url(host: Settings.gitlab.base_url))
visit issues_dashboard_path(:ics) visit issues_dashboard_path(:ics,
due_date: Issue::DueNextMonthAndPreviousTwoWeeks.name,
sort: 'closest_future_date')
expect(response_headers['Content-Type']).to have_content('text/plain') expect(response_headers['Content-Type']).to have_content('text/plain')
expect(body).to have_text('BEGIN:VCALENDAR') expect(body).to have_text('BEGIN:VCALENDAR')
end end
end end
context 'when filtered by milestone' do
it 'renders calendar feed' do
sign_in user
visit issues_dashboard_path(:ics,
due_date: Issue::DueNextMonthAndPreviousTwoWeeks.name,
sort: 'closest_future_date',
milestone_title: milestone.title)
expect(response_headers['Content-Type']).to have_content('text/calendar')
expect(body).to have_text('BEGIN:VCALENDAR')
end
end
end end
context 'when authenticated via personal access token' do context 'when authenticated via personal access token' do
it 'renders calendar feed' do it 'renders calendar feed' do
personal_access_token = create(:personal_access_token, user: user) personal_access_token = create(:personal_access_token, user: user)
visit issues_dashboard_path(:ics, private_token: personal_access_token.token) visit issues_dashboard_path(:ics,
due_date: Issue::DueNextMonthAndPreviousTwoWeeks.name,
sort: 'closest_future_date',
private_token: personal_access_token.token)
expect(response_headers['Content-Type']).to have_content('text/calendar') expect(response_headers['Content-Type']).to have_content('text/calendar')
expect(body).to have_text('BEGIN:VCALENDAR') expect(body).to have_text('BEGIN:VCALENDAR')
...@@ -46,7 +67,10 @@ describe 'Dashboard Issues Calendar Feed' do ...@@ -46,7 +67,10 @@ describe 'Dashboard Issues Calendar Feed' do
context 'when authenticated via feed token' do context 'when authenticated via feed token' do
it 'renders calendar feed' do it 'renders calendar feed' do
visit issues_dashboard_path(:ics, feed_token: user.feed_token) visit issues_dashboard_path(:ics,
due_date: Issue::DueNextMonthAndPreviousTwoWeeks.name,
sort: 'closest_future_date',
feed_token: user.feed_token)
expect(response_headers['Content-Type']).to have_content('text/calendar') expect(response_headers['Content-Type']).to have_content('text/calendar')
expect(body).to have_text('BEGIN:VCALENDAR') expect(body).to have_text('BEGIN:VCALENDAR')
...@@ -60,7 +84,10 @@ describe 'Dashboard Issues Calendar Feed' do ...@@ -60,7 +84,10 @@ describe 'Dashboard Issues Calendar Feed' do
end end
it 'renders issue fields' do it 'renders issue fields' do
visit issues_dashboard_path(:ics, feed_token: user.feed_token) visit issues_dashboard_path(:ics,
due_date: Issue::DueNextMonthAndPreviousTwoWeeks.name,
sort: 'closest_future_date',
feed_token: user.feed_token)
expect(body).to have_text("SUMMARY:test title (in #{project.full_path})") expect(body).to have_text("SUMMARY:test title (in #{project.full_path})")
# line length for ics is 75 chars # line length for ics is 75 chars
......
...@@ -5,7 +5,6 @@ import { ...@@ -5,7 +5,6 @@ import {
INLINE_DIFF_VIEW_TYPE, INLINE_DIFF_VIEW_TYPE,
PARALLEL_DIFF_VIEW_TYPE, PARALLEL_DIFF_VIEW_TYPE,
} from '~/diffs/constants'; } from '~/diffs/constants';
import store from '~/diffs/store';
import * as actions from '~/diffs/store/actions'; import * as actions from '~/diffs/store/actions';
import * as types from '~/diffs/store/mutation_types'; import * as types from '~/diffs/store/mutation_types';
import axios from '~/lib/utils/axios_utils'; import axios from '~/lib/utils/axios_utils';
...@@ -28,22 +27,6 @@ describe('DiffsStoreActions', () => { ...@@ -28,22 +27,6 @@ describe('DiffsStoreActions', () => {
}); });
}); });
describe('setLoadingState', () => {
it('should set loading state', done => {
expect(store.state.diffs.isLoading).toEqual(true);
const loadingState = false;
testAction(
actions.setLoadingState,
loadingState,
{},
[{ type: types.SET_LOADING, payload: loadingState }],
[],
done,
);
});
});
describe('fetchDiffFiles', () => { describe('fetchDiffFiles', () => {
it('should fetch diff files', done => { it('should fetch diff files', done => {
const endpoint = '/fetch/diff/files'; const endpoint = '/fetch/diff/files';
......
...@@ -36,13 +36,15 @@ describe('IDE error message component', () => { ...@@ -36,13 +36,15 @@ describe('IDE error message component', () => {
}); });
describe('with action', () => { describe('with action', () => {
let actionSpy;
beforeEach(done => { beforeEach(done => {
vm.message.action = 'testAction'; actionSpy = jasmine.createSpy('action').and.returnValue(Promise.resolve());
vm.message.action = actionSpy;
vm.message.actionText = 'test action'; vm.message.actionText = 'test action';
vm.message.actionPayload = 'testActionPayload'; vm.message.actionPayload = 'testActionPayload';
spyOn(vm.$store, 'dispatch').and.returnValue(Promise.resolve());
vm.$nextTick(done); vm.$nextTick(done);
}); });
...@@ -63,7 +65,7 @@ describe('IDE error message component', () => { ...@@ -63,7 +65,7 @@ describe('IDE error message component', () => {
vm.$el.querySelector('.flash-action').click(); vm.$el.querySelector('.flash-action').click();
vm.$nextTick(() => { vm.$nextTick(() => {
expect(vm.$store.dispatch).toHaveBeenCalledWith('testAction', 'testActionPayload'); expect(actionSpy).toHaveBeenCalledWith('testActionPayload');
done(); done();
}); });
...@@ -74,7 +76,7 @@ describe('IDE error message component', () => { ...@@ -74,7 +76,7 @@ describe('IDE error message component', () => {
vm.$el.querySelector('.flash-action').click(); vm.$el.querySelector('.flash-action').click();
expect(vm.$store.dispatch).not.toHaveBeenCalledWith(); expect(actionSpy).not.toHaveBeenCalledWith();
}); });
it('resets isLoading after click', done => { it('resets isLoading after click', done => {
......
import Vue from 'vue'; import Vue from 'vue';
import store from '~/ide/stores'; import store from '~/ide/stores';
import service from '~/ide/services';
import router from '~/ide/ide_router'; import router from '~/ide/ide_router';
import repoCommitSection from '~/ide/components/repo_commit_section.vue'; import repoCommitSection from '~/ide/components/repo_commit_section.vue';
import { createComponentWithStore } from 'spec/helpers/vue_mount_component_helper'; import { createComponentWithStore } from 'spec/helpers/vue_mount_component_helper';
...@@ -68,23 +67,6 @@ describe('RepoCommitSection', () => { ...@@ -68,23 +67,6 @@ describe('RepoCommitSection', () => {
vm.$mount(); vm.$mount();
spyOn(service, 'getTreeData').and.returnValue(
Promise.resolve({
headers: {
'page-title': 'test',
},
json: () =>
Promise.resolve({
last_commit_path: 'last_commit_path',
parent_tree_url: 'parent_tree_url',
path: '/',
trees: [{ name: 'tree' }],
blobs: [{ name: 'blob' }],
submodules: [{ name: 'submodule' }],
}),
}),
);
Vue.nextTick(done); Vue.nextTick(done);
}); });
......
import Vue from 'vue'; import Vue from 'vue';
import MockAdapter from 'axios-mock-adapter';
import axios from '~/lib/utils/axios_utils';
import store from '~/ide/stores'; import store from '~/ide/stores';
import * as actions from '~/ide/stores/actions/file'; import * as actions from '~/ide/stores/actions/file';
import * as types from '~/ide/stores/mutation_types'; import * as types from '~/ide/stores/mutation_types';
...@@ -9,11 +11,16 @@ import { file, resetStore } from '../../helpers'; ...@@ -9,11 +11,16 @@ import { file, resetStore } from '../../helpers';
import testAction from '../../../helpers/vuex_action_helper'; import testAction from '../../../helpers/vuex_action_helper';
describe('IDE store file actions', () => { describe('IDE store file actions', () => {
let mock;
beforeEach(() => { beforeEach(() => {
mock = new MockAdapter(axios);
spyOn(router, 'push'); spyOn(router, 'push');
}); });
afterEach(() => { afterEach(() => {
mock.restore();
resetStore(store); resetStore(store);
}); });
...@@ -183,94 +190,125 @@ describe('IDE store file actions', () => { ...@@ -183,94 +190,125 @@ describe('IDE store file actions', () => {
let localFile; let localFile;
beforeEach(() => { beforeEach(() => {
spyOn(service, 'getFileData').and.returnValue( spyOn(service, 'getFileData').and.callThrough();
Promise.resolve({
headers: {
'page-title': 'testing getFileData',
},
json: () =>
Promise.resolve({
blame_path: 'blame_path',
commits_path: 'commits_path',
permalink: 'permalink',
raw_path: 'raw_path',
binary: false,
html: '123',
render_error: '',
}),
}),
);
localFile = file(`newCreate-${Math.random()}`); localFile = file(`newCreate-${Math.random()}`);
localFile.url = 'getFileDataURL'; localFile.url = `${gl.TEST_HOST}/getFileDataURL`;
store.state.entries[localFile.path] = localFile; store.state.entries[localFile.path] = localFile;
}); });
it('calls the service', done => { describe('success', () => {
store beforeEach(() => {
.dispatch('getFileData', { path: localFile.path }) mock.onGet(`${gl.TEST_HOST}/getFileDataURL`).replyOnce(
.then(() => { 200,
expect(service.getFileData).toHaveBeenCalledWith('getFileDataURL'); {
blame_path: 'blame_path',
commits_path: 'commits_path',
permalink: 'permalink',
raw_path: 'raw_path',
binary: false,
html: '123',
render_error: '',
},
{
'page-title': 'testing getFileData',
},
);
});
done(); it('calls the service', done => {
}) store
.catch(done.fail); .dispatch('getFileData', { path: localFile.path })
}); .then(() => {
expect(service.getFileData).toHaveBeenCalledWith(`${gl.TEST_HOST}/getFileDataURL`);
it('sets the file data', done => { done();
store })
.dispatch('getFileData', { path: localFile.path }) .catch(done.fail);
.then(() => { });
expect(localFile.blamePath).toBe('blame_path');
done(); it('sets the file data', done => {
}) store
.catch(done.fail); .dispatch('getFileData', { path: localFile.path })
}); .then(() => {
expect(localFile.blamePath).toBe('blame_path');
it('sets document title', done => { done();
store })
.dispatch('getFileData', { path: localFile.path }) .catch(done.fail);
.then(() => { });
expect(document.title).toBe('testing getFileData');
done(); it('sets document title', done => {
}) store
.catch(done.fail); .dispatch('getFileData', { path: localFile.path })
}); .then(() => {
expect(document.title).toBe('testing getFileData');
it('sets the file as active', done => { done();
store })
.dispatch('getFileData', { path: localFile.path }) .catch(done.fail);
.then(() => { });
expect(localFile.active).toBeTruthy();
done(); it('sets the file as active', done => {
}) store
.catch(done.fail); .dispatch('getFileData', { path: localFile.path })
}); .then(() => {
expect(localFile.active).toBeTruthy();
it('sets the file not as active if we pass makeFileActive false', done => { done();
store })
.dispatch('getFileData', { path: localFile.path, makeFileActive: false }) .catch(done.fail);
.then(() => { });
expect(localFile.active).toBeFalsy();
done(); it('sets the file not as active if we pass makeFileActive false', done => {
}) store
.catch(done.fail); .dispatch('getFileData', { path: localFile.path, makeFileActive: false })
.then(() => {
expect(localFile.active).toBeFalsy();
done();
})
.catch(done.fail);
});
it('adds the file to open files', done => {
store
.dispatch('getFileData', { path: localFile.path })
.then(() => {
expect(store.state.openFiles.length).toBe(1);
expect(store.state.openFiles[0].name).toBe(localFile.name);
done();
})
.catch(done.fail);
});
}); });
it('adds the file to open files', done => { describe('error', () => {
store beforeEach(() => {
.dispatch('getFileData', { path: localFile.path }) mock.onGet(`${gl.TEST_HOST}/getFileDataURL`).networkError();
.then(() => { });
expect(store.state.openFiles.length).toBe(1);
expect(store.state.openFiles[0].name).toBe(localFile.name);
done(); it('dispatches error action', done => {
}) const dispatch = jasmine.createSpy('dispatch');
.catch(done.fail);
actions
.getFileData({ state: store.state, commit() {}, dispatch }, { path: localFile.path })
.then(() => {
expect(dispatch).toHaveBeenCalledWith('setErrorMessage', {
text: 'An error occured whilst loading the file.',
action: jasmine.any(Function),
actionText: 'Please try again',
actionPayload: {
path: localFile.path,
makeFileActive: true,
},
});
done();
})
.catch(done.fail);
});
}); });
}); });
...@@ -278,48 +316,84 @@ describe('IDE store file actions', () => { ...@@ -278,48 +316,84 @@ describe('IDE store file actions', () => {
let tmpFile; let tmpFile;
beforeEach(() => { beforeEach(() => {
spyOn(service, 'getRawFileData').and.returnValue(Promise.resolve('raw')); spyOn(service, 'getRawFileData').and.callThrough();
tmpFile = file('tmpFile'); tmpFile = file('tmpFile');
store.state.entries[tmpFile.path] = tmpFile; store.state.entries[tmpFile.path] = tmpFile;
}); });
it('calls getRawFileData service method', done => { describe('success', () => {
store beforeEach(() => {
.dispatch('getRawFileData', { path: tmpFile.path }) mock.onGet(/(.*)/).replyOnce(200, 'raw');
.then(() => { });
expect(service.getRawFileData).toHaveBeenCalledWith(tmpFile);
done(); it('calls getRawFileData service method', done => {
}) store
.catch(done.fail); .dispatch('getRawFileData', { path: tmpFile.path })
}); .then(() => {
expect(service.getRawFileData).toHaveBeenCalledWith(tmpFile);
it('updates file raw data', done => { done();
store })
.dispatch('getRawFileData', { path: tmpFile.path }) .catch(done.fail);
.then(() => { });
expect(tmpFile.raw).toBe('raw');
done(); it('updates file raw data', done => {
}) store
.catch(done.fail); .dispatch('getRawFileData', { path: tmpFile.path })
}); .then(() => {
expect(tmpFile.raw).toBe('raw');
it('calls also getBaseRawFileData service method', done => { done();
spyOn(service, 'getBaseRawFileData').and.returnValue(Promise.resolve('baseraw')); })
.catch(done.fail);
});
tmpFile.mrChange = { new_file: false }; it('calls also getBaseRawFileData service method', done => {
spyOn(service, 'getBaseRawFileData').and.returnValue(Promise.resolve('baseraw'));
store tmpFile.mrChange = { new_file: false };
.dispatch('getRawFileData', { path: tmpFile.path, baseSha: 'SHA' })
.then(() => {
expect(service.getBaseRawFileData).toHaveBeenCalledWith(tmpFile, 'SHA');
expect(tmpFile.baseRaw).toBe('baseraw');
done(); store
}) .dispatch('getRawFileData', { path: tmpFile.path, baseSha: 'SHA' })
.catch(done.fail); .then(() => {
expect(service.getBaseRawFileData).toHaveBeenCalledWith(tmpFile, 'SHA');
expect(tmpFile.baseRaw).toBe('baseraw');
done();
})
.catch(done.fail);
});
});
describe('error', () => {
beforeEach(() => {
mock.onGet(/(.*)/).networkError();
});
it('dispatches error action', done => {
const dispatch = jasmine.createSpy('dispatch');
actions
.getRawFileData(
{ state: store.state, commit() {}, dispatch },
{ path: tmpFile.path, baseSha: tmpFile.baseSha },
)
.then(done.fail)
.catch(() => {
expect(dispatch).toHaveBeenCalledWith('setErrorMessage', {
text: 'An error occured whilst loading the file content.',
action: jasmine.any(Function),
actionText: 'Please try again',
actionPayload: {
path: tmpFile.path,
baseSha: tmpFile.baseSha,
},
});
done();
});
});
}); });
}); });
......
import MockAdapter from 'axios-mock-adapter';
import axios from '~/lib/utils/axios_utils';
import store from '~/ide/stores'; import store from '~/ide/stores';
import {
getMergeRequestData,
getMergeRequestChanges,
getMergeRequestVersions,
} from '~/ide/stores/actions/merge_request';
import service from '~/ide/services'; import service from '~/ide/services';
import { resetStore } from '../../helpers'; import { resetStore } from '../../helpers';
describe('IDE store merge request actions', () => { describe('IDE store merge request actions', () => {
let mock;
beforeEach(() => { beforeEach(() => {
mock = new MockAdapter(axios);
store.state.projects.abcproject = { store.state.projects.abcproject = {
mergeRequests: {}, mergeRequests: {},
}; };
}); });
afterEach(() => { afterEach(() => {
mock.restore();
resetStore(store); resetStore(store);
}); });
describe('getMergeRequestData', () => { describe('getMergeRequestData', () => {
beforeEach(() => { describe('success', () => {
spyOn(service, 'getProjectMergeRequestData').and.returnValue( beforeEach(() => {
Promise.resolve({ data: { title: 'mergerequest' } }), spyOn(service, 'getProjectMergeRequestData').and.callThrough();
);
mock
.onGet(/api\/(.*)\/projects\/abcproject\/merge_requests\/1/)
.reply(200, { title: 'mergerequest' });
});
it('calls getProjectMergeRequestData service method', done => {
store
.dispatch('getMergeRequestData', { projectId: 'abcproject', mergeRequestId: 1 })
.then(() => {
expect(service.getProjectMergeRequestData).toHaveBeenCalledWith('abcproject', 1);
done();
})
.catch(done.fail);
});
it('sets the Merge Request Object', done => {
store
.dispatch('getMergeRequestData', { projectId: 'abcproject', mergeRequestId: 1 })
.then(() => {
expect(store.state.projects.abcproject.mergeRequests['1'].title).toBe('mergerequest');
expect(store.state.currentMergeRequestId).toBe(1);
done();
})
.catch(done.fail);
});
}); });
it('calls getProjectMergeRequestData service method', done => { describe('error', () => {
store beforeEach(() => {
.dispatch('getMergeRequestData', { projectId: 'abcproject', mergeRequestId: 1 }) mock.onGet(/api\/(.*)\/projects\/abcproject\/merge_requests\/1/).networkError();
.then(() => { });
expect(service.getProjectMergeRequestData).toHaveBeenCalledWith('abcproject', 1);
it('dispatches error action', done => {
done(); const dispatch = jasmine.createSpy('dispatch');
})
.catch(done.fail); getMergeRequestData(
}); {
commit() {},
it('sets the Merge Request Object', done => { dispatch,
store state: store.state,
.dispatch('getMergeRequestData', { projectId: 'abcproject', mergeRequestId: 1 }) },
.then(() => { { projectId: 'abcproject', mergeRequestId: 1 },
expect(store.state.projects.abcproject.mergeRequests['1'].title).toBe('mergerequest'); )
expect(store.state.currentMergeRequestId).toBe(1); .then(done.fail)
.catch(() => {
done(); expect(dispatch).toHaveBeenCalledWith('setErrorMessage', {
}) text: 'An error occured whilst loading the merge request.',
.catch(done.fail); action: jasmine.any(Function),
actionText: 'Please try again',
actionPayload: {
projectId: 'abcproject',
mergeRequestId: 1,
force: false,
},
});
done();
});
});
}); });
}); });
describe('getMergeRequestChanges', () => { describe('getMergeRequestChanges', () => {
beforeEach(() => { beforeEach(() => {
spyOn(service, 'getProjectMergeRequestChanges').and.returnValue(
Promise.resolve({ data: { title: 'mergerequest' } }),
);
store.state.projects.abcproject.mergeRequests['1'] = { changes: [] }; store.state.projects.abcproject.mergeRequests['1'] = { changes: [] };
}); });
it('calls getProjectMergeRequestChanges service method', done => { describe('success', () => {
store beforeEach(() => {
.dispatch('getMergeRequestChanges', { projectId: 'abcproject', mergeRequestId: 1 }) spyOn(service, 'getProjectMergeRequestChanges').and.callThrough();
.then(() => {
expect(service.getProjectMergeRequestChanges).toHaveBeenCalledWith('abcproject', 1); mock
.onGet(/api\/(.*)\/projects\/abcproject\/merge_requests\/1\/changes/)
done(); .reply(200, { title: 'mergerequest' });
}) });
.catch(done.fail);
it('calls getProjectMergeRequestChanges service method', done => {
store
.dispatch('getMergeRequestChanges', { projectId: 'abcproject', mergeRequestId: 1 })
.then(() => {
expect(service.getProjectMergeRequestChanges).toHaveBeenCalledWith('abcproject', 1);
done();
})
.catch(done.fail);
});
it('sets the Merge Request Changes Object', done => {
store
.dispatch('getMergeRequestChanges', { projectId: 'abcproject', mergeRequestId: 1 })
.then(() => {
expect(store.state.projects.abcproject.mergeRequests['1'].changes.title).toBe(
'mergerequest',
);
done();
})
.catch(done.fail);
});
}); });
it('sets the Merge Request Changes Object', done => { describe('error', () => {
store beforeEach(() => {
.dispatch('getMergeRequestChanges', { projectId: 'abcproject', mergeRequestId: 1 }) mock.onGet(/api\/(.*)\/projects\/abcproject\/merge_requests\/1\/changes/).networkError();
.then(() => { });
expect(store.state.projects.abcproject.mergeRequests['1'].changes.title).toBe(
'mergerequest', it('dispatches error action', done => {
); const dispatch = jasmine.createSpy('dispatch');
done();
}) getMergeRequestChanges(
.catch(done.fail); {
commit() {},
dispatch,
state: store.state,
},
{ projectId: 'abcproject', mergeRequestId: 1 },
)
.then(done.fail)
.catch(() => {
expect(dispatch).toHaveBeenCalledWith('setErrorMessage', {
text: 'An error occured whilst loading the merge request changes.',
action: jasmine.any(Function),
actionText: 'Please try again',
actionPayload: {
projectId: 'abcproject',
mergeRequestId: 1,
force: false,
},
});
done();
});
});
}); });
}); });
describe('getMergeRequestVersions', () => { describe('getMergeRequestVersions', () => {
beforeEach(() => { beforeEach(() => {
spyOn(service, 'getProjectMergeRequestVersions').and.returnValue(
Promise.resolve({ data: [{ id: 789 }] }),
);
store.state.projects.abcproject.mergeRequests['1'] = { versions: [] }; store.state.projects.abcproject.mergeRequests['1'] = { versions: [] };
}); });
it('calls getProjectMergeRequestVersions service method', done => { describe('success', () => {
store beforeEach(() => {
.dispatch('getMergeRequestVersions', { projectId: 'abcproject', mergeRequestId: 1 }) mock
.then(() => { .onGet(/api\/(.*)\/projects\/abcproject\/merge_requests\/1\/versions/)
expect(service.getProjectMergeRequestVersions).toHaveBeenCalledWith('abcproject', 1); .reply(200, [{ id: 789 }]);
spyOn(service, 'getProjectMergeRequestVersions').and.callThrough();
done(); });
})
.catch(done.fail); it('calls getProjectMergeRequestVersions service method', done => {
store
.dispatch('getMergeRequestVersions', { projectId: 'abcproject', mergeRequestId: 1 })
.then(() => {
expect(service.getProjectMergeRequestVersions).toHaveBeenCalledWith('abcproject', 1);
done();
})
.catch(done.fail);
});
it('sets the Merge Request Versions Object', done => {
store
.dispatch('getMergeRequestVersions', { projectId: 'abcproject', mergeRequestId: 1 })
.then(() => {
expect(store.state.projects.abcproject.mergeRequests['1'].versions.length).toBe(1);
done();
})
.catch(done.fail);
});
}); });
it('sets the Merge Request Versions Object', done => { describe('error', () => {
store beforeEach(() => {
.dispatch('getMergeRequestVersions', { projectId: 'abcproject', mergeRequestId: 1 }) mock.onGet(/api\/(.*)\/projects\/abcproject\/merge_requests\/1\/versions/).networkError();
.then(() => { });
expect(store.state.projects.abcproject.mergeRequests['1'].versions.length).toBe(1);
done(); it('dispatches error action', done => {
}) const dispatch = jasmine.createSpy('dispatch');
.catch(done.fail);
getMergeRequestVersions(
{
commit() {},
dispatch,
state: store.state,
},
{ projectId: 'abcproject', mergeRequestId: 1 },
)
.then(done.fail)
.catch(() => {
expect(dispatch).toHaveBeenCalledWith('setErrorMessage', {
text: 'An error occured whilst loading the merge request version data.',
action: jasmine.any(Function),
actionText: 'Please try again',
actionPayload: {
projectId: 'abcproject',
mergeRequestId: 1,
force: false,
},
});
done();
});
});
}); });
}); });
}); });
...@@ -110,7 +110,7 @@ describe('IDE store project actions', () => { ...@@ -110,7 +110,7 @@ describe('IDE store project actions', () => {
type: 'setErrorMessage', type: 'setErrorMessage',
payload: { payload: {
text: "Branch <strong>master</strong> was not found in this project's repository.", text: "Branch <strong>master</strong> was not found in this project's repository.",
action: 'createNewBranchFromDefault', action: jasmine.any(Function),
actionText: 'Create branch', actionText: 'Create branch',
actionPayload: 'master', actionPayload: 'master',
}, },
......
import MockAdapter from 'axios-mock-adapter'; import MockAdapter from 'axios-mock-adapter';
import Vue from 'vue';
import testAction from 'spec/helpers/vuex_action_helper'; import testAction from 'spec/helpers/vuex_action_helper';
import { showTreeEntry, getFiles } from '~/ide/stores/actions/tree'; import { showTreeEntry, getFiles } from '~/ide/stores/actions/tree';
import * as types from '~/ide/stores/mutation_types'; import * as types from '~/ide/stores/mutation_types';
...@@ -117,6 +116,40 @@ describe('Multi-file store tree actions', () => { ...@@ -117,6 +116,40 @@ describe('Multi-file store tree actions', () => {
done(); done();
}); });
}); });
it('dispatches error action', done => {
const dispatch = jasmine.createSpy('dispatchSpy');
store.state.projects = {
'abc/def': {
web_url: `${gl.TEST_HOST}/files`,
},
};
mock.onGet(/(.*)/).replyOnce(500);
getFiles(
{
commit() {},
dispatch,
state: store.state,
},
{
projectId: 'abc/def',
branchId: 'master-testing',
},
)
.then(done.fail)
.catch(() => {
expect(dispatch).toHaveBeenCalledWith('setErrorMessage', {
text: 'An error occured whilst loading all the files.',
action: jasmine.any(Function),
actionText: 'Please try again',
actionPayload: { projectId: 'abc/def', branchId: 'master-testing' },
});
done();
});
});
}); });
}); });
...@@ -168,72 +201,4 @@ describe('Multi-file store tree actions', () => { ...@@ -168,72 +201,4 @@ describe('Multi-file store tree actions', () => {
); );
}); });
}); });
describe('getLastCommitData', () => {
beforeEach(() => {
spyOn(service, 'getTreeLastCommit').and.returnValue(
Promise.resolve({
headers: {
'more-logs-url': null,
},
json: () =>
Promise.resolve([
{
type: 'tree',
file_name: 'testing',
commit: {
message: 'commit message',
authored_date: '123',
},
},
]),
}),
);
store.state.trees['abcproject/mybranch'] = {
tree: [],
};
projectTree = store.state.trees['abcproject/mybranch'];
projectTree.tree.push(file('testing', '1', 'tree'));
projectTree.lastCommitPath = 'lastcommitpath';
});
it('calls service with lastCommitPath', done => {
store
.dispatch('getLastCommitData', projectTree)
.then(() => {
expect(service.getTreeLastCommit).toHaveBeenCalledWith('lastcommitpath');
done();
})
.catch(done.fail);
});
it('updates trees last commit data', done => {
store
.dispatch('getLastCommitData', projectTree)
.then(Vue.nextTick)
.then(() => {
expect(projectTree.tree[0].lastCommit.message).toBe('commit message');
done();
})
.catch(done.fail);
});
it('does not update entry if not found', done => {
projectTree.tree[0].name = 'a';
store
.dispatch('getLastCommitData', projectTree)
.then(Vue.nextTick)
.then(() => {
expect(projectTree.tree[0].lastCommit.message).not.toBe('commit message');
done();
})
.catch(done.fail);
});
});
}); });
...@@ -291,4 +291,17 @@ describe('Actions Notes Store', () => { ...@@ -291,4 +291,17 @@ describe('Actions Notes Store', () => {
.catch(done.fail); .catch(done.fail);
}); });
}); });
describe('setNotesFetchedState', () => {
it('should set notes fetched state', done => {
testAction(
actions.setNotesFetchedState,
true,
{},
[{ type: 'SET_NOTES_FETCHED_STATE', payload: true }],
[],
done,
);
});
});
}); });
...@@ -15,6 +15,7 @@ describe('Getters Notes Store', () => { ...@@ -15,6 +15,7 @@ describe('Getters Notes Store', () => {
discussions: [individualNote], discussions: [individualNote],
targetNoteHash: 'hash', targetNoteHash: 'hash',
lastFetchedAt: 'timestamp', lastFetchedAt: 'timestamp',
isNotesFetched: false,
notesData: notesDataMock, notesData: notesDataMock,
userData: userDataMock, userData: userDataMock,
...@@ -84,4 +85,10 @@ describe('Getters Notes Store', () => { ...@@ -84,4 +85,10 @@ describe('Getters Notes Store', () => {
expect(getters.openState(state)).toEqual(noteableDataMock.state); expect(getters.openState(state)).toEqual(noteableDataMock.state);
}); });
}); });
describe('isNotesFetched', () => {
it('should return the state for the fetching notes', () => {
expect(getters.isNotesFetched(state)).toBeFalsy();
});
});
}); });
...@@ -318,4 +318,15 @@ describe('Notes Store mutations', () => { ...@@ -318,4 +318,15 @@ describe('Notes Store mutations', () => {
expect(state.isToggleStateButtonLoading).toEqual(false); expect(state.isToggleStateButtonLoading).toEqual(false);
}); });
}); });
describe('SET_NOTES_FETCHING_STATE', () => {
it('should set the given state', () => {
const state = {
isNotesFetched: false,
};
mutations.SET_NOTES_FETCHED_STATE(state, true);
expect(state.isNotesFetched).toEqual(true);
});
});
}); });
...@@ -3,15 +3,6 @@ require 'spec_helper' ...@@ -3,15 +3,6 @@ require 'spec_helper'
describe Banzai::Filter::EmojiFilter do describe Banzai::Filter::EmojiFilter do
include FilterSpecHelper include FilterSpecHelper
before do
@original_asset_host = ActionController::Base.asset_host
ActionController::Base.asset_host = 'https://foo.com'
end
after do
ActionController::Base.asset_host = @original_asset_host
end
it 'replaces supported name emoji' do it 'replaces supported name emoji' do
doc = filter('<p>:heart:</p>') doc = filter('<p>:heart:</p>')
expect(doc.css('gl-emoji').first.text).to eq '❤' expect(doc.css('gl-emoji').first.text).to eq '❤'
......
require 'spec_helper'
describe Gitlab::BackgroundMigration::DeleteDiffFiles, :migration, schema: 20180619121030 do
describe '#perform' do
context 'when diff files can be deleted' do
let(:merge_request) { create(:merge_request, :merged) }
let(:merge_request_diff) do
merge_request.create_merge_request_diff
merge_request.merge_request_diffs.first
end
it 'deletes all merge request diff files' do
expect { described_class.new.perform(merge_request_diff.id) }
.to change { merge_request_diff.merge_request_diff_files.count }
.from(20).to(0)
end
it 'updates state to without_files' do
expect { described_class.new.perform(merge_request_diff.id) }
.to change { merge_request_diff.reload.state }
.from('collected').to('without_files')
end
it 'rollsback if something goes wrong' do
expect(MergeRequestDiffFile).to receive_message_chain(:where, :delete_all)
.and_raise
expect { described_class.new.perform(merge_request_diff.id) }
.to raise_error
merge_request_diff.reload
expect(merge_request_diff.state).to eq('collected')
expect(merge_request_diff.merge_request_diff_files.count).to eq(20)
end
end
it 'deletes no merge request diff files when MR is not merged' do
merge_request = create(:merge_request, :opened)
merge_request.create_merge_request_diff
merge_request_diff = merge_request.merge_request_diffs.first
expect { described_class.new.perform(merge_request_diff.id) }
.not_to change { merge_request_diff.merge_request_diff_files.count }
.from(20)
end
it 'deletes no merge request diff files when diff is marked as "without_files"' do
merge_request = create(:merge_request, :merged)
merge_request.create_merge_request_diff
merge_request_diff = merge_request.merge_request_diffs.first
merge_request_diff.clean!
expect { described_class.new.perform(merge_request_diff.id) }
.not_to change { merge_request_diff.merge_request_diff_files.count }
.from(20)
end
it 'deletes no merge request diff files when diff is the latest' do
merge_request = create(:merge_request, :merged)
merge_request_diff = merge_request.merge_request_diff
expect { described_class.new.perform(merge_request_diff.id) }
.not_to change { merge_request_diff.merge_request_diff_files.count }
.from(20)
end
end
end
...@@ -52,7 +52,7 @@ describe Gitlab::DataBuilder::Note do ...@@ -52,7 +52,7 @@ describe Gitlab::DataBuilder::Note do
expect(data[:issue].except('updated_at')) expect(data[:issue].except('updated_at'))
.to eq(issue.reload.hook_attrs.except('updated_at')) .to eq(issue.reload.hook_attrs.except('updated_at'))
expect(data[:issue]['updated_at']) expect(data[:issue]['updated_at'])
.to be > issue.hook_attrs['updated_at'] .to be >= issue.hook_attrs['updated_at']
end end
context 'with confidential issue' do context 'with confidential issue' do
...@@ -84,7 +84,7 @@ describe Gitlab::DataBuilder::Note do ...@@ -84,7 +84,7 @@ describe Gitlab::DataBuilder::Note do
expect(data[:merge_request].except('updated_at')) expect(data[:merge_request].except('updated_at'))
.to eq(merge_request.reload.hook_attrs.except('updated_at')) .to eq(merge_request.reload.hook_attrs.except('updated_at'))
expect(data[:merge_request]['updated_at']) expect(data[:merge_request]['updated_at'])
.to be > merge_request.hook_attrs['updated_at'] .to be >= merge_request.hook_attrs['updated_at']
end end
include_examples 'project hook data' include_examples 'project hook data'
...@@ -107,7 +107,7 @@ describe Gitlab::DataBuilder::Note do ...@@ -107,7 +107,7 @@ describe Gitlab::DataBuilder::Note do
expect(data[:merge_request].except('updated_at')) expect(data[:merge_request].except('updated_at'))
.to eq(merge_request.reload.hook_attrs.except('updated_at')) .to eq(merge_request.reload.hook_attrs.except('updated_at'))
expect(data[:merge_request]['updated_at']) expect(data[:merge_request]['updated_at'])
.to be > merge_request.hook_attrs['updated_at'] .to be >= merge_request.hook_attrs['updated_at']
end end
include_examples 'project hook data' include_examples 'project hook data'
...@@ -130,7 +130,7 @@ describe Gitlab::DataBuilder::Note do ...@@ -130,7 +130,7 @@ describe Gitlab::DataBuilder::Note do
expect(data[:snippet].except('updated_at')) expect(data[:snippet].except('updated_at'))
.to eq(snippet.reload.hook_attrs.except('updated_at')) .to eq(snippet.reload.hook_attrs.except('updated_at'))
expect(data[:snippet]['updated_at']) expect(data[:snippet]['updated_at'])
.to be > snippet.hook_attrs['updated_at'] .to be >= snippet.hook_attrs['updated_at']
end end
include_examples 'project hook data' include_examples 'project hook data'
......
...@@ -32,7 +32,7 @@ RSpec.describe Gitlab::Favicon, :request_store do ...@@ -32,7 +32,7 @@ RSpec.describe Gitlab::Favicon, :request_store do
end end
it 'returns a full url when the asset host is configured' do it 'returns a full url when the asset host is configured' do
allow(Gitlab::Application.config).to receive(:asset_host).and_return('http://assets.local') allow(ActionController::Base).to receive(:asset_host).and_return('http://assets.local')
expect(described_class.main).to match %r{^http://localhost/assets/favicon-(?:\h+).png$} expect(described_class.main).to match %r{^http://localhost/assets/favicon-(?:\h+).png$}
end end
end end
......
...@@ -164,7 +164,7 @@ describe Gitlab::GithubImport::Importer::PullRequestsImporter do ...@@ -164,7 +164,7 @@ describe Gitlab::GithubImport::Importer::PullRequestsImporter do
Timecop.freeze do Timecop.freeze do
importer.update_repository importer.update_repository
expect(project.last_repository_updated_at).to eq(Time.zone.now) expect(project.last_repository_updated_at).to be_like_time(Time.zone.now)
end end
end end
end end
......
...@@ -4,14 +4,14 @@ describe Gitlab::ImportExport::Importer do ...@@ -4,14 +4,14 @@ describe Gitlab::ImportExport::Importer do
let(:user) { create(:user) } let(:user) { create(:user) }
let(:test_path) { "#{Dir.tmpdir}/importer_spec" } let(:test_path) { "#{Dir.tmpdir}/importer_spec" }
let(:shared) { project.import_export_shared } let(:shared) { project.import_export_shared }
let(:project) { create(:project, import_source: File.join(test_path, 'exported-project.gz')) } let(:project) { create(:project, import_source: File.join(test_path, 'test_project_export.tar.gz')) }
subject(:importer) { described_class.new(project) } subject(:importer) { described_class.new(project) }
before do before do
allow_any_instance_of(Gitlab::ImportExport).to receive(:storage_path).and_return(test_path) allow_any_instance_of(Gitlab::ImportExport).to receive(:storage_path).and_return(test_path)
FileUtils.mkdir_p(shared.export_path) FileUtils.mkdir_p(shared.export_path)
FileUtils.cp(Rails.root.join('spec', 'fixtures', 'exported-project.gz'), test_path) FileUtils.cp(Rails.root.join('spec/features/projects/import_export/test_project_export.tar.gz'), test_path)
allow(subject).to receive(:remove_import_file) allow(subject).to receive(:remove_import_file)
end end
......
require 'spec_helper'
require Rails.root.join('db', 'post_migrate', '20180604123514_cleanup_stages_position_migration.rb')
describe CleanupStagesPositionMigration, :migration, :sidekiq, :redis do
let(:migration) { spy('migration') }
before do
allow(Gitlab::BackgroundMigration::MigrateStageIndex)
.to receive(:new).and_return(migration)
end
context 'when there are pending background migrations' do
it 'processes pending jobs synchronously' do
Sidekiq::Testing.disable! do
BackgroundMigrationWorker
.perform_in(2.minutes, 'MigrateStageIndex', [1, 1])
BackgroundMigrationWorker
.perform_async('MigrateStageIndex', [1, 1])
migrate!
expect(migration).to have_received(:perform).with(1, 1).twice
end
end
end
context 'when there are no background migrations pending' do
it 'does nothing' do
Sidekiq::Testing.disable! do
migrate!
expect(migration).not_to have_received(:perform)
end
end
end
context 'when there are still unmigrated stages present' do
let(:stages) { table('ci_stages') }
let(:builds) { table('ci_builds') }
let!(:entities) do
%w[build test broken].map do |name|
stages.create(name: name)
end
end
before do
stages.update_all(position: nil)
builds.create(name: 'unit', stage_id: entities.first.id, stage_idx: 1, ref: 'master')
builds.create(name: 'unit', stage_id: entities.second.id, stage_idx: 1, ref: 'master')
end
it 'migrates stages sequentially for every stage' do
expect(stages.all).to all(have_attributes(position: nil))
migrate!
expect(migration).to have_received(:perform)
.with(entities.first.id, entities.first.id)
expect(migration).to have_received(:perform)
.with(entities.second.id, entities.second.id)
expect(migration).not_to have_received(:perform)
.with(entities.third.id, entities.third.id)
end
end
end
require 'spec_helper'
require Rails.root.join('db', 'post_migrate', '20180619121030_enqueue_delete_diff_files_workers.rb')
describe EnqueueDeleteDiffFilesWorkers, :migration, :sidekiq do
let(:merge_request_diffs) { table(:merge_request_diffs) }
let(:merge_requests) { table(:merge_requests) }
let(:namespaces) { table(:namespaces) }
let(:projects) { table(:projects) }
before do
stub_const("#{described_class.name}::BATCH_SIZE", 2)
namespaces.create!(id: 1, name: 'gitlab', path: 'gitlab')
projects.create!(id: 1, namespace_id: 1, name: 'gitlab', path: 'gitlab')
merge_requests.create!(id: 1, target_project_id: 1, source_project_id: 1, target_branch: 'feature', source_branch: 'master', state: 'merged')
merge_request_diffs.create!(id: 1, merge_request_id: 1, state: 'collected')
merge_request_diffs.create!(id: 2, merge_request_id: 1, state: 'without_files')
merge_request_diffs.create!(id: 3, merge_request_id: 1, state: 'collected')
merge_request_diffs.create!(id: 4, merge_request_id: 1, state: 'collected')
merge_request_diffs.create!(id: 5, merge_request_id: 1, state: 'empty')
merge_request_diffs.create!(id: 6, merge_request_id: 1, state: 'collected')
merge_requests.update(1, latest_merge_request_diff_id: 6)
end
it 'correctly schedules diff file deletion workers' do
Sidekiq::Testing.fake! do
Timecop.freeze do
migrate!
# 1st batch
expect(described_class::MIGRATION).to be_scheduled_delayed_migration(8.minutes, 1)
expect(described_class::MIGRATION).to be_scheduled_delayed_migration(9.minutes, 3)
# 2nd batch
expect(described_class::MIGRATION).to be_scheduled_delayed_migration(16.minutes, 4)
expect(described_class::MIGRATION).to be_scheduled_delayed_migration(17.minutes, 6)
expect(BackgroundMigrationWorker.jobs.size).to eq(4)
end
end
end
it 'migrates the data' do
expect { migrate! }.to change { merge_request_diffs.where(state: 'without_files').count }
.from(1).to(4)
end
end
...@@ -534,11 +534,18 @@ describe Discussion, ResolvableDiscussion do ...@@ -534,11 +534,18 @@ describe Discussion, ResolvableDiscussion do
describe "#last_resolved_note" do describe "#last_resolved_note" do
let(:current_user) { create(:user) } let(:current_user) { create(:user) }
let(:time) { Time.now.utc }
before do before do
first_note.resolve!(current_user) Timecop.freeze(time - 1.second) do
third_note.resolve!(current_user) first_note.resolve!(current_user)
second_note.resolve!(current_user) end
Timecop.freeze(time) do
third_note.resolve!(current_user)
end
Timecop.freeze(time + 1.second) do
second_note.resolve!(current_user)
end
end end
it "returns the last note that was resolved" do it "returns the last note that was resolved" do
......
...@@ -614,13 +614,13 @@ describe Project do ...@@ -614,13 +614,13 @@ describe Project do
last_activity_at: timestamp, last_activity_at: timestamp,
last_repository_updated_at: timestamp - 1.hour) last_repository_updated_at: timestamp - 1.hour)
expect(project.last_activity_date).to eq(timestamp) expect(project.last_activity_date).to be_like_time(timestamp)
project.update_attributes(updated_at: timestamp, project.update_attributes(updated_at: timestamp,
last_activity_at: timestamp - 1.hour, last_activity_at: timestamp - 1.hour,
last_repository_updated_at: nil) last_repository_updated_at: nil)
expect(project.last_activity_date).to eq(timestamp) expect(project.last_activity_date).to be_like_time(timestamp)
end end
end end
end end
......
...@@ -2056,6 +2056,38 @@ describe API::Projects do ...@@ -2056,6 +2056,38 @@ describe API::Projects do
end end
end end
describe 'PUT /projects/:id/transfer' do
context 'when authenticated as owner' do
let(:group) { create :group }
it 'transfers the project to the new namespace' do
group.add_owner(user)
put api("/projects/#{project.id}/transfer", user), namespace: group.id
expect(response).to have_gitlab_http_status(200)
end
it 'fails when transferring to a non owned namespace' do
put api("/projects/#{project.id}/transfer", user), namespace: group.id
expect(response).to have_gitlab_http_status(404)
end
it 'fails when transferring to an unknown namespace' do
put api("/projects/#{project.id}/transfer", user), namespace: 'unknown'
expect(response).to have_gitlab_http_status(404)
end
it 'fails on missing namespace' do
put api("/projects/#{project.id}/transfer", user)
expect(response).to have_gitlab_http_status(400)
end
end
end
it_behaves_like 'custom attributes endpoints', 'projects' do it_behaves_like 'custom attributes endpoints', 'projects' do
let(:attributable) { project } let(:attributable) { project }
let(:other_attributable) { project2 } let(:other_attributable) { project2 }
......
require 'spec_helper'
describe 'OAuth Tokens requests' do
let(:user) { create :user }
let(:application) { create :oauth_application, scopes: 'api' }
def request_access_token(user)
post '/oauth/token',
grant_type: 'authorization_code',
code: generate_access_grant(user).token,
redirect_uri: application.redirect_uri,
client_id: application.uid,
client_secret: application.secret
end
def generate_access_grant(user)
create :oauth_access_grant, application: application, resource_owner_id: user.id
end
context 'when there is already a token for the application' do
let!(:existing_token) { create :oauth_access_token, application: application, resource_owner_id: user.id }
context 'and the request is done by the resource owner' do
it 'reuses and returns the stored token' do
expect do
request_access_token(user)
end.not_to change { Doorkeeper::AccessToken.count }
expect(json_response['access_token']).to eq existing_token.token
end
end
context 'and the request is done by a different user' do
let(:other_user) { create :user }
it 'generates and returns a different token for a different owner' do
expect do
request_access_token(other_user)
end.to change { Doorkeeper::AccessToken.count }.by(1)
expect(json_response['access_token']).not_to be_nil
end
end
end
context 'when there is no token stored for the application' do
it 'generates and returns a new token' do
expect do
request_access_token(user)
end.to change { Doorkeeper::AccessToken.count }.by(1)
expect(json_response['access_token']).not_to be_nil
end
end
end
This diff is collapsed.
This diff is collapsed.
This diff is collapsed.
Markdown is supported
0%
or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment