Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
slapos
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Ivan Tyagov
slapos
Commits
c4c2e029
Commit
c4c2e029
authored
Nov 27, 2019
by
Thomas Gambier
🚴🏼
Browse files
Options
Browse Files
Download
Plain Diff
Update Release Candidate
parents
87bea58c
d54b88b0
Changes
16
Hide whitespace changes
Inline
Side-by-side
Showing
16 changed files
with
224 additions
and
88 deletions
+224
-88
component/qemu-kvm/buildout.cfg
component/qemu-kvm/buildout.cfg
+2
-2
software/kvm/buildout.hash.cfg
software/kvm/buildout.hash.cfg
+1
-1
software/kvm/template/template-kvm-run.in
software/kvm/template/template-kvm-run.in
+3
-1
software/kvm/test/test.py
software/kvm/test/test.py
+28
-0
software/slaprunner/buildout.hash.cfg
software/slaprunner/buildout.hash.cfg
+4
-4
software/slaprunner/instance-resilient.cfg.jinja2
software/slaprunner/instance-resilient.cfg.jinja2
+3
-0
software/slaprunner/instance-runner-import.cfg.in
software/slaprunner/instance-runner-import.cfg.in
+16
-4
software/slaprunner/instance-runner.cfg
software/slaprunner/instance-runner.cfg
+43
-40
software/slaprunner/instance.cfg
software/slaprunner/instance.cfg
+1
-0
software/slaprunner/test/test.py
software/slaprunner/test/test.py
+28
-0
stack/resilient/buildout.hash.cfg
stack/resilient/buildout.hash.cfg
+5
-5
stack/resilient/instance-pull-backup.cfg.in
stack/resilient/instance-pull-backup.cfg.in
+16
-4
stack/resilient/pbsready-export.cfg.in
stack/resilient/pbsready-export.cfg.in
+8
-2
stack/resilient/pbsready-import.cfg.in
stack/resilient/pbsready-import.cfg.in
+13
-7
stack/resilient/pbsready.cfg.in
stack/resilient/pbsready.cfg.in
+28
-11
stack/resilient/template-replicated.cfg.in
stack/resilient/template-replicated.cfg.in
+25
-7
No files found.
component/qemu-kvm/buildout.cfg
View file @
c4c2e029
...
...
@@ -19,8 +19,8 @@ extends =
[kvm]
recipe = slapos.recipe.cmmi
# qemu-kvm and qemu are now the same since 1.3.
url = https://download.qemu.org/qemu-4.1.
0
.tar.xz
md5sum =
cdf2b5ca52b9abac9bacb5842fa420f8
url = https://download.qemu.org/qemu-4.1.
1
.tar.xz
md5sum =
53879f792ef2675c6c5e6cbf5cc1ac6c
configure-options =
--target-list="$(uname -m 2>/dev/null|sed 's,^i[456]86$,i386,')-softmmu"
--enable-system
...
...
software/kvm/buildout.hash.cfg
View file @
c4c2e029
...
...
@@ -55,7 +55,7 @@ md5sum = 2036bf145f472f62ef8dee5e729328fd
[template-kvm-run]
filename = template/template-kvm-run.in
md5sum =
2a49e6065a3f46f871318ba88f0cd235
md5sum =
71e0c6c8a0b9a46498f25b67a3ab6894
[template-kvm-controller]
filename = template/kvm-controller-run.in
...
...
software/kvm/template/template-kvm-run.in
View file @
c4c2e029
...
...
@@ -242,7 +242,8 @@ additional_disk_options = ''
if disk_aio == 'native':
additional_disk_options += ',cache.direct=on'
if disk_format == "raw":
additional_disk_options += ',discard=on'
# Generate network parameters
# XXX: use_tap should be a boolean
...
...
@@ -305,6 +306,7 @@ kvm_argument_list = [qemu_path,
'-qmp', 'unix:%s,server,nowait' % socket_path,
'-pidfile', pid_file_path, '-msg', 'timestamp=on',
'-D', logfile,
'-nodefaults',
]
rgx = re.compile('^[\w*\,][\=\d+\-\,\w]*$')
...
...
software/kvm/test/test.py
View file @
c4c2e029
...
...
@@ -280,3 +280,31 @@ class TestAccessKvmClusterAdditional(MonitorAccessMixin, InstanceTestCase):
result
.
status_code
)
self
.
assertTrue
(
'<title>noVNC</title>'
in
result
.
text
)
@
unittest
.
skipIf
(
not
sanityCheck
(),
'missing kvm_intel module'
)
class
TestInstanceResilient
(
InstanceTestCase
):
__partition_reference__
=
'ir'
instance_max_retry
=
20
@
classmethod
def
getInstanceSoftwareType
(
cls
):
return
'kvm-resilient'
def
test
(
self
):
# just check that keys returned on requested partition are for resilient
self
.
assertSetEqual
(
set
(
self
.
computer_partition
.
getConnectionParameterDict
().
keys
()),
set
([
'backend-url'
,
'feed-url-kvm-1-pull'
,
'feed-url-kvm-1-push'
,
'ipv6'
,
'ipv6-network-info'
,
'monitor-base-url'
,
'monitor-password'
,
'monitor-setup-url'
,
'monitor-user'
,
'takeover-kvm-1-password'
,
'takeover-kvm-1-url'
,
'url'
]))
software/slaprunner/buildout.hash.cfg
View file @
c4c2e029
...
...
@@ -14,11 +14,11 @@
# not need these here).
[template]
filename = instance.cfg
md5sum =
317c49bf451e80bf0f9d44baa603861e
md5sum =
8b78e32b877d591400746ec7fd68ed4c
[template-runner]
filename = instance-runner.cfg
md5sum =
bacb2d1a38d3a512025e861debdc75b2
md5sum =
85ea0b78fd18428c242438ebe95f980b
[template-runner-import-script]
filename = template/runner-import.sh.jinja2
...
...
@@ -26,7 +26,7 @@ md5sum = fc22e2d2f03ce58631f157a5b4943e15
[instance-runner-import]
filename = instance-runner-import.cfg.in
md5sum =
1f1c62f2bc09a6ab3a2f96eacdf99492
md5sum =
b450c474464a326f3d0b98728460ac97
[instance-runner-export]
filename = instance-runner-export.cfg.in
...
...
@@ -34,7 +34,7 @@ md5sum = b992bb3391de9d6d422bfa8011d8ffc4
[template-resilient]
filename = instance-resilient.cfg.jinja2
md5sum =
0f3d75ca834839c5ae04e9c26cca289a
md5sum =
2271c829b94542b7b2d9c589376ae538
[template_nginx_conf]
filename = nginx_conf.in
...
...
software/slaprunner/instance-resilient.cfg.jinja2
View file @
c4c2e029
...
...
@@ -26,6 +26,9 @@ eggs-directory = {{ eggs_directory }}
develop-eggs-directory = {{ develop_eggs_directory }}
offline = true
extends =
{{ monitor_template }}
# += because we need to take up parts (like instance-custom, slapmonitor etc) from the profile we extended
parts +=
publish-early
...
...
software/slaprunner/instance-runner-import.cfg.in
View file @
c4c2e029
...
...
@@ -112,7 +112,7 @@ context =
raw restore_exit_code_file ${:restore-exit-code-file}
raw restore_error_message_file ${:restore-error-message-file}
[importer-consistency-promise]
[importer-consistency-promise
-bin
]
# Test that the importer script and "after-import" subscripts
# are not older than 2 days (1 day + some slack), and have succeeded
recipe = collective.recipe.template
...
...
@@ -135,10 +135,16 @@ input = inline: #!/bin/sh
fi
fi
exit 1; # Something else went wrong
output = ${directory:
promises
}/importer-consistency-promise
output = ${directory:
bin
}/importer-consistency-promise
mode = 755
[software-release-deployment-promise]
[importer-consistency-promise]
<= monitor-promise-base
module = check_command_execute
name = importer-consistency-promise.py
config-command = ${importer-consistency-promise-bin:output}
[software-release-deployment-bin]
recipe = collective.recipe.template
input = inline: #!/bin/sh
PROJECT_FILE=$(find "${directory:etc}" -maxdepth 1 -name .project)
...
...
@@ -153,9 +159,15 @@ input = inline: #!/bin/sh
fi
fi
exit 1
output = ${directory:
promises
}/software-release-deployment-promise
output = ${directory:
bin
}/software-release-deployment-promise
mode = 755
[software-release-deployment-promise]
<= monitor-promise-base
module = check_command_execute
name = software-release-deployment-promise.py
config-command =${software-release-deployment-bin:output}
[resilient-software-release-information]
recipe = slapos.recipe.template
url = {{ software_release_information_template }}
...
...
software/slaprunner/instance-runner.cfg
View file @
c4c2e029
...
...
@@ -85,20 +85,18 @@ config-url = {{ slapparameter_dict.get('custom-frontend-backend-url') }}
return = site_url domain
[custom-frontend-promise]
recipe = slapos.cookbook:check_url_available
path = $${directory:promises}/custom_frontend_promise
url = https://$${request-custom-frontend:connection-domain}
<= monitor-promise-base
module = check_url_available
name = custom_frontend_promise.py
config-url = https://$${request-custom-frontend:connection-domain}
{% if slapparameter_dict.get('custom-frontend-basic-auth') -%}
check-secure = 1
c
onfig-c
heck-secure = 1
{% endif -%}
dash_path = {{ dash_executable_location }}
curl_path = {{ curl_executable_location }}
[custom-frontend-url-ready-promise]
[custom-frontend-url-ready-promise
-bin
]
recipe = slapos.recipe.template:jinja2
path = $${directory:promises}/custom_frontend_ready_promise
url = https://$${request-custom-frontend:connection-domain}
rendered = $${directory:
promises
}/custom_frontend_ready_promise
rendered = $${directory:
bin
}/custom_frontend_ready_promise
template = inline:
#!{{ dash_executable_location }}
...
...
@@ -110,6 +108,12 @@ template = inline:
exit 1
fi
[custom-frontend-url-ready-promise]
<= monitor-promise-base
module = check_command_execute
name = custom_frontend_ready_promise.py
config-command = $${custom-frontend-url-ready-promise-bin:rendered}
[publish-connection-information]
custom-frontend-url = $${custom-frontend-url-ready-promise:url}
{% endif %}
...
...
@@ -135,7 +139,6 @@ ssh = $${:etc}/ssh/
log = $${:var}/log/
run = $${:var}/run/
backup = $${:srv}/backup/
promises = $${:etc}/promise/
test = $${:etc}/test/
nginx-data = $${:srv}/nginx
ca-dir = $${:srv}/ssl
...
...
@@ -476,13 +479,12 @@ output = $${directory:scripts}/slaprunner-httpd-graceful
mode = 700
[apache-httpd-promise]
recipe = slapos.cookbook:check_url_available
path = $${directory:promises}/$${:filename}
<= monitor-promise-base
module = check_url_available
name = $${:filename}.py
filename = apache-httpd-listening-on-tcp
url = $${apache-httpd:access-url}
check-secure = 1
dash_path = {{ dash_executable_location }}
curl_path = {{ curl_executable_location }}
config-url = $${apache-httpd:access-url}
config-check-secure = 1
[slaprunner-httpd-cors]
recipe = plone.recipe.command
...
...
@@ -579,12 +581,11 @@ config-domain = $${slap-parameter:frontend-domain}
return = site_url domain
[slaprunner-frontend-promise]
recipe = slapos.cookbook:check_url_available
path = $${directory:promises}/slaprunner_frontend
url = https://$${request-frontend:connection-domain}/login
dash_path = ${dash:location}/bin/dash
curl_path = ${curl:location}/bin/curl
check-secure = 1
<= monitor-promise-base
module = check_url_available
name = slaprunner_frontend.py
config-url = https://$${request-frontend:connection-domain}/login
config-check-secure = 1
[request-httpd-frontend]
<= slap-connection
...
...
@@ -600,12 +601,11 @@ config-domain =
return = secure_access domain
[httpd-frontend-promise]
recipe = slapos.cookbook:check_url_available
path = $${directory:promises}/slaprunner-apache-http-frontend
url = $${request-httpd-frontend:connection-secure_access}
dash_path = {{ dash_executable_location }}
curl_path = {{ curl_executable_location }}
check-secure = 1
<= monitor-promise-base
module = check_url_available
name = slaprunner-apache-http-frontend.py
config-url = $${request-httpd-frontend:connection-secure_access}
config-check-secure = 1
{% endif %}
...
...
@@ -667,16 +667,18 @@ monitor-password = $${monitor-publish-parameters:monitor-password}
#-- Deploy promises scripts
[slaprunner-promise]
recipe = slapos.cookbook:check_port_listening
path = $${directory:promises}/slaprunner
hostname = $${slaprunner:ipv6}
port = $${slaprunner:runner_port}
<= monitor-promise-base
module = check_port_listening
name = slaprunner.py
config-hostname = $${slaprunner:ipv6}
config-port = $${slaprunner:runner_port}
[runner-sshd-promise]
recipe = slapos.cookbook:check_port_listening
path = $${directory:promises}/runner-sshd
hostname = $${slap-network-information:global-ipv6}
port = $${runner-sshd-port:port}
<= monitor-promise-base
module = check_port_listening
name = runner-sshd.py
config-hostname = $${slap-network-information:global-ipv6}
config-port = $${runner-sshd-port:port}
[symlinks]
recipe = cns.recipe.symlink
...
...
@@ -891,10 +893,11 @@ name = slapgrid
log = $${runnerdirectory:home}/instance/*/.slapgrid/log/instance.log $${runnerdirectory:home}/instance/*/.slapgrid/promise/log/*.log
[supervisord-promise]
recipe = slapos.cookbook:check_port_listening
path = $${directory:promises}/supervisord
hostname = $${slaprunner:ipv4}
port = $${supervisord:port}
<= monitor-promise-base
module = check_port_listening
name = supervisord.py
config-hostname = $${slaprunner:ipv4}
config-port = $${supervisord:port}
# XXX Monitor
[monitor-instance-parameter]
...
...
software/slaprunner/instance.cfg
View file @
c4c2e029
...
...
@@ -42,6 +42,7 @@ context = key buildout buildout:bin-directory
key develop_eggs_directory buildout:develop-eggs-directory
key eggs_directory buildout:eggs-directory
key slapparameter_dict slap-configuration:configuration
raw monitor_template ${monitor-template:rendered}
template-parts-destination = ${template-parts:target}
template-replicated-destination = ${template-replicated:target}
import-list = file parts :template-parts-destination
...
...
software/slaprunner/test/test.py
View file @
c4c2e029
...
...
@@ -269,3 +269,31 @@ class ServicesTestCase(SlaprunnerTestCase):
expected_process_name
=
name
.
format
(
hash
=
h
)
self
.
assertIn
(
expected_process_name
,
process_names
)
class
TestInstanceResilient
(
SlaprunnerTestCase
):
instance_max_retry
=
20
@
classmethod
def
getInstanceSoftwareType
(
cls
):
return
'resilient'
def
test
(
self
):
# just check that keys returned on requested partition are for resilient
self
.
assertSetEqual
(
set
(
self
.
computer_partition
.
getConnectionParameterDict
().
keys
()),
set
([
'backend-url'
,
'feed-url-runner-1-pull'
,
'feed-url-runner-1-push'
,
'git-private-url'
,
'git-public-url'
,
'init-password'
,
'init-user'
,
'monitor-base-url'
,
'monitor-setup-url'
,
'public-url'
,
'ssh-command'
,
'takeover-runner-1-password'
,
'takeover-runner-1-url'
,
'url'
,
'webdav-url'
]))
stack/resilient/buildout.hash.cfg
View file @
c4c2e029
...
...
@@ -14,23 +14,23 @@
# not need these here).
[pbsready]
filename = pbsready.cfg.in
md5sum =
f3bf5e1d8bbfbb428c5bbe3a57d8cbe5
md5sum =
5e0dcd4c290f0b46cb2d316dc1c9c011
[pbsready-import]
filename = pbsready-import.cfg.in
md5sum =
9d36d08ac6ae351b598a67db41657cc
6
md5sum =
d813c43ed00eff868fb13bc75b04533
6
[pbsready-export]
filename = pbsready-export.cfg.in
md5sum =
c6c11db5372150019debb1ce519b907
d
md5sum =
2e804e06b5203c3f127c31a1704c48b
d
[template-pull-backup]
filename = instance-pull-backup.cfg.in
md5sum =
57b9b421d233402e6d5177c69cf9567e
md5sum =
0bbe16f3d805afd880a251a4f40ecaf1
[template-replicated]
filename = template-replicated.cfg.in
md5sum =
7392935be29d89f8224bccac78e3ecd0
md5sum =
290b380fe3da8736642bc10a8b1163d1
[template-parts]
filename = template-parts.cfg.in
...
...
stack/resilient/instance-pull-backup.cfg.in
View file @
c4c2e029
...
...
@@ -222,22 +222,34 @@ wrapper-path = $${directory:bin}/resilient-genstatrss.py
recipe = cns.recipe.symlink
symlink = $${pbs:rdiff-backup-data-folder}/restore.log = $${basedirectory:log}/pbs-push-history-log
[pull-push-stalled-promise]
[pull-push-stalled-promise
-bin
]
recipe = slapos.cookbook:wrapper
# # time-buffer is 24h (+1h of latitude)
command-line = ${buildout:bin-directory}/check-feed-as-promise --feed-path $${pbs-resilient-status-feed:feed-path} --title --ok-pattern 'OK' --time-buffer 90000
wrapper-path = $${
basedirectory:promises
}/stalled-pull-push
wrapper-path = $${
rootdirectory:bin
}/stalled-pull-push
[notifier-feed-status-promise]
[pull-push-stalled-promise]
<= monitor-promise-base
module = check_command_execute
name = stalled-pull-push.py
config-command = $${pull-push-stalled-promise-bin:wrapper-path}
[notifier-feed-status-promise-bin]
recipe = slapos.recipe.template:jinja2
template = ${notifier-feed-promise-template:target}
rendered = $${
basedirectory:promises
}/notifier-feed-check-malformed-or-failure.py
rendered = $${
rootdirectory:bin
}/notifier-feed-check-malformed-or-failure.py
mode = 700
context =
key notifier_feed_directory directory:notifier-feeds
raw base_url http://[$${notifier:host}]:$${notifier:port}/get/
raw python_executable ${buildout:executable}
[notifier-feed-status-promise]
<= monitor-promise-base
module = check_command_execute
name = notifier-feed-check-malformed-or-failure.py
config-command = $${notifier-feed-status-promise-bin:rendered}
#----------------
#--
#-- Publish instance parameters.
...
...
stack/resilient/pbsready-export.cfg.in
View file @
c4c2e029
...
...
@@ -46,7 +46,7 @@ max-run = 3
[logrotate-entry-notifier]
rendered = ${rootdirectory:etc}/logrotate_notifier.conf
[notifier-exporter-promise]
[notifier-exporter-promise
-bin
]
recipe = slapos.recipe.template:jinja2
mode = 700
template = inline:
...
...
@@ -56,7 +56,13 @@ template = inline:
if [ -s "$EXPORTER_FEED" ]; then
tail -n 1 $EXPORTER_FEED | grep -vq FAILURE_PATTERN
fi
rendered = ${basedirectory:promises}/exporter-status
rendered = ${rootdirectory:bin}/exporter-status
[notifier-exporter-promise]
<= monitor-promise-base
module = check_command_execute
name = exporter-status.py
config-command = ${notifier-exporter-promise-bin:rendered}
[cron-entry-backup]
# Schedule the periodic database dump.
...
...
stack/resilient/pbsready-import.cfg.in
View file @
c4c2e029
...
...
@@ -65,7 +65,7 @@ recipe = slapos.cookbook:notifier.callback
on-notification-id = $${slap-parameter:on-notification}
callback = $${post-notification-run:output}
[backup-checksum-integrity-promise]
[backup-checksum-integrity-promise
-bin
]
recipe = slapos.recipe.template:jinja2
template = inline:
#!/${bash:location}/bin/bash
...
...
@@ -80,9 +80,16 @@ template = inline:
# If file doesn't exist, promise shouldnt raise false positive
exit 0;
fi
rendered = $${
basedirectory:promises
}/backup-checksum-integrity
rendered = $${
rootdirectory:bin
}/backup-checksum-integrity
mode = 700
[backup-checksum-integrity-promise]
<= monitor-promise-base
module = check_command_execute
name = backup-checksum-integrity.py
config-command = $${backup-checksum-integrity-promise-bin:rendered}
###########
# Generate the takeover script
###########
...
...
@@ -155,11 +162,10 @@ command-line = $${:apache-executable} -f $${resilient-web-takeover-httpd-configu
wrapper-path = $${basedirectory:services}/resilient-web-takeover-httpd
[resilient-web-takeover-httpd-promise]
recipe = slapos.cookbook:check_url_available
path = $${basedirectory:promises}/resilient-web-takeover-httpd
url = http://[$${resilient-web-takeover-httpd-configuration-file:listening-ip}]:$${resilient-web-takeover-httpd-configuration-file:listening-port}/
dash_path = ${dash:location}/bin/dash
curl_path = ${curl:location}/bin/curl
<= monitor-promise-base
module = check_url_available
name = resilient-web-takeover-httpd.py
config-url = http://[$${resilient-web-takeover-httpd-configuration-file:listening-ip}]:$${resilient-web-takeover-httpd-configuration-file:listening-port}/
###########
# Symlinks
...
...
stack/resilient/pbsready.cfg.in
View file @
c4c2e029
...
...
@@ -40,7 +40,6 @@ services = $${rootdirectory:etc}/service
run = $${rootdirectory:var}/run
scripts = $${rootdirectory:etc}/run
backup = $${rootdirectory:srv}/backup
promises = $${rootdirectory:etc}/promise
services = $${rootdirectory:etc}/service
cache = $${rootdirectory:var}/cache
notifier = $${rootdirectory:etc}/notifier
...
...
@@ -152,11 +151,17 @@ name = resilient-notifier-status-feed
frequency = */5 * * * *
command = $${notifier-resilient-status-feed:wrapper-path}
[notifier-stalled-promise]
[notifier-stalled-promise
-bin
]
recipe = slapos.cookbook:wrapper
# time-buffer is 24h (+1h of latitude)
command-line = ${buildout:bin-directory}/check-feed-as-promise --feed-path $${notifier-resilient-status-feed:feed-path} --title --ok-pattern 'OK' --time-buffer 90000
wrapper-path = $${basedirectory:promises}/stalled-notifier-callbacks
wrapper-path = $${rootdirectory:bin}/stalled-notifier-callbacks
[notifier-stalled-promise]
<= monitor-promise-base
module = check_command_execute
name = stalled-notifier-callbacks.py
config-command = $${notifier-stalled-promise-bin:wrapper-path}
#----------------
#--
...
...
@@ -206,10 +211,11 @@ command-line = $${directory:bin}/killpidfromfile $${resilient-sshd-config:path_p
wrapper-path = $${basedirectory:scripts}/sshd-graceful
[sshd-promise]
recipe = slapos.cookbook:check_port_listening
path = $${basedirectory:promises}/sshd
hostname = $${slap-network-information:global-ipv6}
port = $${sshd-port:port}
<= monitor-promise-base
module = check_port_listening
name = sshd.py
config-hostname = $${slap-network-information:global-ipv6}
config-port = $${sshd-port:port}
#----------------
#--
...
...
@@ -237,7 +243,7 @@ public-key = $${sshd-raw-server:rsa-keyfile}.pub
private-key = $${sshd-raw-server:rsa-keyfile}
wrapper = $${basedirectory:services}/sshd
[resilient-sshkeys-sshd-promise]
[resilient-sshkeys-sshd-promise
-bin
]
# Check that public key file exists and is not empty
recipe = collective.recipe.template
input = inline:#!${bash:location}/bin/bash
...
...
@@ -245,23 +251,34 @@ input = inline:#!${bash:location}/bin/bash
if [[ ! -n "$PUBLIC_KEY_CONTENT" || "$PUBLIC_KEY_CONTENT" == *None* ]]; then
exit 1
fi
output = $${
basedirectory:promises
}/public-key-existence
output = $${
rootdirectory:bin
}/public-key-existence
mode = 700
[resilient-sshkeys-sshd-promise]
<= monitor-promise-base
module = check_command_execute
name = public-key-existence.py
config-command = $${resilient-sshkeys-sshd-promise-bin:output}
#----------------
#--
#-- Promises
[notifier-feed-status-promise]
[notifier-feed-status-promise
-bin
]
recipe = slapos.recipe.template:jinja2
template = ${notifier-feed-promise-template:target}
rendered = $${
basedirectory:promises
}/notifier-feed-check-malformed-or-failure.py
rendered = $${
rootdirectory:bin
}/notifier-feed-check-malformed-or-failure.py
mode = 700
context =
key notifier_feed_directory directory:notifier-feeds
raw base_url http://[$${notifier:host}]:$${notifier:port}/get/
raw python_executable ${buildout:executable}
[notifier-feed-status-promise]
<= monitor-promise-base
module = check_command_execute
name = notifier-feed-check-malformed-or-failure.py
config-command = $${notifier-feed-status-promise-bin:rendered}
#----------------
#--
#-- Connection informations to re-use.
...
...
stack/resilient/template-replicated.cfg.in
View file @
c4c2e029
...
...
@@ -18,7 +18,7 @@
recipe = slapos.cookbook:mkdirectory
home = ${buildout:directory}
etc = ${:home}/etc
promise = ${:etc}/promise
bin = ${:home}/bin
## Tells the Backupable recipe that we want a backup
...
...
@@ -137,7 +137,7 @@ takeover-{{namebase}}-{{id}}-password = ${request-{{namebase}}-pseudo-replicatin
[resilient-request-{{namebase}}-public-key-promise]
[resilient-request-{{namebase}}-public-key-promise
-bin
]
# Check that public-key-value parameter exists and is not empty
# XXX: maybe we should consider empty values to be non-nexistent.
recipe = collective.recipe.template
...
...
@@ -147,12 +147,19 @@ input = inline:#!/bin/bash
if [[ ! -n "$PUBLIC_KEY_CONTENT" || "$PUBLIC_KEY_CONTENT" == *None* ]]; then
exit 1
fi
output = ${resilient-directory:
promise
}/resilient-request-{{namebase}}-public-key
output = ${resilient-directory:
bin
}/resilient-request-{{namebase}}-public-key
mode = 700
[resilient-request-{{namebase}}-public-key-promise]
<= monitor-promise-base
module = check_command_execute
name = resilient-request-{{namebase}}-public-key.py
config-command = ${resilient-request-{{namebase}}-public-key-promise-bin:output}
{% for id in range(1,nbbackup|int) %}
[resilient-request-{{namebase}}-pseudo-replicating-{{id}}-public-key-promise]
[resilient-request-{{namebase}}-pseudo-replicating-{{id}}-public-key-promise
-bin
]
# Check that public-key-value parameter exists and is not empty
# XXX: maybe we should consider empty values to be non-nexistent.
recipe = collective.recipe.template
...
...
@@ -162,9 +169,15 @@ input = inline:#!/bin/bash
if [[ ! -n "$PUBLIC_KEY_CONTENT" || "$PUBLIC_KEY_CONTENT" == *None* ]]; then
exit 1
fi
output = ${resilient-directory:
promise
}/resilient-request-{{namebase}}-pseudo-replicating-{{id}}-public-key
output = ${resilient-directory:
bin
}/resilient-request-{{namebase}}-pseudo-replicating-{{id}}-public-key
mode = 700
[resilient-request-{{namebase}}-pseudo-replicating-{{id}}-public-key-promise]
<= monitor-promise-base
module = check_command_execute
name = resilient-request-{{namebase}}-pseudo-replicating-{{id}}-public-key.py
config-command = ${resilient-request-{{namebase}}-pseudo-replicating-{{id}}-public-key-promise-bin:output}
{% endfor %}
...
...
@@ -214,7 +227,7 @@ sla-{{ key }} = {{ value }}
{% endfor %}
{% endif %}
[resilient-request-pbs-{{namebase}}-{{id}}-public-key-promise]
[resilient-request-pbs-{{namebase}}-{{id}}-public-key-promise
-bin
]
# Check that public-key-value parameter exists and is not empty
# XXX: maybe we should consider empty values to be non-nexistent.
recipe = collective.recipe.template
...
...
@@ -224,9 +237,14 @@ input = inline:#!/bin/bash
if [[ ! -n "$PUBLIC_KEY_CONTENT" || "$PUBLIC_KEY_CONTENT" == *None* ]]; then
exit 1
fi
output = ${resilient-directory:
promise
}/resilient-request-{{namebase}}-pseudo-replicating-{{id}}-public-key
output = ${resilient-directory:
bin
}/resilient-request-{{namebase}}-pseudo-replicating-{{id}}-public-key
mode = 700
[resilient-request-pbs-{{namebase}}-{{id}}-public-key-promise]
<= monitor-promise-base
module = check_command_execute
name = resilient-request-{{namebase}}-pseudo-replicating-{{id}}-public-key
config-command = ${resilient-request-pbs-{{namebase}}-{{id}}-public-key-promise-bin:output}
[request-pull-backup-server-{{namebase}}-{{id}}]
<= request-pbs-common
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment