Skip to content
Projects
Groups
Snippets
Help
Loading...
Help
Support
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
slapos
Project overview
Project overview
Details
Activity
Releases
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Issues
0
Issues
0
List
Boards
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Analytics
Analytics
CI / CD
Repository
Value Stream
Wiki
Wiki
Snippets
Snippets
Members
Members
Collapse sidebar
Close sidebar
Activity
Graph
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Carlos Ramos Carreño
slapos
Commits
db5ee6df
Commit
db5ee6df
authored
Aug 30, 2019
by
Łukasz Nowak
Browse files
Options
Browse Files
Download
Plain Diff
Update release candidate
parents
066e6749
7768504e
Changes
42
Hide whitespace changes
Inline
Side-by-side
Showing
42 changed files
with
320 additions
and
533 deletions
+320
-533
component/apache/apache-backend.conf.in
component/apache/apache-backend.conf.in
+3
-5
component/apache/buildout.hash.cfg
component/apache/buildout.hash.cfg
+1
-1
component/caddy/buildout.cfg
component/caddy/buildout.cfg
+1
-1
component/git/buildout.cfg
component/git/buildout.cfg
+2
-2
component/golang/buildout.cfg
component/golang/buildout.cfg
+2
-2
component/lz4/buildout.cfg
component/lz4/buildout.cfg
+2
-2
component/zstd/buildout.cfg
component/zstd/buildout.cfg
+2
-2
setup.py
setup.py
+0
-2
slapos/recipe/generic_mysql/__init__.py
slapos/recipe/generic_mysql/__init__.py
+0
-216
slapos/recipe/generic_mysql/innobackupex.py
slapos/recipe/generic_mysql/innobackupex.py
+0
-24
slapos/recipe/generic_mysql/mysql.py
slapos/recipe/generic_mysql/mysql.py
+0
-49
slapos/recipe/generic_mysql/template/initmysql.sql.in
slapos/recipe/generic_mysql/template/initmysql.sql.in
+0
-3
slapos/recipe/generic_mysql/template/my.cnf.in
slapos/recipe/generic_mysql/template/my.cnf.in
+0
-60
slapos/recipe/generic_mysql/template/mysql-init-function.sql.in
.../recipe/generic_mysql/template/mysql-init-function.sql.in
+0
-6
slapos/test/utils.py
slapos/test/utils.py
+11
-10
software/apache-frontend/buildout.hash.cfg
software/apache-frontend/buildout.hash.cfg
+2
-2
software/apache-frontend/templates/apache.conf.in
software/apache-frontend/templates/apache.conf.in
+0
-3
software/caddy-frontend/buildout.hash.cfg
software/caddy-frontend/buildout.hash.cfg
+7
-7
software/caddy-frontend/instance-apache-frontend.cfg.in
software/caddy-frontend/instance-apache-frontend.cfg.in
+2
-0
software/caddy-frontend/instance-caddy-input-schema.json
software/caddy-frontend/instance-caddy-input-schema.json
+6
-0
software/caddy-frontend/instance.cfg.in
software/caddy-frontend/instance.cfg.in
+1
-0
software/caddy-frontend/templates/Caddyfile.in
software/caddy-frontend/templates/Caddyfile.in
+0
-6
software/caddy-frontend/templates/apache-custom-slave-list.cfg.in
.../caddy-frontend/templates/apache-custom-slave-list.cfg.in
+1
-0
software/caddy-frontend/templates/cached-virtualhost.conf.in
software/caddy-frontend/templates/cached-virtualhost.conf.in
+2
-10
software/caddy-frontend/templates/default-virtualhost.conf.in
...ware/caddy-frontend/templates/default-virtualhost.conf.in
+2
-8
software/caddy-frontend/templates/trafficserver/records.config.jinja2
...dy-frontend/templates/trafficserver/records.config.jinja2
+8
-6
software/caddy-frontend/test/test.py
software/caddy-frontend/test/test.py
+182
-12
software/neoppod/cluster.cfg.in
software/neoppod/cluster.cfg.in
+9
-5
software/neoppod/instance-common.cfg.in
software/neoppod/instance-common.cfg.in
+1
-0
software/neoppod/instance-neo-admin.cfg.in
software/neoppod/instance-neo-admin.cfg.in
+1
-1
software/neoppod/instance-neo-master.cfg.in
software/neoppod/instance-neo-master.cfg.in
+1
-1
software/neoppod/instance-neo.cfg.in
software/neoppod/instance-neo.cfg.in
+16
-32
software/neoppod/root-common.cfg.in
software/neoppod/root-common.cfg.in
+9
-24
software/neoppod/software-common.cfg
software/neoppod/software-common.cfg
+32
-7
software/neoppod/software-zodb5.cfg
software/neoppod/software-zodb5.cfg
+3
-2
software/neoppod/software.cfg
software/neoppod/software.cfg
+1
-1
software/slapos-master/apache-backend.conf.in
software/slapos-master/apache-backend.conf.in
+4
-6
software/slapos-master/buildout.hash.cfg
software/slapos-master/buildout.hash.cfg
+1
-1
stack/erp5/buildout.cfg
stack/erp5/buildout.cfg
+0
-6
stack/erp5/buildout.hash.cfg
stack/erp5/buildout.hash.cfg
+1
-1
stack/erp5/instance-erp5.cfg.in
stack/erp5/instance-erp5.cfg.in
+3
-6
stack/slapos.cfg
stack/slapos.cfg
+1
-1
No files found.
component/apache/apache-backend.conf.in
View file @
db5ee6df
...
@@ -131,13 +131,11 @@ SSLCipherSuite ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:EC
...
@@ -131,13 +131,11 @@ SSLCipherSuite ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:EC
SSLSessionCache shmcb:{{ parameter_dict['ssl-session-cache'] }}(512000)
SSLSessionCache shmcb:{{ parameter_dict['ssl-session-cache'] }}(512000)
SSLProxyEngine On
SSLProxyEngine On
# As backend is trusting REMOTE_USER header unset it always
# As backend is trusting Remote-User header unset it always
RequestHeader unset REMOTE_USER
RequestHeader unset Remote-User
RequestHeader unset SSL_CLIENT_SERIAL
{% if parameter_dict['ca-cert'] -%}
{% if parameter_dict['ca-cert'] -%}
SSLVerifyClient optional
SSLVerifyClient optional
RequestHeader set REMOTE_USER %{SSL_CLIENT_S_DN_CN}s
RequestHeader set Remote-User %{SSL_CLIENT_S_DN_CN}s
RequestHeader set SSL_CLIENT_SERIAL "%{SSL_CLIENT_M_SERIAL}s"
SSLCACertificateFile {{ parameter_dict['ca-cert'] }}
SSLCACertificateFile {{ parameter_dict['ca-cert'] }}
{% if parameter_dict['crl'] -%}
{% if parameter_dict['crl'] -%}
SSLCARevocationCheck chain
SSLCARevocationCheck chain
...
...
component/apache/buildout.hash.cfg
View file @
db5ee6df
...
@@ -14,5 +14,5 @@
...
@@ -14,5 +14,5 @@
# not need these here).
# not need these here).
[template-apache-backend-conf]
[template-apache-backend-conf]
filename = apache-backend.conf.in
filename = apache-backend.conf.in
md5sum =
84d43d3535ffc67f677710b1d97e19aa
md5sum =
bb8c175a93336f0e1838fd47225426f9
component/caddy/buildout.cfg
View file @
db5ee6df
...
@@ -17,7 +17,7 @@ depends =
...
@@ -17,7 +17,7 @@ depends =
[caddy]
[caddy]
# revision and repository can be used to control which caddy version is used
# revision and repository can be used to control which caddy version is used
revision = v1.0.
1
revision = v1.0.
3
repository = github.com/caddyserver/caddy/caddy
repository = github.com/caddyserver/caddy/caddy
recipe = plone.recipe.command
recipe = plone.recipe.command
...
...
component/git/buildout.cfg
View file @
db5ee6df
...
@@ -18,8 +18,8 @@ parts =
...
@@ -18,8 +18,8 @@ parts =
[git]
[git]
recipe = slapos.recipe.cmmi
recipe = slapos.recipe.cmmi
shared = true
shared = true
url = https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.
17.1
.tar.xz
url = https://mirrors.edge.kernel.org/pub/software/scm/git/git-2.
23.0
.tar.xz
md5sum =
5179245515c637357b4a134e8d4e9a6f
md5sum =
93ee0f867f81a39e0ef29eabfb1d2c5b
configure-options =
configure-options =
--with-curl=${curl:location}
--with-curl=${curl:location}
--with-openssl=${openssl:location}
--with-openssl=${openssl:location}
...
...
component/golang/buildout.cfg
View file @
db5ee6df
...
@@ -64,8 +64,8 @@ environment-extra =
...
@@ -64,8 +64,8 @@ environment-extra =
[golang1.12]
[golang1.12]
<= golang-common
<= golang-common
url = https://dl.google.com/go/go1.12.
7
.src.tar.gz
url = https://dl.google.com/go/go1.12.
9
.src.tar.gz
md5sum =
49d7a658cbd825f1cfe903d050bad29f
md5sum =
6132109d4050da349eadc9f7b0304ef4
# go1.11 needs go1.4 to bootstrap
# go1.11 needs go1.4 to bootstrap
environment-extra =
environment-extra =
...
...
component/lz4/buildout.cfg
View file @
db5ee6df
...
@@ -5,6 +5,6 @@ parts =
...
@@ -5,6 +5,6 @@ parts =
[lz4]
[lz4]
recipe = slapos.recipe.cmmi
recipe = slapos.recipe.cmmi
url = https://github.com/lz4/lz4/archive/v1.
8.3
.tar.gz
url = https://github.com/lz4/lz4/archive/v1.
9.2
.tar.gz
md5sum =
d5ce78f7b1b76002bbfffa6f78a5fc4e
md5sum =
3898c56c82fb3d9455aefd48db48eaad
configure-command = true
configure-command = true
component/zstd/buildout.cfg
View file @
db5ee6df
...
@@ -5,8 +5,8 @@ parts =
...
@@ -5,8 +5,8 @@ parts =
[zstd]
[zstd]
recipe = slapos.recipe.cmmi
recipe = slapos.recipe.cmmi
url = https://github.com/facebook/zstd/releases/download/v${:version}/zstd-${:version}.tar.gz
url = https://github.com/facebook/zstd/releases/download/v${:version}/zstd-${:version}.tar.gz
version = 1.4.
2
version = 1.4.
3
md5sum =
1d6aea1cd67a8eab7aa6905f4bf148f8
md5sum =
8581c03b2f56c14ff097a737e60847b3
shared = true
shared = true
location = @@LOCATION@@
location = @@LOCATION@@
configure-command = :
configure-command = :
...
...
setup.py
View file @
db5ee6df
...
@@ -110,9 +110,7 @@ setup(name=name,
...
@@ -110,9 +110,7 @@ setup(name=name,
'generic.cloudooo = slapos.recipe.generic_cloudooo:Recipe'
,
'generic.cloudooo = slapos.recipe.generic_cloudooo:Recipe'
,
'generic.kumofs = slapos.recipe.generic_kumofs:Recipe'
,
'generic.kumofs = slapos.recipe.generic_kumofs:Recipe'
,
'generic.memcached = slapos.recipe.generic_memcached:Recipe'
,
'generic.memcached = slapos.recipe.generic_memcached:Recipe'
,
'generic.mysql = slapos.recipe.generic_mysql:Recipe'
,
'generic.mysql.wrap_update_mysql = slapos.recipe.generic_mysql:WrapUpdateMySQL'
,
'generic.mysql.wrap_update_mysql = slapos.recipe.generic_mysql:WrapUpdateMySQL'
,
'generic.mysql.wrap_mysqld = slapos.recipe.generic_mysql:WrapMySQLd'
,
'generic.varnish = slapos.recipe.generic_varnish:Recipe'
,
'generic.varnish = slapos.recipe.generic_varnish:Recipe'
,
'gitinit = slapos.recipe.gitinit:Recipe'
,
'gitinit = slapos.recipe.gitinit:Recipe'
,
'haproxy = slapos.recipe.haproxy:Recipe'
,
'haproxy = slapos.recipe.haproxy:Recipe'
,
...
...
slapos/recipe/generic_mysql/__init__.py
View file @
db5ee6df
...
@@ -25,206 +25,6 @@
...
@@ -25,206 +25,6 @@
#
#
##############################################################################
##############################################################################
from
slapos.recipe.librecipe
import
GenericBaseRecipe
from
slapos.recipe.librecipe
import
GenericBaseRecipe
import
os
class
Recipe
(
GenericBaseRecipe
):
def
_options
(
self
,
options
):
options
[
'password'
]
=
self
.
generatePassword
()
if
'test-database'
in
options
:
options
[
'test-password'
]
=
self
.
generatePassword
()
options
.
setdefault
(
'parallel-test-database-amount'
,
'0'
)
for
x
in
xrange
(
int
(
options
[
'parallel-test-database-amount'
])):
options
[
'test-password-%s'
%
x
]
=
self
.
generatePassword
()
def
install
(
self
):
path_list
=
[]
template_filename
=
self
.
getTemplateFilename
(
'my.cnf.in'
)
mysql_binary
=
self
.
options
[
'mysql-binary'
]
socket
=
self
.
options
[
'socket'
]
if
'ip'
in
self
.
options
:
networking
=
'port = %s
\
n
bind-address = %s'
%
(
self
.
options
[
'port'
],
self
.
options
[
'ip'
],
)
else
:
networking
=
'skip-networking'
log_bin
=
self
.
options
.
get
(
'binlog-path'
,
''
)
if
log_bin
:
log_bin
=
'log_bin = %s'
%
log_bin
expire_logs_days
=
self
.
options
.
get
(
'binlog-expire-days'
)
if
expire_logs_days
>
0
:
expire_logs_days
=
'expire_logs_days = %s'
%
expire_logs_days
else
:
expire_logs_days
=
''
mysql_conf_file
=
self
.
createFile
(
self
.
options
[
'conf-file'
],
self
.
substituteTemplate
(
template_filename
,
{
'networking'
:
networking
,
'data_directory'
:
self
.
options
[
'data-directory'
],
'pid_file'
:
self
.
options
[
'pid-file'
],
'socket'
:
self
.
options
[
'socket'
],
'error_log'
:
self
.
options
[
'error-log'
],
'slow_query_log'
:
self
.
options
[
'slow-query-log'
],
'log_bin'
:
log_bin
,
'expire_logs_days'
:
expire_logs_days
,
})
)
path_list
.
append
(
mysql_conf_file
)
mysql_script_list
=
[]
# user defined functions
udf_registration
=
"DROP FUNCTION IF EXISTS last_insert_grn_id;
\
n
DROP FUNCTION IF EXISTS mroonga_snippet;
\
n
DROP FUNCTION IF EXISTS mroonga_command;
\
n
"
mroonga
=
self
.
options
.
get
(
'mroonga'
,
'ha_mroonga.so'
)
if
mroonga
:
udf_registration
+=
"CREATE FUNCTION last_insert_grn_id RETURNS "
\
"INTEGER SONAME '"
+
mroonga
+
"';
\
n
"
udf_registration
+=
"CREATE FUNCTION mroonga_snippet RETURNS "
\
"STRING SONAME '"
+
mroonga
+
"';
\
n
"
udf_registration
+=
"CREATE FUNCTION mroonga_command RETURNS "
\
"STRING SONAME '"
+
mroonga
+
"';
\
n
"
mysql_script_list
.
append
(
self
.
substituteTemplate
(
self
.
getTemplateFilename
(
'mysql-init-function.sql.in'
),
{
'udf_registration'
:
udf_registration
,
}
))
# real database
mysql_script_list
.
append
(
self
.
substituteTemplate
(
self
.
getTemplateFilename
(
'initmysql.sql.in'
),
{
'mysql_database'
:
self
.
options
[
'database'
],
'mysql_user'
:
self
.
options
[
'user'
],
'mysql_password'
:
self
.
options
[
'password'
]
}
))
# default test database
if
'test-database'
in
self
.
options
:
mysql_script_list
.
append
(
self
.
substituteTemplate
(
self
.
getTemplateFilename
(
'initmysql.sql.in'
),
{
'mysql_database'
:
self
.
options
[
'test-database'
],
'mysql_user'
:
self
.
options
[
'test-user'
],
'mysql_password'
:
self
.
options
[
'test-password'
]
}
))
# parallel test databases
for
x
in
xrange
(
int
(
self
.
options
[
'parallel-test-database-amount'
])):
mysql_script_list
.
append
(
self
.
substituteTemplate
(
self
.
getTemplateFilename
(
'initmysql.sql.in'
),
{
'mysql_database'
:
self
.
options
[
'mysql-test-database-base'
]
+
'_%s'
%
x
,
'mysql_user'
:
self
.
options
[
'mysql-test-user-base'
]
+
'_%s'
%
x
,
'mysql_password'
:
self
.
options
[
'test-password-%s'
%
x
]
}
))
mysql_script_list
.
append
(
'EXIT'
)
mysql_script
=
'
\
n
'
.
join
(
mysql_script_list
)
mysql_upgrade_binary
=
self
.
options
[
'mysql-upgrade-binary'
]
mysql_update
=
self
.
createPythonScript
(
self
.
options
[
'update-wrapper'
],
'%s.mysql.updateMysql'
%
__name__
,
[
dict
(
mysql_script
=
mysql_script
,
mysql_binary
=
mysql_binary
,
mysql_upgrade_binary
=
mysql_upgrade_binary
,
socket
=
socket
,
)]
)
path_list
.
append
(
mysql_update
)
mysqld
=
self
.
createPythonScript
(
self
.
options
[
'wrapper'
],
'%s.mysql.runMysql'
%
__name__
,
[
dict
(
mysql_base_directory
=
self
.
options
[
'mysql-base-directory'
],
mysql_install_binary
=
self
.
options
[
'mysql-install-binary'
],
mysqld_binary
=
self
.
options
[
'mysqld-binary'
],
data_directory
=
self
.
options
[
'data-directory'
],
mysql_binary
=
mysql_binary
,
socket
=
socket
,
configuration_file
=
mysql_conf_file
,
)]
)
path_list
.
append
(
mysqld
)
environment
=
{
'PATH'
:
self
.
options
[
'bin-directory'
]}
# TODO: move to a separate recipe (ack'ed by Cedric)
if
'backup-script'
in
self
.
options
:
# backup configuration
full_backup
=
self
.
options
[
'full-backup-directory'
]
incremental_backup
=
self
.
options
[
'incremental-backup-directory'
]
innobackupex_argument_list
=
[
self
.
options
[
'perl-binary'
],
self
.
options
[
'innobackupex-binary'
],
'--defaults-file=%s'
%
mysql_conf_file
,
'--socket=%s'
%
socket
.
strip
(),
'--user=root'
,
'--ibbackup=%s'
%
self
.
options
[
'xtrabackup-binary'
]]
innobackupex_incremental
=
self
.
createWrapper
(
self
.
options
[
'innobackupex-incremental'
],
innobackupex_argument_list
+
[
'--incremental'
],
environment
)
path_list
.
append
(
innobackupex_incremental
)
innobackupex_full
=
self
.
createWrapper
(
self
.
options
[
'innobackupex-full'
],
innobackupex_argument_list
,
environment
)
path_list
.
append
(
innobackupex_full
)
backup_controller
=
self
.
createPythonScript
(
self
.
options
[
'backup-script'
],
__name__
+
'.innobackupex.controller'
,
[
innobackupex_incremental
,
innobackupex_full
,
full_backup
,
incremental_backup
])
path_list
.
append
(
backup_controller
)
# TODO: move to a separate recipe (ack'ed by Cedric)
# percona toolkit (formerly known as maatkit) installation
for
pt_script_name
in
(
'pt-align'
,
'pt-archiver'
,
'pt-config-diff'
,
'pt-deadlock-logger'
,
'pt-diskstats'
,
'pt-duplicate-key-checker'
,
'pt-fifo-split'
,
'pt-find'
,
'pt-fingerprint'
,
'pt-fk-error-logger'
,
'pt-heartbeat'
,
'pt-index-usage'
,
'pt-ioprofile'
,
'pt-kill'
,
'pt-mext'
,
'pt-mysql-summary'
,
'pt-online-schema-change'
,
'pt-pmp'
,
'pt-query-digest'
,
'pt-show-grants'
,
'pt-sift'
,
'pt-slave-delay'
,
'pt-slave-find'
,
'pt-slave-restart'
,
'pt-stalk'
,
'pt-summary'
,
'pt-table-checksum'
,
'pt-table-sync'
,
'pt-table-usage'
,
'pt-upgrade'
,
'pt-variable-advisor'
,
'pt-visual-explain'
,
):
option_name
=
pt_script_name
+
'-binary'
if
option_name
not
in
self
.
options
:
continue
pt_argument_list
=
[
self
.
options
[
'perl-binary'
],
self
.
options
[
option_name
],
'--defaults-file=%s'
%
mysql_conf_file
,
'--socket=%s'
%
socket
.
strip
(),
'--user=root'
,
]
pt_exe
=
self
.
createWrapper
(
os
.
path
.
join
(
self
.
options
[
'bin-directory'
],
pt_script_name
),
pt_argument_list
,
environment
)
path_list
.
append
(
pt_exe
)
return
path_list
class
WrapUpdateMySQL
(
GenericBaseRecipe
):
class
WrapUpdateMySQL
(
GenericBaseRecipe
):
def
install
(
self
):
def
install
(
self
):
...
@@ -239,19 +39,3 @@ class WrapUpdateMySQL(GenericBaseRecipe):
...
@@ -239,19 +39,3 @@ class WrapUpdateMySQL(GenericBaseRecipe):
}]
}]
),
),
]
]
class
WrapMySQLd
(
GenericBaseRecipe
):
def
install
(
self
):
return
[
self
.
createPythonScript
(
self
.
options
[
'output'
],
__name__
+
'.mysql.runMysql'
,
[{
'mysqld_binary'
:
self
.
options
[
'binary'
],
'configuration_file'
:
self
.
options
[
'configuration-file'
],
'data_directory'
:
self
.
options
[
'data-directory'
],
'mysql_install_binary'
:
self
.
options
[
'mysql-install-binary'
],
'mysql_base_directory'
:
self
.
options
[
'mysql-base-directory'
],
}]
),
]
slapos/recipe/generic_mysql/innobackupex.py
deleted
100644 → 0
View file @
066e6749
import
os
import
glob
def
controller
(
innobackupex_incremental
,
innobackupex_full
,
full_backup
,
incremental_backup
):
"""Creates full or incremental backup
If no full backup is done, it is created
If full backup exists incremental backup is done starting with base
base is the newest (according to date) full or incremental backup
"""
if
len
(
os
.
listdir
(
full_backup
))
==
0
:
print
'Doing full backup in %r'
%
full_backup
os
.
execv
(
innobackupex_full
,
[
innobackupex_full
,
full_backup
])
else
:
backup_list
=
filter
(
os
.
path
.
isdir
,
glob
.
glob
(
full_backup
+
"/*"
)
+
glob
.
glob
(
incremental_backup
+
"/*"
))
backup_list
.
sort
(
key
=
lambda
x
:
os
.
path
.
getmtime
(
x
),
reverse
=
True
)
base
=
backup_list
[
0
]
print
'Doing incremental backup in %r using %r as a base'
%
(
incremental_backup
,
base
)
os
.
execv
(
innobackupex_incremental
,
[
innobackupex_incremental
,
'--incremental-basedir=%s'
%
base
,
incremental_backup
])
slapos/recipe/generic_mysql/mysql.py
View file @
db5ee6df
...
@@ -4,55 +4,6 @@ import time
...
@@ -4,55 +4,6 @@ import time
import
sys
import
sys
import
pytz
import
pytz
def
runMysql
(
conf
):
sleep
=
60
mysqld_wrapper_list
=
[
conf
[
'mysqld_binary'
],
'--defaults-file=%s'
%
conf
[
'configuration_file'
]]
# we trust mysql_install that if mysql directory is available mysql was
# correctly initalised
if
not
os
.
path
.
isdir
(
os
.
path
.
join
(
conf
[
'data_directory'
],
'mysql'
)):
while
True
:
# XXX: Protect with proper root password
# XXX: Follow http://dev.mysql.com/doc/refman/5.0/en/default-privileges.html
popen
=
subprocess
.
Popen
([
conf
[
'mysql_install_binary'
],
'--defaults-file=%s'
%
conf
[
'configuration_file'
],
'--skip-name-resolve'
,
'--datadir=%s'
%
conf
[
'data_directory'
],
'--basedir=%s'
%
conf
[
'mysql_base_directory'
]],
stdout
=
subprocess
.
PIPE
,
stderr
=
subprocess
.
STDOUT
)
result
=
popen
.
communicate
()[
0
]
if
popen
.
returncode
is
None
or
popen
.
returncode
!=
0
:
print
"Failed to initialise server.
\
n
The error was: %s"
%
result
print
"Waiting for %ss and retrying"
%
sleep
time
.
sleep
(
sleep
)
else
:
print
"Mysql properly initialised"
break
else
:
print
"MySQL already initialised"
print
"Starting %r"
%
mysqld_wrapper_list
[
0
]
sys
.
stdout
.
flush
()
sys
.
stderr
.
flush
()
# try to increase the maximum number of open file descriptors.
# it seems that mysqld requires (max_connections + 810) file descriptors.
# to make it possible, you need to set the hard limit of nofile in
# /etc/security/limits.conf like the following :
# @slapsoft hard nofile 2048
try
:
import
resource
required_nofile
=
2048
# XXX hardcoded value more than 1000 + 810
nofile_limit_list
=
[
max
(
x
,
required_nofile
)
for
x
in
resource
.
getrlimit
(
resource
.
RLIMIT_NOFILE
)]
resource
.
setrlimit
(
resource
.
RLIMIT_NOFILE
,
nofile_limit_list
)
except
ImportError
:
# resource library is only available on Unix platform.
pass
except
ValueError
:
# 'ValueError: not allowed to raise maximum limit'
pass
os
.
execl
(
mysqld_wrapper_list
[
0
],
*
mysqld_wrapper_list
)
def
updateMysql
(
conf
):
def
updateMysql
(
conf
):
sleep
=
30
sleep
=
30
is_succeed
=
False
is_succeed
=
False
...
...
slapos/recipe/generic_mysql/template/initmysql.sql.in
deleted
100644 → 0
View file @
066e6749
CREATE DATABASE IF NOT EXISTS %(mysql_database)s;
GRANT ALL PRIVILEGES ON %(mysql_database)s.* TO %(mysql_user)s@'%%' IDENTIFIED BY '%(mysql_password)s';
GRANT ALL PRIVILEGES ON %(mysql_database)s.* TO %(mysql_user)s@'localhost' IDENTIFIED BY '%(mysql_password)s';
slapos/recipe/generic_mysql/template/my.cnf.in
deleted
100644 → 0
View file @
066e6749
# ERP5 buildout my.cnf template based on my-huge.cnf shipped with mysql
# The MySQL server
[mysqld]
# ERP5 by default requires InnoDB storage. MySQL by default fallbacks to using
# different engine, like MyISAM. Such behaviour generates problems only, when
# tables requested as InnoDB are silently created with MyISAM engine.
#
# Loud fail is really required in such case.
sql-mode="NO_ENGINE_SUBSTITUTION"
skip-show-database
%(networking)s
socket = %(socket)s
datadir = %(data_directory)s
pid-file = %(pid_file)s
log-error = %(error_log)s
slow_query_log
slow_query_log_file = %(slow_query_log)s
long_query_time = 1
max_allowed_packet = 128M
query_cache_size = 0
query_cache_type = 0
innodb_file_per_table = 0
plugin-load = ha_mroonga.so;handlersocket.so
# By default only 100 connections are allowed, when using zeo
# we may have much more connections
max_connections = 1000
# The following are important to configure and depend a lot on to the size of
# your database and the available resources.
#innodb_buffer_pool_size = 4G
#innodb_log_file_size = 256M
#innodb_log_buffer_size = 8M
# very important to allow parallel indexing
innodb_locks_unsafe_for_binlog = 1
# Some dangerous settings you may want to uncomment if you only want
# performance or less disk access. Useful for unit tests.
#innodb_flush_log_at_trx_commit = 0
#innodb_flush_method = nosync
#innodb_doublewrite = 0
#sync_frm = 0
%(log_bin)s
%(expire_logs_days)s
# Force utf8 usage
collation_server = utf8_unicode_ci
character_set_server = utf8
skip-character-set-client-handshake
[mysql]
no-auto-rehash
socket = %(socket)s
[mysqlhotcopy]
interactive-timeout
slapos/recipe/generic_mysql/template/mysql-init-function.sql.in
deleted
100644 → 0
View file @
066e6749
USE mysql;
DROP FUNCTION IF EXISTS last_insert_grn_id;
DROP FUNCTION IF EXISTS mroonga_snippet;
DROP FUNCTION IF EXISTS mroonga_command;
DROP FUNCTION IF EXISTS sphinx_snippets;
%(udf_registration)s
slapos/test/utils.py
View file @
db5ee6df
...
@@ -54,16 +54,17 @@ def makeRecipe(recipe_class, options, name='test', slap_connection=None):
...
@@ -54,16 +54,17 @@ def makeRecipe(recipe_class, options, name='test', slap_connection=None):
if
os
.
path
.
exists
(
buildout_cfg
):
if
os
.
path
.
exists
(
buildout_cfg
):
parser
=
ConfigParser
()
parser
=
ConfigParser
()
parser
.
readfp
(
open
(
buildout_cfg
))
parser
.
readfp
(
open
(
buildout_cfg
))
eggs_directory
=
parser
.
get
(
if
parser
.
has_option
(
'buildout'
,
'eggs-directory'
):
'buildout'
,
# when buildout_cfg is an instance buildout (like in SLAPOS-EGG-TEST),
'eggs-directory'
,
# there's a ${buildout:eggs-directory} we can use.
# default, for the case when buildout_cfg is a software buildout
eggs_directory
=
parser
.
get
(
'buildout'
,
'eggs-directory'
)
# like with SLAPOS-SR-TEST.
develop_eggs_directory
=
parser
.
get
(
'buildout'
,
'develop-eggs-directory'
)
vars
=
{
'eggs-directory'
:
os
.
path
.
join
(
base_directory
,
'eggs'
)})
else
:
develop_eggs_directory
=
parser
.
get
(
# when when buildout_cfg is a software buildout, we can only guess the
'buildout'
,
# standard eggs directories.
'develop-eggs-directory'
,
eggs_directory
=
os
.
path
.
join
(
base_directory
,
'eggs'
)
vars
=
{
'develop-eggs-directory'
:
os
.
path
.
join
(
base_directory
,
'develop-eggs'
)})
develop_eggs_directory
=
os
.
path
.
join
(
base_directory
,
'develop-eggs'
)
logging
.
getLogger
(
__name__
).
info
(
logging
.
getLogger
(
__name__
).
info
(
'Using eggs-directory (%s) and develop-eggs-directory (%s) from buildout at %s'
,
'Using eggs-directory (%s) and develop-eggs-directory (%s) from buildout at %s'
,
eggs_directory
,
eggs_directory
,
...
...
software/apache-frontend/buildout.hash.cfg
View file @
db5ee6df
...
@@ -18,7 +18,7 @@ md5sum = f686f765e55d1dce2e55a400f0714b3e
...
@@ -18,7 +18,7 @@ md5sum = f686f765e55d1dce2e55a400f0714b3e
[template-apache-frontend]
[template-apache-frontend]
filename = instance-apache-frontend.cfg
filename = instance-apache-frontend.cfg
md5sum =
a6b566a29f1b5021d0f1f3c4fa20d749
md5sum =
d6398d727eb1e1bc3df1768a9b9a7e0c
[template-apache-replicate]
[template-apache-replicate]
filename = instance-apache-replicate.cfg.in
filename = instance-apache-replicate.cfg.in
...
@@ -38,7 +38,7 @@ md5sum = 665e83d660c9b779249b2179d7ce4b4e
...
@@ -38,7 +38,7 @@ md5sum = 665e83d660c9b779249b2179d7ce4b4e
[template-apache-frontend-configuration]
[template-apache-frontend-configuration]
filename = templates/apache.conf.in
filename = templates/apache.conf.in
md5sum =
05239181f4d5d0e3fe6bccda587fa9a5
md5sum =
b666d7c4a5c1fd8020713aa53b44a386
[template-custom-slave-list]
[template-custom-slave-list]
filename = templates/apache-custom-slave-list.cfg.in
filename = templates/apache-custom-slave-list.cfg.in
...
...
software/apache-frontend/templates/apache.conf.in
View file @
db5ee6df
...
@@ -20,9 +20,6 @@ TypesConfig {{ httpd_home }}/conf/mime.types
...
@@ -20,9 +20,6 @@ TypesConfig {{ httpd_home }}/conf/mime.types
AddType application/x-compress .Z
AddType application/x-compress .Z
AddType application/x-gzip .gz .tgz
AddType application/x-gzip .gz .tgz
# As backend is trusting REMOTE_USER header unset it always
RequestHeader unset REMOTE_USER
ServerTokens Prod
ServerTokens Prod
# Disable TRACE Method
# Disable TRACE Method
...
...
software/caddy-frontend/buildout.hash.cfg
View file @
db5ee6df
...
@@ -14,7 +14,7 @@
...
@@ -14,7 +14,7 @@
# not need these here).
# not need these here).
[template]
[template]
filename = instance.cfg.in
filename = instance.cfg.in
md5sum =
4832bb055d31be6e99e2ef890b2206b0
md5sum =
d8ce8da7ea7d82c33958bdbabbaad956
[template-common]
[template-common]
filename = instance-common.cfg.in
filename = instance-common.cfg.in
...
@@ -22,7 +22,7 @@ md5sum = c801b7f9f11f0965677c22e6bbe9281b
...
@@ -22,7 +22,7 @@ md5sum = c801b7f9f11f0965677c22e6bbe9281b
[template-apache-frontend]
[template-apache-frontend]
filename = instance-apache-frontend.cfg.in
filename = instance-apache-frontend.cfg.in
md5sum =
74f730a4fb079416118bd412a458cea
8
md5sum =
2903758a104186b7dae9573c3470be7
8
[template-caddy-replicate]
[template-caddy-replicate]
filename = instance-apache-replicate.cfg.in
filename = instance-apache-replicate.cfg.in
...
@@ -30,7 +30,7 @@ md5sum = 491a19d1747bbf795c27b094cf67114d
...
@@ -30,7 +30,7 @@ md5sum = 491a19d1747bbf795c27b094cf67114d
[template-slave-list]
[template-slave-list]
filename = templates/apache-custom-slave-list.cfg.in
filename = templates/apache-custom-slave-list.cfg.in
md5sum =
13338a7844f5a4b749f6647ba8163a8d
md5sum =
c33df53e7752f43b89c5fda7e92a5a78
[template-slave-configuration]
[template-slave-configuration]
filename = templates/custom-virtualhost.conf.in
filename = templates/custom-virtualhost.conf.in
...
@@ -42,7 +42,7 @@ md5sum = eb9ca67763d60843483d95dab2c301b1
...
@@ -42,7 +42,7 @@ md5sum = eb9ca67763d60843483d95dab2c301b1
[template-caddy-frontend-configuration]
[template-caddy-frontend-configuration]
filename = templates/Caddyfile.in
filename = templates/Caddyfile.in
md5sum =
dfec964a9f194293567b09d0f10e4b3d
md5sum =
908b859ff76469381024947f5c98c891
[caddy-backend-url-validator]
[caddy-backend-url-validator]
filename = templates/caddy-backend-url-validator.in
filename = templates/caddy-backend-url-validator.in
...
@@ -54,11 +54,11 @@ md5sum = f20d6c3d2d94fb685f8d26dfca1e822b
...
@@ -54,11 +54,11 @@ md5sum = f20d6c3d2d94fb685f8d26dfca1e822b
[template-default-slave-virtualhost]
[template-default-slave-virtualhost]
filename = templates/default-virtualhost.conf.in
filename = templates/default-virtualhost.conf.in
md5sum =
7e21418a03529db22181962ea804da53
md5sum =
9a984febd7fa14a4ea94599f3e83139c
[template-cached-slave-virtualhost]
[template-cached-slave-virtualhost]
filename = templates/cached-virtualhost.conf.in
filename = templates/cached-virtualhost.conf.in
md5sum =
6ca9a3251830d602cf25e0a0389fc74b
md5sum =
a73839d777fbd548286bbeccf47be335
[template-log-access]
[template-log-access]
filename = templates/template-log-access.conf.in
filename = templates/template-log-access.conf.in
...
@@ -74,7 +74,7 @@ md5sum = 8cde04bfd0c0e9bd56744b988275cfd8
...
@@ -74,7 +74,7 @@ md5sum = 8cde04bfd0c0e9bd56744b988275cfd8
[template-trafficserver-records-config]
[template-trafficserver-records-config]
filename = templates/trafficserver/records.config.jinja2
filename = templates/trafficserver/records.config.jinja2
md5sum =
5ef0ebc37437ada7cc176e663da5f36c
md5sum =
3c342b0388f94f819b04b05b46744427
[template-trafficserver-storage-config]
[template-trafficserver-storage-config]
filename = templates/trafficserver/storage.config.jinja2
filename = templates/trafficserver/storage.config.jinja2
...
...
software/caddy-frontend/instance-apache-frontend.cfg.in
View file @
db5ee6df
...
@@ -264,6 +264,7 @@ extra-context =
...
@@ -264,6 +264,7 @@ extra-context =
key enable_http2_by_default configuration:enable-http2-by-default
key enable_http2_by_default configuration:enable-http2-by-default
key global_disable_http2 configuration:global-disable-http2
key global_disable_http2 configuration:global-disable-http2
key ciphers configuration:ciphers
key ciphers configuration:ciphers
key request_timeout configuration:request-timeout
key proxy_try_duration configuration:proxy-try-duration
key proxy_try_duration configuration:proxy-try-duration
key proxy_try_interval configuration:proxy-try-interval
key proxy_try_interval configuration:proxy-try-interval
key access_log caddy-configuration:access-log
key access_log caddy-configuration:access-log
...
@@ -442,6 +443,7 @@ synthetic-port = ${configuration:trafficserver-synthetic-port}
...
@@ -442,6 +443,7 @@ synthetic-port = ${configuration:trafficserver-synthetic-port}
mgmt-port = ${configuration:trafficserver-mgmt-port}
mgmt-port = ${configuration:trafficserver-mgmt-port}
ram-cache-size = ${configuration:ram-cache-size}
ram-cache-size = ${configuration:ram-cache-size}
templates-dir = {{ parameter_dict['trafficserver'] }}/etc/trafficserver/body_factory
templates-dir = {{ parameter_dict['trafficserver'] }}/etc/trafficserver/body_factory
request-timeout = ${configuration:request-timeout}
[trafficserver-configuration-directory]
[trafficserver-configuration-directory]
recipe = plone.recipe.command
recipe = plone.recipe.command
...
...
software/caddy-frontend/instance-caddy-input-schema.json
View file @
db5ee6df
...
@@ -107,6 +107,12 @@
...
@@ -107,6 +107,12 @@
"description"
:
"List of ciphers. Empty defaults to Caddy list of ciphers. See https://caddyserver.com/docs/tls for more information."
,
"description"
:
"List of ciphers. Empty defaults to Caddy list of ciphers. See https://caddyserver.com/docs/tls for more information."
,
"title"
:
"Ordered space separated list of ciphers"
,
"title"
:
"Ordered space separated list of ciphers"
,
"type"
:
"string"
"type"
:
"string"
},
"request-timeout"
:
{
"default"
:
600
,
"description"
:
"Timeout for HTTP requests."
,
"title"
:
"HTTP Request timeout in seconds"
,
"type"
:
"integer"
}
}
},
},
"title"
:
"Input Parameters"
,
"title"
:
"Input Parameters"
,
...
...
software/caddy-frontend/instance.cfg.in
View file @
db5ee6df
...
@@ -116,6 +116,7 @@ configuration.re6st-verification-url = http://[2001:67c:1254:4::1]/index.html
...
@@ -116,6 +116,7 @@ configuration.re6st-verification-url = http://[2001:67c:1254:4::1]/index.html
configuration.enable-http2-by-default = true
configuration.enable-http2-by-default = true
configuration.global-disable-http2 = false
configuration.global-disable-http2 = false
configuration.ciphers =
configuration.ciphers =
configuration.request-timeout = 600
configuration.enable-quic = false
configuration.enable-quic = false
configuration.mpm-graceful-shutdown-timeout = 5
configuration.mpm-graceful-shutdown-timeout = 5
configuration.monitor-httpd-port = 8072
configuration.monitor-httpd-port = 8072
...
...
software/caddy-frontend/templates/Caddyfile.in
View file @
db5ee6df
...
@@ -7,8 +7,6 @@ import {{ slave_with_cache_configuration_directory }}/*.conf
...
@@ -7,8 +7,6 @@ import {{ slave_with_cache_configuration_directory }}/*.conf
:{{ https_port }} {
:{{ https_port }} {
tls {{ master_certificate }} {{ master_certificate }}
tls {{ master_certificate }} {{ master_certificate }}
bind {{ local_ipv4 }}
bind {{ local_ipv4 }}
# Compress the output
gzip
status 404 /
status 404 /
log / {{ access_log }} "{remote} - {>REMOTE_USER} [{when}] \"{method} {uri} {proto}\" {status} {size} \"{>Referer}\" \"{>User-Agent}\" {latency_ms}" {
log / {{ access_log }} "{remote} - {>REMOTE_USER} [{when}] \"{method} {uri} {proto}\" {status} {size} \"{>Referer}\" \"{>User-Agent}\" {latency_ms}" {
rotate_size 0
rotate_size 0
...
@@ -21,8 +19,6 @@ import {{ slave_with_cache_configuration_directory }}/*.conf
...
@@ -21,8 +19,6 @@ import {{ slave_with_cache_configuration_directory }}/*.conf
:{{ http_port }} {
:{{ http_port }} {
bind {{ local_ipv4 }}
bind {{ local_ipv4 }}
# Compress the output
gzip
status 404 /
status 404 /
log / {{ access_log }} "{remote} - {>REMOTE_USER} [{when}] \"{method} {uri} {proto}\" {status} {size} \"{>Referer}\" \"{>User-Agent}\" {latency_ms}" {
log / {{ access_log }} "{remote} - {>REMOTE_USER} [{when}] \"{method} {uri} {proto}\" {status} {size} \"{>Referer}\" \"{>User-Agent}\" {latency_ms}" {
rotate_size 0
rotate_size 0
...
@@ -36,8 +32,6 @@ import {{ slave_with_cache_configuration_directory }}/*.conf
...
@@ -36,8 +32,6 @@ import {{ slave_with_cache_configuration_directory }}/*.conf
# Access to server-status Caddy-style
# Access to server-status Caddy-style
https://[{{ global_ipv6 }}]:{{ https_port }}/server-status, https://{{ local_ipv4 }}:{{ https_port }}/server-status {
https://[{{ global_ipv6 }}]:{{ https_port }}/server-status, https://{{ local_ipv4 }}:{{ https_port }}/server-status {
tls {{ frontend_configuration['ip-access-certificate'] }} {{ frontend_configuration['ip-access-certificate'] }}
tls {{ frontend_configuration['ip-access-certificate'] }} {{ frontend_configuration['ip-access-certificate'] }}
# Compress the output
gzip
bind {{ local_ipv4 }}
bind {{ local_ipv4 }}
basicauth "{{ username }}" {{ password | trim }} {
basicauth "{{ username }}" {{ password | trim }} {
"Server Status"
"Server Status"
...
...
software/caddy-frontend/templates/apache-custom-slave-list.cfg.in
View file @
db5ee6df
...
@@ -241,6 +241,7 @@ http_port = {{ dumps('' ~ http_port) }}
...
@@ -241,6 +241,7 @@ http_port = {{ dumps('' ~ http_port) }}
local_ipv4 = {{ dumps('' ~ local_ipv4) }}
local_ipv4 = {{ dumps('' ~ local_ipv4) }}
cached_port = {{ dumps('' ~ cached_port) }}
cached_port = {{ dumps('' ~ cached_port) }}
ssl_cached_port = {{ ('' ~ ssl_cached_port) }}
ssl_cached_port = {{ ('' ~ ssl_cached_port) }}
request_timeout = {{ ('' ~ request_timeout) }}
{# BBB: apache_custom_https and apache_custom_http #}
{# BBB: apache_custom_https and apache_custom_http #}
{% set caddy_custom_http_template = slave_instance.pop('caddy_custom_http', slave_instance.pop('apache_custom_http', '')) %}
{% set caddy_custom_http_template = slave_instance.pop('caddy_custom_http', slave_instance.pop('apache_custom_http', '')) %}
{% set caddy_custom_https_template = slave_instance.pop('caddy_custom_https', slave_instance.pop('apache_custom_https', '')) %}
{% set caddy_custom_https_template = slave_instance.pop('caddy_custom_https', slave_instance.pop('apache_custom_https', '')) %}
...
...
software/caddy-frontend/templates/cached-virtualhost.conf.in
View file @
db5ee6df
...
@@ -17,17 +17,13 @@
...
@@ -17,17 +17,13 @@
# SSL-disabled backends
# SSL-disabled backends
{{ http_backend_host_list|join(', ') }} {
{{ http_backend_host_list|join(', ') }} {
bind {{ slave_parameter['local_ipv4'] }}
bind {{ slave_parameter['local_ipv4'] }}
# Compress the output
gzip
# Rewrite part
# Rewrite part
proxy / {{ slave_parameter.get('backend_url', '') }} {
proxy / {{ slave_parameter.get('backend_url', '') }} {
try_duration {{ slave_parameter['proxy_try_duration'] }}s
try_duration {{ slave_parameter['proxy_try_duration'] }}s
try_interval {{ slave_parameter['proxy_try_interval'] }}ms
try_interval {{ slave_parameter['proxy_try_interval'] }}ms
# As backend is trusting REMOTE_USER header unset it always
header_upstream -REMOTE_USER
transparent
transparent
timeout
600
s
timeout
{{ slave_parameter['request_timeout'] }}
s
{%- if ssl_proxy_verify %}
{%- if ssl_proxy_verify %}
{%- if 'path_to_ssl_proxy_ca_crt' in slave_parameter %}
{%- if 'path_to_ssl_proxy_ca_crt' in slave_parameter %}
ca_certificates {{ slave_parameter['path_to_ssl_proxy_ca_crt'] }}
ca_certificates {{ slave_parameter['path_to_ssl_proxy_ca_crt'] }}
...
@@ -48,15 +44,11 @@
...
@@ -48,15 +44,11 @@
# SSL-enabled backends
# SSL-enabled backends
{{ https_backend_host_list|join(', ') }} {
{{ https_backend_host_list|join(', ') }} {
bind {{ slave_parameter['local_ipv4'] }}
bind {{ slave_parameter['local_ipv4'] }}
# Compress the output
gzip
proxy / {{ slave_parameter.get('https_backend_url', '') }} {
proxy / {{ slave_parameter.get('https_backend_url', '') }} {
try_duration {{ slave_parameter['proxy_try_duration'] }}s
try_duration {{ slave_parameter['proxy_try_duration'] }}s
try_interval {{ slave_parameter['proxy_try_interval'] }}ms
try_interval {{ slave_parameter['proxy_try_interval'] }}ms
# As backend is trusting REMOTE_USER header unset it always
header_upstream -REMOTE_USER
transparent
transparent
timeout
600
s
timeout
{{ slave_parameter['request_timeout'] }}
s
{%- if ssl_proxy_verify %}
{%- if ssl_proxy_verify %}
{%- if 'path_to_ssl_proxy_ca_crt' in slave_parameter %}
{%- if 'path_to_ssl_proxy_ca_crt' in slave_parameter %}
ca_certificates {{ slave_parameter['path_to_ssl_proxy_ca_crt'] }}
ca_certificates {{ slave_parameter['path_to_ssl_proxy_ca_crt'] }}
...
...
software/caddy-frontend/templates/default-virtualhost.conf.in
View file @
db5ee6df
...
@@ -50,8 +50,6 @@
...
@@ -50,8 +50,6 @@
{{ http_host_list|join(', ') }} {
{{ http_host_list|join(', ') }} {
{%- endif %}
{%- endif %}
bind {{ slave_parameter['local_ipv4'] }}
bind {{ slave_parameter['local_ipv4'] }}
# Compress the output
gzip
{%- if tls %}
{%- if tls %}
tls {{ slave_parameter['certificate'] }} {{ slave_parameter['certificate'] }} {
tls {{ slave_parameter['certificate'] }} {{ slave_parameter['certificate'] }} {
{%- if cipher_list %}
{%- if cipher_list %}
...
@@ -110,8 +108,6 @@
...
@@ -110,8 +108,6 @@
without /prefer-gzip
without /prefer-gzip
header_upstream Accept-Encoding gzip
header_upstream Accept-Encoding gzip
{%- endif %} {#- if proxy_name == 'prefer-gzip' #}
{%- endif %} {#- if proxy_name == 'prefer-gzip' #}
# As backend is trusting REMOTE_USER header unset it always
header_upstream -REMOTE_USER
{%- for disabled_cookie in disabled_cookie_list %}
{%- for disabled_cookie in disabled_cookie_list %}
# Remove cookie {{ disabled_cookie }} from client Cookies
# Remove cookie {{ disabled_cookie }} from client Cookies
header_upstream Cookie "(.*)(^{{ disabled_cookie }}=[^;]*; |; {{ disabled_cookie }}=[^;]*|^{{ disabled_cookie }}=[^;]*$)(.*)" "$1 $3"
header_upstream Cookie "(.*)(^{{ disabled_cookie }}=[^;]*; |; {{ disabled_cookie }}=[^;]*|^{{ disabled_cookie }}=[^;]*$)(.*)" "$1 $3"
...
@@ -126,7 +122,7 @@
...
@@ -126,7 +122,7 @@
header_upstream -Pragma
header_upstream -Pragma
{%- endif %} {#- if disable_no_cache_header #}
{%- endif %} {#- if disable_no_cache_header #}
transparent
transparent
timeout
600
s
timeout
{{ slave_parameter['request_timeout'] }}
s
{%- if ssl_proxy_verify %}
{%- if ssl_proxy_verify %}
{%- if 'path_to_ssl_proxy_ca_crt' in slave_parameter %}
{%- if 'path_to_ssl_proxy_ca_crt' in slave_parameter %}
ca_certificates {{ slave_parameter['path_to_ssl_proxy_ca_crt'] }}
ca_certificates {{ slave_parameter['path_to_ssl_proxy_ca_crt'] }}
...
@@ -247,8 +243,6 @@
...
@@ -247,8 +243,6 @@
without /prefer-gzip
without /prefer-gzip
header_upstream Accept-Encoding gzip
header_upstream Accept-Encoding gzip
{%- endif %} {#- if proxy_name == 'prefer-gzip' #}
{%- endif %} {#- if proxy_name == 'prefer-gzip' #}
# As backend is trusting REMOTE_USER header unset it always
header_upstream -REMOTE_USER
{%- for disabled_cookie in disabled_cookie_list %}
{%- for disabled_cookie in disabled_cookie_list %}
# Remove cookie {{ disabled_cookie }} from client Cookies
# Remove cookie {{ disabled_cookie }} from client Cookies
header_upstream Cookie "(.*)(^{{ disabled_cookie }}=[^;]*; |; {{ disabled_cookie }}=[^;]*|^{{ disabled_cookie }}=[^;]*$)(.*)" "$1 $3"
header_upstream Cookie "(.*)(^{{ disabled_cookie }}=[^;]*; |; {{ disabled_cookie }}=[^;]*|^{{ disabled_cookie }}=[^;]*$)(.*)" "$1 $3"
...
@@ -263,7 +257,7 @@
...
@@ -263,7 +257,7 @@
header_upstream -Pragma
header_upstream -Pragma
{%- endif %} {#- if disable_no_cache_header #}
{%- endif %} {#- if disable_no_cache_header #}
transparent
transparent
timeout
600
s
timeout
{{ slave_parameter['request_timeout'] }}
s
{%- if ssl_proxy_verify %}
{%- if ssl_proxy_verify %}
{%- if 'path_to_ssl_proxy_ca_crt' in slave_parameter %}
{%- if 'path_to_ssl_proxy_ca_crt' in slave_parameter %}
ca_certificates {{ slave_parameter['path_to_ssl_proxy_ca_crt'] }}
ca_certificates {{ slave_parameter['path_to_ssl_proxy_ca_crt'] }}
...
...
software/caddy-frontend/templates/trafficserver/records.config.jinja2
View file @
db5ee6df
...
@@ -68,8 +68,8 @@ CONFIG proxy.config.http.uncacheable_requests_bypass_parent INT 1
...
@@ -68,8 +68,8 @@ CONFIG proxy.config.http.uncacheable_requests_bypass_parent INT 1
##############################################################################
##############################################################################
CONFIG proxy.config.http.keep_alive_no_activity_timeout_in INT 120
CONFIG proxy.config.http.keep_alive_no_activity_timeout_in INT 120
CONFIG proxy.config.http.keep_alive_no_activity_timeout_out INT 120
CONFIG proxy.config.http.keep_alive_no_activity_timeout_out INT 120
CONFIG proxy.config.http.transaction_no_activity_timeout_in INT
30
CONFIG proxy.config.http.transaction_no_activity_timeout_in INT
{{ ats_configuration['request-timeout'] }}
CONFIG proxy.config.http.transaction_no_activity_timeout_out INT
30
CONFIG proxy.config.http.transaction_no_activity_timeout_out INT
{{ ats_configuration['request-timeout'] }}
CONFIG proxy.config.http.transaction_active_timeout_in INT 900
CONFIG proxy.config.http.transaction_active_timeout_in INT 900
CONFIG proxy.config.http.transaction_active_timeout_out INT 0
CONFIG proxy.config.http.transaction_active_timeout_out INT 0
CONFIG proxy.config.http.accept_no_activity_timeout INT 120
CONFIG proxy.config.http.accept_no_activity_timeout INT 120
...
@@ -79,11 +79,13 @@ CONFIG proxy.config.net.default_inactivity_timeout INT 86400
...
@@ -79,11 +79,13 @@ CONFIG proxy.config.net.default_inactivity_timeout INT 86400
# Origin server connect attempts. Docs:
# Origin server connect attempts. Docs:
# https://docs.trafficserver.apache.org/records.config#origin-server-connect-attempts
# https://docs.trafficserver.apache.org/records.config#origin-server-connect-attempts
##############################################################################
##############################################################################
CONFIG proxy.config.http.connect_attempts_max_retries INT 3
# Try only once to connect (do not retry)
CONFIG proxy.config.http.connect_attempts_max_retries_dead_server INT 1
CONFIG proxy.config.http.connect_attempts_max_retries INT 0
# Try only once with server marked dead (do not retry)
CONFIG proxy.config.http.connect_attempts_max_retries_dead_server INT 0
CONFIG proxy.config.http.connect_attempts_rr_retries INT 3
CONFIG proxy.config.http.connect_attempts_rr_retries INT 3
CONFIG proxy.config.http.connect_attempts_timeout INT
30
CONFIG proxy.config.http.connect_attempts_timeout INT
{{ ats_configuration['request-timeout'] }}
CONFIG proxy.config.http.post_connect_attempts_timeout INT
1800
CONFIG proxy.config.http.post_connect_attempts_timeout INT
{{ ats_configuration['request-timeout'] }}
CONFIG proxy.config.http.down_server.cache_time INT 60
CONFIG proxy.config.http.down_server.cache_time INT 60
CONFIG proxy.config.http.down_server.abort_threshold INT 10
CONFIG proxy.config.http.down_server.abort_threshold INT 10
...
...
software/caddy-frontend/test/test.py
View file @
db5ee6df
...
@@ -42,6 +42,12 @@ from forcediphttpsadapter.adapters import ForcedIPHTTPSAdapter
...
@@ -42,6 +42,12 @@ from forcediphttpsadapter.adapters import ForcedIPHTTPSAdapter
import
time
import
time
import
tempfile
import
tempfile
import
ipaddress
import
ipaddress
import
StringIO
import
gzip
import
base64
import
re
try
:
try
:
import
lzma
import
lzma
except
ImportError
:
except
ImportError
:
...
@@ -652,9 +658,13 @@ class TestMasterRequestDomain(HttpFrontendTestCase, TestDataMixin):
...
@@ -652,9 +658,13 @@ class TestMasterRequestDomain(HttpFrontendTestCase, TestDataMixin):
class TestHandler(BaseHTTPRequestHandler):
class TestHandler(BaseHTTPRequestHandler):
def do_GET(self):
def do_GET(self):
timeout = int(self.headers.dict.get('
timeout
', '
0
'))
timeout = int(self.headers.dict.get('
timeout
', '
0
'))
compress = int(self.headers.dict.get('
compress
', '
0
'))
time.sleep(timeout)
time.sleep(timeout)
self.send_response(200)
self.send_response(200)
drop_header_list = []
for header in self.headers.dict.get('
x
-
drop
-
header
', '').split():
drop_header_list.append(header)
prefix = '
x
-
reply
-
header
-
'
prefix = '
x
-
reply
-
header
-
'
length = len(prefix)
length = len(prefix)
for key, value in self.headers.dict.items():
for key, value in self.headers.dict.items():
...
@@ -664,15 +674,33 @@ class TestHandler(BaseHTTPRequestHandler):
...
@@ -664,15 +674,33 @@ class TestHandler(BaseHTTPRequestHandler):
value.strip()
value.strip()
)
)
self.send_header("Content-type", "application/json")
if '
Content
-
Type
' not in drop_header_list:
self.send_header('
Set
-
Cookie
', '
secured
=
value
;
secure
')
self.send_header("Content-Type", "application/json")
self.send_header('
Set
-
Cookie
', '
nonsecured
=
value
')
if '
Set
-
Cookie
' not in drop_header_list:
self.send_header('
Set
-
Cookie
', '
secured
=
value
;
secure
')
self.send_header('
Set
-
Cookie
', '
nonsecured
=
value
')
if '
x
-
reply
-
body
' not in self.headers.dict:
response = {
'
Path
': self.path,
'
Incoming
Headers
': self.headers.dict
}
response = json.dumps(response, indent=2)
else:
response = base64.b64decode(self.headers.dict['
x
-
reply
-
body
'])
if compress:
self.send_header('
Content
-
Encoding
', '
gzip
')
out = StringIO.StringIO()
# compress with level 0, to find out if in the middle someting would
# like to alter the compression
with gzip.GzipFile(fileobj=out, mode="w", compresslevel=0) as f:
f.write(response)
response = out.getvalue()
self.send_header('
Backend
-
Content
-
Length
', len(response))
if '
Content
-
Length
' not in drop_header_list:
self.send_header('
Content
-
Length
', len(response))
self.end_headers()
self.end_headers()
response = {
self.wfile.write(response)
'
Path
': self.path,
'
Incoming
Headers
': self.headers.dict
}
self.wfile.write(json.dumps(response, indent=2))
class SlaveHttpFrontendTestCase(HttpFrontendTestCase):
class SlaveHttpFrontendTestCase(HttpFrontendTestCase):
...
@@ -856,7 +884,6 @@ class SlaveHttpFrontendTestCase(HttpFrontendTestCase):
...
@@ -856,7 +884,6 @@ class SlaveHttpFrontendTestCase(HttpFrontendTestCase):
headers=None, cookies=None, source_ip=None):
headers=None, cookies=None, source_ip=None):
if headers is None:
if headers is None:
headers = {}
headers = {}
headers.setdefault('
REMOTE_USER
', '
SOME_REMOTE_USER
')
# workaround request problem of setting Accept-Encoding
# workaround request problem of setting Accept-Encoding
# https://github.com/requests/requests/issues/2234
# https://github.com/requests/requests/issues/2234
headers.setdefault('
Accept
-
Encoding
', '
dummy
')
headers.setdefault('
Accept
-
Encoding
', '
dummy
')
...
@@ -881,7 +908,6 @@ class SlaveHttpFrontendTestCase(HttpFrontendTestCase):
...
@@ -881,7 +908,6 @@ class SlaveHttpFrontendTestCase(HttpFrontendTestCase):
headers=None):
headers=None):
if headers is None:
if headers is None:
headers = {}
headers = {}
headers.setdefault('
REMOTE_USER
', '
SOME_REMOTE_USER
')
# workaround request problem of setting Accept-Encoding
# workaround request problem of setting Accept-Encoding
# https://github.com/requests/requests/issues/2234
# https://github.com/requests/requests/issues/2234
headers.setdefault('
Accept
-
Encoding
', '
dummy
')
headers.setdefault('
Accept
-
Encoding
', '
dummy
')
...
@@ -1036,6 +1062,7 @@ http://apachecustomhttpsaccepted.example.com:%%(http_port)s {
...
@@ -1036,6 +1062,7 @@ http://apachecustomhttpsaccepted.example.com:%%(http_port)s {
'
kedifa_port
': KEDIFA_PORT,
'
kedifa_port
': KEDIFA_PORT,
'
caucase_port
': CAUCASE_PORT,
'
caucase_port
': CAUCASE_PORT,
'
mpm
-
graceful
-
shutdown
-
timeout
': 2,
'
mpm
-
graceful
-
shutdown
-
timeout
': 2,
'
request
-
timeout
': '
12
',
}
}
@classmethod
@classmethod
...
@@ -1448,7 +1475,7 @@ http://apachecustomhttpsaccepted.example.com:%%(http_port)s {
...
@@ -1448,7 +1475,7 @@ http://apachecustomhttpsaccepted.example.com:%%(http_port)s {
self
.
instance_path
,
'*'
,
'var'
,
'log'
,
'httpd'
,
'_empty_access_log'
self
.
instance_path
,
'*'
,
'var'
,
'log'
,
'httpd'
,
'_empty_access_log'
))[
0
]
))[
0
]
log_regexp
=
r'^\
d{
1,3}\
.
\d{1,3}\
.
\d{1,3}\
.
\d{1,3} -
SOME_REMOTE_USER
'
\
log_regexp
=
r'^\
d{
1,3}\
.
\d{1,3}\
.
\d{1,3}\
.
\d{1,3} -
-
'
\
r'\
[
\d{2}\
/.{
3}\
/
\d{4}\
:
\d{2}\
:
\d{2}\
:
\d{2} \
+
\d{4}\
]
' \
r'\
[
\d{2}\
/.{
3}\
/
\d{4}\
:
\d{2}\
:
\d{2}\
:
\d{2} \
+
\d{4}\
]
' \
r'"GET
\
/
t
est-path HTTP
\
/
1
.1"
404
\
d
+
"-"
'
\
r'"GET
\
/
t
est-path HTTP
\
/
1
.1"
404
\
d
+
"-"
'
\
r'"python-requests.*"
\
d
+
'
r'"python-requests.*"
\
d
+
'
...
@@ -1485,7 +1512,10 @@ http://apachecustomhttpsaccepted.example.com:%%(http_port)s {
...
@@ -1485,7 +1512,10 @@ http://apachecustomhttpsaccepted.example.com:%%(http_port)s {
result = self.fakeHTTPSResult(
result = self.fakeHTTPSResult(
parameter_dict['
domain
'], parameter_dict['
public
-
ipv4
'],
parameter_dict['
domain
'], parameter_dict['
public
-
ipv4
'],
'
test
-
path
/
deep
/
..
/
.
/
deeper
',
'
test
-
path
/
deep
/
..
/
.
/
deeper
',
headers={'
Timeout
': '
10
'} # more than default proxy-try-duration == 5
headers={
'
Timeout
': '
10
', # more than default proxy-try-duration == 5
'
Accept
-
Encoding
': '
gzip
',
}
)
)
self.assertEqual(
self.assertEqual(
...
@@ -1535,6 +1565,63 @@ http://apachecustomhttpsaccepted.example.com:%%(http_port)s {
...
@@ -1535,6 +1565,63 @@ http://apachecustomhttpsaccepted.example.com:%%(http_port)s {
self.assertTrue('
try_duration
5
s
' in content)
self.assertTrue('
try_duration
5
s
' in content)
self.assertTrue('
try_interval
250
ms
' in content)
self.assertTrue('
try_interval
250
ms
' in content)
def test_compressed_result(self):
parameter_dict = self.assertSlaveBase('
Url
')
result_compressed = self.fakeHTTPSResult(
parameter_dict['
domain
'], parameter_dict['
public
-
ipv4
'],
'
test
-
path
/
deep
/
..
/
.
/
deeper
',
headers={
'
Accept
-
Encoding
': '
gzip
',
'
Compress
': '
1
',
}
)
self.assertEqual(
'
gzip
',
result_compressed.headers['
Content
-
Encoding
']
)
# Assert that no tampering was done with the request
# (compression/decompression)
# Backend compresses with 0 level, so decompression/compression
# would change somthing
self.assertEqual(
result_compressed.headers['
Content
-
Length
'],
result_compressed.headers['
Backend
-
Content
-
Length
']
)
result_not_compressed = self.fakeHTTPSResult(
parameter_dict['
domain
'], parameter_dict['
public
-
ipv4
'],
'
test
-
path
/
deep
/
..
/
.
/
deeper
',
headers={
'
Accept
-
Encoding
': '
gzip
',
}
)
self.assertFalse('
Content
-
Encoding
' in result_not_compressed.headers)
def test_no_content_type_alter(self):
parameter_dict = self.assertSlaveBase('
Url
')
result = self.fakeHTTPSResult(
parameter_dict['
domain
'], parameter_dict['
public
-
ipv4
'],
'
test
-
path
/
deep
/
..
/
.
/
deeper
',
headers={
'
Accept
-
Encoding
': '
gzip
',
'
X
-
Reply
-
Body
': base64.b64encode(
b"""<?xml version="1.0" encoding="UTF-8"?>
<note>
<to>Tove</to>
<from>Jani</from>
<heading>Reminder</heading>
<body>Don'
t
forget
me
this
weekend
!
</
body
>
</
note
>
"""),
'X-Drop-Header': 'Content-Type'
}
)
self.assertEqual(
'text/xml; charset=utf-8',
result.headers['Content-Type']
)
@skip('Feature postponed')
@skip('Feature postponed')
def test_url_ipv6_access(self):
def test_url_ipv6_access(self):
parameter_dict = self.parseSlaveParameterDict('url')
parameter_dict = self.parseSlaveParameterDict('url')
...
@@ -3265,6 +3352,89 @@ http://apachecustomhttpsaccepted.example.com:%%(http_port)s {
...
@@ -3265,6 +3352,89 @@ http://apachecustomhttpsaccepted.example.com:%%(http_port)s {
result_direct_https_backend.headers['Set-Cookie']
result_direct_https_backend.headers['Set-Cookie']
)
)
def test_enable_cache_ats_timeout(self):
parameter_dict = self.assertSlaveBase('enable_cache')
# check that timeout seen by ATS does not result in many queries done
# to the backend and that next request works like a charm
result = self.fakeHTTPResult(
parameter_dict['domain'], parameter_dict['public-ipv4'],
'test_enable_cache_ats_timeout', headers={
'Timeout': '15',
'X-Reply-Header-Cache-Control': 'max-age=1, stale-while-'
'revalidate=3600, stale-if-error=3600'})
# ATS timed out
self.assertEqual(
httplib.GATEWAY_TIMEOUT,
result.status_code
)
caddy_log_file = glob.glob(
os.path.join(
self.instance_path, '*', 'var', 'log', 'httpd-cache-direct',
'_enable_cache_access_log'
))[0]
matching_line_amount = 0
pattern = re.compile(
r'.*GET .test_enable_cache_ats_timeout.*" 499.*')
with open(caddy_log_file) as fh:
for line in fh.readlines():
if pattern.match(line):
matching_line_amount += 1
# Caddy used between ATS and the backend received only one connection
self.assertEqual(
1,
matching_line_amount)
timeout = 5
b = time.time()
# ATS created squid.log with a delay
while True:
if (time.time() - b) > timeout:
self.fail('Squid log file did not appear in %ss' % (timeout,))
ats_log_file_list = glob.glob(
os.path.join(
self.instance_path, '*', 'var', 'log', 'trafficserver', 'squid.log'
))
if len(ats_log_file_list) == 1:
ats_log_file = ats_log_file_list[0]
break
time.sleep(0.1)
pattern = re.compile(
r'.*ERR_READ_TIMEOUT/504 .*test_enable_cache_ats_timeout'
'.*TIMEOUT_DIRECT*')
timeout = 5
b = time.time()
# ATS needs some time to flush logs
while True:
matching_line_amount = 0
if (time.time() - b) > timeout:
break
with open(ats_log_file) as fh:
for line in fh.readlines():
if pattern.match(line):
matching_line_amount += 1
if matching_line_amount > 0:
break
time.sleep(0.1)
# ATS has only one entry for this query
self.assertEqual(
1,
matching_line_amount)
# the result is available immediately after
result = self.fakeHTTPResult(
parameter_dict['domain'], parameter_dict['public-ipv4'],
'test-path/deep/.././deeper', headers={
'X-Reply-Header-Cache-Control': 'max-age=1, stale-while-'
'revalidate=3600, stale-if-error=3600'})
self.assertEqualResultJson(result, 'Path', '/test-path/deeper')
def test_enable_cache_disable_no_cache_request(self):
def test_enable_cache_disable_no_cache_request(self):
parameter_dict = self.assertSlaveBase(
parameter_dict = self.assertSlaveBase(
'enable_cache-disable-no-cache-request')
'enable_cache-disable-no-cache-request')
...
...
software/neoppod/cluster.cfg.in
View file @
db5ee6df
{% import "root_common" as common_macro with context %}
{% import "root_common" as common_macro with context %}
[request-common]
<= request-common-base
{{ common_macro.request_neo(slapparameter_dict, 'neo', 'node-') }}
{{ common_macro.request_neo(slapparameter_dict, 'neo', 'node-') }}
[publish-early]
recipe = slapos.cookbook:publish-early
-init =
neo-admins node-cluster:admins
neo-masters node-cluster:masters
[publish]
[publish]
recipe = slapos.cookbook:publish.serialised
recipe = slapos.cookbook:publish.serialised
neo-masters = ${node-0-final:connection-masters}
-extends = publish-early
neo-admins = ${node-0-final:connection-admins}
{{ common_macro.common_section() }}
{{ common_macro.common_section() }}
software/neoppod/instance-common.cfg.in
View file @
db5ee6df
...
@@ -42,5 +42,6 @@ extra-context =
...
@@ -42,5 +42,6 @@ extra-context =
key admin_cfg neo-admin:rendered
key admin_cfg neo-admin:rendered
{%- if mariadb_location is defined %}
{%- if mariadb_location is defined %}
raw mariadb_location {{ mariadb_location }}
raw mariadb_location {{ mariadb_location }}
raw template_mysqld_wrapper {{ template_mysqld_wrapper }}
raw template_neo_my_cnf {{ template_neo_my_cnf }}
raw template_neo_my_cnf {{ template_neo_my_cnf }}
{%- endif %}
{%- endif %}
software/neoppod/instance-neo-admin.cfg.in
View file @
db5ee6df
...
@@ -13,7 +13,7 @@ ip = ${publish:ip}
...
@@ -13,7 +13,7 @@ ip = ${publish:ip}
port = ${publish:port-admin}
port = ${publish:port-admin}
ssl = {{ dumps(bool(slapparameter_dict['ssl'])) }}
ssl = {{ dumps(bool(slapparameter_dict['ssl'])) }}
cluster = {{ dumps(slapparameter_dict['cluster']) }}
cluster = {{ dumps(slapparameter_dict['cluster']) }}
masters =
${publish:masters
}
masters =
{{ dumps(slapparameter_dict['masters']) }
}
[neo-admin-promise]
[neo-admin-promise]
recipe = slapos.cookbook:check_port_listening
recipe = slapos.cookbook:check_port_listening
...
...
software/neoppod/instance-neo-master.cfg.in
View file @
db5ee6df
...
@@ -21,7 +21,7 @@ autostart = {{ slapparameter_dict['autostart'] }}
...
@@ -21,7 +21,7 @@ autostart = {{ slapparameter_dict['autostart'] }}
# No actual installation takes place at that time
# No actual installation takes place at that time
# (slapos.cookbook:neoppod.master raises), but cfg expansion must succeed. So
# (slapos.cookbook:neoppod.master raises), but cfg expansion must succeed. So
# this default value is required.
# this default value is required.
masters =
${publish:masters
}
masters =
{{ dumps(slapparameter_dict['masters']) }
}
[neo-master-promise]
[neo-master-promise]
recipe = slapos.cookbook:check_port_listening
recipe = slapos.cookbook:check_port_listening
...
...
software/neoppod/instance-neo.cfg.in
View file @
db5ee6df
...
@@ -7,18 +7,17 @@
...
@@ -7,18 +7,17 @@
{% set mysql = storage_type == 'MySQL' -%}
{% set mysql = storage_type == 'MySQL' -%}
{% if mysql -%}
{% if mysql -%}
[mysqld]
[{{ section('mysqld') }}]
recipe = slapos.cookbook:generic.mysql.wrap_mysqld
recipe = slapos.recipe.template:jinja2
output = ${directory:etc_run}/mariadb
template = {{ template_mysqld_wrapper }}
binary = ${:mysql-base-directory}/bin/mysqld
rendered = ${directory:etc_run}/mariadb
configuration-file = ${my-cnf:rendered}
context =
data-directory = ${directory:srv_mariadb}
key defaults_file my-cnf:rendered
mysql-install-binary = ${:mysql-base-directory}/scripts/mysql_install_db
key datadir my-cnf-parameters:data-directory
mysql-base-directory = {{ mariadb_location }}
[my-cnf-parameters]
[my-cnf-parameters]
socket = ${directory:var_run}/mariadb.sock
socket = ${directory:var_run}/mariadb.sock
data-directory = ${
mysqld:data-directory}
data-directory = ${
directory:srv}/mariadb
tmp-directory = ${directory:tmp}
tmp-directory = ${directory:tmp}
pid-file = ${directory:var_run}/mariadb.pid
pid-file = ${directory:var_run}/mariadb.pid
error-log = ${directory:log}/mariadb_error.log
error-log = ${directory:log}/mariadb_error.log
...
@@ -34,9 +33,9 @@ rendered = ${directory:etc}/mariadb.cnf
...
@@ -34,9 +33,9 @@ rendered = ${directory:etc}/mariadb.cnf
template = {{ template_neo_my_cnf }}
template = {{ template_neo_my_cnf }}
context = section parameter_dict my-cnf-parameters
context = section parameter_dict my-cnf-parameters
[
{{ section('binary-wrap-mysql') }}
]
[
binary-wrap-mysql
]
recipe = slapos.cookbook:wrapper
recipe = slapos.cookbook:wrapper
command-line =
${mysqld:mysql-base-directory}/bin/${:command} --defaults-file=${my-cnf:rendered}
command-line =
'{{ mariadb_location }}/bin/${:command}' --defaults-file="${my-cnf:rendered}"
wrapper-path = ${directory:bin}/${:command}
wrapper-path = ${directory:bin}/${:command}
command = mysql
command = mysql
...
@@ -47,17 +46,6 @@ recipe = slapos.cookbook:symbolic.link
...
@@ -47,17 +46,6 @@ recipe = slapos.cookbook:symbolic.link
target-directory = ${directory:bin}
target-directory = ${directory:bin}
link-binary = {{ bin_directory }}/neolog
link-binary = {{ bin_directory }}/neolog
{% set master_list = [] -%}
{% set admin_list = [] -%}
{% for k, v in slapparameter_dict.iteritems() -%}
{% if k.startswith('master-') and v -%}
{% do master_list.append(v) -%}
{% endif -%}
{% if k.startswith('admin-') and v -%}
{% do admin_list.append(v) -%}
{% endif -%}
{% endfor -%}
[publish]
[publish]
recipe = slapos.cookbook:publish.serialised
recipe = slapos.cookbook:publish.serialised
# TODO: make port a partition parameter
# TODO: make port a partition parameter
...
@@ -76,10 +64,6 @@ admin = ${:ip}:${:port-admin}
...
@@ -76,10 +64,6 @@ admin = ${:ip}:${:port-admin}
{% else -%}
{% else -%}
admin =
admin =
{% endif -%}
{% endif -%}
masters = {{ ' '.join(sorted(master_list)) }}
{%- if admin_list %}
admins = {{ ' '.join(sorted(admin_list)) }}
{%- endif %}
{#- Hack to deploy SSL certs via instance parameters #}
{#- Hack to deploy SSL certs via instance parameters #}
{%- for name, pem in zip(('ca.crt', 'neo.crt', 'neo.key'),
{%- for name, pem in zip(('ca.crt', 'neo.crt', 'neo.key'),
...
@@ -102,7 +86,7 @@ binary = {{ bin_directory }}/neostorage
...
@@ -102,7 +86,7 @@ binary = {{ bin_directory }}/neostorage
ip = ${publish:ip}
ip = ${publish:ip}
ssl = {{ dumps(bool(slapparameter_dict['ssl'])) }}
ssl = {{ dumps(bool(slapparameter_dict['ssl'])) }}
cluster = {{ dumps(slapparameter_dict['cluster']) }}
cluster = {{ dumps(slapparameter_dict['cluster']) }}
masters =
${publish:masters
}
masters =
{{ dumps(slapparameter_dict['masters']) }
}
database-adapter = {{ storage_type }}
database-adapter = {{ storage_type }}
wait-database = -1
wait-database = -1
engine = ${my-cnf-parameters:engine}
engine = ${my-cnf-parameters:engine}
...
@@ -122,7 +106,7 @@ database-parameters = root@neo{{ i }}${my-cnf-parameters:socket}
...
@@ -122,7 +106,7 @@ database-parameters = root@neo{{ i }}${my-cnf-parameters:socket}
database-parameters = ${directory:db-{{i}}}/db.sqlite
database-parameters = ${directory:db-{{i}}}/db.sqlite
[directory]
[directory]
db-{{i}} = ${
buildout:directory}/srv
/{{ storage_id }}
db-{{i}} = ${
:srv}
/{{ storage_id }}
{%- endif %}
{%- endif %}
[{{ section('logrotate-storage-' ~ i) }}]
[{{ section('logrotate-storage-' ~ i) }}]
...
@@ -143,9 +127,9 @@ etc_run = ${:etc}/run
...
@@ -143,9 +127,9 @@ etc_run = ${:etc}/run
var_run = ${:var}/run
var_run = ${:var}/run
log = ${buildout:directory}/var/log
log = ${buildout:directory}/var/log
tmp = ${buildout:directory}/tmp
tmp = ${buildout:directory}/tmp
{% if mysql -%}
srv = ${buildout:directory}/srv
srv_mariadb = ${buildout:directory}/srv/mariadb
{% if mysql -%}
[init-script]
[init-script]
recipe = slapos.recipe.template:jinja2
recipe = slapos.recipe.template:jinja2
# XXX: is there a better location ?
# XXX: is there a better location ?
...
@@ -157,7 +141,7 @@ template = inline:
...
@@ -157,7 +141,7 @@ template = inline:
< = logrotate-entry-base
< = logrotate-entry-base
name = mariadb
name = mariadb
log = ${my-cnf-parameters:error-log} ${my-cnf-parameters:slow-query-log}
log = ${my-cnf-parameters:error-log} ${my-cnf-parameters:slow-query-log}
post = ${
mysqld:mysql-base-directory}/bin/mysql --defaults-file="${my-cnf:rendered}"
-e "FLUSH LOGS"
post = ${
binary-wrap-mysql:command-line}
-e "FLUSH LOGS"
{% if runTestSuite_in is defined -%}
{% if runTestSuite_in is defined -%}
# bin/runTestSuite to run NEO tests
# bin/runTestSuite to run NEO tests
...
@@ -170,7 +154,7 @@ context =
...
@@ -170,7 +154,7 @@ context =
section directory directory
section directory directory
section my_cnf_parameters my-cnf-parameters
section my_cnf_parameters my-cnf-parameters
raw bin_directory {{ bin_directory }}
raw bin_directory {{ bin_directory }}
raw prepend_path
${mysqld:mysql-base-directory
}/bin
raw prepend_path
{{ mariadb_location }
}/bin
{%- endif %}
{%- endif %}
{%- endif %}
{%- endif %}
...
...
software/neoppod/root-common.cfg.in
View file @
db5ee6df
...
@@ -39,6 +39,10 @@ parts =
...
@@ -39,6 +39,10 @@ parts =
{% set section_id_list = [] -%}
{% set section_id_list = [] -%}
[{{ prefix }}request-common]
[{{ prefix }}request-common]
<= request-common-base
<= request-common-base
return =
master
admin
config-masters = {{ '${' ~ prefix ~ 'cluster:masters}' }}
config-cluster = {{ parameter_dict['cluster'] }}
config-cluster = {{ parameter_dict['cluster'] }}
{% set replicas = parameter_dict.get('replicas', 0) -%}
{% set replicas = parameter_dict.get('replicas', 0) -%}
config-partitions = {{ dumps(parameter_dict.get('partitions', 12)) }}
config-partitions = {{ dumps(parameter_dict.get('partitions', 12)) }}
...
@@ -62,35 +66,16 @@ config-autostart = {{ dumps(sum(storage_count)) }}
...
@@ -62,35 +66,16 @@ config-autostart = {{ dumps(sum(storage_count)) }}
{% set section_id = prefix ~ i -%}
{% set section_id = prefix ~ i -%}
{% do section_id_list.append(section_id) -%}
{% do section_id_list.append(section_id) -%}
[{{ section_id }}-base]
[{{section_id}}]
<= {{ prefix }}request-common
name = {{ section_id }}
name = {{ section_id }}
{% for k, v in node.iteritems() -%}
{% for k, v in node.iteritems() -%}
config-{{ k }} = {{ dumps(v) }}
config-{{ k }} = {{ dumps(v) }}
{% endfor -%}
{% endfor -%}
{{ sla(section_id) }}
{{ sla(section_id) }}
[{{ section_id }}]
<= {{ prefix }}request-common
{{ section_id }}-base
return =
master
admin
{% endfor -%}
[final-base]
{% for i, section_id in enumerate(section_id_list) -%}
config-master-{{i}} = {{ '${' + section_id + ':connection-master}' }}
config-admin-{{i}} = {{ '${' + section_id + ':connection-admin}' }}
{% endfor -%}
{% endfor -%}
{% for section_id in section_id_list -%}
[{{section(prefix ~ 'cluster')}}]
[{{ section(section_id + '-final') }}]
recipe = slapos.cookbook:neoppod.cluster
<= {{ prefix }}request-common
nodes = {{ ' '.join(section_id_list) }}
final-base
{{ section_id }}-base
{% if loop.first -%}
return =
masters
admins
{% endif -%}
{% endfor -%}
{% endmacro -%}
{% endmacro -%}
software/neoppod/software-common.cfg
View file @
db5ee6df
...
@@ -94,7 +94,7 @@ mode = 644
...
@@ -94,7 +94,7 @@ mode = 644
recipe = slapos.recipe.template:jinja2
recipe = slapos.recipe.template:jinja2
template = ${:_profile_base_location_}/${:_buildout_section_name_}.cfg.in
template = ${:_profile_base_location_}/${:_buildout_section_name_}.cfg.in
rendered = ${buildout:directory}/${:_buildout_section_name_}.cfg
rendered = ${buildout:directory}/${:_buildout_section_name_}.cfg
md5sum =
c0e22816537b56bceef0b4c2b40f6219
md5sum =
0a3a54fcc7be0bbd63cbd64f006ceebc
context =
context =
key bin_directory buildout:bin-directory
key bin_directory buildout:bin-directory
key develop_eggs_directory buildout:develop-eggs-directory
key develop_eggs_directory buildout:develop-eggs-directory
...
@@ -107,33 +107,56 @@ context =
...
@@ -107,33 +107,56 @@ context =
${:adapter-context}
${:adapter-context}
adapter-context =
adapter-context =
key mariadb_location mariadb:location
key mariadb_location mariadb:location
key template_mysqld_wrapper template-mysqld-wrapper:rendered
key template_neo_my_cnf template-neo-my-cnf:target
key template_neo_my_cnf template-neo-my-cnf:target
[root-common]
[root-common]
<= download-base-neo
<= download-base-neo
md5sum =
66055aa82f9097c5301864c07e6e5d80
md5sum =
15fa47a59cc3019f59612aaf33bd9ec5
[instance-neo-admin]
[instance-neo-admin]
<= download-base-neo
<= download-base-neo
md5sum =
4d1ae570b4458e7725454857aabb37f6
md5sum =
ce0d9ff9e899bb706351a99df29238a9
[instance-neo-master]
[instance-neo-master]
<= download-base-neo
<= download-base-neo
md5sum =
1fee10f02c2fa2a581e21878ca0fd704
md5sum =
4faee020eaf7cd495cd6210dfa4eb0c1
[instance-neo]
[instance-neo]
<= download-base-neo
<= download-base-neo
md5sum =
d4e30d74316e6931da4a1e305f9bbc68
md5sum =
5fc9fcaec3a5387625af34fe686097ae
[template-neo-my-cnf]
[template-neo-my-cnf]
<= download-base-neo
<= download-base-neo
url = ${:_profile_base_location_}/my.cnf.in
url = ${:_profile_base_location_}/my.cnf.in
md5sum = 9f6f8f2b5f4cb0d97d50ffc1d3837e2f
md5sum = 9f6f8f2b5f4cb0d97d50ffc1d3837e2f
[template-mysqld-wrapper]
recipe = slapos.recipe.template:jinja2
rendered = ${buildout:parts-directory}/${:_buildout_section_name_}/mysqld.in
mode = 644
template =
inline:{% raw %}#!/bin/sh -e
datadir='{{datadir}}'
[ -e "$datadir" ] || {
rm -vrf "$datadir.new"
'${mariadb:location}/scripts/mysql_install_db' \
--defaults-file='{{defaults_file}}' \
--skip-name-resolve \
--basedir='${mariadb:location}' \
--datadir="$datadir.new"
mv -v "$datadir.new" "$datadir"
}
exec '${mariadb:location}/bin/mysqld' \
--defaults-file='{{defaults_file}}' \
"$@"
{% endraw %}
[versions]
[versions]
BTrees = 4.5.1
BTrees = 4.5.1
ZODB = 4.4.5
ZODB = 4.4.5
coverage = 4.5.1
coverage = 4.5.1
mock = 3.0.5
ecdsa = 0.13
ecdsa = 0.13
gitdb2 = 2.0.0
gitdb2 = 2.0.0
msgpack = 0.5.6
msgpack = 0.5.6
...
@@ -146,7 +169,7 @@ setproctitle = 1.1.10
...
@@ -146,7 +169,7 @@ setproctitle = 1.1.10
slapos.recipe.template = 4.3
slapos.recipe.template = 4.3
smmap2 = 2.0.1
smmap2 = 2.0.1
transaction = 1.7.0
transaction = 1.7.0
zodbpickle =
0.6.0
zodbpickle =
1.0.4
zodbtools = 0.0.0.dev4
zodbtools = 0.0.0.dev4
cython-zstd = 0.2
cython-zstd = 0.2
python-dateutil = 2.7.3
python-dateutil = 2.7.3
...
@@ -164,10 +187,12 @@ ZEO = 4.3.1+SlapOSPatched001
...
@@ -164,10 +187,12 @@ ZEO = 4.3.1+SlapOSPatched001
# ZEO==4.3.1
# ZEO==4.3.1
zdaemon = 4.2.0
zdaemon = 4.2.0
# Required by:
# mock = 3.0.5
funcsigs = 1.0.2
# Test Suite: NEO-MASTER ran at 2019/08/28 16:24:58.949371 UTC
# Test Suite: NEO-MASTER ran at 2019/08/28 16:24:58.949371 UTC
# 22 failures, 1 errors, 839 total, status: FAIL
# 22 failures, 1 errors, 839 total, status: FAIL
[neoppod-repository]
[neoppod-repository]
revision = c681f666c191581551c9d63e1f302270fd6a343d
revision = c681f666c191581551c9d63e1f302270fd6a343d
software/neoppod/software-zodb5.cfg
View file @
db5ee6df
...
@@ -2,12 +2,13 @@
...
@@ -2,12 +2,13 @@
extends = software.cfg
extends = software.cfg
[neoppod]
[neoppod]
eggs += mock
ZEO-patches =
ZEO-patches =
[versions]
[versions]
ZODB = 5.
4.0
ZODB = 5.
5.1
ZEO = 5.2.0
ZEO = 5.2.0
transaction = 2.
2.1
transaction = 2.
4.0
# Required by:
# Required by:
# ZEO==5.2.0
# ZEO==5.2.0
...
...
software/neoppod/software.cfg
View file @
db5ee6df
...
@@ -22,7 +22,7 @@ context =
...
@@ -22,7 +22,7 @@ context =
[cluster]
[cluster]
<= download-base-neo
<= download-base-neo
md5sum =
ee8401a4e7d82bf488a57e3399f9ce48
md5sum =
5afd326de385563b5aeac81039f23341
[runTestSuite.in]
[runTestSuite.in]
recipe = slapos.recipe.build:download
recipe = slapos.recipe.build:download
...
...
software/slapos-master/apache-backend.conf.in
View file @
db5ee6df
...
@@ -131,13 +131,11 @@ SSLCipherSuite ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:EC
...
@@ -131,13 +131,11 @@ SSLCipherSuite ECDH+AESGCM:DH+AESGCM:ECDH+AES256:DH+AES256:ECDH+AES128:DH+AES:EC
SSLSessionCache shmcb:{{ parameter_dict['ssl-session-cache'] }}(512000)
SSLSessionCache shmcb:{{ parameter_dict['ssl-session-cache'] }}(512000)
SSLProxyEngine On
SSLProxyEngine On
# As backend is trusting REMOTE_USER header unset it always
# As backend is trusting Remote-User header unset it always
RequestHeader unset REMOTE_USER
RequestHeader unset Remote-User
RequestHeader unset SSL_CLIENT_SERIAL
{% if parameter_dict['ca-cert'] -%}
{% if parameter_dict['ca-cert'] -%}
SSLVerifyClient optional
SSLVerifyClient optional
RequestHeader set REMOTE_USER %{SSL_CLIENT_S_DN_CN}s
RequestHeader set Remote-User %{SSL_CLIENT_S_DN_CN}s
RequestHeader set SSL_CLIENT_SERIAL "%{SSL_CLIENT_M_SERIAL}s"
SSLCACertificateFile {{ parameter_dict['ca-cert'] }}
SSLCACertificateFile {{ parameter_dict['ca-cert'] }}
{% if not parameter_dict['shared-ca-cert'] %}
{% if not parameter_dict['shared-ca-cert'] %}
{% if parameter_dict['crl'] -%}
{% if parameter_dict['crl'] -%}
...
@@ -168,7 +166,7 @@ Listen {{ ip }}:{{ port }}
...
@@ -168,7 +166,7 @@ Listen {{ ip }}:{{ port }}
{% if enable_authentication and parameter_dict['shared-ca-cert'] and parameter_dict['shared-crl'] -%}
{% if enable_authentication and parameter_dict['shared-ca-cert'] and parameter_dict['shared-crl'] -%}
SSLVerifyClient require
SSLVerifyClient require
# Custom block we use for now different parameters.
# Custom block we use for now different parameters.
RequestHeader set R
EMOTE_USER
%{SSL_CLIENT_S_DN_CN}s
RequestHeader set R
emote-User
%{SSL_CLIENT_S_DN_CN}s
SSLCACertificateFile {{ parameter_dict['shared-ca-cert'] }}
SSLCACertificateFile {{ parameter_dict['shared-ca-cert'] }}
SSLCARevocationPath {{ parameter_dict['shared-crl'] }}
SSLCARevocationPath {{ parameter_dict['shared-crl'] }}
...
...
software/slapos-master/buildout.hash.cfg
View file @
db5ee6df
...
@@ -22,4 +22,4 @@ md5sum = e8033d4fd7b6348b525a6148762ccdb4
...
@@ -22,4 +22,4 @@ md5sum = e8033d4fd7b6348b525a6148762ccdb4
[template-apache-backend-conf]
[template-apache-backend-conf]
filename = apache-backend.conf.in
filename = apache-backend.conf.in
md5sum =
aff99c44ccf16eaa2ca25430d76d3bd6
md5sum =
48f086ce1acffca7bab942b43d856fb7
stack/erp5/buildout.cfg
View file @
db5ee6df
...
@@ -820,12 +820,6 @@ unidiff = 0.5.5
...
@@ -820,12 +820,6 @@ unidiff = 0.5.5
jsonpickle = 0.9.6
jsonpickle = 0.9.6
decorator = 4.3.0
decorator = 4.3.0
mock = 3.0.4
# Required by:
# mock = 3.0.4
funcsigs = 1.0.2
responses = 0.10.6
responses = 0.10.6
# Required by:
# Required by:
...
...
stack/erp5/buildout.hash.cfg
View file @
db5ee6df
...
@@ -78,7 +78,7 @@ md5sum = d41d8cd98f00b204e9800998ecf8427e
...
@@ -78,7 +78,7 @@ md5sum = d41d8cd98f00b204e9800998ecf8427e
[template-erp5]
[template-erp5]
filename = instance-erp5.cfg.in
filename = instance-erp5.cfg.in
md5sum =
ca5375204bacdc1df30285d3c5d179b1
md5sum =
af5d9aeac2bae695220465a4348ae592
[template-zeo]
[template-zeo]
filename = instance-zeo.cfg.in
filename = instance-zeo.cfg.in
...
...
stack/erp5/instance-erp5.cfg.in
View file @
db5ee6df
...
@@ -212,7 +212,7 @@ config-test-runner-node-count = {{ dumps(test_runner_node_count) }}
...
@@ -212,7 +212,7 @@ config-test-runner-node-count = {{ dumps(test_runner_node_count) }}
{% if server_type == 'neo' -%}
{% if server_type == 'neo' -%}
config-neo-cluster = ${publish-early:neo-cluster}
config-neo-cluster = ${publish-early:neo-cluster}
config-neo-name = {{ server_dict.keys()[0] }}
config-neo-name = {{ server_dict.keys()[0] }}
config-neo-masters = ${
neo-0-final:connection
-masters}
config-neo-masters = ${
publish-early:neo
-masters}
{% else -%}
{% else -%}
config-zodb-zeo = ${request-zodb:connection-storage-dict}
config-zodb-zeo = ${request-zodb:connection-storage-dict}
config-tidstorage-ip = ${request-zodb:connection-tidstorage-ip}
config-tidstorage-ip = ${request-zodb:connection-tidstorage-ip}
...
@@ -368,10 +368,6 @@ return = site_url
...
@@ -368,10 +368,6 @@ return = site_url
<= monitor-publish
<= monitor-publish
recipe = slapos.cookbook:publish.serialised
recipe = slapos.cookbook:publish.serialised
-extends = publish-early
-extends = publish-early
{% if 'neo' in storage_dict -%}
neo-masters = ${neo-0-final:connection-masters}
neo-admins = ${neo-0-final:connection-admins}
{% endif -%}
{% if zope_address_list_id_dict -%}
{% if zope_address_list_id_dict -%}
{#
{#
Pick any published hosts-dict, they are expected to be identical - and there is
Pick any published hosts-dict, they are expected to be identical - and there is
...
@@ -388,7 +384,6 @@ hosts-dict = {{ '${' ~ zope_address_list_id_dict.keys()[0] ~ ':connection-hosts-
...
@@ -388,7 +384,6 @@ hosts-dict = {{ '${' ~ zope_address_list_id_dict.keys()[0] ~ ':connection-hosts-
{% endfor -%}
{% endfor -%}
{% endif -%}
{% endif -%}
[publish-early]
[publish-early]
recipe = slapos.cookbook:publish-early
recipe = slapos.cookbook:publish-early
-init =
-init =
...
@@ -404,6 +399,8 @@ recipe = slapos.cookbook:publish-early
...
@@ -404,6 +399,8 @@ recipe = slapos.cookbook:publish-early
{%- endif %}
{%- endif %}
{%- if neo %}
{%- if neo %}
neo-cluster gen-neo-cluster:name
neo-cluster gen-neo-cluster:name
neo-admins neo-cluster:admins
neo-masters neo-cluster:masters
{%- if neo[0] %}
{%- if neo[0] %}
neo-cluster = {{ dumps(neo[0]) }}
neo-cluster = {{ dumps(neo[0]) }}
{%- endif %}
{%- endif %}
...
...
stack/slapos.cfg
View file @
db5ee6df
...
@@ -135,7 +135,7 @@ pyOpenSSL = 18.0.0
...
@@ -135,7 +135,7 @@ pyOpenSSL = 18.0.0
pyparsing = 2.2.0
pyparsing = 2.2.0
pytz = 2016.10
pytz = 2016.10
requests = 2.13.0
requests = 2.13.0
six = 1.1
1
.0
six = 1.1
2
.0
slapos.cookbook = 1.0.119
slapos.cookbook = 1.0.119
slapos.core = 1.4.26
slapos.core = 1.4.26
slapos.extension.strip = 0.4
slapos.extension.strip = 0.4
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment