Compare commits

...

55 Commits

Author SHA1 Message Date
Ulysses Souza
46118bc5e0 "Bump 1.26.0-rc3"
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-03-10 16:25:59 +01:00
Ulysses Souza
3288404b24 Merge branch 'master' into 1.26.x 2020-03-10 16:18:46 +01:00
Ulysses Souza
e9dc97fcf3 Merge pull request #7276 from ulyssessouza/conformity-tests
Add conformity tests
2020-03-10 16:17:16 +01:00
Ulysses Souza
0ace76114b Add conformity tests hack
- That can be used with:
./script/test/acceptance

Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-03-10 15:41:28 +01:00
Anca Iordache
3c89ff843a Merge pull request #7268 from aiordache/fix_v3.8_schema_support
Fix v3.8 schema support for binaries
2020-03-05 17:03:46 +01:00
Anca Iordache
98abe07646 Fix v3.8 schema support for binaries
Signed-off-by: Anca Iordache <anca.iordache@docker.com>
2020-03-05 15:46:45 +01:00
Ulysses Souza
2882cf56ed Merge pull request #7254 from StefanScherer/update-jenkins-badge
Update Jenkins build status badge
2020-02-27 18:15:43 +01:00
Stefan Scherer
2769c33a7e Update Jenkins build status badge
Signed-off-by: Stefan Scherer <stefan.scherer@docker.com>
2020-02-27 17:23:50 +01:00
Ulysses Souza
6b988aa1f1 Merge pull request #7251 from ulyssessouza/downgrade-idna
Resolve a compatibility issue
2020-02-27 10:58:35 +01:00
Ulysses Souza
cfefeaa6f7 Resolve a compatibility issue
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-02-27 10:48:37 +01:00
Ulysses Souza
07cab51378 Downgrade gitpython to 2.1.15 and idna to 2.8
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-02-24 13:40:16 +01:00
Ulysses Souza
12637904f0 Bump gitpython -> 3.1.0
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-02-24 13:10:14 +01:00
Ulysses Souza
d412a1e47f Merge pull request #6937 from apollo13/issue6871
Properly escape values coming from env_files, fixes #6871
2020-02-24 11:52:14 +01:00
Ulysses Souza
160034f678 "Bump 1.26.0-rc1"
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-02-24 11:42:32 +01:00
Ulysses Souza
61b749cecd Merge pull request #7238 from docker/dependabot/pip/idna-2.9
Bump idna from 2.8 to 2.9
2020-02-21 11:54:50 +01:00
Ulysses Souza
80ba1f07f3 Merge pull request #7224 from docker/dependabot/pip/python-dotenv-0.11.0
Bump python-dotenv from 0.10.5 to 0.11.0
2020-02-18 14:18:52 +01:00
dependabot-preview[bot]
0f8e7a8874 Bump idna from 2.8 to 2.9
Bumps [idna](https://github.com/kjd/idna) from 2.8 to 2.9.
- [Release notes](https://github.com/kjd/idna/releases)
- [Changelog](https://github.com/kjd/idna/blob/master/HISTORY.rst)
- [Commits](https://github.com/kjd/idna/compare/v2.8...v2.9)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-02-17 22:16:41 +00:00
dependabot-preview[bot]
01df88e1eb Bump python-dotenv from 0.10.5 to 0.11.0
Bumps [python-dotenv](https://github.com/theskumar/python-dotenv) from 0.10.5 to 0.11.0.
- [Release notes](https://github.com/theskumar/python-dotenv/releases)
- [Changelog](https://github.com/theskumar/python-dotenv/blob/master/CHANGELOG.md)
- [Commits](https://github.com/theskumar/python-dotenv/compare/v0.10.5...v0.11.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-02-10 22:15:09 +00:00
Ulysses Souza
bb93a18500 Merge pull request #7217 from aiordache/compose_v3.8_schema_support
Add v3.8 schema support
2020-02-10 16:25:45 +01:00
Anca Iordache
79fe7ca997 add warning when max_replicas_per_node limits scale
Signed-off-by: Anca Iordache <anca.iordache@docker.com>
2020-02-10 11:52:16 +01:00
Florian Apolloner
5fe0858450 Updated requirements.txt back to the released python-dotenv 0.11.0.
Signed-off-by: Florian Apolloner <florian@apolloner.eu>
2020-02-10 10:48:21 +01:00
Anca Iordache
02d8e9ee14 set min engine version needed for v38 schema support
Signed-off-by: Anca Iordache <anca.iordache@docker.com>
2020-02-07 13:50:06 +01:00
Anca Iordache
d9b0fabd9b update api version for 3.8
Signed-off-by: Anca Iordache <anca.iordache@docker.com>
2020-02-07 13:50:06 +01:00
Anca Iordache
09c80ce49b test update - remove 'placement' from unsupported fields
Signed-off-by: Anca Iordache <anca.iordache@docker.com>
2020-02-07 13:50:06 +01:00
Anca Iordache
391e5a6bc2 Add v3.8 schema support
- service scale bounded by 'max_replicas_per_node' field

Signed-off-by: Anca Iordache <anca.iordache@docker.com>
2020-02-07 13:50:06 +01:00
Ulysses Souza
13bacba2b9 Merge pull request #7202 from aiordache/devtool28_compose_docker_contexts
Implement docker contexts to target different docker engines
2020-02-07 13:35:23 +01:00
Anca Iordache
2dfd85e30e Use docker context interface from docker-py
Signed-off-by: Anca Iordache <anca.iordache@docker.com>
2020-02-06 12:01:37 +01:00
Ulysses Souza
eb20915c6e Merge pull request #7045 from ndeloof/command_suggest
Suggested command is invalid
2020-02-03 18:01:27 +01:00
Ulysses Souza
4d5a3fdf8f Merge pull request #7211 from ulyssessouza/add-release-script
Add release validation and tagging script release.py
2020-02-03 17:55:50 +01:00
Ulysses Souza
b5c4f4fc0f Add release validation and tagging script release.py
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-02-03 16:52:35 +01:00
Ulysses Souza
d9af02eddf Merge pull request #7199 from ulyssessouza/fix-exec-dotenv
Remove `None` entries on exec command
2020-01-31 10:47:53 +01:00
Ulysses Souza
9f5f8b4757 Remove None entries on execute command
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-01-30 19:26:33 +01:00
Ulysses Souza
160b5c1755 Merge pull request #7197 from ulyssessouza/fix-mac-deployment-target
Force MacOS SDK version to "10.11"
2020-01-30 15:54:32 +01:00
Ulysses Souza
b62722a3c5 Force MacOS SDK version to "10.11"
This is due to the fact that the new CI machines are on 10.14

Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-01-29 17:11:00 +01:00
Ulysses Souza
edf27e486a Pass the interpolation value to python-dotenv
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-01-29 14:29:32 +01:00
Florian Apolloner
f17e7268b0 Properly escape values coming from env_files, fixes #6871
Signed-off-by: Florian Apolloner <florian@apolloner.eu>
2020-01-28 16:33:28 +01:00
Ulysses Souza
73551d5a92 Merge pull request #7165 from hartwork/setup-py-add-missing-test-dependency-ddt
Add missing test dependency ddt to setup.py
2020-01-27 15:06:33 +01:00
Ulysses Souza
4c9d19cf5c Merge pull request #7190 from docker/dependabot/pip/pytest-5.3.4
Bump pytest from 5.3.2 to 5.3.4
2020-01-27 14:58:56 +01:00
Ulysses Souza
409a9d8207 Merge pull request #7150 from ulyssessouza/add-python-dotenv
Add python-dotenv to delegate `.env` file processing
2020-01-27 14:48:41 +01:00
dependabot-preview[bot]
97b8bff14e Bump pytest from 5.3.2 to 5.3.4
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.3.2 to 5.3.4.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.3.2...5.3.4)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-01-27 13:30:57 +00:00
Ulysses Souza
6d2658ea65 Add python-dotenv
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-01-27 13:50:34 +01:00
Ulysses Souza
4bf623d53d Merge pull request #7163 from ndeloof/changelog
Compute changelog by searching previous tag .. even from a tag
2020-01-23 19:01:19 +01:00
Ulysses Souza
ec0f8a8f7c Merge pull request #7177 from docker/ndeloof-patch-1
Tag as `x.y.z` without `v` prefix
2020-01-23 18:59:52 +01:00
Ulysses Souza
8d5023e1ca Enforce Python37 in the creation of virtualenv
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-01-23 18:43:36 +01:00
Ulysses Souza
be807a0be8 Merge pull request #7181 from chris-crone/bump-linux-python
Bump Linux and Python
2020-01-23 12:20:41 +01:00
Christopher Crone
a259c48ae9 Bump Linux and Python
* Alpine 3.10 to 3.11
* Debian Stretch 20191118 to 20191224
* Python 3.7.5 to 3.7.6

Signed-off-by: Christopher Crone <christopher.crone@docker.com>
2020-01-23 11:30:26 +01:00
Nicolas De Loof
a9c79bd5b1 Force sha256 file to be ASCII encoded
Signed-off-by: Nicolas De Loof <nicolas.deloof@gmail.com>
2020-01-22 18:58:27 +01:00
Nicolas De loof
8ad480546f Tag as x.y.z without v prefix
Signed-off-by: Nicolas De Loof <nicolas.deloof@gmail.com>

Close https://github.com/docker/compose/issues/7168
2020-01-22 13:55:24 +01:00
Sebastian Pipping
9887121c2c setup.py: Expose test dependencies as extra "tests"
Signed-off-by: Sebastian Pipping <sebastian@pipping.org>
2020-01-20 19:54:34 +01:00
Sebastian Pipping
1c28b5a5d0 setup.py: Add missing test depencendy ddt
Signed-off-by: Sebastian Pipping <sebastian@pipping.org>
2020-01-20 19:54:30 +01:00
Nicolas De Loof
84dad1a0e6 Compute changelog by searching previous tag .. even from a tag
Signed-off-by: Nicolas De Loof <nicolas.deloof@gmail.com>
2020-01-20 16:18:02 +01:00
Ulysses Souza
ebc56c5ade Merge pull request #7133 from docker/jenkins
Automate release process
2020-01-20 15:23:59 +01:00
Nicolas De Loof
b2e9b83d46 update public CI so we run tests on same combinations of python+docker
Signed-off-by: Nicolas De Loof <nicolas.deloof@gmail.com>
2020-01-09 16:15:34 +01:00
Nicolas De Loof
7ca5973a71 run release on tag by Jenkinsfile
Signed-off-by: Nicolas De Loof <nicolas.deloof@gmail.com>
2020-01-09 16:11:03 +01:00
Nicolas De Loof
657bdef6ff Suggested command is invalid
Signed-off-by: Nicolas De Loof <nicolas.deloof@gmail.com>
2019-11-22 09:25:50 +01:00
45 changed files with 2314 additions and 1716 deletions

View File

@@ -1,9 +1,9 @@
ARG DOCKER_VERSION=19.03.5
ARG PYTHON_VERSION=3.7.5
ARG BUILD_ALPINE_VERSION=3.10
ARG PYTHON_VERSION=3.7.6
ARG BUILD_ALPINE_VERSION=3.11
ARG BUILD_DEBIAN_VERSION=slim-stretch
ARG RUNTIME_ALPINE_VERSION=3.10.3
ARG RUNTIME_DEBIAN_VERSION=stretch-20191118-slim
ARG RUNTIME_ALPINE_VERSION=3.11.3
ARG RUNTIME_DEBIAN_VERSION=stretch-20191224-slim
ARG BUILD_PLATFORM=alpine

4
Jenkinsfile vendored
View File

@@ -1,8 +1,8 @@
#!groovy
def dockerVersions = ['19.03.5', '18.09.9']
def dockerVersions = ['19.03.5']
def baseImages = ['alpine', 'debian']
def pythonVersions = ['py27', 'py37']
def pythonVersions = ['py37']
pipeline {
agent none

View File

@@ -56,7 +56,7 @@ Installation and documentation
Contributing
------------
[![Build Status](https://jenkins.dockerproject.org/buildStatus/icon?job=docker/compose/master)](https://jenkins.dockerproject.org/job/docker/job/compose/job/master/)
[![Build Status](https://ci-next.docker.com/public/buildStatus/icon?job=compose/master)](https://ci-next.docker.com/public/job/compose/job/master/)
Want to help build Compose? Check out our [contributing documentation](https://github.com/docker/compose/blob/master/CONTRIBUTING.md).

View File

@@ -2,7 +2,7 @@
def dockerVersions = ['19.03.5', '18.09.9']
def baseImages = ['alpine', 'debian']
def pythonVersions = ['py27', 'py37']
def pythonVersions = ['py37']
pipeline {
agent none
@@ -72,10 +72,13 @@ pipeline {
agent {
label 'mac-python'
}
environment {
DEPLOYMENT_TARGET="10.11"
}
steps {
checkout scm
sh './script/setup/osx'
sh 'tox -e py27,py37 -- tests/unit'
sh 'tox -e py37 -- tests/unit'
sh './script/build/osx'
dir ('dist') {
checksum('docker-compose-Darwin-x86_64')
@@ -112,7 +115,7 @@ pipeline {
}
steps {
checkout scm
bat 'tox.exe -e py27,py37 -- tests/unit'
bat 'tox.exe -e py37 -- tests/unit'
powershell '.\\script\\build\\windows.ps1'
dir ('dist') {
checksum('docker-compose-Windows-x86_64.exe')
@@ -159,6 +162,9 @@ pipeline {
agent {
label 'linux'
}
environment {
GITHUB_TOKEN = credentials('github-release-token')
}
steps {
checkout scm
sh 'mkdir -p dist'
@@ -167,7 +173,20 @@ pipeline {
unstash "bin-linux"
unstash "bin-win"
unstash "changelog"
githubRelease()
sh("""
curl -SfL https://github.com/github/hub/releases/download/v2.13.0/hub-linux-amd64-2.13.0.tgz | tar xzv --wildcards 'hub-*/bin/hub' --strip=2
./hub release create --draft --prerelease=${env.TAG_NAME !=~ /v[0-9\.]+/} \\
-a docker-compose-Darwin-x86_64 \\
-a docker-compose-Darwin-x86_64.sha256 \\
-a docker-compose-Darwin-x86_64.tgz \\
-a docker-compose-Darwin-x86_64.tgz.sha256 \\
-a docker-compose-Linux-x86_64 \\
-a docker-compose-Linux-x86_64.sha256 \\
-a docker-compose-Windows-x86_64.exe \\
-a docker-compose-Windows-x86_64.exe.sha256 \\
-a ../script/run/run.sh \\
-F CHANGELOG.md \${TAG_NAME}
""")
}
}
}
@@ -175,20 +194,18 @@ pipeline {
agent {
label 'linux'
}
environment {
PYPIRC = credentials('pypirc-docker-dsg-cibot')
}
steps {
checkout scm
withCredentials([[$class: "FileBinding", credentialsId: 'pypirc-docker-dsg-cibot', variable: 'PYPIRC']]) {
sh """
virtualenv venv-publish
source venv-publish/bin/activate
python setup.py sdist bdist_wheel
pip install twine
twine upload --config-file ${PYPIRC} ./dist/docker-compose-${env.TAG_NAME}.tar.gz ./dist/docker_compose-${env.TAG_NAME}-py2.py3-none-any.whl
"""
}
}
post {
sh 'deactivate; rm -rf venv-publish'
sh """
rm -rf build/ dist/
pip install wheel
python setup.py sdist bdist_wheel
pip install twine
~/.local/bin/twine upload --config-file ${PYPIRC} ./dist/docker-compose-*.tar.gz ./dist/docker_compose-*-py2.py3-none-any.whl
"""
}
}
}
@@ -268,9 +285,8 @@ def buildRuntimeImage(baseImage) {
def pushRuntimeImage(baseImage) {
unstash "compose-${baseImage}"
sh 'echo -n "${DOCKERHUB_CREDS_PSW}" | docker login --username "${DOCKERHUB_CREDS_USR}" --password-stdin'
sh "docker load -i dist/docker-compose-${baseImage}.tar"
withDockerRegistry(credentialsId: 'dockerbuildbot-hub.docker.com') {
withDockerRegistry(credentialsId: 'dockerhub-dockerdsgcibot') {
sh "docker push docker/compose:${baseImage}-${env.TAG_NAME}"
if (baseImage == "alpine" && env.TAG_NAME != null) {
sh "docker tag docker/compose:alpine-${env.TAG_NAME} docker/compose:${env.TAG_NAME}"
@@ -279,37 +295,10 @@ def pushRuntimeImage(baseImage) {
}
}
def githubRelease() {
withCredentials([string(credentialsId: 'github-compose-release-test-token', variable: 'GITHUB_TOKEN')]) {
def prerelease = !( env.TAG_NAME ==~ /v[0-9\.]+/ )
changelog = readFile "CHANGELOG.md"
def data = """{
\"tag_name\": \"${env.TAG_NAME}\",
\"name\": \"${env.TAG_NAME}\",
\"draft\": true,
\"prerelease\": ${prerelease},
\"body\" : \"${changelog}\"
"""
echo $data
def url = "https://api.github.com/repos/docker/compose/releases"
def upload_url = sh(returnStdout: true, script: """
curl -sSf -H 'Authorization: token ${GITHUB_TOKEN}' -H 'Accept: application/json' -H 'Content-type: application/json' -X POST -d '$data' $url") \\
| jq '.upload_url | .[:rindex("{")]'
""")
sh("""
for f in * ; do
curl -sf -H 'Authorization: token ${GITHUB_TOKEN}' -H 'Accept: application/json' -H 'Content-type: application/octet-stream' \\
-X POST --data-binary @\$f ${upload_url}?name=\$f;
done
""")
}
}
def checksum(filepath) {
if (isUnix()) {
sh "openssl sha256 -r -out ${filepath}.sha256 ${filepath}"
} else {
powershell "(Get-FileHash -Path ${filepath} -Algorithm SHA256 | % hash) + ' *${filepath}' > ${filepath}.sha256"
powershell "(Get-FileHash -Path ${filepath} -Algorithm SHA256 | % hash).ToLower() + ' *${filepath}' | Out-File -encoding ascii ${filepath}.sha256"
}
}

View File

@@ -1,4 +1,4 @@
from __future__ import absolute_import
from __future__ import unicode_literals
__version__ = '1.26.0dev'
__version__ = '1.26.0-rc3'

View File

@@ -8,7 +8,6 @@ import re
import six
from . import errors
from . import verbose_proxy
from .. import config
from .. import parallel
from ..config.environment import Environment
@@ -17,10 +16,10 @@ from ..const import LABEL_CONFIG_FILES
from ..const import LABEL_ENVIRONMENT_FILE
from ..const import LABEL_WORKING_DIR
from ..project import Project
from .docker_client import docker_client
from .docker_client import get_tls_version
from .docker_client import tls_config_from_options
from .utils import get_version_info
from .docker_client import get_client
from .docker_client import load_context
from .docker_client import make_context
from .errors import UserError
log = logging.getLogger(__name__)
@@ -48,16 +47,28 @@ def project_from_options(project_dir, options, additional_options=None):
environment.silent = options.get('COMMAND', None) in SILENT_COMMANDS
set_parallel_limit(environment)
host = options.get('--host')
# get the context for the run
context = None
context_name = options.get('--context', None)
if context_name:
context = load_context(context_name)
if not context:
raise UserError("Context '{}' not found".format(context_name))
host = options.get('--host', None)
if host is not None:
if context:
raise UserError(
"-H, --host and -c, --context are mutually exclusive. Only one should be set.")
host = host.lstrip('=')
context = make_context(host, options, environment)
return get_project(
project_dir,
get_config_path_from_options(project_dir, options, environment),
project_name=options.get('--project-name'),
verbose=options.get('--verbose'),
host=host,
tls_config=tls_config_from_options(options, environment),
context=context,
environment=environment,
override_dir=override_dir,
compatibility=compatibility_from_options(project_dir, options, environment),
@@ -112,25 +123,8 @@ def get_config_path_from_options(base_dir, options, environment):
return None
def get_client(environment, verbose=False, version=None, tls_config=None, host=None,
tls_version=None):
client = docker_client(
version=version, tls_config=tls_config, host=host,
environment=environment, tls_version=get_tls_version(environment)
)
if verbose:
version_info = six.iteritems(client.version())
log.info(get_version_info('full'))
log.info("Docker base_url: %s", client.base_url)
log.info("Docker version: %s",
", ".join("%s=%s" % item for item in version_info))
return verbose_proxy.VerboseProxy('docker', client)
return client
def get_project(project_dir, config_path=None, project_name=None, verbose=False,
host=None, tls_config=None, environment=None, override_dir=None,
context=None, environment=None, override_dir=None,
compatibility=False, interpolate=True, environment_file=None):
if not environment:
environment = Environment.from_env_file(project_dir)
@@ -145,8 +139,7 @@ def get_project(project_dir, config_path=None, project_name=None, verbose=False,
API_VERSIONS[config_data.version])
client = get_client(
verbose=verbose, version=api_version, tls_config=tls_config,
host=host, environment=environment
verbose=verbose, version=api_version, context=context, environment=environment
)
with errors.handle_connection_errors(client):

View File

@@ -5,17 +5,22 @@ import logging
import os.path
import ssl
import six
from docker import APIClient
from docker import Context
from docker import ContextAPI
from docker import TLSConfig
from docker.errors import TLSParameterError
from docker.tls import TLSConfig
from docker.utils import kwargs_from_env
from docker.utils.config import home_dir
from . import verbose_proxy
from ..config.environment import Environment
from ..const import HTTP_TIMEOUT
from ..utils import unquote_path
from .errors import UserError
from .utils import generate_user_agent
from .utils import get_version_info
log = logging.getLogger(__name__)
@@ -24,6 +29,33 @@ def default_cert_path():
return os.path.join(home_dir(), '.docker')
def make_context(host, options, environment):
tls = tls_config_from_options(options, environment)
ctx = Context("compose", host=host)
if tls:
ctx.set_endpoint("docker", host, tls, skip_tls_verify=not tls.verify)
return ctx
def load_context(name=None):
return ContextAPI.get_context(name)
def get_client(environment, verbose=False, version=None, context=None):
client = docker_client(
version=version, context=context,
environment=environment, tls_version=get_tls_version(environment)
)
if verbose:
version_info = six.iteritems(client.version())
log.info(get_version_info('full'))
log.info("Docker base_url: %s", client.base_url)
log.info("Docker version: %s",
", ".join("%s=%s" % item for item in version_info))
return verbose_proxy.VerboseProxy('docker', client)
return client
def get_tls_version(environment):
compose_tls_version = environment.get('COMPOSE_TLS_VERSION', None)
if not compose_tls_version:
@@ -87,8 +119,7 @@ def tls_config_from_options(options, environment=None):
return None
def docker_client(environment, version=None, tls_config=None, host=None,
tls_version=None):
def docker_client(environment, version=None, context=None, tls_version=None):
"""
Returns a docker-py client configured using environment variables
according to the same logic as the official Docker client.
@@ -101,10 +132,21 @@ def docker_client(environment, version=None, tls_config=None, host=None,
"and DOCKER_CERT_PATH are set correctly.\n"
"You might need to run `eval \"$(docker-machine env default)\"`")
if host:
kwargs['base_url'] = host
if tls_config:
kwargs['tls'] = tls_config
if not context:
# check env for DOCKER_HOST and certs path
host = kwargs.get("base_url", None)
tls = kwargs.get("tls", None)
verify = False if not tls else tls.verify
if host:
context = Context("compose", host=host)
else:
context = ContextAPI.get_current_context()
if tls:
context.set_endpoint("docker", host=host, tls_cfg=tls, skip_tls_verify=not verify)
kwargs['base_url'] = context.Host
if context.TLSConfig:
kwargs['tls'] = context.TLSConfig
if version:
kwargs['version'] = version

View File

@@ -192,6 +192,7 @@ class TopLevelCommand(object):
(default: docker-compose.yml)
-p, --project-name NAME Specify an alternate project name
(default: directory name)
-c, --context NAME Specify a context name
--verbose Show more output
--log-level LEVEL Set log level (DEBUG, INFO, WARNING, ERROR, CRITICAL)
--no-ansi Do not print ANSI control characters
@@ -1201,7 +1202,7 @@ def image_digests_for_project(project):
if e.needs_push:
command_hint = (
"Use `docker-compose push {}` to push them. "
"Use `docker push {}` to push them. "
.format(" ".join(sorted(e.needs_push)))
)
paras += [
@@ -1212,7 +1213,7 @@ def image_digests_for_project(project):
if e.needs_pull:
command_hint = (
"Use `docker-compose pull {}` to pull them. "
"Use `docker pull {}` to pull them. "
.format(" ".join(sorted(e.needs_pull)))
)
@@ -1466,7 +1467,12 @@ def call_docker(args, dockeropts, environment):
args = [executable_path] + tls_options + args
log.debug(" ".join(map(pipes.quote, args)))
return subprocess.call(args, env=environment)
filtered_env = {}
for k, v in environment.items():
if v is not None:
filtered_env[k] = environment[k]
return subprocess.call(args, env=filtered_env)
def parse_scale_args(options):

View File

@@ -408,7 +408,7 @@ def load(config_details, compatibility=False, interpolate=True):
configs = load_mapping(
config_details.config_files, 'get_configs', 'Config', config_details.working_dir
)
service_dicts = load_services(config_details, main_file, compatibility)
service_dicts = load_services(config_details, main_file, compatibility, interpolate=interpolate)
if main_file.version != V1:
for service_dict in service_dicts:
@@ -460,7 +460,7 @@ def validate_external(entity_type, name, config, version):
entity_type, name, ', '.join(k for k in config if k != 'external')))
def load_services(config_details, config_file, compatibility=False):
def load_services(config_details, config_file, compatibility=False, interpolate=True):
def build_service(service_name, service_dict, service_names):
service_config = ServiceConfig.with_abs_paths(
config_details.working_dir,
@@ -479,7 +479,8 @@ def load_services(config_details, config_file, compatibility=False):
service_names,
config_file.version,
config_details.environment,
compatibility
compatibility,
interpolate
)
return service_dict
@@ -679,13 +680,13 @@ class ServiceExtendsResolver(object):
return filename
def resolve_environment(service_dict, environment=None):
def resolve_environment(service_dict, environment=None, interpolate=True):
"""Unpack any environment variables from an env_file, if set.
Interpolate environment values if set.
"""
env = {}
for env_file in service_dict.get('env_file', []):
env.update(env_vars_from_file(env_file))
env.update(env_vars_from_file(env_file, interpolate))
env.update(parse_environment(service_dict.get('environment')))
return dict(resolve_env_var(k, v, environment) for k, v in six.iteritems(env))
@@ -881,11 +882,12 @@ def finalize_service_volumes(service_dict, environment):
return service_dict
def finalize_service(service_config, service_names, version, environment, compatibility):
def finalize_service(service_config, service_names, version, environment, compatibility,
interpolate=True):
service_dict = dict(service_config.config)
if 'environment' in service_dict or 'env_file' in service_dict:
service_dict['environment'] = resolve_environment(service_dict, environment)
service_dict['environment'] = resolve_environment(service_dict, environment, interpolate)
service_dict.pop('env_file', None)
if 'volumes_from' in service_dict:
@@ -990,12 +992,17 @@ def translate_deploy_keys_to_container_config(service_dict):
deploy_dict = service_dict['deploy']
ignored_keys = [
k for k in ['endpoint_mode', 'labels', 'update_config', 'rollback_config', 'placement']
k for k in ['endpoint_mode', 'labels', 'update_config', 'rollback_config']
if k in deploy_dict
]
if 'replicas' in deploy_dict and deploy_dict.get('mode', 'replicated') == 'replicated':
service_dict['scale'] = deploy_dict['replicas']
scale = deploy_dict.get('replicas', 1)
max_replicas = deploy_dict.get('placement', {}).get('max_replicas_per_node', scale)
service_dict['scale'] = min(scale, max_replicas)
if max_replicas < scale:
log.warning("Scale is limited to {} ('max_replicas_per_node' field).".format(
max_replicas))
if 'restart_policy' in deploy_dict:
service_dict['restart'] = {

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -1,12 +1,11 @@
from __future__ import absolute_import
from __future__ import unicode_literals
import codecs
import contextlib
import logging
import os
import re
import dotenv
import six
from ..const import IS_WINDOWS_PLATFORM
@@ -31,7 +30,7 @@ def split_env(env):
return key, value
def env_vars_from_file(filename):
def env_vars_from_file(filename, interpolate=True):
"""
Read in a line delimited file of environment variables.
"""
@@ -39,16 +38,10 @@ def env_vars_from_file(filename):
raise EnvFileNotFound("Couldn't find env file: {}".format(filename))
elif not os.path.isfile(filename):
raise EnvFileNotFound("{} is not a file.".format(filename))
env = {}
with contextlib.closing(codecs.open(filename, 'r', 'utf-8-sig')) as fileobj:
for line in fileobj:
line = line.strip()
if line and not line.startswith('#'):
try:
k, v = split_env(line)
env[k] = v
except ConfigurationError as e:
raise ConfigurationError('In file {}: {}'.format(filename, e.msg))
env = dotenv.dotenv_values(dotenv_path=filename, encoding='utf-8-sig', interpolate=interpolate)
for k, v in env.items():
env[k] = v if interpolate else v.replace('$', '$$')
return env

View File

@@ -41,7 +41,10 @@ COMPOSEFILE_V3_4 = ComposeVersion('3.4')
COMPOSEFILE_V3_5 = ComposeVersion('3.5')
COMPOSEFILE_V3_6 = ComposeVersion('3.6')
COMPOSEFILE_V3_7 = ComposeVersion('3.7')
COMPOSEFILE_V3_8 = ComposeVersion('3.8')
# minimum DOCKER ENGINE API version needed to support
# features for each compose schema version
API_VERSIONS = {
COMPOSEFILE_V1: '1.21',
COMPOSEFILE_V2_0: '1.22',
@@ -57,6 +60,7 @@ API_VERSIONS = {
COMPOSEFILE_V3_5: '1.30',
COMPOSEFILE_V3_6: '1.36',
COMPOSEFILE_V3_7: '1.38',
COMPOSEFILE_V3_8: '1.38',
}
API_VERSION_TO_ENGINE_VERSION = {
@@ -74,4 +78,5 @@ API_VERSION_TO_ENGINE_VERSION = {
API_VERSIONS[COMPOSEFILE_V3_5]: '17.06.0',
API_VERSIONS[COMPOSEFILE_V3_6]: '18.02.0',
API_VERSIONS[COMPOSEFILE_V3_7]: '18.06.0',
API_VERSIONS[COMPOSEFILE_V3_8]: '18.06.0',
}

View File

@@ -87,6 +87,11 @@ exe = EXE(pyz,
'compose/config/config_schema_v3.7.json',
'DATA'
),
(
'compose/config/config_schema_v3.8.json',
'compose/config/config_schema_v3.8.json',
'DATA'
),
(
'compose/GITSHA',
'compose/GITSHA',

View File

@@ -96,6 +96,11 @@ coll = COLLECT(exe,
'compose/config/config_schema_v3.7.json',
'DATA'
),
(
'compose/config/config_schema_v3.8.json',
'compose/config/config_schema_v3.8.json',
'DATA'
),
(
'compose/GITSHA',
'compose/GITSHA',

View File

@@ -1,7 +1,9 @@
Click==7.0
coverage==5.0.3
ddt==1.2.2
flake8==3.7.9
gitpython==2.1.15
mock==3.0.5
pytest==5.3.2; python_version >= '3.5'
pytest==5.3.4; python_version >= '3.5'
pytest==4.6.5; python_version < '3.5'
pytest-cov==2.8.1

View File

@@ -4,7 +4,7 @@ cached-property==1.5.1
certifi==2019.11.28
chardet==3.0.4
colorama==0.4.3; sys_platform == 'win32'
docker==4.1.0
docker==4.2.0
docker-pycreds==0.4.0
dockerpty==0.4.1
docopt==0.6.2
@@ -17,6 +17,7 @@ paramiko==2.7.1
pypiwin32==219; sys_platform == 'win32' and python_version < '3.6'
pypiwin32==223; sys_platform == 'win32' and python_version >= '3.6'
PySocks==1.7.1
python-dotenv==0.11.0
PyYAML==5.3
requests==2.22.0
six==1.12.0

View File

@@ -6,7 +6,7 @@
#
# http://git-scm.com/download/win
#
# 2. Install Python 3.7.2:
# 2. Install Python 3.7.x:
#
# https://www.python.org/downloads/
#
@@ -39,7 +39,7 @@ if (Test-Path venv) {
Get-ChildItem -Recurse -Include *.pyc | foreach ($_) { Remove-Item $_.FullName }
# Create virtualenv
virtualenv .\venv
virtualenv -p C:\Python37\python.exe .\venv
# pip and pyinstaller generate lots of warnings, so we need to ignore them
$ErrorActionPreference = "Continue"

View File

@@ -1,201 +1,18 @@
# Release HOWTO
This file describes the process of making a public release of `docker-compose`.
Please read it carefully before proceeding!
The release process is fully automated by `Release.Jenkinsfile`.
## Prerequisites
## Usage
The following things are required to bring a release to a successful conclusion
1. In the appropriate branch, run `./scripts/release/release tag <version>`
### Local Docker engine (Linux Containers)
By appropriate, we mean for a version `1.26.0` or `1.26.0-rc1` you should run the script in the `1.26.x` branch.
The release script builds images that will be part of the release.
The script should check the above then ask for changelog modifications.
### Docker Hub account
After the executions, you should have a commit with the proper bumps for `docker-compose version` and `run.sh`
You should be logged into a Docker Hub account that allows pushing to the
following repositories:
2. Run `git push --tags upstream <version_branch>`
This should trigger a new CI build on the new tag. When the CI finishes with the tests and builds a new draft release would be available on github's releases page.
- docker/compose
- docker/compose-tests
### Python
The release script is written in Python and requires Python 3.3 at minimum.
### A Github account and Github API token
Your Github account needs to have write access on the `docker/compose` repo.
To generate a Github token, head over to the
[Personal access tokens](https://github.com/settings/tokens) page in your
Github settings and select "Generate new token". Your token should include
(at minimum) the following scopes:
- `repo:status`
- `public_repo`
This API token should be exposed to the release script through the
`GITHUB_TOKEN` environment variable.
### A Bintray account and Bintray API key
Your Bintray account will need to be an admin member of the
[docker-compose organization](https://bintray.com/docker-compose).
Additionally, you should generate a personal API key. To do so, click your
username in the top-right hand corner and select "Edit profile" ; on the new
page, select "API key" in the left-side menu.
This API key should be exposed to the release script through the
`BINTRAY_TOKEN` environment variable.
### A PyPi account
Said account needs to be a member of the maintainers group for the
[`docker-compose` project](https://pypi.org/project/docker-compose/).
Moreover, the `~/.pypirc` file should exist on your host and contain the
relevant pypi credentials.
The following is a sample `.pypirc` provided as a guideline:
```
[distutils]
index-servers =
pypi
[pypi]
username = user
password = pass
```
## Start a feature release
A feature release is a release that includes all changes present in the
`master` branch when initiated. It's typically versioned `X.Y.0-rc1`, where
Y is the minor version of the previous release incremented by one. A series
of one or more Release Candidates (RCs) should be made available to the public
to find and squash potential bugs.
From the root of the Compose repository, run the following command:
```
./script/release/release.sh -b <BINTRAY_USERNAME> start X.Y.0-rc1
```
After a short initialization period, the script will invite you to edit the
`CHANGELOG.md` file. Do so by being careful to respect the same format as
previous releases. Once done, the script will display a `diff` of the staged
changes for the bump commit. Once you validate these, a bump commit will be
created on the newly created release branch and pushed remotely.
The release tool then waits for the CI to conclude before proceeding.
If failures are reported, the release will be aborted until these are fixed.
Please refer to the "Resume a draft release" section below for more details.
Once all resources have been prepared, the release script will exit with a
message resembling this one:
```
You're almost done! Please verify that everything is in order and you are ready
to make the release public, then run the following command:
./script/release/release.sh -b user finalize X.Y.0-rc1
```
Once you are ready to finalize the release (making binaries and other versioned
assets public), proceed to the "Finalize a release" section of this guide.
## Start a patch release
A patch release is a release that builds off a previous release with discrete
additions. This can be an RC release after RC1 (`X.Y.0-rcZ`, `Z > 1`), a GA release
based off the final RC (`X.Y.0`), or a bugfix release based off a previous
GA release (`X.Y.Z`, `Z > 0`).
From the root of the Compose repository, run the following command:
```
./script/release/release.sh -b <BINTRAY_USERNAME> start --patch=BASE_VERSION RELEASE_VERSION
```
The process of starting a patch release is identical to starting a feature
release except for one difference ; at the beginning, the script will ask for
PR numbers you wish to cherry-pick into the release. These numbers should
correspond to existing PRs on the docker/compose repository. Multiple numbers
should be separated by whitespace.
Once you are ready to finalize the release (making binaries and other versioned
assets public), proceed to the "Finalize a release" section of this guide.
## Finalize a release
Once you're ready to make your release public, you may execute the following
command from the root of the Compose repository:
```
./script/release/release.sh -b <BINTRAY_USERNAME> finalize RELEASE_VERSION
```
Note that this command will create and publish versioned assets to the public.
As a result, it can not be reverted. The command will perform some basic
sanity checks before doing so, but it is your responsibility to ensure
everything is in order before pushing the button.
After the command exits, you should make sure:
- The `docker/compose:VERSION` image is available on Docker Hub and functional
- The `pip install -U docker-compose==VERSION` command correctly installs the
specified version
- The install command on the Github release page installs the new release
## Resume a draft release
"Resuming" a release lets you address the following situations occurring before
a release is made final:
- Cherry-pick additional PRs to include in the release
- Resume a release that was aborted because of CI failures after they've been
addressed
- Rebuild / redownload assets after manual changes have been made to the
release branch
- etc.
From the root of the Compose repository, run the following command:
```
./script/release/release.sh -b <BINTRAY_USERNAME> resume RELEASE_VERSION
```
The release tool will attempt to determine what steps it's already been through
for the specified release and pick up where it left off. Some steps are
executed again no matter what as it's assumed they'll produce different
results, like building images or downloading binaries.
## Cancel a draft release
If issues snuck into your release branch, it is sometimes easier to start from
scratch. Before a release has been finalized, it is possible to cancel it using
the following command:
```
./script/release/release.sh -b <BINTRAY_USERNAME> cancel RELEASE_VERSION
```
This will remove the release branch with this release (locally and remotely),
close the associated PR, remove the release page draft on Github and delete
the Bintray repository for it, allowing you to start fresh.
## Manual operations
Some common, release-related operations are not covered by this tool and should
be handled manually by the operator:
- After any release:
- Announce new release on Slack
- After a GA release:
- Close the release milestone
- Merge back `CHANGELOG.md` changes from the `release` branch into `master`
- Bump the version in `compose/__init__.py` to the *next* minor version
number with `dev` appended. For example, if you just released `1.4.0`,
update it to `1.5.0dev`
- Update compose_version in [github.com/docker/docker.github.io/blob/master/_config.yml](https://github.com/docker/docker.github.io/blob/master/_config.yml) and [github.com/docker/docker.github.io/blob/master/_config_authoring.yml](https://github.com/docker/docker.github.io/blob/master/_config_authoring.yml)
- Update the release note in [github.com/docker/docker.github.io](https://github.com/docker/docker.github.io/blob/master/release-notes/docker-compose.md)
## Advanced options
You can consult the full list of options for the release tool by executing
`./script/release/release.sh --help`.
3. Check and confirm the release on github's release page.

7
script/release/const.py Normal file
View File

@@ -0,0 +1,7 @@
from __future__ import absolute_import
from __future__ import unicode_literals
import os
REPO_ROOT = os.path.join(os.path.dirname(__file__), '..', '..')

View File

@@ -10,7 +10,7 @@ set -x
git config --add remote.origin.fetch +refs/pull/*/head:refs/remotes/origin/pull/*
git fetch origin
RANGE=${1:-"$(git describe --tags --abbrev=0)..HEAD"}
RANGE=${1:-"$(git describe --tags --abbrev=0 HEAD^)..HEAD"}
echo "Generate changelog for range ${RANGE}"
echo
@@ -26,14 +26,17 @@ changes=$(pullrequests | uniq)
echo "pull requests merged within range:"
echo $changes
echo '#Features' > CHANGELOG.md
echo '#Features' > FEATURES.md
echo '#Bugs' > BUGS.md
for pr in $changes; do
curl -fs -H "Authorization: token ${GITHUB_TOKEN}" https://api.github.com/repos/docker/compose/pulls/${pr} \
| jq -r ' select( .labels[].name | contains("kind/feature") ) | "* "+.title' >> CHANGELOG.md
curl -fs -H "Authorization: token ${GITHUB_TOKEN}" https://api.github.com/repos/docker/compose/pulls/${pr} -o PR.json
cat PR.json | jq -r ' select( .labels[].name | contains("kind/feature") ) | "- "+.title' >> FEATURES.md
cat PR.json | jq -r ' select( .labels[].name | contains("kind/bug") ) | "- "+.title' >> BUGS.md
done
echo '#Bugs' >> CHANGELOG.md
for pr in $changes; do
curl -fs -H "Authorization: token ${GITHUB_TOKEN}" https://api.github.com/repos/docker/compose/pulls/${pr} \
| jq -r ' select( .labels[].name | contains("kind/bug") ) | "* "+.title' >> CHANGELOG.md
done
echo ${TAG_NAME} > CHANGELOG.md
echo >> CHANGELOG.md
cat FEATURES.md >> CHANGELOG.md
echo >> CHANGELOG.md
cat BUGS.md >> CHANGELOG.md

View File

@@ -1,74 +0,0 @@
#!/bin/bash
#
# Create the official release
#
. "$(dirname "${BASH_SOURCE[0]}")/utils.sh"
function usage() {
>&2 cat << EOM
Publish a release by building all artifacts and pushing them.
This script requires that 'git config branch.${BRANCH}.release' is set to the
release version for the release branch.
EOM
exit 1
}
BRANCH="$(git rev-parse --abbrev-ref HEAD)"
VERSION="$(git config "branch.${BRANCH}.release")" || usage
if [ -z "$(command -v jq 2> /dev/null)" ]; then
>&2 echo "$0 requires https://stedolan.github.io/jq/"
>&2 echo "Please install it and make sure it is available on your \$PATH."
exit 2
fi
API=https://api.github.com/repos
REPO=docker/compose
GITHUB_REPO=git@github.com:$REPO
# Check the build status is green
sha=$(git rev-parse HEAD)
url=$API/$REPO/statuses/$sha
build_status=$(curl -s $url | jq -r '.[0].state')
if [ -n "$SKIP_BUILD_CHECK" ]; then
echo "Skipping build status check..."
elif [[ "$build_status" != "success" ]]; then
>&2 echo "Build status is $build_status, but it should be success."
exit -1
fi
echo "Tagging the release as $VERSION"
git tag $VERSION
git push $GITHUB_REPO $VERSION
echo "Uploading the docker image"
docker push docker/compose:$VERSION
echo "Uploading the compose-tests image"
docker push docker/compose-tests:latest
docker push docker/compose-tests:$VERSION
echo "Uploading package to PyPI"
./script/build/write-git-sha
python setup.py sdist bdist_wheel
if [ "$(command -v twine 2> /dev/null)" ]; then
twine upload ./dist/docker-compose-${VERSION/-/}.tar.gz ./dist/docker_compose-${VERSION/-/}-py2.py3-none-any.whl
else
python setup.py upload
fi
echo "Testing pip package"
deactivate || true
virtualenv venv-test
source venv-test/bin/activate
pip install docker-compose==$VERSION
docker-compose version
deactivate
rm -rf venv-test
echo "Now publish the github release, and test the downloads."
echo "Email maintainers@dockerproject.org and engineering@docker.com about the new release."

View File

@@ -1,38 +0,0 @@
#!/bin/bash
#
# Move the "bump to <version>" commit to the HEAD of the branch
#
. "$(dirname "${BASH_SOURCE[0]}")/utils.sh"
function usage() {
>&2 cat << EOM
Move the "bump to <version>" commit to the HEAD of the branch
This script requires that 'git config branch.${BRANCH}.release' is set to the
release version for the release branch.
EOM
exit 1
}
BRANCH="$(git rev-parse --abbrev-ref HEAD)"
VERSION="$(git config "branch.${BRANCH}.release")" || usage
COMMIT_MSG="Bump $VERSION"
sha="$(git log --grep "$COMMIT_MSG\$" --format="%H")"
if [ -z "$sha" ]; then
>&2 echo "No commit with message \"$COMMIT_MSG\""
exit 2
fi
if [[ "$sha" == "$(git rev-parse HEAD)" ]]; then
>&2 echo "Bump commit already at HEAD"
exit 0
fi
commits=$(git log --format="%H" "$sha..HEAD" | wc -l | xargs echo)
git rebase --onto $sha~1 HEAD~$commits $BRANCH
git cherry-pick $sha

View File

@@ -1,387 +1,126 @@
#!/usr/bin/env python3
from __future__ import absolute_import
from __future__ import print_function
from __future__ import unicode_literals
import argparse
import os
import shutil
import sys
import time
import re
from jinja2 import Template
from release.bintray import BintrayAPI
from release.const import BINTRAY_ORG
from release.const import NAME
from release.const import REPO_ROOT
from release.downloader import BinaryDownloader
from release.images import ImageManager
from release.images import is_tag_latest
from release.pypi import check_pypirc
from release.pypi import pypi_upload
from release.repository import delete_assets
from release.repository import get_contributors
from release.repository import Repository
from release.repository import upload_assets
from release.utils import branch_name
from release.utils import compatibility_matrix
from release.utils import read_release_notes_from_changelog
from release.utils import ScriptError
from release.utils import update_init_py_version
from release.utils import update_run_sh_version
from release.utils import yesno
import click
from git import Repo
from utils import update_init_py_version
from utils import update_run_sh_version
from utils import yesno
VALID_VERSION_PATTERN = re.compile(r"^\d+\.\d+\.\d+(-rc\d+)?$")
def create_initial_branch(repository, args):
release_branch = repository.create_release_branch(args.release, args.base)
if args.base and args.cherries:
print('Detected patch version.')
cherries = input('Indicate (space-separated) PR numbers to cherry-pick then press Enter:\n')
repository.cherry_pick_prs(release_branch, cherries.split())
class Version(str):
def matching_groups(self):
match = VALID_VERSION_PATTERN.match(self)
if not match:
return False
return create_bump_commit(repository, release_branch, args.bintray_user, args.bintray_org)
return match.groups()
def is_ga_version(self):
groups = self.matching_groups()
if not groups:
return False
rc_suffix = groups[1]
return not rc_suffix
def validate(self):
return len(self.matching_groups()) > 0
def branch_name(self):
if not self.validate():
return None
rc_part = self.matching_groups()[0]
ver = self
if rc_part:
ver = ver[:-len(rc_part)]
tokens = ver.split(".")
tokens[-1] = 'x'
return ".".join(tokens)
def create_bump_commit(repository, release_branch, bintray_user, bintray_org):
with release_branch.config_reader() as cfg:
release = cfg.get('release')
print('Updating version info in __init__.py and run.sh')
update_run_sh_version(release)
update_init_py_version(release)
def create_bump_commit(repository, version):
print('Creating bump commit...')
repository.commit('-a', '-s', '-m "Bump {}"'.format(version), '--no-verify')
def validate_environment(version, repository):
if not version.validate():
print('Version "{}" has an invalid format. This should follow D+.D+.D+(-rcD+). '
'Like: 1.26.0 or 1.26.0-rc1'.format(version))
return False
expected_branch = version.branch_name()
if str(repository.active_branch) != expected_branch:
print('Cannot tag in this branch with version "{}". '
'Please checkout "{}" to tag'.format(version, version.branch_name()))
return False
return True
@click.group()
def cli():
pass
@cli.command()
@click.argument('version')
def tag(version):
"""
Updates the version related files and tag
"""
repo = Repo(".")
version = Version(version)
if not validate_environment(version, repo):
return
update_init_py_version(version)
update_run_sh_version(version)
input('Please add the release notes to the CHANGELOG.md file, then press Enter to continue.')
proceed = None
proceed = False
while not proceed:
print(repository.diff())
print(repo.git.diff())
proceed = yesno('Are these changes ok? y/N ', default=False)
if repository.diff():
repository.create_bump_commit(release_branch, release)
repository.push_branch_to_remote(release_branch)
bintray_api = BintrayAPI(os.environ['BINTRAY_TOKEN'], bintray_user)
if not bintray_api.repository_exists(bintray_org, release_branch.name):
print('Creating data repository {} on bintray'.format(release_branch.name))
bintray_api.create_repository(bintray_org, release_branch.name, 'generic')
if repo.git.diff():
create_bump_commit(repo.git, version)
else:
print('Bintray repository {} already exists. Skipping'.format(release_branch.name))
print('No changes to commit. Exiting...')
return
repo.create_tag(version)
print('Please, check the changes. If everything is OK, you just need to push with:\n'
'$ git push --tags upstream {}'.format(version.branch_name()))
def monitor_pr_status(pr_data):
print('Waiting for CI to complete...')
last_commit = pr_data.get_commits().reversed[0]
while True:
status = last_commit.get_combined_status()
if status.state == 'pending' or status.state == 'failure':
summary = {
'pending': 0,
'success': 0,
'failure': 0,
'error': 0,
}
for detail in status.statuses:
if detail.context == 'dco-signed':
# dco-signed check breaks on merge remote-tracking ; ignore it
continue
if detail.state in summary:
summary[detail.state] += 1
print(
'{pending} pending, {success} successes, {failure} failures, '
'{error} errors'.format(**summary)
)
if summary['failure'] > 0 or summary['error'] > 0:
raise ScriptError('CI failures detected!')
elif summary['pending'] == 0 and summary['success'] > 0:
# This check assumes at least 1 non-DCO CI check to avoid race conditions.
# If testing on a repo without CI, use --skip-ci-check to avoid looping eternally
return True
time.sleep(30)
elif status.state == 'success':
print('{} successes: all clear!'.format(status.total_count))
return True
@cli.command()
@click.argument('version')
def push_latest(version):
"""
TODO Pushes the latest tag pointing to a certain GA version
"""
raise NotImplementedError
def check_pr_mergeable(pr_data):
if pr_data.mergeable is False:
# mergeable can also be null, in which case the warning would be a false positive.
print(
'WARNING!! PR #{} can not currently be merged. You will need to '
'resolve the conflicts manually before finalizing the release.'.format(pr_data.number)
)
return pr_data.mergeable is True
def create_release_draft(repository, version, pr_data, files):
print('Creating Github release draft')
with open(os.path.join(os.path.dirname(__file__), 'release.md.tmpl'), 'r') as f:
template = Template(f.read())
print('Rendering release notes based on template')
release_notes = template.render(
version=version,
compat_matrix=compatibility_matrix(),
integrity=files,
contributors=get_contributors(pr_data),
changelog=read_release_notes_from_changelog(),
)
gh_release = repository.create_release(
version, release_notes, draft=True, prerelease='-rc' in version,
target_commitish='release'
)
print('Release draft initialized')
return gh_release
def print_final_instructions(args):
print(
"You're almost done! Please verify that everything is in order and "
"you are ready to make the release public, then run the following "
"command:\n{exe} -b {user} finalize {version}".format(
exe='./script/release/release.sh', user=args.bintray_user, version=args.release
)
)
def distclean():
print('Running distclean...')
dirs = [
os.path.join(REPO_ROOT, 'build'), os.path.join(REPO_ROOT, 'dist'),
os.path.join(REPO_ROOT, 'docker-compose.egg-info')
]
files = []
for base, dirnames, fnames in os.walk(REPO_ROOT):
for fname in fnames:
path = os.path.normpath(os.path.join(base, fname))
if fname.endswith('.pyc'):
files.append(path)
elif fname.startswith('.coverage.'):
files.append(path)
for dirname in dirnames:
path = os.path.normpath(os.path.join(base, dirname))
if dirname == '__pycache__':
dirs.append(path)
elif dirname == '.coverage-binfiles':
dirs.append(path)
for file in files:
os.unlink(file)
for folder in dirs:
shutil.rmtree(folder, ignore_errors=True)
def resume(args):
try:
distclean()
repository = Repository(REPO_ROOT, args.repo)
br_name = branch_name(args.release)
if not repository.branch_exists(br_name):
raise ScriptError('No local branch exists for this release.')
gh_release = repository.find_release(args.release)
if gh_release and not gh_release.draft:
print('WARNING!! Found non-draft (public) release for this version!')
proceed = yesno(
'Are you sure you wish to proceed? Modifying an already '
'released version is dangerous! y/N ', default=False
)
if proceed.lower() is not True:
raise ScriptError('Aborting release')
release_branch = repository.checkout_branch(br_name)
if args.cherries:
cherries = input('Indicate (space-separated) PR numbers to cherry-pick then press Enter:\n')
repository.cherry_pick_prs(release_branch, cherries.split())
create_bump_commit(repository, release_branch, args.bintray_user, args.bintray_org)
pr_data = repository.find_release_pr(args.release)
if not pr_data:
pr_data = repository.create_release_pull_request(args.release)
check_pr_mergeable(pr_data)
if not args.skip_ci:
monitor_pr_status(pr_data)
downloader = BinaryDownloader(args.destination)
files = downloader.download_all(args.release)
if not gh_release:
gh_release = create_release_draft(repository, args.release, pr_data, files)
delete_assets(gh_release)
upload_assets(gh_release, files)
tag_as_latest = is_tag_latest(args.release)
img_manager = ImageManager(args.release, tag_as_latest)
img_manager.build_images(repository)
except ScriptError as e:
print(e)
return 1
print_final_instructions(args)
return 0
def cancel(args):
try:
repository = Repository(REPO_ROOT, args.repo)
repository.close_release_pr(args.release)
repository.remove_release(args.release)
repository.remove_bump_branch(args.release)
bintray_api = BintrayAPI(os.environ['BINTRAY_TOKEN'], args.bintray_user)
print('Removing Bintray data repository for {}'.format(args.release))
bintray_api.delete_repository(args.bintray_org, branch_name(args.release))
distclean()
except ScriptError as e:
print(e)
return 1
print('Release cancellation complete.')
return 0
def start(args):
distclean()
try:
repository = Repository(REPO_ROOT, args.repo)
create_initial_branch(repository, args)
pr_data = repository.create_release_pull_request(args.release)
check_pr_mergeable(pr_data)
if not args.skip_ci:
monitor_pr_status(pr_data)
downloader = BinaryDownloader(args.destination)
files = downloader.download_all(args.release)
gh_release = create_release_draft(repository, args.release, pr_data, files)
upload_assets(gh_release, files)
tag_as_latest = is_tag_latest(args.release)
img_manager = ImageManager(args.release, tag_as_latest)
img_manager.build_images(repository)
except ScriptError as e:
print(e)
return 1
print_final_instructions(args)
return 0
def finalize(args):
distclean()
try:
check_pypirc()
repository = Repository(REPO_ROOT, args.repo)
tag_as_latest = is_tag_latest(args.release)
img_manager = ImageManager(args.release, tag_as_latest)
pr_data = repository.find_release_pr(args.release)
if not pr_data:
raise ScriptError('No PR found for {}'.format(args.release))
if not check_pr_mergeable(pr_data):
raise ScriptError('Can not finalize release with an unmergeable PR')
if not img_manager.check_images():
raise ScriptError('Missing release image')
br_name = branch_name(args.release)
if not repository.branch_exists(br_name):
raise ScriptError('No local branch exists for this release.')
gh_release = repository.find_release(args.release)
if not gh_release:
raise ScriptError('No Github release draft for this version')
repository.checkout_branch(br_name)
os.system('python {setup_script} sdist bdist_wheel'.format(
setup_script=os.path.join(REPO_ROOT, 'setup.py')))
merge_status = pr_data.merge()
if not merge_status.merged and not args.finalize_resume:
raise ScriptError(
'Unable to merge PR #{}: {}'.format(pr_data.number, merge_status.message)
)
pypi_upload(args)
img_manager.push_images()
repository.publish_release(gh_release)
except ScriptError as e:
print(e)
return 1
return 0
ACTIONS = [
'start',
'cancel',
'resume',
'finalize',
]
EPILOG = '''Example uses:
* Start a new feature release (includes all changes currently in master)
release.sh -b user start 1.23.0
* Start a new patch release
release.sh -b user --patch 1.21.0 start 1.21.1
* Cancel / rollback an existing release draft
release.sh -b user cancel 1.23.0
* Restart a previously aborted patch release
release.sh -b user -p 1.21.0 resume 1.21.1
'''
def main():
if 'GITHUB_TOKEN' not in os.environ:
print('GITHUB_TOKEN environment variable must be set')
return 1
if 'BINTRAY_TOKEN' not in os.environ:
print('BINTRAY_TOKEN environment variable must be set')
return 1
parser = argparse.ArgumentParser(
description='Orchestrate a new release of docker/compose. This tool assumes that you have '
'obtained a Github API token and Bintray API key and set the GITHUB_TOKEN and '
'BINTRAY_TOKEN environment variables accordingly.',
epilog=EPILOG, formatter_class=argparse.RawTextHelpFormatter)
parser.add_argument(
'action', choices=ACTIONS, help='The action to be performed for this release'
)
parser.add_argument('release', help='Release number, e.g. 1.9.0-rc1, 2.1.1')
parser.add_argument(
'--patch', '-p', dest='base',
help='Which version is being patched by this release'
)
parser.add_argument(
'--repo', '-r', dest='repo', default=NAME,
help='Start a release for the given repo (default: {})'.format(NAME)
)
parser.add_argument(
'-b', dest='bintray_user', required=True, metavar='USER',
help='Username associated with the Bintray API key'
)
parser.add_argument(
'--bintray-org', dest='bintray_org', metavar='ORG', default=BINTRAY_ORG,
help='Organization name on bintray where the data repository will be created.'
)
parser.add_argument(
'--destination', '-o', metavar='DIR', default='binaries',
help='Directory where release binaries will be downloaded relative to the project root'
)
parser.add_argument(
'--no-cherries', '-C', dest='cherries', action='store_false',
help='If set, the program will not prompt the user for PR numbers to cherry-pick'
)
parser.add_argument(
'--skip-ci-checks', dest='skip_ci', action='store_true',
help='If set, the program will not wait for CI jobs to complete'
)
parser.add_argument(
'--finalize-resume', dest='finalize_resume', action='store_true',
help='If set, finalize will continue through steps that have already been completed.'
)
args = parser.parse_args()
if args.action == 'start':
return start(args)
elif args.action == 'resume':
return resume(args)
elif args.action == 'cancel':
return cancel(args)
elif args.action == 'finalize':
return finalize(args)
print('Unexpected action "{}"'.format(args.action), file=sys.stderr)
return 1
@cli.command()
@click.argument('version')
def ghtemplate(version):
"""
TODO Generates the github release page content
"""
version = Version(version)
raise NotImplementedError
if __name__ == '__main__':
sys.exit(main())
cli()

View File

@@ -1,13 +0,0 @@
#!/bin/sh
if test -d ${VENV_DIR:-./.release-venv}; then
true
else
./script/release/setup-venv.sh
fi
if test -z "$*"; then
args="--help"
fi
${VENV_DIR:-./.release-venv}/bin/python ./script/release/release.py "$@"

View File

@@ -1,50 +0,0 @@
from __future__ import absolute_import
from __future__ import unicode_literals
import json
import requests
from .const import NAME
class BintrayAPI(requests.Session):
def __init__(self, api_key, user, *args, **kwargs):
super(BintrayAPI, self).__init__(*args, **kwargs)
self.auth = (user, api_key)
self.base_url = 'https://api.bintray.com/'
def create_repository(self, subject, repo_name, repo_type='generic'):
url = '{base}repos/{subject}/{repo_name}'.format(
base=self.base_url, subject=subject, repo_name=repo_name,
)
data = {
'name': repo_name,
'type': repo_type,
'private': False,
'desc': 'Automated release for {}: {}'.format(NAME, repo_name),
'labels': ['docker-compose', 'docker', 'release-bot'],
}
return self.post_json(url, data)
def repository_exists(self, subject, repo_name):
url = '{base}/repos/{subject}/{repo_name}'.format(
base=self.base_url, subject=subject, repo_name=repo_name,
)
result = self.get(url)
if result.status_code == 404:
return False
result.raise_for_status()
return True
def delete_repository(self, subject, repo_name):
url = '{base}repos/{subject}/{repo_name}'.format(
base=self.base_url, subject=subject, repo_name=repo_name,
)
return self.delete(url)
def post_json(self, url, data, **kwargs):
if 'headers' not in kwargs:
kwargs['headers'] = {}
kwargs['headers']['Content-Type'] = 'application/json'
return self.post(url, data=json.dumps(data), **kwargs)

View File

@@ -1,10 +0,0 @@
from __future__ import absolute_import
from __future__ import unicode_literals
import os
REPO_ROOT = os.path.join(os.path.dirname(__file__), '..', '..', '..')
NAME = 'docker/compose'
COMPOSE_TESTS_IMAGE_BASE_NAME = NAME + '-tests'
BINTRAY_ORG = 'docker-compose'

View File

@@ -1,73 +0,0 @@
from __future__ import absolute_import
from __future__ import print_function
from __future__ import unicode_literals
import hashlib
import os
import requests
from .const import BINTRAY_ORG
from .const import NAME
from .const import REPO_ROOT
from .utils import branch_name
class BinaryDownloader(requests.Session):
base_bintray_url = 'https://dl.bintray.com/{}'.format(BINTRAY_ORG)
base_appveyor_url = 'https://ci.appveyor.com/api/projects/{}/artifacts/'.format(NAME)
def __init__(self, destination, *args, **kwargs):
super(BinaryDownloader, self).__init__(*args, **kwargs)
self.destination = destination
os.makedirs(self.destination, exist_ok=True)
def download_from_bintray(self, repo_name, filename):
print('Downloading {} from bintray'.format(filename))
url = '{base}/{repo_name}/{filename}'.format(
base=self.base_bintray_url, repo_name=repo_name, filename=filename
)
full_dest = os.path.join(REPO_ROOT, self.destination, filename)
return self._download(url, full_dest)
def download_from_appveyor(self, branch_name, filename):
print('Downloading {} from appveyor'.format(filename))
url = '{base}/dist%2F{filename}?branch={branch_name}'.format(
base=self.base_appveyor_url, filename=filename, branch_name=branch_name
)
full_dest = os.path.join(REPO_ROOT, self.destination, filename)
return self._download(url, full_dest)
def _download(self, url, full_dest):
m = hashlib.sha256()
with open(full_dest, 'wb') as f:
r = self.get(url, stream=True)
for chunk in r.iter_content(chunk_size=1024 * 600, decode_unicode=False):
print('.', end='', flush=True)
m.update(chunk)
f.write(chunk)
print(' download complete')
hex_digest = m.hexdigest()
with open(full_dest + '.sha256', 'w') as f:
f.write('{} {}\n'.format(hex_digest, os.path.basename(full_dest)))
return full_dest, hex_digest
def download_all(self, version):
files = {
'docker-compose-Darwin-x86_64.tgz': None,
'docker-compose-Darwin-x86_64': None,
'docker-compose-Linux-x86_64': None,
'docker-compose-Windows-x86_64.exe': None,
}
for filename in files.keys():
if 'Windows' in filename:
files[filename] = self.download_from_appveyor(
branch_name(version), filename
)
else:
files[filename] = self.download_from_bintray(
branch_name(version), filename
)
return files

View File

@@ -1,157 +0,0 @@
from __future__ import absolute_import
from __future__ import print_function
from __future__ import unicode_literals
import base64
import json
import os
import docker
from enum import Enum
from .const import NAME
from .const import REPO_ROOT
from .utils import ScriptError
from .utils import yesno
from script.release.release.const import COMPOSE_TESTS_IMAGE_BASE_NAME
class Platform(Enum):
ALPINE = 'alpine'
DEBIAN = 'debian'
def __str__(self):
return self.value
# Checks if this version respects the GA version format ('x.y.z') and not an RC
def is_tag_latest(version):
ga_version = all(n.isdigit() for n in version.split('.')) and version.count('.') == 2
return ga_version and yesno('Should this release be tagged as \"latest\"? [Y/n]: ', default=True)
class ImageManager(object):
def __init__(self, version, latest=False):
self.docker_client = docker.APIClient(**docker.utils.kwargs_from_env())
self.version = version
self.latest = latest
if 'HUB_CREDENTIALS' in os.environ:
print('HUB_CREDENTIALS found in environment, issuing login')
credentials = json.loads(base64.urlsafe_b64decode(os.environ['HUB_CREDENTIALS']))
self.docker_client.login(
username=credentials['Username'], password=credentials['Password']
)
def _tag(self, image, existing_tag, new_tag):
existing_repo_tag = '{image}:{tag}'.format(image=image, tag=existing_tag)
new_repo_tag = '{image}:{tag}'.format(image=image, tag=new_tag)
self.docker_client.tag(existing_repo_tag, new_repo_tag)
def get_full_version(self, platform=None):
return self.version + '-' + platform.__str__() if platform else self.version
def get_runtime_image_tag(self, tag):
return '{image_base_image}:{tag}'.format(
image_base_image=NAME,
tag=self.get_full_version(tag)
)
def build_runtime_image(self, repository, platform):
git_sha = repository.write_git_sha()
compose_image_base_name = NAME
print('Building {image} image ({platform} based)'.format(
image=compose_image_base_name,
platform=platform
))
full_version = self.get_full_version(platform)
build_tag = self.get_runtime_image_tag(platform)
logstream = self.docker_client.build(
REPO_ROOT,
tag=build_tag,
buildargs={
'BUILD_PLATFORM': platform.value,
'GIT_COMMIT': git_sha,
},
decode=True
)
for chunk in logstream:
if 'error' in chunk:
raise ScriptError('Build error: {}'.format(chunk['error']))
if 'stream' in chunk:
print(chunk['stream'], end='')
if platform == Platform.ALPINE:
self._tag(compose_image_base_name, full_version, self.version)
if self.latest:
self._tag(compose_image_base_name, full_version, platform)
if platform == Platform.ALPINE:
self._tag(compose_image_base_name, full_version, 'latest')
def get_ucp_test_image_tag(self, tag=None):
return '{image}:{tag}'.format(
image=COMPOSE_TESTS_IMAGE_BASE_NAME,
tag=tag or self.version
)
# Used for producing a test image for UCP
def build_ucp_test_image(self, repository):
print('Building test image (debian based for UCP e2e)')
git_sha = repository.write_git_sha()
ucp_test_image_tag = self.get_ucp_test_image_tag()
logstream = self.docker_client.build(
REPO_ROOT,
tag=ucp_test_image_tag,
target='build',
buildargs={
'BUILD_PLATFORM': Platform.DEBIAN.value,
'GIT_COMMIT': git_sha,
},
decode=True
)
for chunk in logstream:
if 'error' in chunk:
raise ScriptError('Build error: {}'.format(chunk['error']))
if 'stream' in chunk:
print(chunk['stream'], end='')
self._tag(COMPOSE_TESTS_IMAGE_BASE_NAME, self.version, 'latest')
def build_images(self, repository):
self.build_runtime_image(repository, Platform.ALPINE)
self.build_runtime_image(repository, Platform.DEBIAN)
self.build_ucp_test_image(repository)
def check_images(self):
for name in self.get_images_to_push():
try:
self.docker_client.inspect_image(name)
except docker.errors.ImageNotFound:
print('Expected image {} was not found'.format(name))
return False
return True
def get_images_to_push(self):
tags_to_push = {
"{}:{}".format(NAME, self.version),
self.get_runtime_image_tag(Platform.ALPINE),
self.get_runtime_image_tag(Platform.DEBIAN),
self.get_ucp_test_image_tag(),
self.get_ucp_test_image_tag('latest'),
}
if is_tag_latest(self.version):
tags_to_push.add("{}:latest".format(NAME))
return tags_to_push
def push_images(self):
tags_to_push = self.get_images_to_push()
print('Build tags to push {}'.format(tags_to_push))
for name in tags_to_push:
print('Pushing {} to Docker Hub'.format(name))
logstream = self.docker_client.push(name, stream=True, decode=True)
for chunk in logstream:
if 'status' in chunk:
print(chunk['status'])
if 'error' in chunk:
raise ScriptError(
'Error pushing {name}: {err}'.format(name=name, err=chunk['error'])
)

View File

@@ -1,44 +0,0 @@
from __future__ import absolute_import
from __future__ import unicode_literals
from configparser import Error
from requests.exceptions import HTTPError
from twine.commands.upload import main as twine_upload
from twine.utils import get_config
from .utils import ScriptError
def pypi_upload(args):
print('Uploading to PyPi')
try:
rel = args.release.replace('-rc', 'rc')
twine_upload([
'dist/docker_compose-{}*.whl'.format(rel),
'dist/docker-compose-{}*.tar.gz'.format(rel)
])
except HTTPError as e:
if e.response.status_code == 400 and 'File already exists' in str(e):
if not args.finalize_resume:
raise ScriptError(
'Package already uploaded on PyPi.'
)
print('Skipping PyPi upload - package already uploaded')
else:
raise ScriptError('Unexpected HTTP error uploading package to PyPi: {}'.format(e))
def check_pypirc():
try:
config = get_config()
except Error as e:
raise ScriptError('Failed to parse .pypirc file: {}'.format(e))
if config is None:
raise ScriptError('Failed to parse .pypirc file')
if 'pypi' not in config:
raise ScriptError('Missing [pypi] section in .pypirc file')
if not (config['pypi'].get('username') and config['pypi'].get('password')):
raise ScriptError('Missing login/password pair for pypi repo')

View File

@@ -1,246 +0,0 @@
from __future__ import absolute_import
from __future__ import unicode_literals
import os
import tempfile
import requests
from git import GitCommandError
from git import Repo
from github import Github
from .const import NAME
from .const import REPO_ROOT
from .utils import branch_name
from .utils import read_release_notes_from_changelog
from .utils import ScriptError
class Repository(object):
def __init__(self, root=None, gh_name=None):
if root is None:
root = REPO_ROOT
if gh_name is None:
gh_name = NAME
self.git_repo = Repo(root)
self.gh_client = Github(os.environ['GITHUB_TOKEN'])
self.gh_repo = self.gh_client.get_repo(gh_name)
def create_release_branch(self, version, base=None):
print('Creating release branch {} based on {}...'.format(version, base or 'master'))
remote = self.find_remote(self.gh_repo.full_name)
br_name = branch_name(version)
remote.fetch()
if self.branch_exists(br_name):
raise ScriptError(
"Branch {} already exists locally. Please remove it before "
"running the release script, or use `resume` instead.".format(
br_name
)
)
if base is not None:
base = self.git_repo.tag('refs/tags/{}'.format(base))
else:
base = 'refs/remotes/{}/master'.format(remote.name)
release_branch = self.git_repo.create_head(br_name, commit=base)
release_branch.checkout()
self.git_repo.git.merge('--strategy=ours', '--no-edit', '{}/release'.format(remote.name))
with release_branch.config_writer() as cfg:
cfg.set_value('release', version)
return release_branch
def find_remote(self, remote_name=None):
if not remote_name:
remote_name = self.gh_repo.full_name
for remote in self.git_repo.remotes:
for url in remote.urls:
if remote_name in url:
return remote
return None
def create_bump_commit(self, bump_branch, version):
print('Creating bump commit...')
bump_branch.checkout()
self.git_repo.git.commit('-a', '-s', '-m "Bump {}"'.format(version), '--no-verify')
def diff(self):
return self.git_repo.git.diff()
def checkout_branch(self, name):
return self.git_repo.branches[name].checkout()
def push_branch_to_remote(self, branch, remote_name=None):
print('Pushing branch {} to remote...'.format(branch.name))
remote = self.find_remote(remote_name)
remote.push(refspec=branch, force=True)
def branch_exists(self, name):
return name in [h.name for h in self.git_repo.heads]
def create_release_pull_request(self, version):
return self.gh_repo.create_pull(
title='Bump {}'.format(version),
body='Automated release for docker-compose {}\n\n{}'.format(
version, read_release_notes_from_changelog()
),
base='release',
head=branch_name(version),
)
def create_release(self, version, release_notes, **kwargs):
return self.gh_repo.create_git_release(
tag=version, name=version, message=release_notes, **kwargs
)
def find_release(self, version):
print('Retrieving release draft for {}'.format(version))
releases = self.gh_repo.get_releases()
for release in releases:
if release.tag_name == version and release.title == version:
return release
return None
def publish_release(self, release):
release.update_release(
name=release.title,
message=release.body,
draft=False,
prerelease=release.prerelease
)
def remove_release(self, version):
print('Removing release draft for {}'.format(version))
releases = self.gh_repo.get_releases()
for release in releases:
if release.tag_name == version and release.title == version:
if not release.draft:
print(
'The release at {} is no longer a draft. If you TRULY intend '
'to remove it, please do so manually.'.format(release.url)
)
continue
release.delete_release()
def remove_bump_branch(self, version, remote_name=None):
name = branch_name(version)
if not self.branch_exists(name):
return False
print('Removing local branch "{}"'.format(name))
if self.git_repo.active_branch.name == name:
print('Active branch is about to be deleted. Checking out to master...')
try:
self.checkout_branch('master')
except GitCommandError:
raise ScriptError(
'Unable to checkout master. Try stashing local changes before proceeding.'
)
self.git_repo.branches[name].delete(self.git_repo, name, force=True)
print('Removing remote branch "{}"'.format(name))
remote = self.find_remote(remote_name)
try:
remote.push(name, delete=True)
except GitCommandError as e:
if 'remote ref does not exist' in str(e):
return False
raise ScriptError(
'Error trying to remove remote branch: {}'.format(e)
)
return True
def find_release_pr(self, version):
print('Retrieving release PR for {}'.format(version))
name = branch_name(version)
open_prs = self.gh_repo.get_pulls(state='open')
for pr in open_prs:
if pr.head.ref == name:
print('Found matching PR #{}'.format(pr.number))
return pr
print('No open PR for this release branch.')
return None
def close_release_pr(self, version):
print('Retrieving and closing release PR for {}'.format(version))
name = branch_name(version)
open_prs = self.gh_repo.get_pulls(state='open')
count = 0
for pr in open_prs:
if pr.head.ref == name:
print('Found matching PR #{}'.format(pr.number))
pr.edit(state='closed')
count += 1
if count == 0:
print('No open PR for this release branch.')
return count
def write_git_sha(self):
with open(os.path.join(REPO_ROOT, 'compose', 'GITSHA'), 'w') as f:
f.write(self.git_repo.head.commit.hexsha[:7])
return self.git_repo.head.commit.hexsha[:7]
def cherry_pick_prs(self, release_branch, ids):
if not ids:
return
release_branch.checkout()
for i in ids:
try:
i = int(i)
except ValueError as e:
raise ScriptError('Invalid PR id: {}'.format(e))
print('Retrieving PR#{}'.format(i))
pr = self.gh_repo.get_pull(i)
patch_data = requests.get(pr.patch_url).text
self.apply_patch(patch_data)
def apply_patch(self, patch_data):
with tempfile.NamedTemporaryFile(mode='w', prefix='_compose_cherry', encoding='utf-8') as f:
f.write(patch_data)
f.flush()
self.git_repo.git.am('--3way', f.name)
def get_prs_in_milestone(self, version):
milestones = self.gh_repo.get_milestones(state='open')
milestone = None
for ms in milestones:
if ms.title == version:
milestone = ms
break
if not milestone:
print('Didn\'t find a milestone matching "{}"'.format(version))
return None
issues = self.gh_repo.get_issues(milestone=milestone, state='all')
prs = []
for issue in issues:
if issue.pull_request is not None:
prs.append(issue.number)
return sorted(prs)
def get_contributors(pr_data):
commits = pr_data.get_commits()
authors = {}
for commit in commits:
if not commit or not commit.author or not commit.author.login:
continue
author = commit.author.login
authors[author] = authors.get(author, 0) + 1
return [x[0] for x in sorted(list(authors.items()), key=lambda x: x[1])]
def upload_assets(gh_release, files):
print('Uploading binaries and hash sums')
for filename, filedata in files.items():
print('Uploading {}...'.format(filename))
gh_release.upload_asset(filedata[0], content_type='application/octet-stream')
gh_release.upload_asset('{}.sha256'.format(filedata[0]), content_type='text/plain')
print('Uploading run.sh...')
gh_release.upload_asset(
os.path.join(REPO_ROOT, 'script', 'run', 'run.sh'), content_type='text/plain'
)
def delete_assets(gh_release):
print('Removing previously uploaded assets')
for asset in gh_release.get_assets():
print('Deleting asset {}'.format(asset.name))
asset.delete_asset()

View File

@@ -1,47 +0,0 @@
#!/bin/bash
debian_based() { test -f /etc/debian_version; }
if test -z $VENV_DIR; then
VENV_DIR=./.release-venv
fi
if test -z $PYTHONBIN; then
PYTHONBIN=$(which python3)
if test -z $PYTHONBIN; then
PYTHONBIN=$(which python)
fi
fi
VERSION=$($PYTHONBIN -c "import sys; print('{}.{}'.format(*sys.version_info[0:2]))")
if test $(echo $VERSION | cut -d. -f1) -lt 3; then
echo "Python 3.3 or above is required"
fi
if test $(echo $VERSION | cut -d. -f2) -lt 3; then
echo "Python 3.3 or above is required"
fi
# Debian / Ubuntu workaround:
# https://askubuntu.com/questions/879437/ensurepip-is-disabled-in-debian-ubuntu-for-the-system-python
if debian_based; then
VENV_FLAGS="$VENV_FLAGS --without-pip"
fi
$PYTHONBIN -m venv $VENV_DIR $VENV_FLAGS
VENV_PYTHONBIN=$VENV_DIR/bin/python
if debian_based; then
curl https://bootstrap.pypa.io/get-pip.py -o $VENV_DIR/get-pip.py
$VENV_PYTHONBIN $VENV_DIR/get-pip.py
fi
$VENV_PYTHONBIN -m pip install -U Jinja2==2.10 \
PyGithub==1.39 \
GitPython==2.1.9 \
requests==2.18.4 \
setuptools==40.6.2 \
twine==1.11.0
$VENV_PYTHONBIN setup.py develop

View File

@@ -4,36 +4,7 @@ from __future__ import unicode_literals
import os
import re
from .const import REPO_ROOT
from compose import const as compose_const
section_header_re = re.compile(r'^[0-9]+\.[0-9]+\.[0-9]+ \([0-9]{4}-[01][0-9]-[0-3][0-9]\)$')
class ScriptError(Exception):
pass
def branch_name(version):
return 'bump-{}'.format(version)
def read_release_notes_from_changelog():
with open(os.path.join(REPO_ROOT, 'CHANGELOG.md'), 'r') as f:
lines = f.readlines()
i = 0
while i < len(lines):
if section_header_re.match(lines[i]):
break
i += 1
j = i + 1
while j < len(lines):
if section_header_re.match(lines[j]):
break
j += 1
return ''.join(lines[i + 2:j - 1])
from const import REPO_ROOT
def update_init_py_version(version):
@@ -54,15 +25,6 @@ def update_run_sh_version(version):
f.write(contents)
def compatibility_matrix():
result = {}
for engine_version in compose_const.API_VERSION_TO_ENGINE_VERSION.values():
result[engine_version] = []
for fmt, api_version in compose_const.API_VERSIONS.items():
result[compose_const.API_VERSION_TO_ENGINE_VERSION[api_version]].append(fmt.vstring)
return result
def yesno(prompt, default=None):
"""
Prompt the user for a yes or no.

View File

@@ -15,7 +15,7 @@
set -e
VERSION="1.25.1"
VERSION="1.26.0-rc3"
IMAGE="docker/compose:$VERSION"

View File

@@ -17,9 +17,9 @@ OPENSSL_VERSION=1.1.1d
OPENSSL_URL=https://www.openssl.org/source/openssl-${OPENSSL_VERSION}.tar.gz
OPENSSL_SHA1=056057782325134b76d1931c48f2c7e6595d7ef4
PYTHON_VERSION=3.7.5
PYTHON_VERSION=3.7.6
PYTHON_URL=https://www.python.org/ftp/python/${PYTHON_VERSION}/Python-${PYTHON_VERSION}.tgz
PYTHON_SHA1=8b0311d4cca19f0ea9181731189fa33c9f5aedf9
PYTHON_SHA1=4642680fbf9a9a5382597dc0e9faa058fdfd94e2
#
# Install prerequisites.

3
script/test/acceptance Executable file
View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
pytest --conformity --binary ${1:-docker-compose} tests/acceptance/

View File

@@ -44,6 +44,7 @@ install_requires = [
tests_require = [
'ddt >= 1.2.2, < 2',
'pytest < 6',
]
@@ -59,6 +60,7 @@ extras_require = {
'ipaddress >= 1.0.16, < 2'],
':sys_platform == "win32"': ['colorama >= 0.4, < 1'],
'socks': ['PySocks >= 1.5.6, != 1.5.7, < 2'],
'tests': tests_require,
}

View File

@@ -38,6 +38,8 @@ from tests.integration.testcases import v2_2_only
from tests.integration.testcases import v2_only
from tests.integration.testcases import v3_only
DOCKER_COMPOSE_EXECUTABLE = 'docker-compose'
ProcessResult = namedtuple('ProcessResult', 'stdout stderr')
@@ -65,7 +67,7 @@ COMPOSE_COMPATIBILITY_DICT = {
def start_process(base_dir, options):
proc = subprocess.Popen(
['docker-compose'] + options,
[DOCKER_COMPOSE_EXECUTABLE] + options,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
@@ -1711,6 +1713,17 @@ services:
assert stderr == ""
assert stdout == "/\n"
@mock.patch.dict(os.environ)
def test_exec_novalue_var_dotenv_file(self):
os.environ['MYVAR'] = 'SUCCESS'
self.base_dir = 'tests/fixtures/exec-novalue-var'
self.dispatch(['up', '-d'])
assert len(self.project.containers()) == 1
stdout, stderr = self.dispatch(['exec', '-T', 'nginx', 'env'])
assert 'CHECK_VAR=SUCCESS' in stdout
assert not stderr
def test_exec_detach_long_form(self):
self.base_dir = 'tests/fixtures/links-composefile'
self.dispatch(['up', '--detach', 'console'])

View File

@@ -0,0 +1,48 @@
# -*- coding: utf-8 -*-
from __future__ import absolute_import
from __future__ import unicode_literals
import os
import shutil
import unittest
from docker import ContextAPI
from tests.acceptance.cli_test import dispatch
class ContextTestCase(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.docker_dir = os.path.join(os.environ.get("HOME", "/tmp"), '.docker')
if not os.path.exists(cls.docker_dir):
os.makedirs(cls.docker_dir)
f = open(os.path.join(cls.docker_dir, "config.json"), "w")
f.write("{}")
f.close()
cls.docker_config = os.path.join(cls.docker_dir, "config.json")
os.environ['DOCKER_CONFIG'] = cls.docker_config
ContextAPI.create_context("testcontext", host="tcp://doesnotexist:8000")
@classmethod
def tearDownClass(cls):
shutil.rmtree(cls.docker_dir, ignore_errors=True)
def setUp(self):
self.base_dir = 'tests/fixtures/simple-composefile'
self.override_dir = None
def dispatch(self, options, project_options=None, returncode=0, stdin=None):
return dispatch(self.base_dir, options, project_options, returncode, stdin)
def test_help(self):
result = self.dispatch(['help'], returncode=0)
assert '-c, --context NAME' in result.stdout
def test_fail_on_both_host_and_context_opt(self):
result = self.dispatch(['-H', 'unix://', '-c', 'default', 'up'], returncode=1)
assert '-H, --host and -c, --context are mutually exclusive' in result.stderr
def test_fail_run_on_inexistent_context(self):
result = self.dispatch(['-c', 'testcontext', 'up', '-d'], returncode=1)
assert "Couldn't connect to Docker daemon" in result.stderr

243
tests/conftest.py Normal file
View File

@@ -0,0 +1,243 @@
from __future__ import absolute_import
from __future__ import unicode_literals
import pytest
import tests.acceptance.cli_test
# FIXME Skipping all the acceptance tests when in `--conformity`
non_conformity_tests = [
"test_build_failed",
"test_build_failed_forcerm",
"test_build_log_level",
"test_build_memory_build_option",
"test_build_no_cache",
"test_build_no_cache_pull",
"test_build_override_dir",
"test_build_override_dir_invalid_path",
"test_build_parallel",
"test_build_plain",
"test_build_pull",
"test_build_rm",
"test_build_shm_size_build_option",
"test_build_with_buildarg_cli_override",
"test_build_with_buildarg_from_compose_file",
"test_build_with_buildarg_old_api_version",
"test_config_compatibility_mode",
"test_config_compatibility_mode_from_env",
"test_config_compatibility_mode_from_env_and_option_precedence",
"test_config_default",
"test_config_external_network",
"test_config_external_network_v3_5",
"test_config_external_volume_v2",
"test_config_external_volume_v2_x",
"test_config_external_volume_v3_4",
"test_config_external_volume_v3_x",
"test_config_list_services",
"test_config_list_volumes",
"test_config_quiet",
"test_config_quiet_with_error",
"test_config_restart",
"test_config_stdin",
"test_config_v1",
"test_config_v3",
"test_config_with_dot_env",
"test_config_with_dot_env_and_override_dir",
"test_config_with_env_file",
"test_config_with_hash_option",
"test_create",
"test_create_with_force_recreate",
"test_create_with_force_recreate_and_no_recreate",
"test_create_with_no_recreate",
"test_down",
"test_down_invalid_rmi_flag",
"test_down_signal",
"test_down_timeout",
"test_env_file_relative_to_compose_file",
"test_events_human_readable",
"test_events_json",
"test_exec_custom_user",
"test_exec_detach_long_form",
"test_exec_novalue_var_dotenv_file",
"test_exec_service_with_environment_overridden",
"test_exec_without_tty",
"test_exec_workdir",
"test_exit_code_from_signal_stop",
"test_expanded_port",
"test_forward_exitval",
"test_help",
"test_help_nonexistent",
"test_home_and_env_var_in_volume_path",
"test_host_not_reachable",
"test_host_not_reachable_volumes_from_container",
"test_host_not_reachable_volumes_from_container",
"test_images",
"test_images_default_composefile",
"test_images_tagless_image",
"test_images_use_service_tag",
"test_kill",
"test_kill_signal_sigstop",
"test_kill_stopped_service",
"test_logs_default",
"test_logs_follow",
"test_logs_follow_logs_from_new_containers",
"test_logs_follow_logs_from_restarted_containers",
"test_logs_invalid_service_name",
"test_logs_on_stopped_containers_exits",
"test_logs_tail",
"test_logs_timestamps",
"test_pause_no_containers",
"test_pause_unpause",
"test_port",
"test_port_with_scale",
"test_ps",
"test_ps_all",
"test_ps_alternate_composefile",
"test_ps_default_composefile",
"test_ps_services_filter_option",
"test_ps_services_filter_status",
"test_pull",
"test_pull_can_build",
"test_pull_with_digest",
"test_pull_with_ignore_pull_failures",
"test_pull_with_include_deps",
"test_pull_with_no_deps",
"test_pull_with_parallel_failure",
"test_pull_with_quiet",
"test_quiet_build",
"test_restart",
"test_restart_no_containers",
"test_restart_stopped_container",
"test_rm",
"test_rm_all",
"test_rm_stop",
"test_run_detached_connects_to_network",
"test_run_does_not_recreate_linked_containers",
"test_run_env_values_from_system",
"test_run_handles_sighup",
"test_run_handles_sigint",
"test_run_handles_sigterm",
"test_run_interactive_connects_to_network",
"test_run_label_flag",
"test_run_one_off_with_multiple_volumes",
"test_run_one_off_with_volume",
"test_run_one_off_with_volume_merge",
"test_run_rm",
"test_run_service_with_compose_file_entrypoint",
"test_run_service_with_compose_file_entrypoint_and_command_overridden",
"test_run_service_with_compose_file_entrypoint_and_empty_string_command",
"test_run_service_with_compose_file_entrypoint_overridden",
"test_run_service_with_dependencies",
"test_run_service_with_dockerfile_entrypoint",
"test_run_service_with_dockerfile_entrypoint_and_command_overridden",
"test_run_service_with_dockerfile_entrypoint_overridden",
"test_run_service_with_environment_overridden",
"test_run_service_with_explicitly_mapped_ip_ports",
"test_run_service_with_explicitly_mapped_ports",
"test_run_service_with_links",
"test_run_service_with_map_ports",
"test_run_service_with_scaled_dependencies",
"test_run_service_with_unset_entrypoint",
"test_run_service_with_use_aliases",
"test_run_service_with_user_overridden",
"test_run_service_with_user_overridden_short_form",
"test_run_service_with_workdir_overridden",
"test_run_service_with_workdir_overridden_short_form",
"test_run_service_without_links",
"test_run_service_without_map_ports",
"test_run_unicode_env_values_from_system",
"test_run_with_custom_name",
"test_run_with_expose_ports",
"test_run_with_no_deps",
"test_run_without_command",
"test_scale",
"test_scale_v2_2",
"test_shorthand_host_opt",
"test_shorthand_host_opt_interactive",
"test_start_no_containers",
"test_stop",
"test_stop_signal",
"test_top_processes_running",
"test_top_services_not_running",
"test_top_services_running",
"test_unpause_no_containers",
"test_up",
"test_up_attached",
"test_up_detached",
"test_up_detached_long_form",
"test_up_external_networks",
"test_up_handles_abort_on_container_exit",
"test_up_handles_abort_on_container_exit_code",
"test_up_handles_aborted_dependencies",
"test_up_handles_force_shutdown",
"test_up_handles_sigint",
"test_up_handles_sigterm",
"test_up_logging",
"test_up_logging_legacy",
"test_up_missing_network",
"test_up_no_ansi",
"test_up_no_services",
"test_up_no_start",
"test_up_no_start_remove_orphans",
"test_up_scale_reset",
"test_up_scale_scale_down",
"test_up_scale_scale_up",
"test_up_scale_to_zero",
"test_up_with_attach_dependencies",
"test_up_with_default_network_config",
"test_up_with_default_override_file",
"test_up_with_duplicate_override_yaml_files",
"test_up_with_extends",
"test_up_with_external_default_network",
"test_up_with_force_recreate",
"test_up_with_force_recreate_and_no_recreate",
"test_up_with_healthcheck",
"test_up_with_ignore_remove_orphans",
"test_up_with_links_v1",
"test_up_with_multiple_files",
"test_up_with_net_is_invalid",
"test_up_with_net_v1",
"test_up_with_network_aliases",
"test_up_with_network_internal",
"test_up_with_network_labels",
"test_up_with_network_mode",
"test_up_with_network_static_addresses",
"test_up_with_networks",
"test_up_with_no_deps",
"test_up_with_no_recreate",
"test_up_with_override_yaml",
"test_up_with_pid_mode",
"test_up_with_timeout",
"test_up_with_volume_labels",
"test_fail_on_both_host_and_context_opt",
"test_fail_run_on_inexistent_context",
]
def pytest_addoption(parser):
parser.addoption(
"--conformity",
action="store_true",
default=False,
help="Only runs tests that are not black listed as non conformity test. "
"The conformity tests check for compatibility with the Compose spec."
)
parser.addoption(
"--binary",
default=tests.acceptance.cli_test.DOCKER_COMPOSE_EXECUTABLE,
help="Forces the execution of a binary in the PATH. Default is `docker-compose`."
)
def pytest_collection_modifyitems(config, items):
if not config.getoption("--conformity"):
return
if config.getoption("--binary"):
tests.acceptance.cli_test.DOCKER_COMPOSE_EXECUTABLE = config.getoption("--binary")
print("Binary -> {}".format(tests.acceptance.cli_test.DOCKER_COMPOSE_EXECUTABLE))
skip_non_conformity = pytest.mark.skip(reason="skipping because that's not a conformity test")
for item in items:
if item.name in non_conformity_tests:
print("Skipping '{}' when running in compatibility mode".format(item.name))
item.add_marker(skip_non_conformity)

2
tests/fixtures/env/three.env vendored Normal file
View File

@@ -0,0 +1,2 @@
FOO=NO $ENV VAR
DOO=NO ${ENV} VAR

View File

@@ -0,0 +1,6 @@
version: '3'
services:
nginx:
image: nginx
environment:
- CHECK_VAR=${MYVAR}

View File

@@ -3637,7 +3637,6 @@ class InterpolationTest(unittest.TestCase):
assert 'labels' in warn_message
assert 'endpoint_mode' in warn_message
assert 'update_config' in warn_message
assert 'placement' in warn_message
assert 'resources.reservations.cpus' in warn_message
assert 'restart_policy.delay' in warn_message
assert 'restart_policy.window' in warn_message
@@ -5420,15 +5419,19 @@ class SerializeTest(unittest.TestCase):
'environment': {
'CURRENCY': '$'
},
'env_file': ['tests/fixtures/env/three.env'],
'entrypoint': ['$SHELL', '-c'],
}
}
}
config_dict = config.load(build_config_details(cfg), interpolate=False)
config_dict = config.load(build_config_details(cfg, working_dir='.'), interpolate=False)
serialized_config = yaml.safe_load(serialize_config(config_dict, escape_dollar=False))
serialized_service = serialized_config['services']['web']
assert serialized_service['environment']['CURRENCY'] == '$'
# Values coming from env_files are not allowed to have variables
assert serialized_service['environment']['FOO'] == 'NO $$ENV VAR'
assert serialized_service['environment']['DOO'] == 'NO $${ENV} VAR'
assert serialized_service['command'] == 'echo $FOO'
assert serialized_service['entrypoint'][0] == '$SHELL'

View File

@@ -8,15 +8,18 @@ import os
import shutil
import tempfile
import pytest
from ddt import data
from ddt import ddt
from ddt import unpack
from compose.config.environment import env_vars_from_file
from compose.config.environment import Environment
from compose.config.errors import ConfigurationError
from tests import unittest
@ddt
class EnvironmentTest(unittest.TestCase):
@classmethod
def test_get_simple(self):
env = Environment({
'FOO': 'bar',
@@ -28,12 +31,14 @@ class EnvironmentTest(unittest.TestCase):
assert env.get('BAR') == '1'
assert env.get('BAZ') == ''
@classmethod
def test_get_undefined(self):
env = Environment({
'FOO': 'bar'
})
assert env.get('FOOBAR') is None
@classmethod
def test_get_boolean(self):
env = Environment({
'FOO': '',
@@ -48,20 +53,18 @@ class EnvironmentTest(unittest.TestCase):
assert env.get_boolean('FOOBAR') is True
assert env.get_boolean('UNDEFINED') is False
def test_env_vars_from_file_bom(self):
@data(
('unicode exclude test', '\ufeffPARK_BOM=박봄\n', {'PARK_BOM': '박봄'}),
('export prefixed test', 'export PREFIXED_VARS=yes\n', {"PREFIXED_VARS": "yes"}),
('quoted vars test', "QUOTED_VARS='yes'\n", {"QUOTED_VARS": "yes"}),
('double quoted vars test', 'DOUBLE_QUOTED_VARS="yes"\n', {"DOUBLE_QUOTED_VARS": "yes"}),
('extra spaces test', 'SPACES_VARS = "yes"\n', {"SPACES_VARS": "yes"}),
)
@unpack
def test_env_vars(self, test_name, content, expected):
tmpdir = tempfile.mkdtemp('env_file')
self.addCleanup(shutil.rmtree, tmpdir)
with codecs.open('{}/bom.env'.format(str(tmpdir)), 'w', encoding='utf-8') as f:
f.write('\ufeffPARK_BOM=박봄\n')
assert env_vars_from_file(str(os.path.join(tmpdir, 'bom.env'))) == {
'PARK_BOM': '박봄'
}
def test_env_vars_from_file_whitespace(self):
tmpdir = tempfile.mkdtemp('env_file')
self.addCleanup(shutil.rmtree, tmpdir)
with codecs.open('{}/whitespace.env'.format(str(tmpdir)), 'w', encoding='utf-8') as f:
f.write('WHITESPACE =yes\n')
with pytest.raises(ConfigurationError) as exc:
env_vars_from_file(str(os.path.join(tmpdir, 'whitespace.env')))
assert 'environment variable' in exc.exconly()
file_abs_path = str(os.path.join(tmpdir, ".env"))
with codecs.open(file_abs_path, 'w', encoding='utf-8') as f:
f.write(content)
assert env_vars_from_file(file_abs_path) == expected, '"{}" Failed'.format(test_name)