Compare commits

...

88 Commits

Author SHA1 Message Date
Ulysses Souza
eefe0d319f "Bump 1.26.2"
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-07-02 18:09:01 +02:00
Ulysses Souza
76e20a8ff2 Enforce pip3 and python3 on Release Jenkins
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-07-02 18:09:01 +02:00
Ulysses Souza
bfa7406872 Apply more specific filtering CI machines
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-07-02 15:33:27 +02:00
Sebastiaan van Stijn
757f7b784d setyp.py: fix minimum docker-py requirement to 4.2.2
Signed-off-by: Sebastiaan van Stijn <github@gone.nl>
2020-07-02 13:41:23 +02:00
Ulysses Souza
f216ddbf05 "Bump 1.26.1"
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-06-30 20:13:01 +02:00
Ulysses Souza
afabfbdf8d Expect failure of test_create_container_with_blkio_config
On Linux kernel >= 5.3.x at least the daemon prints 2 warnings:
"Your kernel does not support cgroup blkio weight"
"Your kernel does not support cgroup blkio weight_device"

Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-06-30 20:13:01 +02:00
aiordache
bff44e6943 check context in use targets a docker engine
Signed-off-by: aiordache <anca.iordache@docker.com>
2020-06-30 19:01:56 +02:00
Ulysses Souza
b61fb7ef90 Enforce docker>=4.2.1 on pip install
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-06-04 17:00:15 +02:00
Ulysses Souza
d445165955 "Bump 1.26.0"
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-06-03 16:10:04 +02:00
Ulysses Souza
3d94f44217 "Bump 1.26.0-rc5"
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-06-03 10:32:47 +02:00
Ulysses Souza
1386a85571 Merge pull request #7485 from ulyssessouza/fix-https-daemon
Bump docker-py
2020-06-03 10:08:36 +02:00
Ulysses Souza
8034c96d66 Bump docker-py
This should fix https problems when running on
remote daemon

Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-06-03 01:00:10 +02:00
Ulysses Souza
48d093697d Fix flake8 errors
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-06-03 01:00:10 +02:00
Ulysses Souza
28bf47ce45 Pin wcwidth==0.1.9
wcwidth version 0.2.0 until at least 0.2.3 results in:
[745] Failed to execute script pyi_rth_pkgres
Traceback (most recent call last):
	File "site-packages/PyInstaller/loader/rthooks/pyi_rth_pkgres.py", line 13, in <module>
	File "/code/.tox/py37/lib/python3.7/site-packages/PyInstaller/loader/pyimod03_importers.py", line 623, in exec_module
		exec(bytecode, module.__dict__)
	File "site-packages/pkg_resources/__init__.py", line 86, in <module>
ModuleNotFoundError: No module named 'pkg_resources.py2_warn'

Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-06-03 00:32:34 +02:00
Ulysses Souza
d279b7a891 "Bump 1.26.0-rc4"
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-04-29 12:27:55 +02:00
Ulysses Souza
83371df2d1 Merge branch 'master' into 1.26.x 2020-04-27 19:56:02 +02:00
Ulysses Souza
9c5351cf27 Merge pull request #7389 from ulyssessouza/general-bumps-for-1_26
General bumps to prepare 1.26.0
2020-04-27 19:46:56 +02:00
Ulysses Souza
836e2b7c4d General bumps
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-04-27 17:34:36 +02:00
Ulysses Souza
266d287eca Merge pull request #7390 from ulyssessouza/remove-unused-resources
Remove unused files
2020-04-27 17:29:28 +02:00
Ulysses Souza
d64f3f3912 Remove unused files
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-04-21 11:04:10 +02:00
Ulysses Souza
4e310a945e Merge pull request #7386 from ulyssessouza/bump-python-dotenv-1_13_0
Bump python-dotenv from 0.11.0 to 0.13.0
2020-04-21 09:55:06 +02:00
Ulysses Souza
d52b51e8ea Bump python-dotenv from 0.11.0 to 0.13.0
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-04-20 18:31:52 +02:00
Ulysses Souza
64a6a48bd9 Merge pull request #7345 from docker/dependabot/pip/certifi-2020.4.5.1
Bump certifi from 2019.11.28 to 2020.4.5.1
2020-04-19 21:19:32 +02:00
Ulysses Souza
0979c7a1fe Merge pull request #7380 from joehattori/simplify-code
Simplify code in compose/config/config.py
2020-04-19 16:57:36 +02:00
Joe Hattori
ce782b592f Simplify code in compose/config/config.py
Signed-off-by: Joe Hattori <joe2ninja21@gmail.com>
2020-04-19 01:52:19 +09:00
dependabot-preview[bot]
b7d6dc7941 Bump certifi from 2019.11.28 to 2020.4.5.1
Bumps [certifi](https://github.com/certifi/python-certifi) from 2019.11.28 to 2020.4.5.1.
- [Release notes](https://github.com/certifi/python-certifi/releases)
- [Commits](https://github.com/certifi/python-certifi/commits)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-04-17 15:48:12 +00:00
Ulysses Souza
c22a25105e Merge pull request #7374 from ulyssessouza/fix-distro-guess
Add "distro" package
2020-04-17 17:46:48 +02:00
Ulysses Souza
a62a1e1d62 Add "distro" package
This package implements the method 'platform.linux_distribution' removed in
Python 3.8

Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-04-16 20:04:05 +02:00
Ulysses Souza
afc5d20510 Merge pull request #7371 from ulyssessouza/bump-openssl-111f
Bump OPENSSL to 1.1.1f
2020-04-16 15:30:12 +02:00
Ulysses Souza
4d2afc07da Merge pull request #7372 from ulyssessouza/update-changelog
Update changelog to 1.25.5
2020-04-16 15:29:58 +02:00
Ulysses Souza
27d43ed041 Update changelog to 1.25.5
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-04-16 10:09:21 +02:00
Ulysses Souza
23889a8f88 Bump OPENSSL to 1.1.1f
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-04-16 10:00:40 +02:00
Ulysses Souza
440c94ea7a Merge pull request #7328 from luHub/7242-fix-cli-no-labels
Bug fix 7242 docker-compose with buildkit does not insert container labels
2020-03-31 03:01:27 +02:00
luHub
1a688289b4 add labels to CLIbuilder
Signed-off-by: luHub <lucioguerchi16f@gmail.com>
2020-03-28 18:00:22 +01:00
Ulysses Souza
aaef2d5aa7 Merge pull request #7287 from ulyssessouza/add-python-dotenv-setup
Fix pip install by adding python-dotenv to setup.py
2020-03-13 15:13:05 +01:00
Ulysses Souza
1b9855c1c2 Fix pip install by adding python-dotenv to setup.py
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-03-13 14:44:45 +01:00
Ulysses Souza
46118bc5e0 "Bump 1.26.0-rc3"
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-03-10 16:25:59 +01:00
Ulysses Souza
3288404b24 Merge branch 'master' into 1.26.x 2020-03-10 16:18:46 +01:00
Ulysses Souza
e9dc97fcf3 Merge pull request #7276 from ulyssessouza/conformity-tests
Add conformity tests
2020-03-10 16:17:16 +01:00
Ulysses Souza
0ace76114b Add conformity tests hack
- That can be used with:
./script/test/acceptance

Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-03-10 15:41:28 +01:00
Anca Iordache
3c89ff843a Merge pull request #7268 from aiordache/fix_v3.8_schema_support
Fix v3.8 schema support for binaries
2020-03-05 17:03:46 +01:00
Anca Iordache
98abe07646 Fix v3.8 schema support for binaries
Signed-off-by: Anca Iordache <anca.iordache@docker.com>
2020-03-05 15:46:45 +01:00
Ulysses Souza
2882cf56ed Merge pull request #7254 from StefanScherer/update-jenkins-badge
Update Jenkins build status badge
2020-02-27 18:15:43 +01:00
Stefan Scherer
2769c33a7e Update Jenkins build status badge
Signed-off-by: Stefan Scherer <stefan.scherer@docker.com>
2020-02-27 17:23:50 +01:00
Ulysses Souza
6b988aa1f1 Merge pull request #7251 from ulyssessouza/downgrade-idna
Resolve a compatibility issue
2020-02-27 10:58:35 +01:00
Ulysses Souza
cfefeaa6f7 Resolve a compatibility issue
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-02-27 10:48:37 +01:00
Ulysses Souza
07cab51378 Downgrade gitpython to 2.1.15 and idna to 2.8
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-02-24 13:40:16 +01:00
Ulysses Souza
12637904f0 Bump gitpython -> 3.1.0
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-02-24 13:10:14 +01:00
Ulysses Souza
d412a1e47f Merge pull request #6937 from apollo13/issue6871
Properly escape values coming from env_files, fixes #6871
2020-02-24 11:52:14 +01:00
Ulysses Souza
160034f678 "Bump 1.26.0-rc1"
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-02-24 11:42:32 +01:00
Ulysses Souza
61b749cecd Merge pull request #7238 from docker/dependabot/pip/idna-2.9
Bump idna from 2.8 to 2.9
2020-02-21 11:54:50 +01:00
Ulysses Souza
80ba1f07f3 Merge pull request #7224 from docker/dependabot/pip/python-dotenv-0.11.0
Bump python-dotenv from 0.10.5 to 0.11.0
2020-02-18 14:18:52 +01:00
dependabot-preview[bot]
0f8e7a8874 Bump idna from 2.8 to 2.9
Bumps [idna](https://github.com/kjd/idna) from 2.8 to 2.9.
- [Release notes](https://github.com/kjd/idna/releases)
- [Changelog](https://github.com/kjd/idna/blob/master/HISTORY.rst)
- [Commits](https://github.com/kjd/idna/compare/v2.8...v2.9)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-02-17 22:16:41 +00:00
dependabot-preview[bot]
01df88e1eb Bump python-dotenv from 0.10.5 to 0.11.0
Bumps [python-dotenv](https://github.com/theskumar/python-dotenv) from 0.10.5 to 0.11.0.
- [Release notes](https://github.com/theskumar/python-dotenv/releases)
- [Changelog](https://github.com/theskumar/python-dotenv/blob/master/CHANGELOG.md)
- [Commits](https://github.com/theskumar/python-dotenv/compare/v0.10.5...v0.11.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-02-10 22:15:09 +00:00
Ulysses Souza
bb93a18500 Merge pull request #7217 from aiordache/compose_v3.8_schema_support
Add v3.8 schema support
2020-02-10 16:25:45 +01:00
Anca Iordache
79fe7ca997 add warning when max_replicas_per_node limits scale
Signed-off-by: Anca Iordache <anca.iordache@docker.com>
2020-02-10 11:52:16 +01:00
Florian Apolloner
5fe0858450 Updated requirements.txt back to the released python-dotenv 0.11.0.
Signed-off-by: Florian Apolloner <florian@apolloner.eu>
2020-02-10 10:48:21 +01:00
Anca Iordache
02d8e9ee14 set min engine version needed for v38 schema support
Signed-off-by: Anca Iordache <anca.iordache@docker.com>
2020-02-07 13:50:06 +01:00
Anca Iordache
d9b0fabd9b update api version for 3.8
Signed-off-by: Anca Iordache <anca.iordache@docker.com>
2020-02-07 13:50:06 +01:00
Anca Iordache
09c80ce49b test update - remove 'placement' from unsupported fields
Signed-off-by: Anca Iordache <anca.iordache@docker.com>
2020-02-07 13:50:06 +01:00
Anca Iordache
391e5a6bc2 Add v3.8 schema support
- service scale bounded by 'max_replicas_per_node' field

Signed-off-by: Anca Iordache <anca.iordache@docker.com>
2020-02-07 13:50:06 +01:00
Ulysses Souza
13bacba2b9 Merge pull request #7202 from aiordache/devtool28_compose_docker_contexts
Implement docker contexts to target different docker engines
2020-02-07 13:35:23 +01:00
Anca Iordache
2dfd85e30e Use docker context interface from docker-py
Signed-off-by: Anca Iordache <anca.iordache@docker.com>
2020-02-06 12:01:37 +01:00
Ulysses Souza
eb20915c6e Merge pull request #7045 from ndeloof/command_suggest
Suggested command is invalid
2020-02-03 18:01:27 +01:00
Ulysses Souza
4d5a3fdf8f Merge pull request #7211 from ulyssessouza/add-release-script
Add release validation and tagging script release.py
2020-02-03 17:55:50 +01:00
Ulysses Souza
b5c4f4fc0f Add release validation and tagging script release.py
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-02-03 16:52:35 +01:00
Ulysses Souza
d9af02eddf Merge pull request #7199 from ulyssessouza/fix-exec-dotenv
Remove `None` entries on exec command
2020-01-31 10:47:53 +01:00
Ulysses Souza
9f5f8b4757 Remove None entries on execute command
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-01-30 19:26:33 +01:00
Ulysses Souza
160b5c1755 Merge pull request #7197 from ulyssessouza/fix-mac-deployment-target
Force MacOS SDK version to "10.11"
2020-01-30 15:54:32 +01:00
Ulysses Souza
b62722a3c5 Force MacOS SDK version to "10.11"
This is due to the fact that the new CI machines are on 10.14

Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-01-29 17:11:00 +01:00
Ulysses Souza
edf27e486a Pass the interpolation value to python-dotenv
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-01-29 14:29:32 +01:00
Florian Apolloner
f17e7268b0 Properly escape values coming from env_files, fixes #6871
Signed-off-by: Florian Apolloner <florian@apolloner.eu>
2020-01-28 16:33:28 +01:00
Ulysses Souza
73551d5a92 Merge pull request #7165 from hartwork/setup-py-add-missing-test-dependency-ddt
Add missing test dependency ddt to setup.py
2020-01-27 15:06:33 +01:00
Ulysses Souza
4c9d19cf5c Merge pull request #7190 from docker/dependabot/pip/pytest-5.3.4
Bump pytest from 5.3.2 to 5.3.4
2020-01-27 14:58:56 +01:00
Ulysses Souza
409a9d8207 Merge pull request #7150 from ulyssessouza/add-python-dotenv
Add python-dotenv to delegate `.env` file processing
2020-01-27 14:48:41 +01:00
dependabot-preview[bot]
97b8bff14e Bump pytest from 5.3.2 to 5.3.4
Bumps [pytest](https://github.com/pytest-dev/pytest) from 5.3.2 to 5.3.4.
- [Release notes](https://github.com/pytest-dev/pytest/releases)
- [Changelog](https://github.com/pytest-dev/pytest/blob/master/CHANGELOG.rst)
- [Commits](https://github.com/pytest-dev/pytest/compare/5.3.2...5.3.4)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2020-01-27 13:30:57 +00:00
Ulysses Souza
6d2658ea65 Add python-dotenv
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-01-27 13:50:34 +01:00
Ulysses Souza
4bf623d53d Merge pull request #7163 from ndeloof/changelog
Compute changelog by searching previous tag .. even from a tag
2020-01-23 19:01:19 +01:00
Ulysses Souza
ec0f8a8f7c Merge pull request #7177 from docker/ndeloof-patch-1
Tag as `x.y.z` without `v` prefix
2020-01-23 18:59:52 +01:00
Ulysses Souza
8d5023e1ca Enforce Python37 in the creation of virtualenv
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2020-01-23 18:43:36 +01:00
Ulysses Souza
be807a0be8 Merge pull request #7181 from chris-crone/bump-linux-python
Bump Linux and Python
2020-01-23 12:20:41 +01:00
Christopher Crone
a259c48ae9 Bump Linux and Python
* Alpine 3.10 to 3.11
* Debian Stretch 20191118 to 20191224
* Python 3.7.5 to 3.7.6

Signed-off-by: Christopher Crone <christopher.crone@docker.com>
2020-01-23 11:30:26 +01:00
Nicolas De Loof
a9c79bd5b1 Force sha256 file to be ASCII encoded
Signed-off-by: Nicolas De Loof <nicolas.deloof@gmail.com>
2020-01-22 18:58:27 +01:00
Nicolas De loof
8ad480546f Tag as x.y.z without v prefix
Signed-off-by: Nicolas De Loof <nicolas.deloof@gmail.com>

Close https://github.com/docker/compose/issues/7168
2020-01-22 13:55:24 +01:00
Sebastian Pipping
9887121c2c setup.py: Expose test dependencies as extra "tests"
Signed-off-by: Sebastian Pipping <sebastian@pipping.org>
2020-01-20 19:54:34 +01:00
Sebastian Pipping
1c28b5a5d0 setup.py: Add missing test depencendy ddt
Signed-off-by: Sebastian Pipping <sebastian@pipping.org>
2020-01-20 19:54:30 +01:00
Nicolas De Loof
84dad1a0e6 Compute changelog by searching previous tag .. even from a tag
Signed-off-by: Nicolas De Loof <nicolas.deloof@gmail.com>
2020-01-20 16:18:02 +01:00
Nicolas De Loof
657bdef6ff Suggested command is invalid
Signed-off-by: Nicolas De Loof <nicolas.deloof@gmail.com>
2019-11-22 09:25:50 +01:00
40 changed files with 2513 additions and 371 deletions

View File

@@ -1,14 +0,0 @@
# Generated by FOSSA CLI (https://github.com/fossas/fossa-cli)
# Visit https://fossa.io to learn more
version: 2
cli:
server: https://app.fossa.io
fetcher: custom
project: git@github.com:docker/compose
analyze:
modules:
- name: .
type: pip
target: .
path: .

View File

@@ -1,6 +1,123 @@
Change log
==========
1.26.2 (2020-07-02)
-------------------
### Bugs
- Enforce `docker-py` 4.2.2 as minimum version when installing with pip
1.26.1 (2020-06-30)
-------------------
### Features
- Bump `docker-py` from 4.2.1 to 4.2.2
### Bugs
- Enforce `docker-py` 4.2.1 as minimum version when installing with pip
- Fix context load for non-docker endpoints
1.26.0 (2020-06-03)
-------------------
### Features
- Add `docker context` support
- Add missing test dependency `ddt` to `setup.py`
- Add `--attach-dependencies` to command `up` for attaching to dependencies
- Allow compatibility option with `COMPOSE_COMPATIBILITY` environment variable
- Bump `Pytest` to 5.3.4 and add refactor compatibility with new version
- Bump `OpenSSL` from 1.1.1f to 1.1.1g
- Bump `docker-py` from 4.2.0 to 4.2.1
### Bugs
- Properly escape values coming from env_files
- Sync compose-schemas with upstream (docker/cli)
- Remove `None` entries on exec command
- Add `python-dotenv` to delegate `.env` file processing
- Don't adjust output on terminal width when piped into another command
- Show an error message when `version` attribute is malformed
- Fix HTTPS connection when DOCKER_HOST is remote
1.25.5 (2020-02-04)
-------------------
### Features
- Bump OpenSSL from 1.1.1d to 1.1.1f
- Add 3.8 compose version
1.25.4 (2020-01-23)
-------------------
### Bugfixes
- Fix CI script to enforce the minimal MacOS version to 10.11
- Fix docker-compose exec for keys with no value
1.25.3 (2020-01-23)
-------------------
### Bugfixes
- Fix CI script to enforce the compilation with Python3
- Fix binary's sha256 in the release page
1.25.2 (2020-01-20)
-------------------
### Features
- Allow compatibility option with `COMPOSE_COMPATIBILITY` environment variable
- Bump PyInstaller from 3.5 to 3.6
- Bump pysocks from 1.6.7 to 1.7.1
- Bump websocket-client from 0.32.0 to 0.57.0
- Bump urllib3 from 1.24.2 to 1.25.7
- Bump jsonschema from 3.0.1 to 3.2.0
- Bump PyYAML from 4.2b1 to 5.3
- Bump certifi from 2017.4.17 to 2019.11.28
- Bump coverage from 4.5.4 to 5.0.3
- Bump paramiko from 2.6.0 to 2.7.1
- Bump cached-property from 1.3.0 to 1.5.1
- Bump minor Linux and MacOSX dependencies
### Bugfixes
- Validate version format on formats 2+
- Assume infinite terminal width when not running in a terminal
1.25.1 (2020-01-06)
-------------------

View File

@@ -1,9 +1,9 @@
ARG DOCKER_VERSION=19.03.5
ARG PYTHON_VERSION=3.7.5
ARG BUILD_ALPINE_VERSION=3.10
ARG DOCKER_VERSION=19.03.8
ARG PYTHON_VERSION=3.7.7
ARG BUILD_ALPINE_VERSION=3.11
ARG BUILD_DEBIAN_VERSION=slim-stretch
ARG RUNTIME_ALPINE_VERSION=3.10.3
ARG RUNTIME_DEBIAN_VERSION=stretch-20191118-slim
ARG RUNTIME_ALPINE_VERSION=3.11.5
ARG RUNTIME_DEBIAN_VERSION=stretch-20200414-slim
ARG BUILD_PLATFORM=alpine

View File

@@ -1,15 +0,0 @@
FROM s390x/alpine:3.10.1
ARG COMPOSE_VERSION=1.16.1
RUN apk add --update --no-cache \
python \
py-pip \
&& pip install --no-cache-dir docker-compose==$COMPOSE_VERSION \
&& rm -rf /var/cache/apk/*
WORKDIR /data
VOLUME /data
ENTRYPOINT ["docker-compose"]

2
Jenkinsfile vendored
View File

@@ -1,6 +1,6 @@
#!groovy
def dockerVersions = ['19.03.5']
def dockerVersions = ['19.03.8']
def baseImages = ['alpine', 'debian']
def pythonVersions = ['py37']

View File

@@ -2,7 +2,7 @@ Docker Compose
==============
![Docker Compose](logo.png?raw=true "Docker Compose Logo")
## :exclamation: The docker-compose project announces that as Python 2 reaches it's EOL, versions 1.25.x will be the last to support it. For more information, please refer to this [issue](https://github.com/docker/compose/issues/6890).
## :exclamation: The docker-compose project announces that as Python 2 has reached it's EOL, versions 1.26.x will be the last to support it. For more information, please refer to this [issue](https://github.com/docker/compose/issues/6890).
Compose is a tool for defining and running multi-container Docker applications.
With Compose, you use a Compose file to configure your application's services.
@@ -56,7 +56,7 @@ Installation and documentation
Contributing
------------
[![Build Status](https://jenkins.dockerproject.org/buildStatus/icon?job=docker/compose/master)](https://jenkins.dockerproject.org/job/docker/job/compose/job/master/)
[![Build Status](https://ci-next.docker.com/public/buildStatus/icon?job=compose/master)](https://ci-next.docker.com/public/job/compose/job/master/)
Want to help build Compose? Check out our [contributing documentation](https://github.com/docker/compose/blob/master/CONTRIBUTING.md).

View File

@@ -1,6 +1,6 @@
#!groovy
def dockerVersions = ['19.03.5', '18.09.9']
def dockerVersions = ['19.03.8', '18.09.9']
def baseImages = ['alpine', 'debian']
def pythonVersions = ['py37']
@@ -20,7 +20,7 @@ pipeline {
parallel {
stage('alpine') {
agent {
label 'linux'
label 'linux && docker && ubuntu-2004'
}
steps {
buildImage('alpine')
@@ -28,7 +28,7 @@ pipeline {
}
stage('debian') {
agent {
label 'linux'
label 'linux && docker && ubuntu-2004'
}
steps {
buildImage('debian')
@@ -55,7 +55,7 @@ pipeline {
}
stage('Generate Changelog') {
agent {
label 'linux'
label 'linux && docker && ubuntu-2004'
}
steps {
checkout scm
@@ -72,6 +72,9 @@ pipeline {
agent {
label 'mac-python'
}
environment {
DEPLOYMENT_TARGET="10.11"
}
steps {
checkout scm
sh './script/setup/osx'
@@ -89,7 +92,7 @@ pipeline {
}
stage('linux binary') {
agent {
label 'linux'
label 'linux && docker && ubuntu-2004'
}
steps {
checkout scm
@@ -125,7 +128,7 @@ pipeline {
}
stage('alpine image') {
agent {
label 'linux'
label 'linux && docker && ubuntu-2004'
}
steps {
buildRuntimeImage('alpine')
@@ -133,7 +136,7 @@ pipeline {
}
stage('debian image') {
agent {
label 'linux'
label 'linux && docker && ubuntu-2004'
}
steps {
buildRuntimeImage('debian')
@@ -148,7 +151,7 @@ pipeline {
parallel {
stage('Pushing images') {
agent {
label 'linux'
label 'linux && docker && ubuntu-2004'
}
steps {
pushRuntimeImage('alpine')
@@ -157,7 +160,7 @@ pipeline {
}
stage('Creating Github Release') {
agent {
label 'linux'
label 'linux && docker && ubuntu-2004'
}
environment {
GITHUB_TOKEN = credentials('github-release-token')
@@ -189,7 +192,7 @@ pipeline {
}
stage('Publishing Python packages') {
agent {
label 'linux'
label 'linux && docker && ubuntu-2004'
}
environment {
PYPIRC = credentials('pypirc-docker-dsg-cibot')
@@ -198,9 +201,9 @@ pipeline {
checkout scm
sh """
rm -rf build/ dist/
pip install wheel
python setup.py sdist bdist_wheel
pip install twine
pip3 install wheel
python3 setup.py sdist bdist_wheel
pip3 install twine
~/.local/bin/twine upload --config-file ${PYPIRC} ./dist/docker-compose-*.tar.gz ./dist/docker_compose-*-py2.py3-none-any.whl
"""
}
@@ -296,6 +299,6 @@ def checksum(filepath) {
if (isUnix()) {
sh "openssl sha256 -r -out ${filepath}.sha256 ${filepath}"
} else {
powershell "(Get-FileHash -Path ${filepath} -Algorithm SHA256 | % hash) + ' *${filepath}' > ${filepath}.sha256"
powershell "(Get-FileHash -Path ${filepath} -Algorithm SHA256 | % hash).ToLower() + ' *${filepath}' | Out-File -encoding ascii ${filepath}.sha256"
}
}

View File

@@ -1,4 +1,4 @@
from __future__ import absolute_import
from __future__ import unicode_literals
__version__ = '1.26.0dev'
__version__ = '1.26.2'

View File

@@ -8,7 +8,6 @@ import re
import six
from . import errors
from . import verbose_proxy
from .. import config
from .. import parallel
from ..config.environment import Environment
@@ -17,10 +16,10 @@ from ..const import LABEL_CONFIG_FILES
from ..const import LABEL_ENVIRONMENT_FILE
from ..const import LABEL_WORKING_DIR
from ..project import Project
from .docker_client import docker_client
from .docker_client import get_tls_version
from .docker_client import tls_config_from_options
from .utils import get_version_info
from .docker_client import get_client
from .docker_client import load_context
from .docker_client import make_context
from .errors import UserError
log = logging.getLogger(__name__)
@@ -48,16 +47,28 @@ def project_from_options(project_dir, options, additional_options=None):
environment.silent = options.get('COMMAND', None) in SILENT_COMMANDS
set_parallel_limit(environment)
host = options.get('--host')
# get the context for the run
context = None
context_name = options.get('--context', None)
if context_name:
context = load_context(context_name)
if not context:
raise UserError("Context '{}' not found".format(context_name))
host = options.get('--host', None)
if host is not None:
if context:
raise UserError(
"-H, --host and -c, --context are mutually exclusive. Only one should be set.")
host = host.lstrip('=')
context = make_context(host, options, environment)
return get_project(
project_dir,
get_config_path_from_options(project_dir, options, environment),
project_name=options.get('--project-name'),
verbose=options.get('--verbose'),
host=host,
tls_config=tls_config_from_options(options, environment),
context=context,
environment=environment,
override_dir=override_dir,
compatibility=compatibility_from_options(project_dir, options, environment),
@@ -112,25 +123,8 @@ def get_config_path_from_options(base_dir, options, environment):
return None
def get_client(environment, verbose=False, version=None, tls_config=None, host=None,
tls_version=None):
client = docker_client(
version=version, tls_config=tls_config, host=host,
environment=environment, tls_version=get_tls_version(environment)
)
if verbose:
version_info = six.iteritems(client.version())
log.info(get_version_info('full'))
log.info("Docker base_url: %s", client.base_url)
log.info("Docker version: %s",
", ".join("%s=%s" % item for item in version_info))
return verbose_proxy.VerboseProxy('docker', client)
return client
def get_project(project_dir, config_path=None, project_name=None, verbose=False,
host=None, tls_config=None, environment=None, override_dir=None,
context=None, environment=None, override_dir=None,
compatibility=False, interpolate=True, environment_file=None):
if not environment:
environment = Environment.from_env_file(project_dir)
@@ -145,8 +139,7 @@ def get_project(project_dir, config_path=None, project_name=None, verbose=False,
API_VERSIONS[config_data.version])
client = get_client(
verbose=verbose, version=api_version, tls_config=tls_config,
host=host, environment=environment
verbose=verbose, version=api_version, context=context, environment=environment
)
with errors.handle_connection_errors(client):

View File

@@ -5,17 +5,22 @@ import logging
import os.path
import ssl
import six
from docker import APIClient
from docker import Context
from docker import ContextAPI
from docker import TLSConfig
from docker.errors import TLSParameterError
from docker.tls import TLSConfig
from docker.utils import kwargs_from_env
from docker.utils.config import home_dir
from . import verbose_proxy
from ..config.environment import Environment
from ..const import HTTP_TIMEOUT
from ..utils import unquote_path
from .errors import UserError
from .utils import generate_user_agent
from .utils import get_version_info
log = logging.getLogger(__name__)
@@ -24,6 +29,33 @@ def default_cert_path():
return os.path.join(home_dir(), '.docker')
def make_context(host, options, environment):
tls = tls_config_from_options(options, environment)
ctx = Context("compose", host=host, tls=tls.verify if tls else False)
if tls:
ctx.set_endpoint("docker", host, tls, skip_tls_verify=not tls.verify)
return ctx
def load_context(name=None):
return ContextAPI.get_context(name)
def get_client(environment, verbose=False, version=None, context=None):
client = docker_client(
version=version, context=context,
environment=environment, tls_version=get_tls_version(environment)
)
if verbose:
version_info = six.iteritems(client.version())
log.info(get_version_info('full'))
log.info("Docker base_url: %s", client.base_url)
log.info("Docker version: %s",
", ".join("%s=%s" % item for item in version_info))
return verbose_proxy.VerboseProxy('docker', client)
return client
def get_tls_version(environment):
compose_tls_version = environment.get('COMPOSE_TLS_VERSION', None)
if not compose_tls_version:
@@ -87,8 +119,7 @@ def tls_config_from_options(options, environment=None):
return None
def docker_client(environment, version=None, tls_config=None, host=None,
tls_version=None):
def docker_client(environment, version=None, context=None, tls_version=None):
"""
Returns a docker-py client configured using environment variables
according to the same logic as the official Docker client.
@@ -101,10 +132,26 @@ def docker_client(environment, version=None, tls_config=None, host=None,
"and DOCKER_CERT_PATH are set correctly.\n"
"You might need to run `eval \"$(docker-machine env default)\"`")
if host:
kwargs['base_url'] = host
if tls_config:
kwargs['tls'] = tls_config
if not context:
# check env for DOCKER_HOST and certs path
host = kwargs.get("base_url", None)
tls = kwargs.get("tls", None)
verify = False if not tls else tls.verify
if host:
context = Context("compose", host=host, tls=verify)
else:
context = ContextAPI.get_current_context()
if tls:
context.set_endpoint("docker", host=host, tls_cfg=tls, skip_tls_verify=not verify)
if not context.is_docker_host():
raise UserError(
"The platform targeted with the current context is not supported.\n"
"Make sure the context in use targets a Docker Engine.\n")
kwargs['base_url'] = context.Host
if context.TLSConfig:
kwargs['tls'] = context.TLSConfig
if version:
kwargs['version'] = version

View File

@@ -192,6 +192,7 @@ class TopLevelCommand(object):
(default: docker-compose.yml)
-p, --project-name NAME Specify an alternate project name
(default: directory name)
-c, --context NAME Specify a context name
--verbose Show more output
--log-level LEVEL Set log level (DEBUG, INFO, WARNING, ERROR, CRITICAL)
--no-ansi Do not print ANSI control characters
@@ -1079,7 +1080,7 @@ class TopLevelCommand(object):
log.error(
"The image for the service you're trying to recreate has been removed. "
"If you continue, volume data could be lost. Consider backing up your data "
"before continuing.\n".format(e.explanation)
"before continuing.\n"
)
res = yesno("Continue with the new image? [yN]", False)
if res is None or not res:
@@ -1201,7 +1202,7 @@ def image_digests_for_project(project):
if e.needs_push:
command_hint = (
"Use `docker-compose push {}` to push them. "
"Use `docker push {}` to push them. "
.format(" ".join(sorted(e.needs_push)))
)
paras += [
@@ -1212,7 +1213,7 @@ def image_digests_for_project(project):
if e.needs_pull:
command_hint = (
"Use `docker-compose pull {}` to pull them. "
"Use `docker pull {}` to pull them. "
.format(" ".join(sorted(e.needs_pull)))
)
@@ -1466,7 +1467,12 @@ def call_docker(args, dockeropts, environment):
args = [executable_path] + tls_options + args
log.debug(" ".join(map(pipes.quote, args)))
return subprocess.call(args, env=environment)
filtered_env = {}
for k, v in environment.items():
if v is not None:
filtered_env[k] = environment[k]
return subprocess.call(args, env=filtered_env)
def parse_scale_args(options):

View File

@@ -9,6 +9,7 @@ import ssl
import subprocess
import sys
import distro
import docker
import six
@@ -73,7 +74,7 @@ def is_mac():
def is_ubuntu():
return platform.system() == 'Linux' and platform.linux_distribution()[0] == 'Ubuntu'
return platform.system() == 'Linux' and distro.linux_distribution()[0] == 'Ubuntu'
def is_windows():

View File

@@ -408,7 +408,7 @@ def load(config_details, compatibility=False, interpolate=True):
configs = load_mapping(
config_details.config_files, 'get_configs', 'Config', config_details.working_dir
)
service_dicts = load_services(config_details, main_file, compatibility)
service_dicts = load_services(config_details, main_file, compatibility, interpolate=interpolate)
if main_file.version != V1:
for service_dict in service_dicts:
@@ -460,7 +460,7 @@ def validate_external(entity_type, name, config, version):
entity_type, name, ', '.join(k for k in config if k != 'external')))
def load_services(config_details, config_file, compatibility=False):
def load_services(config_details, config_file, compatibility=False, interpolate=True):
def build_service(service_name, service_dict, service_names):
service_config = ServiceConfig.with_abs_paths(
config_details.working_dir,
@@ -479,7 +479,8 @@ def load_services(config_details, config_file, compatibility=False):
service_names,
config_file.version,
config_details.environment,
compatibility
compatibility,
interpolate
)
return service_dict
@@ -504,9 +505,7 @@ def load_services(config_details, config_file, compatibility=False):
file.get_service_dicts() for file in config_details.config_files
]
service_config = service_configs[0]
for next_config in service_configs[1:]:
service_config = merge_services(service_config, next_config)
service_config = functools.reduce(merge_services, service_configs)
return build_services(service_config)
@@ -679,13 +678,13 @@ class ServiceExtendsResolver(object):
return filename
def resolve_environment(service_dict, environment=None):
def resolve_environment(service_dict, environment=None, interpolate=True):
"""Unpack any environment variables from an env_file, if set.
Interpolate environment values if set.
"""
env = {}
for env_file in service_dict.get('env_file', []):
env.update(env_vars_from_file(env_file))
env.update(env_vars_from_file(env_file, interpolate))
env.update(parse_environment(service_dict.get('environment')))
return dict(resolve_env_var(k, v, environment) for k, v in six.iteritems(env))
@@ -881,11 +880,12 @@ def finalize_service_volumes(service_dict, environment):
return service_dict
def finalize_service(service_config, service_names, version, environment, compatibility):
def finalize_service(service_config, service_names, version, environment, compatibility,
interpolate=True):
service_dict = dict(service_config.config)
if 'environment' in service_dict or 'env_file' in service_dict:
service_dict['environment'] = resolve_environment(service_dict, environment)
service_dict['environment'] = resolve_environment(service_dict, environment, interpolate)
service_dict.pop('env_file', None)
if 'volumes_from' in service_dict:
@@ -990,12 +990,17 @@ def translate_deploy_keys_to_container_config(service_dict):
deploy_dict = service_dict['deploy']
ignored_keys = [
k for k in ['endpoint_mode', 'labels', 'update_config', 'rollback_config', 'placement']
k for k in ['endpoint_mode', 'labels', 'update_config', 'rollback_config']
if k in deploy_dict
]
if 'replicas' in deploy_dict and deploy_dict.get('mode', 'replicated') == 'replicated':
service_dict['scale'] = deploy_dict['replicas']
scale = deploy_dict.get('replicas', 1)
max_replicas = deploy_dict.get('placement', {}).get('max_replicas_per_node', scale)
service_dict['scale'] = min(scale, max_replicas)
if max_replicas < scale:
log.warning("Scale is limited to {} ('max_replicas_per_node' field).".format(
max_replicas))
if 'restart_policy' in deploy_dict:
service_dict['restart'] = {

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -1,12 +1,11 @@
from __future__ import absolute_import
from __future__ import unicode_literals
import codecs
import contextlib
import logging
import os
import re
import dotenv
import six
from ..const import IS_WINDOWS_PLATFORM
@@ -31,7 +30,7 @@ def split_env(env):
return key, value
def env_vars_from_file(filename):
def env_vars_from_file(filename, interpolate=True):
"""
Read in a line delimited file of environment variables.
"""
@@ -39,16 +38,10 @@ def env_vars_from_file(filename):
raise EnvFileNotFound("Couldn't find env file: {}".format(filename))
elif not os.path.isfile(filename):
raise EnvFileNotFound("{} is not a file.".format(filename))
env = {}
with contextlib.closing(codecs.open(filename, 'r', 'utf-8-sig')) as fileobj:
for line in fileobj:
line = line.strip()
if line and not line.startswith('#'):
try:
k, v = split_env(line)
env[k] = v
except ConfigurationError as e:
raise ConfigurationError('In file {}: {}'.format(filename, e.msg))
env = dotenv.dotenv_values(dotenv_path=filename, encoding='utf-8-sig', interpolate=interpolate)
for k, v in env.items():
env[k] = v if interpolate else v.replace('$', '$$')
return env

View File

@@ -41,7 +41,10 @@ COMPOSEFILE_V3_4 = ComposeVersion('3.4')
COMPOSEFILE_V3_5 = ComposeVersion('3.5')
COMPOSEFILE_V3_6 = ComposeVersion('3.6')
COMPOSEFILE_V3_7 = ComposeVersion('3.7')
COMPOSEFILE_V3_8 = ComposeVersion('3.8')
# minimum DOCKER ENGINE API version needed to support
# features for each compose schema version
API_VERSIONS = {
COMPOSEFILE_V1: '1.21',
COMPOSEFILE_V2_0: '1.22',
@@ -57,6 +60,7 @@ API_VERSIONS = {
COMPOSEFILE_V3_5: '1.30',
COMPOSEFILE_V3_6: '1.36',
COMPOSEFILE_V3_7: '1.38',
COMPOSEFILE_V3_8: '1.38',
}
API_VERSION_TO_ENGINE_VERSION = {
@@ -74,4 +78,5 @@ API_VERSION_TO_ENGINE_VERSION = {
API_VERSIONS[COMPOSEFILE_V3_5]: '17.06.0',
API_VERSIONS[COMPOSEFILE_V3_6]: '18.02.0',
API_VERSIONS[COMPOSEFILE_V3_7]: '18.06.0',
API_VERSIONS[COMPOSEFILE_V3_8]: '18.06.0',
}

View File

@@ -1157,7 +1157,7 @@ class Service(object):
container_name = build_container_name(
self.project, service_name, number, slug,
)
ext_links_origins = [l.split(':')[0] for l in self.options.get('external_links', [])]
ext_links_origins = [link.split(':')[0] for link in self.options.get('external_links', [])]
if container_name in ext_links_origins:
raise DependencyError(
'Service {0} has a self-referential external link: {1}'.format(
@@ -1792,6 +1792,7 @@ class _CLIBuilder(object):
command_builder.add_list("--cache-from", cache_from)
command_builder.add_arg("--file", dockerfile)
command_builder.add_flag("--force-rm", forcerm)
command_builder.add_params("--label", labels)
command_builder.add_arg("--memory", container_limits.get("memory"))
command_builder.add_flag("--no-cache", nocache)
command_builder.add_arg("--progress", self._progress)

View File

@@ -87,6 +87,11 @@ exe = EXE(pyz,
'compose/config/config_schema_v3.7.json',
'DATA'
),
(
'compose/config/config_schema_v3.8.json',
'compose/config/config_schema_v3.8.json',
'DATA'
),
(
'compose/GITSHA',
'compose/GITSHA',

View File

@@ -96,6 +96,11 @@ coll = COLLECT(exe,
'compose/config/config_schema_v3.7.json',
'DATA'
),
(
'compose/config/config_schema_v3.8.json',
'compose/config/config_schema_v3.8.json',
'DATA'
),
(
'compose/GITSHA',
'compose/GITSHA',

View File

@@ -1,7 +1,9 @@
Click==7.0
coverage==5.0.3
ddt==1.2.2
flake8==3.7.9
gitpython==2.1.15
mock==3.0.5
pytest==5.3.2; python_version >= '3.5'
pytest==5.3.4; python_version >= '3.5'
pytest==4.6.5; python_version < '3.5'
pytest-cov==2.8.1

View File

@@ -1,10 +1,11 @@
backports.shutil_get_terminal_size==1.0.0
backports.ssl-match-hostname==3.5.0.1; python_version < '3'
cached-property==1.5.1
certifi==2019.11.28
certifi==2020.4.5.1
chardet==3.0.4
colorama==0.4.3; sys_platform == 'win32'
docker==4.1.0
distro==1.5.0
docker==4.2.2
docker-pycreds==0.4.0
dockerpty==0.4.1
docopt==0.6.2
@@ -17,10 +18,12 @@ paramiko==2.7.1
pypiwin32==219; sys_platform == 'win32' and python_version < '3.6'
pypiwin32==223; sys_platform == 'win32' and python_version >= '3.6'
PySocks==1.7.1
python-dotenv==0.13.0
PyYAML==5.3
requests==2.22.0
six==1.12.0
subprocess32==3.5.4; python_version < '3.2'
texttable==1.6.2
urllib3==1.25.7; python_version == '3.3'
wcwidth==0.1.9
websocket-client==0.57.0

View File

@@ -6,7 +6,7 @@
#
# http://git-scm.com/download/win
#
# 2. Install Python 3.7.2:
# 2. Install Python 3.7.x:
#
# https://www.python.org/downloads/
#
@@ -39,7 +39,7 @@ if (Test-Path venv) {
Get-ChildItem -Recurse -Include *.pyc | foreach ($_) { Remove-Item $_.FullName }
# Create virtualenv
virtualenv .\venv
virtualenv -p C:\Python37\python.exe .\venv
# pip and pyinstaller generate lots of warnings, so we need to ignore them
$ErrorActionPreference = "Continue"

View File

@@ -4,6 +4,15 @@ The release process is fully automated by `Release.Jenkinsfile`.
## Usage
1. edit `compose/__init__.py` to set release version number
1. commit and tag as `v{major}.{minor}.{patch}`
1. edit `compose/__init__.py` again to set next development version number
1. In the appropriate branch, run `./scripts/release/release tag <version>`
By appropriate, we mean for a version `1.26.0` or `1.26.0-rc1` you should run the script in the `1.26.x` branch.
The script should check the above then ask for changelog modifications.
After the executions, you should have a commit with the proper bumps for `docker-compose version` and `run.sh`
2. Run `git push --tags upstream <version_branch>`
This should trigger a new CI build on the new tag. When the CI finishes with the tests and builds a new draft release would be available on github's releases page.
3. Check and confirm the release on github's release page.

7
script/release/const.py Normal file
View File

@@ -0,0 +1,7 @@
from __future__ import absolute_import
from __future__ import unicode_literals
import os
REPO_ROOT = os.path.join(os.path.dirname(__file__), '..', '..')

View File

@@ -10,7 +10,7 @@ set -x
git config --add remote.origin.fetch +refs/pull/*/head:refs/remotes/origin/pull/*
git fetch origin
RANGE=${1:-"$(git describe --tags --abbrev=0)..HEAD"}
RANGE=${1:-"$(git describe --tags --abbrev=0 HEAD^)..HEAD"}
echo "Generate changelog for range ${RANGE}"
echo

126
script/release/release.py Executable file
View File

@@ -0,0 +1,126 @@
#!/usr/bin/env python3
from __future__ import absolute_import
from __future__ import unicode_literals
import re
import click
from git import Repo
from utils import update_init_py_version
from utils import update_run_sh_version
from utils import yesno
VALID_VERSION_PATTERN = re.compile(r"^\d+\.\d+\.\d+(-rc\d+)?$")
class Version(str):
def matching_groups(self):
match = VALID_VERSION_PATTERN.match(self)
if not match:
return False
return match.groups()
def is_ga_version(self):
groups = self.matching_groups()
if not groups:
return False
rc_suffix = groups[1]
return not rc_suffix
def validate(self):
return len(self.matching_groups()) > 0
def branch_name(self):
if not self.validate():
return None
rc_part = self.matching_groups()[0]
ver = self
if rc_part:
ver = ver[:-len(rc_part)]
tokens = ver.split(".")
tokens[-1] = 'x'
return ".".join(tokens)
def create_bump_commit(repository, version):
print('Creating bump commit...')
repository.commit('-a', '-s', '-m "Bump {}"'.format(version), '--no-verify')
def validate_environment(version, repository):
if not version.validate():
print('Version "{}" has an invalid format. This should follow D+.D+.D+(-rcD+). '
'Like: 1.26.0 or 1.26.0-rc1'.format(version))
return False
expected_branch = version.branch_name()
if str(repository.active_branch) != expected_branch:
print('Cannot tag in this branch with version "{}". '
'Please checkout "{}" to tag'.format(version, version.branch_name()))
return False
return True
@click.group()
def cli():
pass
@cli.command()
@click.argument('version')
def tag(version):
"""
Updates the version related files and tag
"""
repo = Repo(".")
version = Version(version)
if not validate_environment(version, repo):
return
update_init_py_version(version)
update_run_sh_version(version)
input('Please add the release notes to the CHANGELOG.md file, then press Enter to continue.')
proceed = False
while not proceed:
print(repo.git.diff())
proceed = yesno('Are these changes ok? y/N ', default=False)
if repo.git.diff():
create_bump_commit(repo.git, version)
else:
print('No changes to commit. Exiting...')
return
repo.create_tag(version)
print('Please, check the changes. If everything is OK, you just need to push with:\n'
'$ git push --tags upstream {}'.format(version.branch_name()))
@cli.command()
@click.argument('version')
def push_latest(version):
"""
TODO Pushes the latest tag pointing to a certain GA version
"""
raise NotImplementedError
@cli.command()
@click.argument('version')
def ghtemplate(version):
"""
TODO Generates the github release page content
"""
version = Version(version)
raise NotImplementedError
if __name__ == '__main__':
cli()

47
script/release/utils.py Normal file
View File

@@ -0,0 +1,47 @@
from __future__ import absolute_import
from __future__ import unicode_literals
import os
import re
from const import REPO_ROOT
def update_init_py_version(version):
path = os.path.join(REPO_ROOT, 'compose', '__init__.py')
with open(path, 'r') as f:
contents = f.read()
contents = re.sub(r"__version__ = '[0-9a-z.-]+'", "__version__ = '{}'".format(version), contents)
with open(path, 'w') as f:
f.write(contents)
def update_run_sh_version(version):
path = os.path.join(REPO_ROOT, 'script', 'run', 'run.sh')
with open(path, 'r') as f:
contents = f.read()
contents = re.sub(r'VERSION="[0-9a-z.-]+"', 'VERSION="{}"'.format(version), contents)
with open(path, 'w') as f:
f.write(contents)
def yesno(prompt, default=None):
"""
Prompt the user for a yes or no.
Can optionally specify a default value, which will only be
used if they enter a blank line.
Unrecognised input (anything other than "y", "n", "yes",
"no" or "") will return None.
"""
answer = input(prompt).strip().lower()
if answer == "y" or answer == "yes":
return True
elif answer == "n" or answer == "no":
return False
elif answer == "":
return default
else:
return None

View File

@@ -15,7 +15,7 @@
set -e
VERSION="1.25.1"
VERSION="1.26.2"
IMAGE="docker/compose:$VERSION"

View File

@@ -13,13 +13,13 @@ if ! [ ${DEPLOYMENT_TARGET} == "$(macos_version)" ]; then
SDK_SHA1=dd228a335194e3392f1904ce49aff1b1da26ca62
fi
OPENSSL_VERSION=1.1.1d
OPENSSL_VERSION=1.1.1g
OPENSSL_URL=https://www.openssl.org/source/openssl-${OPENSSL_VERSION}.tar.gz
OPENSSL_SHA1=056057782325134b76d1931c48f2c7e6595d7ef4
OPENSSL_SHA1=b213a293f2127ec3e323fb3cfc0c9807664fd997
PYTHON_VERSION=3.7.5
PYTHON_VERSION=3.7.7
PYTHON_URL=https://www.python.org/ftp/python/${PYTHON_VERSION}/Python-${PYTHON_VERSION}.tgz
PYTHON_SHA1=8b0311d4cca19f0ea9181731189fa33c9f5aedf9
PYTHON_SHA1=8e9968663a214aea29659ba9dfa959e8a7d82b39
#
# Install prerequisites.

3
script/test/acceptance Executable file
View File

@@ -0,0 +1,3 @@
#!/usr/bin/env bash
pytest --conformity --binary ${1:-docker-compose} tests/acceptance/

View File

@@ -36,14 +36,17 @@ install_requires = [
'requests >= 2.20.0, < 3',
'texttable >= 0.9.0, < 2',
'websocket-client >= 0.32.0, < 1',
'docker[ssh] >= 3.7.0, < 5',
'distro >= 1.5.0, < 2',
'docker[ssh] >= 4.2.2, < 5',
'dockerpty >= 0.4.1, < 1',
'six >= 1.3.0, < 2',
'jsonschema >= 2.5.1, < 4',
'python-dotenv >= 0.13.0, < 1',
]
tests_require = [
'ddt >= 1.2.2, < 2',
'pytest < 6',
]
@@ -59,6 +62,7 @@ extras_require = {
'ipaddress >= 1.0.16, < 2'],
':sys_platform == "win32"': ['colorama >= 0.4, < 1'],
'socks': ['PySocks >= 1.5.6, != 1.5.7, < 2'],
'tests': tests_require,
}

View File

@@ -38,6 +38,8 @@ from tests.integration.testcases import v2_2_only
from tests.integration.testcases import v2_only
from tests.integration.testcases import v3_only
DOCKER_COMPOSE_EXECUTABLE = 'docker-compose'
ProcessResult = namedtuple('ProcessResult', 'stdout stderr')
@@ -65,7 +67,7 @@ COMPOSE_COMPATIBILITY_DICT = {
def start_process(base_dir, options):
proc = subprocess.Popen(
['docker-compose'] + options,
[DOCKER_COMPOSE_EXECUTABLE] + options,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
@@ -1711,6 +1713,17 @@ services:
assert stderr == ""
assert stdout == "/\n"
@mock.patch.dict(os.environ)
def test_exec_novalue_var_dotenv_file(self):
os.environ['MYVAR'] = 'SUCCESS'
self.base_dir = 'tests/fixtures/exec-novalue-var'
self.dispatch(['up', '-d'])
assert len(self.project.containers()) == 1
stdout, stderr = self.dispatch(['exec', '-T', 'nginx', 'env'])
assert 'CHECK_VAR=SUCCESS' in stdout
assert not stderr
def test_exec_detach_long_form(self):
self.base_dir = 'tests/fixtures/links-composefile'
self.dispatch(['up', '--detach', 'console'])

View File

@@ -0,0 +1,48 @@
# -*- coding: utf-8 -*-
from __future__ import absolute_import
from __future__ import unicode_literals
import os
import shutil
import unittest
from docker import ContextAPI
from tests.acceptance.cli_test import dispatch
class ContextTestCase(unittest.TestCase):
@classmethod
def setUpClass(cls):
cls.docker_dir = os.path.join(os.environ.get("HOME", "/tmp"), '.docker')
if not os.path.exists(cls.docker_dir):
os.makedirs(cls.docker_dir)
f = open(os.path.join(cls.docker_dir, "config.json"), "w")
f.write("{}")
f.close()
cls.docker_config = os.path.join(cls.docker_dir, "config.json")
os.environ['DOCKER_CONFIG'] = cls.docker_config
ContextAPI.create_context("testcontext", host="tcp://doesnotexist:8000")
@classmethod
def tearDownClass(cls):
shutil.rmtree(cls.docker_dir, ignore_errors=True)
def setUp(self):
self.base_dir = 'tests/fixtures/simple-composefile'
self.override_dir = None
def dispatch(self, options, project_options=None, returncode=0, stdin=None):
return dispatch(self.base_dir, options, project_options, returncode, stdin)
def test_help(self):
result = self.dispatch(['help'], returncode=0)
assert '-c, --context NAME' in result.stdout
def test_fail_on_both_host_and_context_opt(self):
result = self.dispatch(['-H', 'unix://', '-c', 'default', 'up'], returncode=1)
assert '-H, --host and -c, --context are mutually exclusive' in result.stderr
def test_fail_run_on_inexistent_context(self):
result = self.dispatch(['-c', 'testcontext', 'up', '-d'], returncode=1)
assert "Couldn't connect to Docker daemon" in result.stderr

243
tests/conftest.py Normal file
View File

@@ -0,0 +1,243 @@
from __future__ import absolute_import
from __future__ import unicode_literals
import pytest
import tests.acceptance.cli_test
# FIXME Skipping all the acceptance tests when in `--conformity`
non_conformity_tests = [
"test_build_failed",
"test_build_failed_forcerm",
"test_build_log_level",
"test_build_memory_build_option",
"test_build_no_cache",
"test_build_no_cache_pull",
"test_build_override_dir",
"test_build_override_dir_invalid_path",
"test_build_parallel",
"test_build_plain",
"test_build_pull",
"test_build_rm",
"test_build_shm_size_build_option",
"test_build_with_buildarg_cli_override",
"test_build_with_buildarg_from_compose_file",
"test_build_with_buildarg_old_api_version",
"test_config_compatibility_mode",
"test_config_compatibility_mode_from_env",
"test_config_compatibility_mode_from_env_and_option_precedence",
"test_config_default",
"test_config_external_network",
"test_config_external_network_v3_5",
"test_config_external_volume_v2",
"test_config_external_volume_v2_x",
"test_config_external_volume_v3_4",
"test_config_external_volume_v3_x",
"test_config_list_services",
"test_config_list_volumes",
"test_config_quiet",
"test_config_quiet_with_error",
"test_config_restart",
"test_config_stdin",
"test_config_v1",
"test_config_v3",
"test_config_with_dot_env",
"test_config_with_dot_env_and_override_dir",
"test_config_with_env_file",
"test_config_with_hash_option",
"test_create",
"test_create_with_force_recreate",
"test_create_with_force_recreate_and_no_recreate",
"test_create_with_no_recreate",
"test_down",
"test_down_invalid_rmi_flag",
"test_down_signal",
"test_down_timeout",
"test_env_file_relative_to_compose_file",
"test_events_human_readable",
"test_events_json",
"test_exec_custom_user",
"test_exec_detach_long_form",
"test_exec_novalue_var_dotenv_file",
"test_exec_service_with_environment_overridden",
"test_exec_without_tty",
"test_exec_workdir",
"test_exit_code_from_signal_stop",
"test_expanded_port",
"test_forward_exitval",
"test_help",
"test_help_nonexistent",
"test_home_and_env_var_in_volume_path",
"test_host_not_reachable",
"test_host_not_reachable_volumes_from_container",
"test_host_not_reachable_volumes_from_container",
"test_images",
"test_images_default_composefile",
"test_images_tagless_image",
"test_images_use_service_tag",
"test_kill",
"test_kill_signal_sigstop",
"test_kill_stopped_service",
"test_logs_default",
"test_logs_follow",
"test_logs_follow_logs_from_new_containers",
"test_logs_follow_logs_from_restarted_containers",
"test_logs_invalid_service_name",
"test_logs_on_stopped_containers_exits",
"test_logs_tail",
"test_logs_timestamps",
"test_pause_no_containers",
"test_pause_unpause",
"test_port",
"test_port_with_scale",
"test_ps",
"test_ps_all",
"test_ps_alternate_composefile",
"test_ps_default_composefile",
"test_ps_services_filter_option",
"test_ps_services_filter_status",
"test_pull",
"test_pull_can_build",
"test_pull_with_digest",
"test_pull_with_ignore_pull_failures",
"test_pull_with_include_deps",
"test_pull_with_no_deps",
"test_pull_with_parallel_failure",
"test_pull_with_quiet",
"test_quiet_build",
"test_restart",
"test_restart_no_containers",
"test_restart_stopped_container",
"test_rm",
"test_rm_all",
"test_rm_stop",
"test_run_detached_connects_to_network",
"test_run_does_not_recreate_linked_containers",
"test_run_env_values_from_system",
"test_run_handles_sighup",
"test_run_handles_sigint",
"test_run_handles_sigterm",
"test_run_interactive_connects_to_network",
"test_run_label_flag",
"test_run_one_off_with_multiple_volumes",
"test_run_one_off_with_volume",
"test_run_one_off_with_volume_merge",
"test_run_rm",
"test_run_service_with_compose_file_entrypoint",
"test_run_service_with_compose_file_entrypoint_and_command_overridden",
"test_run_service_with_compose_file_entrypoint_and_empty_string_command",
"test_run_service_with_compose_file_entrypoint_overridden",
"test_run_service_with_dependencies",
"test_run_service_with_dockerfile_entrypoint",
"test_run_service_with_dockerfile_entrypoint_and_command_overridden",
"test_run_service_with_dockerfile_entrypoint_overridden",
"test_run_service_with_environment_overridden",
"test_run_service_with_explicitly_mapped_ip_ports",
"test_run_service_with_explicitly_mapped_ports",
"test_run_service_with_links",
"test_run_service_with_map_ports",
"test_run_service_with_scaled_dependencies",
"test_run_service_with_unset_entrypoint",
"test_run_service_with_use_aliases",
"test_run_service_with_user_overridden",
"test_run_service_with_user_overridden_short_form",
"test_run_service_with_workdir_overridden",
"test_run_service_with_workdir_overridden_short_form",
"test_run_service_without_links",
"test_run_service_without_map_ports",
"test_run_unicode_env_values_from_system",
"test_run_with_custom_name",
"test_run_with_expose_ports",
"test_run_with_no_deps",
"test_run_without_command",
"test_scale",
"test_scale_v2_2",
"test_shorthand_host_opt",
"test_shorthand_host_opt_interactive",
"test_start_no_containers",
"test_stop",
"test_stop_signal",
"test_top_processes_running",
"test_top_services_not_running",
"test_top_services_running",
"test_unpause_no_containers",
"test_up",
"test_up_attached",
"test_up_detached",
"test_up_detached_long_form",
"test_up_external_networks",
"test_up_handles_abort_on_container_exit",
"test_up_handles_abort_on_container_exit_code",
"test_up_handles_aborted_dependencies",
"test_up_handles_force_shutdown",
"test_up_handles_sigint",
"test_up_handles_sigterm",
"test_up_logging",
"test_up_logging_legacy",
"test_up_missing_network",
"test_up_no_ansi",
"test_up_no_services",
"test_up_no_start",
"test_up_no_start_remove_orphans",
"test_up_scale_reset",
"test_up_scale_scale_down",
"test_up_scale_scale_up",
"test_up_scale_to_zero",
"test_up_with_attach_dependencies",
"test_up_with_default_network_config",
"test_up_with_default_override_file",
"test_up_with_duplicate_override_yaml_files",
"test_up_with_extends",
"test_up_with_external_default_network",
"test_up_with_force_recreate",
"test_up_with_force_recreate_and_no_recreate",
"test_up_with_healthcheck",
"test_up_with_ignore_remove_orphans",
"test_up_with_links_v1",
"test_up_with_multiple_files",
"test_up_with_net_is_invalid",
"test_up_with_net_v1",
"test_up_with_network_aliases",
"test_up_with_network_internal",
"test_up_with_network_labels",
"test_up_with_network_mode",
"test_up_with_network_static_addresses",
"test_up_with_networks",
"test_up_with_no_deps",
"test_up_with_no_recreate",
"test_up_with_override_yaml",
"test_up_with_pid_mode",
"test_up_with_timeout",
"test_up_with_volume_labels",
"test_fail_on_both_host_and_context_opt",
"test_fail_run_on_inexistent_context",
]
def pytest_addoption(parser):
parser.addoption(
"--conformity",
action="store_true",
default=False,
help="Only runs tests that are not black listed as non conformity test. "
"The conformity tests check for compatibility with the Compose spec."
)
parser.addoption(
"--binary",
default=tests.acceptance.cli_test.DOCKER_COMPOSE_EXECUTABLE,
help="Forces the execution of a binary in the PATH. Default is `docker-compose`."
)
def pytest_collection_modifyitems(config, items):
if not config.getoption("--conformity"):
return
if config.getoption("--binary"):
tests.acceptance.cli_test.DOCKER_COMPOSE_EXECUTABLE = config.getoption("--binary")
print("Binary -> {}".format(tests.acceptance.cli_test.DOCKER_COMPOSE_EXECUTABLE))
skip_non_conformity = pytest.mark.skip(reason="skipping because that's not a conformity test")
for item in items:
if item.name in non_conformity_tests:
print("Skipping '{}' when running in compatibility mode".format(item.name))
item.add_marker(skip_non_conformity)

2
tests/fixtures/env/three.env vendored Normal file
View File

@@ -0,0 +1,2 @@
FOO=NO $ENV VAR
DOO=NO ${ENV} VAR

View File

@@ -0,0 +1,6 @@
version: '3'
services:
nginx:
image: nginx
environment:
- CHECK_VAR=${MYVAR}

View File

@@ -223,6 +223,9 @@ class ServiceTest(DockerClientTestCase):
service.start_container(container)
assert container.get('HostConfig.ReadonlyRootfs') == read_only
@pytest.mark.xfail(True, reason='Getting "Your kernel does not support '
'cgroup blkio weight and weight_device" on daemon start '
'on Linux kernel 5.3.x')
def test_create_container_with_blkio_config(self):
blkio_config = {
'weight': 300,
@@ -985,6 +988,23 @@ class ServiceTest(DockerClientTestCase):
self.addCleanup(self.client.remove_image, service.image_name)
assert self.client.inspect_image('composetest_web')
def test_build_cli_with_build_labels(self):
base_dir = tempfile.mkdtemp()
self.addCleanup(shutil.rmtree, base_dir)
with open(os.path.join(base_dir, 'Dockerfile'), 'w') as f:
f.write("FROM busybox\n")
service = self.create_service('web',
build={
'context': base_dir,
'labels': {'com.docker.compose.test': 'true'}},
)
service.build(cli=True)
self.addCleanup(self.client.remove_image, service.image_name)
image = self.client.inspect_image('composetest_web')
assert image['Config']['Labels']['com.docker.compose.test']
def test_up_build_cli(self):
base_dir = tempfile.mkdtemp()
self.addCleanup(shutil.rmtree, base_dir)

View File

@@ -3637,7 +3637,6 @@ class InterpolationTest(unittest.TestCase):
assert 'labels' in warn_message
assert 'endpoint_mode' in warn_message
assert 'update_config' in warn_message
assert 'placement' in warn_message
assert 'resources.reservations.cpus' in warn_message
assert 'restart_policy.delay' in warn_message
assert 'restart_policy.window' in warn_message
@@ -5420,15 +5419,19 @@ class SerializeTest(unittest.TestCase):
'environment': {
'CURRENCY': '$'
},
'env_file': ['tests/fixtures/env/three.env'],
'entrypoint': ['$SHELL', '-c'],
}
}
}
config_dict = config.load(build_config_details(cfg), interpolate=False)
config_dict = config.load(build_config_details(cfg, working_dir='.'), interpolate=False)
serialized_config = yaml.safe_load(serialize_config(config_dict, escape_dollar=False))
serialized_service = serialized_config['services']['web']
assert serialized_service['environment']['CURRENCY'] == '$'
# Values coming from env_files are not allowed to have variables
assert serialized_service['environment']['FOO'] == 'NO $$ENV VAR'
assert serialized_service['environment']['DOO'] == 'NO $${ENV} VAR'
assert serialized_service['command'] == 'echo $FOO'
assert serialized_service['entrypoint'][0] == '$SHELL'

View File

@@ -8,15 +8,18 @@ import os
import shutil
import tempfile
import pytest
from ddt import data
from ddt import ddt
from ddt import unpack
from compose.config.environment import env_vars_from_file
from compose.config.environment import Environment
from compose.config.errors import ConfigurationError
from tests import unittest
@ddt
class EnvironmentTest(unittest.TestCase):
@classmethod
def test_get_simple(self):
env = Environment({
'FOO': 'bar',
@@ -28,12 +31,14 @@ class EnvironmentTest(unittest.TestCase):
assert env.get('BAR') == '1'
assert env.get('BAZ') == ''
@classmethod
def test_get_undefined(self):
env = Environment({
'FOO': 'bar'
})
assert env.get('FOOBAR') is None
@classmethod
def test_get_boolean(self):
env = Environment({
'FOO': '',
@@ -48,20 +53,18 @@ class EnvironmentTest(unittest.TestCase):
assert env.get_boolean('FOOBAR') is True
assert env.get_boolean('UNDEFINED') is False
def test_env_vars_from_file_bom(self):
@data(
('unicode exclude test', '\ufeffPARK_BOM=박봄\n', {'PARK_BOM': '박봄'}),
('export prefixed test', 'export PREFIXED_VARS=yes\n', {"PREFIXED_VARS": "yes"}),
('quoted vars test', "QUOTED_VARS='yes'\n", {"QUOTED_VARS": "yes"}),
('double quoted vars test', 'DOUBLE_QUOTED_VARS="yes"\n', {"DOUBLE_QUOTED_VARS": "yes"}),
('extra spaces test', 'SPACES_VARS = "yes"\n', {"SPACES_VARS": "yes"}),
)
@unpack
def test_env_vars(self, test_name, content, expected):
tmpdir = tempfile.mkdtemp('env_file')
self.addCleanup(shutil.rmtree, tmpdir)
with codecs.open('{}/bom.env'.format(str(tmpdir)), 'w', encoding='utf-8') as f:
f.write('\ufeffPARK_BOM=박봄\n')
assert env_vars_from_file(str(os.path.join(tmpdir, 'bom.env'))) == {
'PARK_BOM': '박봄'
}
def test_env_vars_from_file_whitespace(self):
tmpdir = tempfile.mkdtemp('env_file')
self.addCleanup(shutil.rmtree, tmpdir)
with codecs.open('{}/whitespace.env'.format(str(tmpdir)), 'w', encoding='utf-8') as f:
f.write('WHITESPACE =yes\n')
with pytest.raises(ConfigurationError) as exc:
env_vars_from_file(str(os.path.join(tmpdir, 'whitespace.env')))
assert 'environment variable' in exc.exconly()
file_abs_path = str(os.path.join(tmpdir, ".env"))
with codecs.open(file_abs_path, 'w', encoding='utf-8') as f:
f.write(content)
assert env_vars_from_file(file_abs_path) == expected, '"{}" Failed'.format(test_name)