mirror of
https://github.com/docker/compose.git
synced 2026-02-12 03:29:27 +08:00
Compare commits
114 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
5becea4ca9 | ||
|
|
3ecf001f96 | ||
|
|
96a8e3b3c5 | ||
|
|
aca6edd631 | ||
|
|
39ab3aee50 | ||
|
|
39c2d42a0e | ||
|
|
aa45dedb3d | ||
|
|
afb36f236b | ||
|
|
f01e535cf4 | ||
|
|
853efbfb54 | ||
|
|
6ea24001fa | ||
|
|
c3d8e558a2 | ||
|
|
8f9f1111f3 | ||
|
|
1b42ecba14 | ||
|
|
c34c88b217 | ||
|
|
575d67618e | ||
|
|
adae403b27 | ||
|
|
0bea52b18d | ||
|
|
2d0c9366ff | ||
|
|
8a6dc88f9e | ||
|
|
d79745a2cd | ||
|
|
1250bb7481 | ||
|
|
3ba41b98d3 | ||
|
|
7f7370b811 | ||
|
|
a9e8ae190f | ||
|
|
3eee3e093a | ||
|
|
ad770b272c | ||
|
|
e5cab3ced5 | ||
|
|
7984767db2 | ||
|
|
0773730525 | ||
|
|
c81046aac0 | ||
|
|
84c816e887 | ||
|
|
1607674374 | ||
|
|
683fac0dbf | ||
|
|
ddee2958ec | ||
|
|
1ab1cd202b | ||
|
|
1752927dcd | ||
|
|
c2ddd71e5f | ||
|
|
72bbd9c3a6 | ||
|
|
3c9ee678e7 | ||
|
|
92fefbc9cc | ||
|
|
e496c64127 | ||
|
|
84afa518e8 | ||
|
|
0f6a55e036 | ||
|
|
8ce5e235e4 | ||
|
|
c1dddbe608 | ||
|
|
981b0cd641 | ||
|
|
5ec8af582c | ||
|
|
f5342b600c | ||
|
|
4a26d95de4 | ||
|
|
5b7851f55b | ||
|
|
eaa22df151 | ||
|
|
551f680751 | ||
|
|
3e071ec8d9 | ||
|
|
858ff26731 | ||
|
|
2a7c06a050 | ||
|
|
d0b7bc3110 | ||
|
|
fe4f16e448 | ||
|
|
1da4301650 | ||
|
|
c594cb3fc3 | ||
|
|
89ad637d50 | ||
|
|
6ca2aed7ec | ||
|
|
fc744a0cc9 | ||
|
|
245ede1d75 | ||
|
|
72f7b086d7 | ||
|
|
2f48b6f5e9 | ||
|
|
e6fcde422c | ||
|
|
75b2d7905f | ||
|
|
efa5969086 | ||
|
|
2a4aca7f54 | ||
|
|
9c8f5a5705 | ||
|
|
62bbc5cfe2 | ||
|
|
66375c2871 | ||
|
|
c760600a65 | ||
|
|
4daad056c4 | ||
|
|
74c09cac66 | ||
|
|
36e470d640 | ||
|
|
d28d717884 | ||
|
|
42c2cfd7a6 | ||
|
|
5b983ac653 | ||
|
|
93425218eb | ||
|
|
49d0ee2de5 | ||
|
|
a92c6d7e17 | ||
|
|
b8800db52e | ||
|
|
ccabfde353 | ||
|
|
3297bb50bb | ||
|
|
e688006444 | ||
|
|
e4a83c15ff | ||
|
|
824b9f138e | ||
|
|
8654eb2ea3 | ||
|
|
9407ee65e5 | ||
|
|
66c6d2757a | ||
|
|
17daa93edf | ||
|
|
9795e39d0c | ||
|
|
393abc5b33 | ||
|
|
d0866c8c18 | ||
|
|
546133c977 | ||
|
|
9a2f94713e | ||
|
|
b88f635514 | ||
|
|
31002aeacd | ||
|
|
6209baccf3 | ||
|
|
28f8b8549d | ||
|
|
76a19ec8c5 | ||
|
|
bba8cd0322 | ||
|
|
f2ec6a2176 | ||
|
|
7f7f1607de | ||
|
|
4990a7f935 | ||
|
|
72f8551466 | ||
|
|
487779960c | ||
|
|
99b6776fd2 | ||
|
|
4e382b9c28 | ||
|
|
862107a32a | ||
|
|
6a3af5b707 | ||
|
|
205d520805 |
@@ -17,7 +17,7 @@
|
||||
sha: v1.3.4
|
||||
hooks:
|
||||
- id: reorder-python-imports
|
||||
language_version: 'python3.9'
|
||||
language_version: 'python3.7'
|
||||
args:
|
||||
- --py3-plus
|
||||
- repo: https://github.com/asottile/pyupgrade
|
||||
|
||||
197
CHANGELOG.md
197
CHANGELOG.md
@@ -1,6 +1,203 @@
|
||||
Change log
|
||||
==========
|
||||
|
||||
1.29.2 (2021-05-10)
|
||||
-------------------
|
||||
|
||||
[List of PRs / issues for this release](https://github.com/docker/compose/milestone/59?closed=1)
|
||||
|
||||
### Miscellaneous
|
||||
|
||||
- Remove advertisement for `docker compose` in the `up` command to avoid annoyance
|
||||
|
||||
- Bump `py` to `1.10.0` in `requirements-indirect.txt`
|
||||
|
||||
1.29.1 (2021-04-13)
|
||||
-------------------
|
||||
|
||||
[List of PRs / issues for this release](https://github.com/docker/compose/milestone/58?closed=1)
|
||||
|
||||
### Bugs
|
||||
|
||||
- Fix for invalid handler warning on Windows builds
|
||||
|
||||
- Fix config hash to trigger container recreation on IPC mode updates
|
||||
|
||||
- Fix conversion map for `placement.max_replicas_per_node`
|
||||
|
||||
- Remove extra scan suggestion on build
|
||||
|
||||
1.29.0 (2021-04-06)
|
||||
-------------------
|
||||
|
||||
[List of PRs / issues for this release](https://github.com/docker/compose/milestone/56?closed=1)
|
||||
|
||||
### Features
|
||||
|
||||
- Add profile filter to `docker-compose config`
|
||||
|
||||
- Add a `depends_on` condition to wait for successful service completion
|
||||
|
||||
### Miscellaneous
|
||||
|
||||
- Add image scan message on build
|
||||
|
||||
- Update warning message for `--no-ansi` to mention `--ansi never` as alternative
|
||||
|
||||
- Bump docker-py to 5.0.0
|
||||
|
||||
- Bump PyYAML to 5.4.1
|
||||
|
||||
- Bump python-dotenv to 0.17.0
|
||||
|
||||
1.28.6 (2021-03-23)
|
||||
-------------------
|
||||
|
||||
[List of PRs / issues for this release](https://github.com/docker/compose/milestone/57?closed=1)
|
||||
|
||||
### Bugs
|
||||
|
||||
- Make `--env-file` relative to the current working directory and error out for invalid paths. Environment file paths set with `--env-file` are relative to the current working directory while the default `.env` file is located in the project directory which by default is the base directory of the Compose file.
|
||||
|
||||
- Fix missing service property `storage_opt` by updating the compose schema
|
||||
|
||||
- Fix build `extra_hosts` list format
|
||||
|
||||
- Remove extra error message on `exec`
|
||||
|
||||
### Miscellaneous
|
||||
|
||||
- Add `compose.yml` and `compose.yaml` to default filename list
|
||||
|
||||
1.28.5 (2021-02-25)
|
||||
-------------------
|
||||
|
||||
[List of PRs / issues for this release](https://github.com/docker/compose/milestone/55?closed=1)
|
||||
|
||||
### Bugs
|
||||
|
||||
- Fix OpenSSL version mismatch error when shelling out to the ssh client (via bump to docker-py 4.4.4 which contains the fix)
|
||||
|
||||
- Add missing build flags to the native builder: `platform`, `isolation` and `extra_hosts`
|
||||
|
||||
- Remove info message on native build
|
||||
|
||||
- Avoid fetching logs when service logging driver is set to 'none'
|
||||
|
||||
1.28.4 (2021-02-18)
|
||||
-------------------
|
||||
|
||||
[List of PRs / issues for this release](https://github.com/docker/compose/milestone/54?closed=1)
|
||||
|
||||
### Bugs
|
||||
|
||||
- Fix SSH port parsing by bumping docker-py to 4.4.3
|
||||
|
||||
### Miscellaneous
|
||||
|
||||
- Bump Python to 3.7.10
|
||||
|
||||
1.28.3 (2021-02-17)
|
||||
-------------------
|
||||
|
||||
[List of PRs / issues for this release](https://github.com/docker/compose/milestone/53?closed=1)
|
||||
|
||||
### Bugs
|
||||
|
||||
- Fix SSH hostname parsing when it contains leading s/h, and remove the quiet option that was hiding the error (via docker-py bump to 4.4.2)
|
||||
|
||||
- Fix key error for '--no-log-prefix' option
|
||||
|
||||
- Fix incorrect CLI environment variable name for service profiles: `COMPOSE_PROFILES` instead of `COMPOSE_PROFILE`
|
||||
|
||||
- Fix fish completion
|
||||
|
||||
### Miscellaneous
|
||||
|
||||
- Bump cryptography to 3.3.2
|
||||
|
||||
- Remove log driver filter
|
||||
|
||||
1.28.2 (2021-01-26)
|
||||
-------------------
|
||||
|
||||
### Miscellaneous
|
||||
|
||||
- CI setup update
|
||||
|
||||
1.28.1 (2021-01-25)
|
||||
-------------------
|
||||
|
||||
### Bugs
|
||||
|
||||
- Revert to Python 3.7 bump for Linux static builds
|
||||
|
||||
- Add bash completion for `docker-compose logs|up --no-log-prefix`
|
||||
|
||||
1.28.0 (2021-01-20)
|
||||
-------------------
|
||||
|
||||
### Features
|
||||
|
||||
- Support for Nvidia GPUs via device requests
|
||||
|
||||
- Support for service profiles
|
||||
|
||||
- Change the SSH connection approach to the Docker CLI's via shellout to the local SSH client (old behaviour enabled by setting `COMPOSE_PARAMIKO_SSH` environment variable)
|
||||
|
||||
- Add flag to disable log prefix
|
||||
|
||||
- Add flag for ansi output control
|
||||
|
||||
### Bugs
|
||||
|
||||
- Make `parallel_pull=True` by default
|
||||
|
||||
- Bring back warning for configs in non-swarm mode
|
||||
|
||||
- Take `--file` in account when defining `project_dir`
|
||||
|
||||
- On `compose up`, attach only to services we read logs from
|
||||
|
||||
### Miscellaneous
|
||||
|
||||
- Make COMPOSE_DOCKER_CLI_BUILD=1 the default
|
||||
|
||||
- Add usage metrics
|
||||
|
||||
- Sync schema with COMPOSE specification
|
||||
|
||||
- Improve failure report for missing mandatory environment variables
|
||||
|
||||
- Bump attrs to 20.3.0
|
||||
|
||||
- Bump more_itertools to 8.6.0
|
||||
|
||||
- Bump cryptograhy to 3.2.1
|
||||
|
||||
- Bump cffi to 1.14.4
|
||||
|
||||
- Bump virtualenv to 20.2.2
|
||||
|
||||
- Bump bcrypt to 3.2.0
|
||||
|
||||
- Bump gitpython to 3.1.11
|
||||
|
||||
- Bump docker-py to 4.4.1
|
||||
|
||||
- Bump Python to 3.9
|
||||
|
||||
- Linux: bump Debian base image from stretch to buster (required for Python 3.9)
|
||||
|
||||
- macOS: OpenSSL 1.1.1g to 1.1.1h, Python 3.7.7 to 3.9.0
|
||||
|
||||
- Bump pyinstaller 4.1
|
||||
|
||||
- Loosen restriction on base images to latest minor
|
||||
|
||||
- Updates of READMEs
|
||||
|
||||
|
||||
1.27.4 (2020-09-24)
|
||||
-------------------
|
||||
|
||||
|
||||
14
Dockerfile
14
Dockerfile
@@ -1,13 +1,13 @@
|
||||
ARG DOCKER_VERSION=19.03
|
||||
ARG PYTHON_VERSION=3.9.0
|
||||
ARG PYTHON_VERSION=3.7.10
|
||||
|
||||
ARG BUILD_ALPINE_VERSION=3.12
|
||||
ARG BUILD_CENTOS_VERSION=7
|
||||
ARG BUILD_DEBIAN_VERSION=slim-buster
|
||||
ARG BUILD_DEBIAN_VERSION=slim-stretch
|
||||
|
||||
ARG RUNTIME_ALPINE_VERSION=3.12
|
||||
ARG RUNTIME_CENTOS_VERSION=7
|
||||
ARG RUNTIME_DEBIAN_VERSION=buster-slim
|
||||
ARG RUNTIME_DEBIAN_VERSION=stretch-slim
|
||||
|
||||
ARG DISTRO=alpine
|
||||
|
||||
@@ -38,7 +38,7 @@ RUN apt-get update && apt-get install --no-install-recommends -y \
|
||||
git \
|
||||
libc-dev \
|
||||
libffi-dev \
|
||||
libgcc-8-dev \
|
||||
libgcc-6-dev \
|
||||
libssl-dev \
|
||||
make \
|
||||
openssl \
|
||||
@@ -68,8 +68,8 @@ WORKDIR /code/
|
||||
COPY docker-compose-entrypoint.sh /usr/local/bin/
|
||||
COPY --from=docker-cli /usr/local/bin/docker /usr/local/bin/docker
|
||||
RUN pip install \
|
||||
virtualenv==20.2.2 \
|
||||
tox==3.20.1
|
||||
virtualenv==20.4.0 \
|
||||
tox==3.21.2
|
||||
COPY requirements-dev.txt .
|
||||
COPY requirements-indirect.txt .
|
||||
COPY requirements.txt .
|
||||
@@ -79,7 +79,7 @@ COPY tox.ini .
|
||||
COPY setup.py .
|
||||
COPY README.md .
|
||||
COPY compose compose/
|
||||
RUN tox --notest
|
||||
RUN tox -e py37 --notest
|
||||
COPY . .
|
||||
ARG GIT_COMMIT=unknown
|
||||
ENV DOCKER_COMPOSE_GITSHA=$GIT_COMMIT
|
||||
|
||||
14
Jenkinsfile
vendored
14
Jenkinsfile
vendored
@@ -2,7 +2,7 @@
|
||||
|
||||
def dockerVersions = ['19.03.13']
|
||||
def baseImages = ['alpine', 'debian']
|
||||
def pythonVersions = ['py39']
|
||||
def pythonVersions = ['py37']
|
||||
|
||||
pipeline {
|
||||
agent none
|
||||
@@ -23,7 +23,7 @@ pipeline {
|
||||
parallel {
|
||||
stage('alpine') {
|
||||
agent {
|
||||
label 'ubuntu && amd64 && !zfs'
|
||||
label 'ubuntu-2004 && amd64 && !zfs && cgroup1'
|
||||
}
|
||||
steps {
|
||||
buildImage('alpine')
|
||||
@@ -31,7 +31,7 @@ pipeline {
|
||||
}
|
||||
stage('debian') {
|
||||
agent {
|
||||
label 'ubuntu && amd64 && !zfs'
|
||||
label 'ubuntu-2004 && amd64 && !zfs && cgroup1'
|
||||
}
|
||||
steps {
|
||||
buildImage('debian')
|
||||
@@ -62,7 +62,7 @@ pipeline {
|
||||
|
||||
def buildImage(baseImage) {
|
||||
def scmvar = checkout(scm)
|
||||
def imageName = "dockerbuildbot/compose:${baseImage}-${scmvar.GIT_COMMIT}"
|
||||
def imageName = "dockerpinata/compose:${baseImage}-${scmvar.GIT_COMMIT}"
|
||||
image = docker.image(imageName)
|
||||
|
||||
withDockerRegistry(credentialsId:'dockerbuildbot-index.docker.io') {
|
||||
@@ -87,9 +87,9 @@ def buildImage(baseImage) {
|
||||
def runTests(dockerVersion, pythonVersion, baseImage) {
|
||||
return {
|
||||
stage("python=${pythonVersion} docker=${dockerVersion} ${baseImage}") {
|
||||
node("ubuntu && amd64 && !zfs") {
|
||||
node("ubuntu-2004 && amd64 && !zfs && cgroup1") {
|
||||
def scmvar = checkout(scm)
|
||||
def imageName = "dockerbuildbot/compose:${baseImage}-${scmvar.GIT_COMMIT}"
|
||||
def imageName = "dockerpinata/compose:${baseImage}-${scmvar.GIT_COMMIT}"
|
||||
def storageDriver = sh(script: "docker info -f \'{{.Driver}}\'", returnStdout: true).trim()
|
||||
echo "Using local system's storage driver: ${storageDriver}"
|
||||
withDockerRegistry(credentialsId:'dockerbuildbot-index.docker.io') {
|
||||
@@ -99,6 +99,8 @@ def runTests(dockerVersion, pythonVersion, baseImage) {
|
||||
--privileged \\
|
||||
--volume="\$(pwd)/.git:/code/.git" \\
|
||||
--volume="/var/run/docker.sock:/var/run/docker.sock" \\
|
||||
--volume="\${DOCKER_CONFIG}/config.json:/root/.docker/config.json" \\
|
||||
-e "DOCKER_TLS_CERTDIR=" \\
|
||||
-e "TAG=${imageName}" \\
|
||||
-e "STORAGE_DRIVER=${storageDriver}" \\
|
||||
-e "DOCKER_VERSIONS=${dockerVersion}" \\
|
||||
|
||||
@@ -2,7 +2,7 @@
|
||||
|
||||
def dockerVersions = ['19.03.13', '18.09.9']
|
||||
def baseImages = ['alpine', 'debian']
|
||||
def pythonVersions = ['py39']
|
||||
def pythonVersions = ['py37']
|
||||
|
||||
pipeline {
|
||||
agent none
|
||||
@@ -23,7 +23,7 @@ pipeline {
|
||||
parallel {
|
||||
stage('alpine') {
|
||||
agent {
|
||||
label 'linux && docker && ubuntu-2004'
|
||||
label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
|
||||
}
|
||||
steps {
|
||||
buildImage('alpine')
|
||||
@@ -31,7 +31,7 @@ pipeline {
|
||||
}
|
||||
stage('debian') {
|
||||
agent {
|
||||
label 'linux && docker && ubuntu-2004'
|
||||
label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
|
||||
}
|
||||
steps {
|
||||
buildImage('debian')
|
||||
@@ -41,7 +41,7 @@ pipeline {
|
||||
}
|
||||
stage('Test') {
|
||||
agent {
|
||||
label 'linux && docker && ubuntu-2004'
|
||||
label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
|
||||
}
|
||||
steps {
|
||||
// TODO use declarative 1.5.0 `matrix` once available on CI
|
||||
@@ -61,7 +61,7 @@ pipeline {
|
||||
}
|
||||
stage('Generate Changelog') {
|
||||
agent {
|
||||
label 'linux && docker && ubuntu-2004'
|
||||
label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
|
||||
}
|
||||
steps {
|
||||
checkout scm
|
||||
@@ -98,7 +98,7 @@ pipeline {
|
||||
}
|
||||
stage('linux binary') {
|
||||
agent {
|
||||
label 'linux && docker && ubuntu-2004'
|
||||
label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
|
||||
}
|
||||
steps {
|
||||
checkout scm
|
||||
@@ -134,7 +134,7 @@ pipeline {
|
||||
}
|
||||
stage('alpine image') {
|
||||
agent {
|
||||
label 'linux && docker && ubuntu-2004'
|
||||
label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
|
||||
}
|
||||
steps {
|
||||
buildRuntimeImage('alpine')
|
||||
@@ -142,7 +142,7 @@ pipeline {
|
||||
}
|
||||
stage('debian image') {
|
||||
agent {
|
||||
label 'linux && docker && ubuntu-2004'
|
||||
label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
|
||||
}
|
||||
steps {
|
||||
buildRuntimeImage('debian')
|
||||
@@ -157,7 +157,7 @@ pipeline {
|
||||
parallel {
|
||||
stage('Pushing images') {
|
||||
agent {
|
||||
label 'linux && docker && ubuntu-2004'
|
||||
label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
|
||||
}
|
||||
steps {
|
||||
pushRuntimeImage('alpine')
|
||||
@@ -166,7 +166,7 @@ pipeline {
|
||||
}
|
||||
stage('Creating Github Release') {
|
||||
agent {
|
||||
label 'linux && docker && ubuntu-2004'
|
||||
label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
|
||||
}
|
||||
environment {
|
||||
GITHUB_TOKEN = credentials('github-release-token')
|
||||
@@ -198,7 +198,7 @@ pipeline {
|
||||
}
|
||||
stage('Publishing Python packages') {
|
||||
agent {
|
||||
label 'linux && docker && ubuntu-2004'
|
||||
label 'linux && docker && ubuntu-2004 && amd64 && cgroup1'
|
||||
}
|
||||
environment {
|
||||
PYPIRC = credentials('pypirc-docker-dsg-cibot')
|
||||
@@ -222,7 +222,7 @@ pipeline {
|
||||
|
||||
def buildImage(baseImage) {
|
||||
def scmvar = checkout(scm)
|
||||
def imageName = "dockerbuildbot/compose:${baseImage}-${scmvar.GIT_COMMIT}"
|
||||
def imageName = "dockerpinata/compose:${baseImage}-${scmvar.GIT_COMMIT}"
|
||||
image = docker.image(imageName)
|
||||
|
||||
withDockerRegistry(credentialsId:'dockerbuildbot-index.docker.io') {
|
||||
@@ -247,9 +247,9 @@ def buildImage(baseImage) {
|
||||
def runTests(dockerVersion, pythonVersion, baseImage) {
|
||||
return {
|
||||
stage("python=${pythonVersion} docker=${dockerVersion} ${baseImage}") {
|
||||
node("linux && docker && ubuntu-2004") {
|
||||
node("linux && docker && ubuntu-2004 && amd64 && cgroup1") {
|
||||
def scmvar = checkout(scm)
|
||||
def imageName = "dockerbuildbot/compose:${baseImage}-${scmvar.GIT_COMMIT}"
|
||||
def imageName = "dockerpinata/compose:${baseImage}-${scmvar.GIT_COMMIT}"
|
||||
def storageDriver = sh(script: "docker info -f \'{{.Driver}}\'", returnStdout: true).trim()
|
||||
echo "Using local system's storage driver: ${storageDriver}"
|
||||
withDockerRegistry(credentialsId:'dockerbuildbot-index.docker.io') {
|
||||
@@ -259,6 +259,8 @@ def runTests(dockerVersion, pythonVersion, baseImage) {
|
||||
--privileged \\
|
||||
--volume="\$(pwd)/.git:/code/.git" \\
|
||||
--volume="/var/run/docker.sock:/var/run/docker.sock" \\
|
||||
--volume="\${DOCKER_CONFIG}/config.json:/root/.docker/config.json" \\
|
||||
-e "DOCKER_TLS_CERTDIR=" \\
|
||||
-e "TAG=${imageName}" \\
|
||||
-e "STORAGE_DRIVER=${storageDriver}" \\
|
||||
-e "DOCKER_VERSIONS=${dockerVersion}" \\
|
||||
|
||||
@@ -1 +1 @@
|
||||
__version__ = '1.28.0dev'
|
||||
__version__ = '1.29.2'
|
||||
|
||||
@@ -129,7 +129,7 @@ def get_profiles_from_options(options, environment):
|
||||
if profile_option:
|
||||
return profile_option
|
||||
|
||||
profiles = environment.get('COMPOSE_PROFILE')
|
||||
profiles = environment.get('COMPOSE_PROFILES')
|
||||
if profiles:
|
||||
return profiles.split(',')
|
||||
|
||||
|
||||
@@ -158,10 +158,8 @@ class QueueItem(namedtuple('_QueueItem', 'item is_stop exc')):
|
||||
|
||||
|
||||
def tail_container_logs(container, presenter, queue, log_args):
|
||||
generator = get_log_generator(container)
|
||||
|
||||
try:
|
||||
for item in generator(container, log_args):
|
||||
for item in build_log_generator(container, log_args):
|
||||
queue.put(QueueItem.new(presenter.present(container, item)))
|
||||
except Exception as e:
|
||||
queue.put(QueueItem.exception(e))
|
||||
@@ -171,20 +169,6 @@ def tail_container_logs(container, presenter, queue, log_args):
|
||||
queue.put(QueueItem.stop(container.name))
|
||||
|
||||
|
||||
def get_log_generator(container):
|
||||
if container.has_api_logs:
|
||||
return build_log_generator
|
||||
return build_no_log_generator
|
||||
|
||||
|
||||
def build_no_log_generator(container, log_args):
|
||||
"""Return a generator that prints a warning about logs and waits for
|
||||
container to exit.
|
||||
"""
|
||||
yield "WARNING: no logs are available with the '{}' log driver\n".format(
|
||||
container.log_driver)
|
||||
|
||||
|
||||
def build_log_generator(container, log_args):
|
||||
# if the container doesn't have a log_stream we need to attach to container
|
||||
# before log printer starts running
|
||||
|
||||
@@ -23,6 +23,7 @@ from ..config import resolve_build_args
|
||||
from ..config.environment import Environment
|
||||
from ..config.serialize import serialize_config
|
||||
from ..config.types import VolumeSpec
|
||||
from ..const import IS_LINUX_PLATFORM
|
||||
from ..const import IS_WINDOWS_PLATFORM
|
||||
from ..errors import StreamParseError
|
||||
from ..metrics.decorator import metrics
|
||||
@@ -78,8 +79,10 @@ def main(): # noqa: C901
|
||||
try:
|
||||
command_func = dispatch()
|
||||
command_func()
|
||||
if not IS_LINUX_PLATFORM and command == 'help':
|
||||
print("\nDocker Compose is now in the Docker CLI, try `docker compose` help")
|
||||
except (KeyboardInterrupt, signals.ShutdownException):
|
||||
exit_with_metrics(command, "Aborting.", status=Status.FAILURE)
|
||||
exit_with_metrics(command, "Aborting.", status=Status.CANCELED)
|
||||
except (UserError, NoSuchService, ConfigurationError,
|
||||
ProjectError, OperationFailedError) as e:
|
||||
exit_with_metrics(command, e.msg, status=Status.FAILURE)
|
||||
@@ -98,7 +101,10 @@ def main(): # noqa: C901
|
||||
e.service.name), status=Status.FAILURE)
|
||||
except NoSuchCommand as e:
|
||||
commands = "\n".join(parse_doc_section("commands:", getdoc(e.supercommand)))
|
||||
exit_with_metrics(e.command, "No such command: {}\n\n{}".format(e.command, commands))
|
||||
if not IS_LINUX_PLATFORM:
|
||||
commands += "\n\nDocker Compose is now in the Docker CLI, try `docker compose`"
|
||||
exit_with_metrics("", log_msg="No such command: {}\n\n{}".format(
|
||||
e.command, commands), status=Status.FAILURE)
|
||||
except (errors.ConnectionError, StreamParseError):
|
||||
exit_with_metrics(command, status=Status.FAILURE)
|
||||
except SystemExit as e:
|
||||
@@ -116,6 +122,10 @@ def main(): # noqa: C901
|
||||
code = 0
|
||||
if isinstance(e.code, int):
|
||||
code = e.code
|
||||
|
||||
if not IS_LINUX_PLATFORM and not command:
|
||||
msg += "\n\nDocker Compose is now in the Docker CLI, try `docker compose`"
|
||||
|
||||
exit_with_metrics(command, log_msg=msg, status=status,
|
||||
exit_code=code)
|
||||
|
||||
@@ -128,7 +138,7 @@ def get_filtered_args(args):
|
||||
|
||||
|
||||
def exit_with_metrics(command, log_msg=None, status=Status.SUCCESS, exit_code=1):
|
||||
if log_msg:
|
||||
if log_msg and command != 'exec':
|
||||
if not exit_code:
|
||||
log.info(log_msg)
|
||||
else:
|
||||
@@ -162,7 +172,8 @@ def dispatch():
|
||||
if options.get("--no-ansi"):
|
||||
if options.get("--ansi"):
|
||||
raise UserError("--no-ansi and --ansi cannot be combined.")
|
||||
log.warning('--no-ansi option is deprecated and will be removed in future versions.')
|
||||
log.warning('--no-ansi option is deprecated and will be removed in future versions. '
|
||||
'Use `--ansi never` instead.')
|
||||
ansi_mode = AnsiMode.NEVER
|
||||
|
||||
setup_console_handler(console_handler,
|
||||
@@ -381,6 +392,7 @@ class TopLevelCommand:
|
||||
--no-interpolate Don't interpolate environment variables.
|
||||
-q, --quiet Only validate the configuration, don't print
|
||||
anything.
|
||||
--profiles Print the profile names, one per line.
|
||||
--services Print the service names, one per line.
|
||||
--volumes Print the volume names, one per line.
|
||||
--hash="*" Print the service config hash, one per line.
|
||||
@@ -400,6 +412,15 @@ class TopLevelCommand:
|
||||
if options['--quiet']:
|
||||
return
|
||||
|
||||
if options['--profiles']:
|
||||
profiles = set()
|
||||
for service in compose_config.services:
|
||||
if 'profiles' in service:
|
||||
for profile in service['profiles']:
|
||||
profiles.add(profile)
|
||||
print('\n'.join(sorted(profiles)))
|
||||
return
|
||||
|
||||
if options['--services']:
|
||||
print('\n'.join(service['name'] for service in compose_config.services))
|
||||
return
|
||||
@@ -691,7 +712,7 @@ class TopLevelCommand:
|
||||
-t, --timestamps Show timestamps.
|
||||
--tail="all" Number of lines to show from the end of the logs
|
||||
for each container.
|
||||
--no-log-prefix Don't print prefix in logs.
|
||||
--no-log-prefix Don't print prefix in logs.
|
||||
"""
|
||||
containers = self.project.containers(service_names=options['SERVICE'], stopped=True)
|
||||
|
||||
@@ -1109,7 +1130,7 @@ class TopLevelCommand:
|
||||
container. Implies --abort-on-container-exit.
|
||||
--scale SERVICE=NUM Scale SERVICE to NUM instances. Overrides the
|
||||
`scale` setting in the Compose file if present.
|
||||
--no-log-prefix Don't print prefix in logs.
|
||||
--no-log-prefix Don't print prefix in logs.
|
||||
"""
|
||||
start_deps = not options['--no-deps']
|
||||
always_recreate_deps = options['--always-recreate-deps']
|
||||
@@ -1121,7 +1142,7 @@ class TopLevelCommand:
|
||||
detached = options.get('--detach')
|
||||
no_start = options.get('--no-start')
|
||||
attach_dependencies = options.get('--attach-dependencies')
|
||||
keep_prefix = not options['--no-log-prefix']
|
||||
keep_prefix = not options.get('--no-log-prefix')
|
||||
|
||||
if detached and (cascade_stop or exit_value_from or attach_dependencies):
|
||||
raise UserError(
|
||||
@@ -1482,7 +1503,7 @@ def log_printer_from_project(
|
||||
keep_prefix=True,
|
||||
):
|
||||
return LogPrinter(
|
||||
containers,
|
||||
[c for c in containers if c.log_driver not in (None, 'none')],
|
||||
build_log_presenters(project.service_names, monochrome, keep_prefix),
|
||||
event_stream or project.events(),
|
||||
cascade_stop=cascade_stop,
|
||||
|
||||
@@ -188,7 +188,7 @@
|
||||
"properties": {
|
||||
"condition": {
|
||||
"type": "string",
|
||||
"enum": ["service_started", "service_healthy"]
|
||||
"enum": ["service_started", "service_healthy", "service_completed_successfully"]
|
||||
}
|
||||
},
|
||||
"required": ["condition"]
|
||||
@@ -335,7 +335,6 @@
|
||||
"read_only": {"type": "boolean"},
|
||||
"restart": {"type": "string"},
|
||||
"runtime": {
|
||||
"deprecated": true,
|
||||
"type": "string"
|
||||
},
|
||||
"scale": {
|
||||
@@ -367,6 +366,7 @@
|
||||
"stdin_open": {"type": "boolean"},
|
||||
"stop_grace_period": {"type": "string", "format": "duration"},
|
||||
"stop_signal": {"type": "string"},
|
||||
"storage_opt": {"type": "object"},
|
||||
"tmpfs": {"$ref": "#/definitions/string_or_list"},
|
||||
"tty": {"type": "boolean"},
|
||||
"ulimits": {
|
||||
|
||||
@@ -10,7 +10,11 @@ from operator import attrgetter
|
||||
from operator import itemgetter
|
||||
|
||||
import yaml
|
||||
from cached_property import cached_property
|
||||
|
||||
try:
|
||||
from functools import cached_property
|
||||
except ImportError:
|
||||
from cached_property import cached_property
|
||||
|
||||
from . import types
|
||||
from ..const import COMPOSE_SPEC as VERSION
|
||||
@@ -149,9 +153,14 @@ DOCKER_VALID_URL_PREFIXES = (
|
||||
SUPPORTED_FILENAMES = [
|
||||
'docker-compose.yml',
|
||||
'docker-compose.yaml',
|
||||
'compose.yml',
|
||||
'compose.yaml',
|
||||
]
|
||||
|
||||
DEFAULT_OVERRIDE_FILENAMES = ('docker-compose.override.yml', 'docker-compose.override.yaml')
|
||||
DEFAULT_OVERRIDE_FILENAMES = ('docker-compose.override.yml',
|
||||
'docker-compose.override.yaml',
|
||||
'compose.override.yml',
|
||||
'compose.override.yaml')
|
||||
|
||||
|
||||
log = logging.getLogger(__name__)
|
||||
@@ -304,7 +313,16 @@ def find(base_dir, filenames, environment, override_dir=None):
|
||||
if filenames:
|
||||
filenames = [os.path.join(base_dir, f) for f in filenames]
|
||||
else:
|
||||
# search for compose files in the base dir and its parents
|
||||
filenames = get_default_config_files(base_dir)
|
||||
if not filenames and not override_dir:
|
||||
# none found in base_dir and no override_dir defined
|
||||
raise ComposeFileNotFound(SUPPORTED_FILENAMES)
|
||||
if not filenames:
|
||||
# search for compose files in the project directory and its parents
|
||||
filenames = get_default_config_files(override_dir)
|
||||
if not filenames:
|
||||
raise ComposeFileNotFound(SUPPORTED_FILENAMES)
|
||||
|
||||
log.debug("Using configuration files: {}".format(",".join(filenames)))
|
||||
return ConfigDetails(
|
||||
@@ -335,7 +353,7 @@ def get_default_config_files(base_dir):
|
||||
(candidates, path) = find_candidates_in_parent_dirs(SUPPORTED_FILENAMES, base_dir)
|
||||
|
||||
if not candidates:
|
||||
raise ComposeFileNotFound(SUPPORTED_FILENAMES)
|
||||
return None
|
||||
|
||||
winner = candidates[0]
|
||||
|
||||
@@ -556,8 +574,7 @@ def process_config_section(config_file, config, section, environment, interpolat
|
||||
config_file.version,
|
||||
config,
|
||||
section,
|
||||
environment
|
||||
)
|
||||
environment)
|
||||
else:
|
||||
return config
|
||||
|
||||
|
||||
@@ -54,9 +54,10 @@ class Environment(dict):
|
||||
if base_dir is None:
|
||||
return result
|
||||
if env_file:
|
||||
env_file_path = os.path.join(base_dir, env_file)
|
||||
else:
|
||||
env_file_path = os.path.join(base_dir, '.env')
|
||||
env_file_path = os.path.join(os.getcwd(), env_file)
|
||||
return cls(env_vars_from_file(env_file_path))
|
||||
|
||||
env_file_path = os.path.join(base_dir, '.env')
|
||||
try:
|
||||
return cls(env_vars_from_file(env_file_path))
|
||||
except EnvFileNotFound:
|
||||
|
||||
@@ -243,6 +243,7 @@ class ConversionMap:
|
||||
service_path('healthcheck', 'disable'): to_boolean,
|
||||
service_path('deploy', 'labels', PATH_JOKER): to_str,
|
||||
service_path('deploy', 'replicas'): to_int,
|
||||
service_path('deploy', 'placement', 'max_replicas_per_node'): to_int,
|
||||
service_path('deploy', 'resources', 'limits', "cpus"): to_float,
|
||||
service_path('deploy', 'update_config', 'parallelism'): to_int,
|
||||
service_path('deploy', 'update_config', 'max_failure_ratio'): to_float,
|
||||
|
||||
@@ -5,6 +5,7 @@ from .version import ComposeVersion
|
||||
DEFAULT_TIMEOUT = 10
|
||||
HTTP_TIMEOUT = 60
|
||||
IS_WINDOWS_PLATFORM = (sys.platform == "win32")
|
||||
IS_LINUX_PLATFORM = (sys.platform == "linux")
|
||||
LABEL_CONTAINER_NUMBER = 'com.docker.compose.container-number'
|
||||
LABEL_ONE_OFF = 'com.docker.compose.oneoff'
|
||||
LABEL_PROJECT = 'com.docker.compose.project'
|
||||
|
||||
@@ -186,11 +186,6 @@ class Container:
|
||||
def log_driver(self):
|
||||
return self.get('HostConfig.LogConfig.Type')
|
||||
|
||||
@property
|
||||
def has_api_logs(self):
|
||||
log_type = self.log_driver
|
||||
return not log_type or log_type in ('json-file', 'journald', 'local')
|
||||
|
||||
@property
|
||||
def human_readable_health_status(self):
|
||||
""" Generate UP status string with up time and health
|
||||
@@ -204,11 +199,7 @@ class Container:
|
||||
return status_string
|
||||
|
||||
def attach_log_stream(self):
|
||||
"""A log stream can only be attached if the container uses a
|
||||
json-file, journald or local log driver.
|
||||
"""
|
||||
if self.has_api_logs:
|
||||
self.log_stream = self.attach(stdout=True, stderr=True, stream=True)
|
||||
self.log_stream = self.attach(stdout=True, stderr=True, stream=True)
|
||||
|
||||
def get(self, key):
|
||||
"""Return a value from the container or None if the value is not set.
|
||||
|
||||
@@ -27,3 +27,8 @@ class NoHealthCheckConfigured(HealthCheckException):
|
||||
service_name
|
||||
)
|
||||
)
|
||||
|
||||
|
||||
class CompletedUnsuccessfully(Exception):
|
||||
def __init__(self, container_id, exit_code):
|
||||
self.msg = 'Container "{}" exited with code {}.'.format(container_id, exit_code)
|
||||
|
||||
@@ -36,7 +36,7 @@ class MetricsCommand(requests.Session):
|
||||
context_type=None, status=Status.SUCCESS,
|
||||
source=MetricsSource.CLI, uri=None):
|
||||
super().__init__()
|
||||
self.command = "compose " + command if command else "compose --help"
|
||||
self.command = ("compose " + command).strip() if command else "compose --help"
|
||||
self.context = context_type or ContextAPI.get_current_context().context_type or 'moby'
|
||||
self.source = source
|
||||
self.status = status.value
|
||||
|
||||
@@ -16,6 +16,7 @@ from compose.cli.colors import green
|
||||
from compose.cli.colors import red
|
||||
from compose.cli.signals import ShutdownException
|
||||
from compose.const import PARALLEL_LIMIT
|
||||
from compose.errors import CompletedUnsuccessfully
|
||||
from compose.errors import HealthCheckFailed
|
||||
from compose.errors import NoHealthCheckConfigured
|
||||
from compose.errors import OperationFailedError
|
||||
@@ -61,7 +62,8 @@ def parallel_execute_watch(events, writer, errors, results, msg, get_name, fail_
|
||||
elif isinstance(exception, APIError):
|
||||
errors[get_name(obj)] = exception.explanation
|
||||
writer.write(msg, get_name(obj), 'error', red)
|
||||
elif isinstance(exception, (OperationFailedError, HealthCheckFailed, NoHealthCheckConfigured)):
|
||||
elif isinstance(exception, (OperationFailedError, HealthCheckFailed, NoHealthCheckConfigured,
|
||||
CompletedUnsuccessfully)):
|
||||
errors[get_name(obj)] = exception.msg
|
||||
writer.write(msg, get_name(obj), 'error', red)
|
||||
elif isinstance(exception, UpstreamError):
|
||||
@@ -241,6 +243,12 @@ def feed_queue(objects, func, get_deps, results, state, limiter):
|
||||
'not processing'.format(obj)
|
||||
)
|
||||
results.put((obj, None, e))
|
||||
except CompletedUnsuccessfully as e:
|
||||
log.debug(
|
||||
'Service(s) upstream of {} did not completed successfully - '
|
||||
'not processing'.format(obj)
|
||||
)
|
||||
results.put((obj, None, e))
|
||||
|
||||
if state.is_done():
|
||||
results.put(STOP)
|
||||
|
||||
@@ -490,8 +490,6 @@ class Project:
|
||||
log.info('%s uses an image, skipping' % service.name)
|
||||
|
||||
if cli:
|
||||
log.info("Building with native build. Learn about native build in Compose here: "
|
||||
"https://docs.docker.com/go/compose-native-build/")
|
||||
if parallel_build:
|
||||
log.warning("Flag '--parallel' is ignored when building with "
|
||||
"COMPOSE_DOCKER_CLI_BUILD=1")
|
||||
@@ -651,10 +649,6 @@ class Project:
|
||||
override_options=None,
|
||||
):
|
||||
|
||||
if cli:
|
||||
log.info("Building with native build. Learn about native build in Compose here: "
|
||||
"https://docs.docker.com/go/compose-native-build/")
|
||||
|
||||
self.initialize()
|
||||
if not ignore_orphans:
|
||||
self.find_orphan_containers(remove_orphans)
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
import enum
|
||||
import itertools
|
||||
import json
|
||||
import logging
|
||||
import os
|
||||
import re
|
||||
@@ -45,6 +44,7 @@ from .const import LABEL_VERSION
|
||||
from .const import NANOCPUS_SCALE
|
||||
from .const import WINDOWS_LONGPATH_PREFIX
|
||||
from .container import Container
|
||||
from .errors import CompletedUnsuccessfully
|
||||
from .errors import HealthCheckFailed
|
||||
from .errors import NoHealthCheckConfigured
|
||||
from .errors import OperationFailedError
|
||||
@@ -112,6 +112,7 @@ HOST_CONFIG_KEYS = [
|
||||
|
||||
CONDITION_STARTED = 'service_started'
|
||||
CONDITION_HEALTHY = 'service_healthy'
|
||||
CONDITION_COMPLETED_SUCCESSFULLY = 'service_completed_successfully'
|
||||
|
||||
|
||||
class BuildError(Exception):
|
||||
@@ -712,6 +713,7 @@ class Service:
|
||||
'image_id': image_id(),
|
||||
'links': self.get_link_names(),
|
||||
'net': self.network_mode.id,
|
||||
'ipc_mode': self.ipc_mode.mode,
|
||||
'networks': self.networks,
|
||||
'secrets': self.secrets,
|
||||
'volumes_from': [
|
||||
@@ -753,6 +755,8 @@ class Service:
|
||||
configs[svc] = lambda s: True
|
||||
elif config['condition'] == CONDITION_HEALTHY:
|
||||
configs[svc] = lambda s: s.is_healthy()
|
||||
elif config['condition'] == CONDITION_COMPLETED_SUCCESSFULLY:
|
||||
configs[svc] = lambda s: s.is_completed_successfully()
|
||||
else:
|
||||
# The config schema already prevents this, but it might be
|
||||
# bypassed if Compose is called programmatically.
|
||||
@@ -1103,8 +1107,9 @@ class Service:
|
||||
'Impossible to perform platform-targeted builds for API version < 1.35'
|
||||
)
|
||||
|
||||
builder = self.client if not cli else _CLIBuilder(progress)
|
||||
build_output = builder.build(
|
||||
builder = _ClientBuilder(self.client) if not cli else _CLIBuilder(progress)
|
||||
return builder.build(
|
||||
service=self,
|
||||
path=path,
|
||||
tag=self.image_name,
|
||||
rm=rm,
|
||||
@@ -1125,30 +1130,7 @@ class Service:
|
||||
gzip=gzip,
|
||||
isolation=build_opts.get('isolation', self.options.get('isolation', None)),
|
||||
platform=self.platform,
|
||||
)
|
||||
|
||||
try:
|
||||
all_events = list(stream_output(build_output, output_stream))
|
||||
except StreamOutputError as e:
|
||||
raise BuildError(self, str(e))
|
||||
|
||||
# Ensure the HTTP connection is not reused for another
|
||||
# streaming command, as the Docker daemon can sometimes
|
||||
# complain about it
|
||||
self.client.close()
|
||||
|
||||
image_id = None
|
||||
|
||||
for event in all_events:
|
||||
if 'stream' in event:
|
||||
match = re.search(r'Successfully built ([0-9a-f]+)', event.get('stream', ''))
|
||||
if match:
|
||||
image_id = match.group(1)
|
||||
|
||||
if image_id is None:
|
||||
raise BuildError(self, event if all_events else 'Unknown')
|
||||
|
||||
return image_id
|
||||
output_stream=output_stream)
|
||||
|
||||
def get_cache_from(self, build_opts):
|
||||
cache_from = build_opts.get('cache_from', None)
|
||||
@@ -1304,6 +1286,21 @@ class Service:
|
||||
raise HealthCheckFailed(ctnr.short_id)
|
||||
return result
|
||||
|
||||
def is_completed_successfully(self):
|
||||
""" Check that all containers for this service has completed successfully
|
||||
Returns false if at least one container does not exited and
|
||||
raises CompletedUnsuccessfully exception if at least one container
|
||||
exited with non-zero exit code.
|
||||
"""
|
||||
result = True
|
||||
for ctnr in self.containers(stopped=True):
|
||||
ctnr.inspect()
|
||||
if ctnr.get('State.Status') != 'exited':
|
||||
result = False
|
||||
elif ctnr.exit_code != 0:
|
||||
raise CompletedUnsuccessfully(ctnr.short_id, ctnr.exit_code)
|
||||
return result
|
||||
|
||||
def _parse_proxy_config(self):
|
||||
client = self.client
|
||||
if 'proxies' not in client._general_configs:
|
||||
@@ -1790,20 +1787,77 @@ def rewrite_build_path(path):
|
||||
return path
|
||||
|
||||
|
||||
class _CLIBuilder:
|
||||
def __init__(self, progress):
|
||||
self._progress = progress
|
||||
class _ClientBuilder:
|
||||
def __init__(self, client):
|
||||
self.client = client
|
||||
|
||||
def build(self, path, tag=None, quiet=False, fileobj=None,
|
||||
def build(self, service, path, tag=None, quiet=False, fileobj=None,
|
||||
nocache=False, rm=False, timeout=None,
|
||||
custom_context=False, encoding=None, pull=False,
|
||||
forcerm=False, dockerfile=None, container_limits=None,
|
||||
decode=False, buildargs=None, gzip=False, shmsize=None,
|
||||
labels=None, cache_from=None, target=None, network_mode=None,
|
||||
squash=None, extra_hosts=None, platform=None, isolation=None,
|
||||
use_config_proxy=True):
|
||||
use_config_proxy=True, output_stream=sys.stdout):
|
||||
build_output = self.client.build(
|
||||
path=path,
|
||||
tag=tag,
|
||||
nocache=nocache,
|
||||
rm=rm,
|
||||
pull=pull,
|
||||
forcerm=forcerm,
|
||||
dockerfile=dockerfile,
|
||||
labels=labels,
|
||||
cache_from=cache_from,
|
||||
buildargs=buildargs,
|
||||
network_mode=network_mode,
|
||||
target=target,
|
||||
shmsize=shmsize,
|
||||
extra_hosts=extra_hosts,
|
||||
container_limits=container_limits,
|
||||
gzip=gzip,
|
||||
isolation=isolation,
|
||||
platform=platform)
|
||||
|
||||
try:
|
||||
all_events = list(stream_output(build_output, output_stream))
|
||||
except StreamOutputError as e:
|
||||
raise BuildError(service, str(e))
|
||||
|
||||
# Ensure the HTTP connection is not reused for another
|
||||
# streaming command, as the Docker daemon can sometimes
|
||||
# complain about it
|
||||
self.client.close()
|
||||
|
||||
image_id = None
|
||||
|
||||
for event in all_events:
|
||||
if 'stream' in event:
|
||||
match = re.search(r'Successfully built ([0-9a-f]+)', event.get('stream', ''))
|
||||
if match:
|
||||
image_id = match.group(1)
|
||||
|
||||
if image_id is None:
|
||||
raise BuildError(service, event if all_events else 'Unknown')
|
||||
|
||||
return image_id
|
||||
|
||||
|
||||
class _CLIBuilder:
|
||||
def __init__(self, progress):
|
||||
self._progress = progress
|
||||
|
||||
def build(self, service, path, tag=None, quiet=False, fileobj=None,
|
||||
nocache=False, rm=False, timeout=None,
|
||||
custom_context=False, encoding=None, pull=False,
|
||||
forcerm=False, dockerfile=None, container_limits=None,
|
||||
decode=False, buildargs=None, gzip=False, shmsize=None,
|
||||
labels=None, cache_from=None, target=None, network_mode=None,
|
||||
squash=None, extra_hosts=None, platform=None, isolation=None,
|
||||
use_config_proxy=True, output_stream=sys.stdout):
|
||||
"""
|
||||
Args:
|
||||
service (str): Service to be built
|
||||
path (str): Path to the directory containing the Dockerfile
|
||||
buildargs (dict): A dictionary of build arguments
|
||||
cache_from (:py:class:`list`): A list of images used for build
|
||||
@@ -1852,10 +1906,11 @@ class _CLIBuilder:
|
||||
configuration file (``~/.docker/config.json`` by default)
|
||||
contains a proxy configuration, the corresponding environment
|
||||
variables will be set in the container being built.
|
||||
output_stream (writer): stream to use for build logs
|
||||
Returns:
|
||||
A generator for the build output.
|
||||
"""
|
||||
if dockerfile:
|
||||
if dockerfile and os.path.isdir(path):
|
||||
dockerfile = os.path.join(path, dockerfile)
|
||||
iidfile = tempfile.mktemp()
|
||||
|
||||
@@ -1873,35 +1928,29 @@ class _CLIBuilder:
|
||||
command_builder.add_arg("--tag", tag)
|
||||
command_builder.add_arg("--target", target)
|
||||
command_builder.add_arg("--iidfile", iidfile)
|
||||
command_builder.add_arg("--platform", platform)
|
||||
command_builder.add_arg("--isolation", isolation)
|
||||
|
||||
if extra_hosts:
|
||||
if isinstance(extra_hosts, dict):
|
||||
extra_hosts = ["{}:{}".format(host, ip) for host, ip in extra_hosts.items()]
|
||||
for host in extra_hosts:
|
||||
command_builder.add_arg("--add-host", "{}".format(host))
|
||||
|
||||
args = command_builder.build([path])
|
||||
|
||||
magic_word = "Successfully built "
|
||||
appear = False
|
||||
with subprocess.Popen(args, stdout=subprocess.PIPE,
|
||||
with subprocess.Popen(args, stdout=output_stream, stderr=sys.stderr,
|
||||
universal_newlines=True) as p:
|
||||
while True:
|
||||
line = p.stdout.readline()
|
||||
if not line:
|
||||
break
|
||||
if line.startswith(magic_word):
|
||||
appear = True
|
||||
yield json.dumps({"stream": line})
|
||||
|
||||
p.communicate()
|
||||
if p.returncode != 0:
|
||||
raise StreamOutputError()
|
||||
raise BuildError(service, "Build failed")
|
||||
|
||||
with open(iidfile) as f:
|
||||
line = f.readline()
|
||||
image_id = line.split(":")[1].strip()
|
||||
os.remove(iidfile)
|
||||
|
||||
# In case of `DOCKER_BUILDKIT=1`
|
||||
# there is no success message already present in the output.
|
||||
# Since that's the way `Service::build` gets the `image_id`
|
||||
# it has to be added `manually`
|
||||
if not appear:
|
||||
yield json.dumps({"stream": "{}{}\n".format(magic_word, image_id)})
|
||||
return image_id
|
||||
|
||||
|
||||
class _CommandBuilder:
|
||||
|
||||
@@ -138,7 +138,7 @@ _docker_compose_config() {
|
||||
;;
|
||||
esac
|
||||
|
||||
COMPREPLY=( $( compgen -W "--hash --help --no-interpolate --quiet -q --resolve-image-digests --services --volumes" -- "$cur" ) )
|
||||
COMPREPLY=( $( compgen -W "--hash --help --no-interpolate --profiles --quiet -q --resolve-image-digests --services --volumes" -- "$cur" ) )
|
||||
}
|
||||
|
||||
|
||||
@@ -172,6 +172,10 @@ _docker_compose_docker_compose() {
|
||||
COMPREPLY=( $( compgen -W "debug info warning error critical" -- "$cur" ) )
|
||||
return
|
||||
;;
|
||||
--profile)
|
||||
COMPREPLY=( $( compgen -W "$(__docker_compose_q config --profiles)" -- "$cur" ) )
|
||||
return
|
||||
;;
|
||||
--project-directory)
|
||||
_filedir -d
|
||||
return
|
||||
@@ -294,7 +298,7 @@ _docker_compose_logs() {
|
||||
|
||||
case "$cur" in
|
||||
-*)
|
||||
COMPREPLY=( $( compgen -W "--follow -f --help --no-color --tail --timestamps -t" -- "$cur" ) )
|
||||
COMPREPLY=( $( compgen -W "--follow -f --help --no-color --no-log-prefix --tail --timestamps -t" -- "$cur" ) )
|
||||
;;
|
||||
*)
|
||||
__docker_compose_complete_services
|
||||
@@ -549,7 +553,7 @@ _docker_compose_up() {
|
||||
|
||||
case "$cur" in
|
||||
-*)
|
||||
COMPREPLY=( $( compgen -W "--abort-on-container-exit --always-recreate-deps --attach-dependencies --build -d --detach --exit-code-from --force-recreate --help --no-build --no-color --no-deps --no-recreate --no-start --renew-anon-volumes -V --remove-orphans --scale --timeout -t" -- "$cur" ) )
|
||||
COMPREPLY=( $( compgen -W "--abort-on-container-exit --always-recreate-deps --attach-dependencies --build -d --detach --exit-code-from --force-recreate --help --no-build --no-color --no-deps --no-log-prefix --no-recreate --no-start --renew-anon-volumes -V --remove-orphans --scale --timeout -t" -- "$cur" ) )
|
||||
;;
|
||||
*)
|
||||
__docker_compose_complete_services
|
||||
@@ -618,10 +622,11 @@ _docker_compose() {
|
||||
--tlskey
|
||||
"
|
||||
|
||||
# These options are require special treatment when searching the command.
|
||||
# These options require special treatment when searching the command.
|
||||
local top_level_options_with_args="
|
||||
--ansi
|
||||
--log-level
|
||||
--profile
|
||||
"
|
||||
|
||||
COMPREPLY=()
|
||||
|
||||
@@ -22,6 +22,6 @@ complete -c docker-compose -l tlskey -r -d 'Path to TLS key fi
|
||||
complete -c docker-compose -l tlsverify -d 'Use TLS and verify the remote'
|
||||
complete -c docker-compose -l skip-hostname-check -d "Don't check the daemon's hostname against the name specified in the client certificate (for example if your docker host is an IP address)"
|
||||
complete -c docker-compose -l no-ansi -d 'Do not print ANSI control characters'
|
||||
complete -c docker-compose -l ansi -a never always auto -d 'Control when to print ANSI control characters'
|
||||
complete -c docker-compose -l ansi -a 'never always auto' -d 'Control when to print ANSI control characters'
|
||||
complete -c docker-compose -s h -l help -d 'Print usage'
|
||||
complete -c docker-compose -s v -l version -d 'Print version and exit'
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
Click==7.1.2
|
||||
coverage==5.2.1
|
||||
coverage==5.5
|
||||
ddt==1.4.1
|
||||
flake8==3.8.3
|
||||
gitpython==3.1.11
|
||||
@@ -7,4 +7,3 @@ mock==3.0.5
|
||||
pytest==6.0.1; python_version >= '3.5'
|
||||
pytest==4.6.5; python_version < '3.5'
|
||||
pytest-cov==2.10.1
|
||||
PyYAML==5.3.1
|
||||
|
||||
@@ -3,7 +3,7 @@ appdirs==1.4.4
|
||||
attrs==20.3.0
|
||||
bcrypt==3.2.0
|
||||
cffi==1.14.4
|
||||
cryptography==3.2.1
|
||||
cryptography==3.3.2
|
||||
distlib==0.3.1
|
||||
entrypoints==0.3
|
||||
filelock==3.0.12
|
||||
@@ -11,9 +11,9 @@ gitdb2==4.0.2
|
||||
mccabe==0.6.1
|
||||
more-itertools==8.6.0; python_version >= '3.5'
|
||||
more-itertools==5.0.0; python_version < '3.5'
|
||||
packaging==20.4
|
||||
packaging==20.9
|
||||
pluggy==0.13.1
|
||||
py==1.9.0
|
||||
py==1.10.0
|
||||
pycodestyle==2.6.0
|
||||
pycparser==2.20
|
||||
pyflakes==2.2.0
|
||||
@@ -23,6 +23,6 @@ pyrsistent==0.16.0
|
||||
smmap==3.0.4
|
||||
smmap2==3.0.1
|
||||
toml==0.10.1
|
||||
tox==3.20.1
|
||||
virtualenv==20.2.2
|
||||
tox==3.21.2
|
||||
virtualenv==20.4.0
|
||||
wcwidth==0.2.5
|
||||
|
||||
@@ -1,10 +1,10 @@
|
||||
backports.shutil_get_terminal_size==1.0.0
|
||||
cached-property==1.5.1
|
||||
cached-property==1.5.1; python_version < '3.8'
|
||||
certifi==2020.6.20
|
||||
chardet==3.0.4
|
||||
colorama==0.4.3; sys_platform == 'win32'
|
||||
distro==1.5.0
|
||||
docker==4.4.1
|
||||
docker==5.0.0
|
||||
docker-pycreds==0.4.0
|
||||
dockerpty==0.4.1
|
||||
docopt==0.6.2
|
||||
@@ -13,9 +13,9 @@ ipaddress==1.0.23
|
||||
jsonschema==3.2.0
|
||||
paramiko==2.7.1
|
||||
PySocks==1.7.1
|
||||
python-dotenv==0.14.0
|
||||
python-dotenv==0.17.0
|
||||
pywin32==227; sys_platform == 'win32'
|
||||
PyYAML==5.3.1
|
||||
PyYAML==5.4.1
|
||||
requests==2.24.0
|
||||
texttable==1.6.2
|
||||
urllib3==1.25.10; python_version == '3.3'
|
||||
|
||||
@@ -3,7 +3,7 @@
|
||||
set -ex
|
||||
|
||||
CODE_PATH=/code
|
||||
VENV="${CODE_PATH}"/.tox/py39
|
||||
VENV="${CODE_PATH}"/.tox/py37
|
||||
|
||||
cd "${CODE_PATH}"
|
||||
mkdir -p dist
|
||||
|
||||
0
script/release/release.py
Normal file → Executable file
0
script/release/release.py
Normal file → Executable file
@@ -15,7 +15,7 @@
|
||||
|
||||
set -e
|
||||
|
||||
VERSION="1.26.1"
|
||||
VERSION="1.29.2"
|
||||
IMAGE="docker/compose:$VERSION"
|
||||
|
||||
|
||||
|
||||
@@ -11,7 +11,7 @@ docker run --rm \
|
||||
"$TAG" tox -e pre-commit
|
||||
|
||||
get_versions="docker run --rm
|
||||
--entrypoint=/code/.tox/py39/bin/python
|
||||
--entrypoint=/code/.tox/py37/bin/python
|
||||
$TAG
|
||||
/code/script/test/versions.py docker/docker-ce,moby/moby"
|
||||
|
||||
@@ -22,7 +22,7 @@ elif [ "$DOCKER_VERSIONS" == "all" ]; then
|
||||
fi
|
||||
|
||||
BUILD_NUMBER=${BUILD_NUMBER-$USER}
|
||||
PY_TEST_VERSIONS=${PY_TEST_VERSIONS:-py39}
|
||||
PY_TEST_VERSIONS=${PY_TEST_VERSIONS:-py37}
|
||||
|
||||
for version in $DOCKER_VERSIONS; do
|
||||
>&2 echo "Running tests against Docker $version"
|
||||
@@ -38,17 +38,23 @@ for version in $DOCKER_VERSIONS; do
|
||||
|
||||
trap "on_exit" EXIT
|
||||
|
||||
repo="dockerswarm/dind"
|
||||
|
||||
docker run \
|
||||
-d \
|
||||
--name "$daemon_container" \
|
||||
--privileged \
|
||||
--volume="/var/lib/docker" \
|
||||
"$repo:$version" \
|
||||
-e "DOCKER_TLS_CERTDIR=" \
|
||||
"docker:$version-dind" \
|
||||
dockerd -H tcp://0.0.0.0:2375 $DOCKER_DAEMON_ARGS \
|
||||
2>&1 | tail -n 10
|
||||
|
||||
docker exec "$daemon_container" sh -c "apk add --no-cache git"
|
||||
|
||||
# copy docker config from host for authentication with Docker Hub
|
||||
docker exec "$daemon_container" sh -c "mkdir /root/.docker"
|
||||
docker cp /root/.docker/config.json $daemon_container:/root/.docker/config.json
|
||||
docker exec "$daemon_container" sh -c "chmod 644 /root/.docker/config.json"
|
||||
|
||||
docker run \
|
||||
--rm \
|
||||
--tty \
|
||||
|
||||
4
setup.py
4
setup.py
@@ -25,14 +25,13 @@ def find_version(*file_paths):
|
||||
|
||||
|
||||
install_requires = [
|
||||
'cached-property >= 1.2.0, < 2',
|
||||
'docopt >= 0.6.1, < 1',
|
||||
'PyYAML >= 3.10, < 6',
|
||||
'requests >= 2.20.0, < 3',
|
||||
'texttable >= 0.9.0, < 2',
|
||||
'websocket-client >= 0.32.0, < 1',
|
||||
'distro >= 1.5.0, < 2',
|
||||
'docker[ssh] >= 4.4.0, < 5',
|
||||
'docker[ssh] >= 5',
|
||||
'dockerpty >= 0.4.1, < 1',
|
||||
'jsonschema >= 2.5.1, < 4',
|
||||
'python-dotenv >= 0.13.0, < 1',
|
||||
@@ -50,6 +49,7 @@ if sys.version_info[:2] < (3, 4):
|
||||
|
||||
extras_require = {
|
||||
':python_version < "3.5"': ['backports.ssl_match_hostname >= 3.5, < 4'],
|
||||
':python_version < "3.8"': ['cached-property >= 1.2.0, < 2'],
|
||||
':sys_platform == "win32"': ['colorama >= 0.4, < 1'],
|
||||
'socks': ['PySocks >= 1.5.6, != 1.5.7, < 2'],
|
||||
'tests': tests_require,
|
||||
|
||||
@@ -237,6 +237,11 @@ class CLITestCase(DockerClientTestCase):
|
||||
result = self.dispatch(['-H=tcp://doesnotexist:8000', 'ps'], returncode=1)
|
||||
assert "Couldn't connect to Docker daemon" in result.stderr
|
||||
|
||||
def test_config_list_profiles(self):
|
||||
self.base_dir = 'tests/fixtures/config-profiles'
|
||||
result = self.dispatch(['config', '--profiles'])
|
||||
assert set(result.stdout.rstrip().split('\n')) == {'debug', 'frontend', 'gui'}
|
||||
|
||||
def test_config_list_services(self):
|
||||
self.base_dir = 'tests/fixtures/v2-full'
|
||||
result = self.dispatch(['config', '--services'])
|
||||
|
||||
15
tests/fixtures/config-profiles/docker-compose.yml
vendored
Normal file
15
tests/fixtures/config-profiles/docker-compose.yml
vendored
Normal file
@@ -0,0 +1,15 @@
|
||||
version: '3.8'
|
||||
services:
|
||||
frontend:
|
||||
image: frontend
|
||||
profiles: ["frontend", "gui"]
|
||||
phpmyadmin:
|
||||
image: phpmyadmin
|
||||
depends_on:
|
||||
- db
|
||||
profiles:
|
||||
- debug
|
||||
backend:
|
||||
image: backend
|
||||
db:
|
||||
image: mysql
|
||||
1
tests/fixtures/env-file-override/.env
vendored
Normal file
1
tests/fixtures/env-file-override/.env
vendored
Normal file
@@ -0,0 +1 @@
|
||||
WHEREAMI=default
|
||||
@@ -1,5 +1,6 @@
|
||||
import tempfile
|
||||
|
||||
import pytest
|
||||
from ddt import data
|
||||
from ddt import ddt
|
||||
|
||||
@@ -8,6 +9,7 @@ from ..acceptance.cli_test import dispatch
|
||||
from compose.cli.command import get_project
|
||||
from compose.cli.command import project_from_options
|
||||
from compose.config.environment import Environment
|
||||
from compose.config.errors import EnvFileNotFound
|
||||
from tests.integration.testcases import DockerClientTestCase
|
||||
|
||||
|
||||
@@ -55,13 +57,36 @@ services:
|
||||
class EnvironmentOverrideFileTest(DockerClientTestCase):
|
||||
def test_env_file_override(self):
|
||||
base_dir = 'tests/fixtures/env-file-override'
|
||||
# '--env-file' are relative to the current working dir
|
||||
env = Environment.from_env_file(base_dir, base_dir+'/.env.override')
|
||||
dispatch(base_dir, ['--env-file', '.env.override', 'up'])
|
||||
project = get_project(project_dir=base_dir,
|
||||
config_path=['docker-compose.yml'],
|
||||
environment=Environment.from_env_file(base_dir, '.env.override'),
|
||||
environment=env,
|
||||
override_dir=base_dir)
|
||||
containers = project.containers(stopped=True)
|
||||
assert len(containers) == 1
|
||||
assert "WHEREAMI=override" in containers[0].get('Config.Env')
|
||||
assert "DEFAULT_CONF_LOADED=true" in containers[0].get('Config.Env')
|
||||
dispatch(base_dir, ['--env-file', '.env.override', 'down'], None)
|
||||
|
||||
def test_env_file_not_found_error(self):
|
||||
base_dir = 'tests/fixtures/env-file-override'
|
||||
with pytest.raises(EnvFileNotFound) as excinfo:
|
||||
Environment.from_env_file(base_dir, '.env.override')
|
||||
|
||||
assert "Couldn't find env file" in excinfo.exconly()
|
||||
|
||||
def test_dot_env_file(self):
|
||||
base_dir = 'tests/fixtures/env-file-override'
|
||||
# '.env' is relative to the project_dir (base_dir)
|
||||
env = Environment.from_env_file(base_dir, None)
|
||||
dispatch(base_dir, ['up'])
|
||||
project = get_project(project_dir=base_dir,
|
||||
config_path=['docker-compose.yml'],
|
||||
environment=env,
|
||||
override_dir=base_dir)
|
||||
containers = project.containers(stopped=True)
|
||||
assert len(containers) == 1
|
||||
assert "WHEREAMI=default" in containers[0].get('Config.Env')
|
||||
dispatch(base_dir, ['down'], None)
|
||||
|
||||
@@ -25,6 +25,7 @@ from compose.const import COMPOSE_SPEC as VERSION
|
||||
from compose.const import LABEL_PROJECT
|
||||
from compose.const import LABEL_SERVICE
|
||||
from compose.container import Container
|
||||
from compose.errors import CompletedUnsuccessfully
|
||||
from compose.errors import HealthCheckFailed
|
||||
from compose.errors import NoHealthCheckConfigured
|
||||
from compose.project import Project
|
||||
@@ -1899,6 +1900,106 @@ class ProjectTest(DockerClientTestCase):
|
||||
with pytest.raises(NoHealthCheckConfigured):
|
||||
svc1.is_healthy()
|
||||
|
||||
def test_project_up_completed_successfully_dependency(self):
|
||||
config_dict = {
|
||||
'version': '2.1',
|
||||
'services': {
|
||||
'svc1': {
|
||||
'image': BUSYBOX_IMAGE_WITH_TAG,
|
||||
'command': 'true'
|
||||
},
|
||||
'svc2': {
|
||||
'image': BUSYBOX_IMAGE_WITH_TAG,
|
||||
'command': 'top',
|
||||
'depends_on': {
|
||||
'svc1': {'condition': 'service_completed_successfully'},
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
config_data = load_config(config_dict)
|
||||
project = Project.from_config(
|
||||
name='composetest', config_data=config_data, client=self.client
|
||||
)
|
||||
project.up()
|
||||
|
||||
svc1 = project.get_service('svc1')
|
||||
svc2 = project.get_service('svc2')
|
||||
|
||||
assert 'svc1' in svc2.get_dependency_names()
|
||||
assert svc2.containers()[0].is_running
|
||||
assert len(svc1.containers()) == 0
|
||||
assert svc1.is_completed_successfully()
|
||||
|
||||
def test_project_up_completed_unsuccessfully_dependency(self):
|
||||
config_dict = {
|
||||
'version': '2.1',
|
||||
'services': {
|
||||
'svc1': {
|
||||
'image': BUSYBOX_IMAGE_WITH_TAG,
|
||||
'command': 'false'
|
||||
},
|
||||
'svc2': {
|
||||
'image': BUSYBOX_IMAGE_WITH_TAG,
|
||||
'command': 'top',
|
||||
'depends_on': {
|
||||
'svc1': {'condition': 'service_completed_successfully'},
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
config_data = load_config(config_dict)
|
||||
project = Project.from_config(
|
||||
name='composetest', config_data=config_data, client=self.client
|
||||
)
|
||||
with pytest.raises(ProjectError):
|
||||
project.up()
|
||||
|
||||
svc1 = project.get_service('svc1')
|
||||
svc2 = project.get_service('svc2')
|
||||
assert 'svc1' in svc2.get_dependency_names()
|
||||
assert len(svc2.containers()) == 0
|
||||
with pytest.raises(CompletedUnsuccessfully):
|
||||
svc1.is_completed_successfully()
|
||||
|
||||
def test_project_up_completed_differently_dependencies(self):
|
||||
config_dict = {
|
||||
'version': '2.1',
|
||||
'services': {
|
||||
'svc1': {
|
||||
'image': BUSYBOX_IMAGE_WITH_TAG,
|
||||
'command': 'true'
|
||||
},
|
||||
'svc2': {
|
||||
'image': BUSYBOX_IMAGE_WITH_TAG,
|
||||
'command': 'false'
|
||||
},
|
||||
'svc3': {
|
||||
'image': BUSYBOX_IMAGE_WITH_TAG,
|
||||
'command': 'top',
|
||||
'depends_on': {
|
||||
'svc1': {'condition': 'service_completed_successfully'},
|
||||
'svc2': {'condition': 'service_completed_successfully'},
|
||||
}
|
||||
}
|
||||
}
|
||||
}
|
||||
config_data = load_config(config_dict)
|
||||
project = Project.from_config(
|
||||
name='composetest', config_data=config_data, client=self.client
|
||||
)
|
||||
with pytest.raises(ProjectError):
|
||||
project.up()
|
||||
|
||||
svc1 = project.get_service('svc1')
|
||||
svc2 = project.get_service('svc2')
|
||||
svc3 = project.get_service('svc3')
|
||||
assert ['svc1', 'svc2'] == svc3.get_dependency_names()
|
||||
assert svc1.is_completed_successfully()
|
||||
assert len(svc3.containers()) == 0
|
||||
with pytest.raises(CompletedUnsuccessfully):
|
||||
svc2.is_completed_successfully()
|
||||
|
||||
def test_project_up_seccomp_profile(self):
|
||||
seccomp_data = {
|
||||
'defaultAction': 'SCMP_ACT_ALLOW',
|
||||
|
||||
@@ -8,7 +8,6 @@ from docker.errors import APIError
|
||||
|
||||
from compose.cli.log_printer import build_log_generator
|
||||
from compose.cli.log_printer import build_log_presenters
|
||||
from compose.cli.log_printer import build_no_log_generator
|
||||
from compose.cli.log_printer import consume_queue
|
||||
from compose.cli.log_printer import QueueItem
|
||||
from compose.cli.log_printer import wait_on_exit
|
||||
@@ -75,14 +74,6 @@ def test_wait_on_exit_raises():
|
||||
assert expected in wait_on_exit(mock_container)
|
||||
|
||||
|
||||
def test_build_no_log_generator(mock_container):
|
||||
mock_container.has_api_logs = False
|
||||
mock_container.log_driver = 'none'
|
||||
output, = build_no_log_generator(mock_container, None)
|
||||
assert "WARNING: no logs are available with the 'none' log driver\n" in output
|
||||
assert "exited with code" not in output
|
||||
|
||||
|
||||
class TestBuildLogGenerator:
|
||||
|
||||
def test_no_log_stream(self, mock_container):
|
||||
|
||||
@@ -2397,7 +2397,8 @@ web:
|
||||
'image': 'busybox',
|
||||
'depends_on': {
|
||||
'app1': {'condition': 'service_started'},
|
||||
'app2': {'condition': 'service_healthy'}
|
||||
'app2': {'condition': 'service_healthy'},
|
||||
'app3': {'condition': 'service_completed_successfully'}
|
||||
}
|
||||
}
|
||||
override = {}
|
||||
@@ -2409,11 +2410,12 @@ web:
|
||||
'image': 'busybox',
|
||||
'depends_on': {
|
||||
'app1': {'condition': 'service_started'},
|
||||
'app2': {'condition': 'service_healthy'}
|
||||
'app2': {'condition': 'service_healthy'},
|
||||
'app3': {'condition': 'service_completed_successfully'}
|
||||
}
|
||||
}
|
||||
override = {
|
||||
'depends_on': ['app3']
|
||||
'depends_on': ['app4']
|
||||
}
|
||||
|
||||
actual = config.merge_service_dicts(base, override, VERSION)
|
||||
@@ -2422,7 +2424,8 @@ web:
|
||||
'depends_on': {
|
||||
'app1': {'condition': 'service_started'},
|
||||
'app2': {'condition': 'service_healthy'},
|
||||
'app3': {'condition': 'service_started'}
|
||||
'app3': {'condition': 'service_completed_successfully'},
|
||||
'app4': {'condition': 'service_started'},
|
||||
}
|
||||
}
|
||||
|
||||
@@ -3567,9 +3570,11 @@ class InterpolationTest(unittest.TestCase):
|
||||
@mock.patch.dict(os.environ)
|
||||
def test_config_file_with_options_environment_file(self):
|
||||
project_dir = 'tests/fixtures/default-env-file'
|
||||
# env-file is relative to current working dir
|
||||
env = Environment.from_env_file(project_dir, project_dir + '/.env2')
|
||||
service_dicts = config.load(
|
||||
config.find(
|
||||
project_dir, None, Environment.from_env_file(project_dir, '.env2')
|
||||
project_dir, None, env
|
||||
)
|
||||
).services
|
||||
|
||||
@@ -5233,6 +5238,8 @@ class GetDefaultConfigFilesTestCase(unittest.TestCase):
|
||||
files = [
|
||||
'docker-compose.yml',
|
||||
'docker-compose.yaml',
|
||||
'compose.yml',
|
||||
'compose.yaml',
|
||||
]
|
||||
|
||||
def test_get_config_path_default_file_in_basedir(self):
|
||||
@@ -5266,8 +5273,10 @@ def get_config_filename_for_files(filenames, subdir=None):
|
||||
base_dir = tempfile.mkdtemp(dir=project_dir)
|
||||
else:
|
||||
base_dir = project_dir
|
||||
filename, = config.get_default_config_files(base_dir)
|
||||
return os.path.basename(filename)
|
||||
filenames = config.get_default_config_files(base_dir)
|
||||
if not filenames:
|
||||
raise config.ComposeFileNotFound(config.SUPPORTED_FILENAMES)
|
||||
return os.path.basename(filenames[0])
|
||||
finally:
|
||||
shutil.rmtree(project_dir)
|
||||
|
||||
|
||||
@@ -221,34 +221,6 @@ class ContainerTest(unittest.TestCase):
|
||||
container = Container(None, self.container_dict, has_been_inspected=True)
|
||||
assert container.short_id == self.container_id[:12]
|
||||
|
||||
def test_has_api_logs(self):
|
||||
container_dict = {
|
||||
'HostConfig': {
|
||||
'LogConfig': {
|
||||
'Type': 'json-file'
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
container = Container(None, container_dict, has_been_inspected=True)
|
||||
assert container.has_api_logs is True
|
||||
|
||||
container_dict['HostConfig']['LogConfig']['Type'] = 'none'
|
||||
container = Container(None, container_dict, has_been_inspected=True)
|
||||
assert container.has_api_logs is False
|
||||
|
||||
container_dict['HostConfig']['LogConfig']['Type'] = 'syslog'
|
||||
container = Container(None, container_dict, has_been_inspected=True)
|
||||
assert container.has_api_logs is False
|
||||
|
||||
container_dict['HostConfig']['LogConfig']['Type'] = 'journald'
|
||||
container = Container(None, container_dict, has_been_inspected=True)
|
||||
assert container.has_api_logs is True
|
||||
|
||||
container_dict['HostConfig']['LogConfig']['Type'] = 'foobar'
|
||||
container = Container(None, container_dict, has_been_inspected=True)
|
||||
assert container.has_api_logs is False
|
||||
|
||||
|
||||
class GetContainerNameTestCase(unittest.TestCase):
|
||||
|
||||
|
||||
@@ -330,7 +330,7 @@ class ServiceTest(unittest.TestCase):
|
||||
assert service.options['environment'] == environment
|
||||
|
||||
assert opts['labels'][LABEL_CONFIG_HASH] == \
|
||||
'689149e6041a85f6fb4945a2146a497ed43c8a5cbd8991753d875b165f1b4de4'
|
||||
'6da0f3ec0d5adf901de304bdc7e0ee44ec5dd7adb08aebc20fe0dd791d4ee5a8'
|
||||
assert opts['environment'] == ['also=real']
|
||||
|
||||
def test_get_container_create_options_sets_affinity_with_binds(self):
|
||||
@@ -700,6 +700,7 @@ class ServiceTest(unittest.TestCase):
|
||||
config_dict = service.config_dict()
|
||||
expected = {
|
||||
'image_id': 'abcd',
|
||||
'ipc_mode': None,
|
||||
'options': {'image': 'example.com/foo'},
|
||||
'links': [('one', 'one')],
|
||||
'net': 'other',
|
||||
@@ -723,6 +724,7 @@ class ServiceTest(unittest.TestCase):
|
||||
config_dict = service.config_dict()
|
||||
expected = {
|
||||
'image_id': 'abcd',
|
||||
'ipc_mode': None,
|
||||
'options': {'image': 'example.com/foo'},
|
||||
'links': [],
|
||||
'networks': {},
|
||||
|
||||
Reference in New Issue
Block a user