Compare commits

...

24 Commits
v1 ... 1.28.2

Author SHA1 Message Date
aiordache
67630359cf "Bump 1.28.2"
Signed-off-by: aiordache <anca.iordache@docker.com>
2021-01-26 20:23:53 +01:00
aiordache
c99c1556aa Add cgroup1 label to Release.Jenkinsfile
Signed-off-by: aiordache <anca.iordache@docker.com>
2021-01-26 20:15:15 +01:00
aiordache
0e529bf29b "Bump 1.28.1"
Signed-off-by: aiordache <anca.iordache@docker.com>
2021-01-26 20:15:15 +01:00
Harald Albers
27d039d39a Fix formatting of help output for up|logs --no-log-prefix
Signed-off-by: Harald Albers <github@albersweb.de>
2021-01-26 20:15:15 +01:00
Harald Albers
ad1baff1b3 Add bash completion for logs|up --no-log-prefix
This adds bash completion for https://github.com/docker/compose/pull/7435

Signed-off-by: Harald Albers <github@albersweb.de>
2021-01-26 20:15:15 +01:00
Chris Crone
59e9ebe428 build.linux: Revert to Python 3.7
This allows us to revert from Debian Buster to Stretch which allows
us to relax the glibc version requirements.

Signed-off-by: Chris Crone <christopher.crone@docker.com>
2021-01-26 20:15:15 +01:00
aiordache
90373e9e63 "Bump 1.28.0"
Signed-off-by: aiordache <anca.iordache@docker.com>
2021-01-26 20:15:15 +01:00
Ulysses Souza
786822e921 Update compose-spec
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2021-01-26 20:15:15 +01:00
Ulysses Souza
95c6adeecf Remove restriction on docker version
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2021-01-26 20:15:15 +01:00
Mike Seplowitz
b6ddddc31a Improve control over ANSI output (#6858)
* Move global console_handler into function scope

Signed-off-by: Mike Seplowitz <mseplowitz@bloomberg.net>

* Improve control over ANSI output

- Disabled parallel logger ANSI output if not attached to a tty.
  The console handler and progress stream already checked whether the
  output stream is a tty, but ParallelStreamWriter did not.

- Added --ansi=(never|always|auto) option to allow clearer control over
  ANSI output. Since --no-ansi is the same as --ansi=never, --no-ansi is
  now deprecated.

Signed-off-by: Mike Seplowitz <mseplowitz@bloomberg.net>
2021-01-26 20:15:15 +01:00
aiordache
e1fb1e9a3a "Bump 1.28.0-rc3"
Signed-off-by: aiordache <anca.iordache@docker.com>
2021-01-26 20:15:15 +01:00
Mark Gallagher
c27c73efae Remove duplicate values check for build.cache_from
The `docker` command accepts duplicate values, so there is no benefit to
performing this check.

Fixes #7342.

Signed-off-by: Mark Gallagher <mark@fts.scot>
2021-01-26 20:15:15 +01:00
Sebastiaan van Stijn
a5863de31a Make COMPOSE_DOCKER_CLI_BUILD=1 the default
This changes compose to use "native" build through the CLI
by default. With this, docker-compose can take advantage of
BuildKit (which is now enabled by default on Docker Desktop
2.5 and up).

Users that want to use the python client for building can
opt-out of this feature by setting COMPOSE_DOCKER_CLI_BUILD=0

Signed-off-by: Sebastiaan van Stijn <github@gone.nl>
2021-01-26 20:15:14 +01:00
guillaume.tardif
97056552dc Support windows npipe, set content type & corrrect URL /usage. Also fixed socket name for desktop mac
Signed-off-by: guillaume.tardif <guillaume.tardif@gmail.com>
2021-01-26 20:15:14 +01:00
Ulysses Souza
318741ca5e Add metrics
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2021-01-26 20:15:14 +01:00
aiordache
aa8b7bb392 "Bump 1.28.0-rc2"
Signed-off-by: aiordache <anca.iordache@docker.com>
2021-01-26 20:15:14 +01:00
Daniil Sigalov
a8ffcfaefb Only attach services we'll read logs from in up
When 'up' is run with explicit list of services, compose will
start them together with their dependencies. It will attach to all
started services, but won't read output from dependencies (their
logs are not printed by 'up') - so the receive buffer of
dependencies will fill and at some point will start blocking those
services. Fix that by only attaching to services given in the
list.
To do that, move logic of choosing which services to attach from
cli/main.py to utils.py and use it from project.py to decide if
service should be attached.

Fixes #6018

Signed-off-by: Daniil Sigalov <asterite@seclab.cs.msu.ru>
2021-01-26 20:15:14 +01:00
Ulysses Souza
97e009a8cb Avoid setting unsuported parameter for subprocess.Popen on Windows
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2021-01-26 20:15:14 +01:00
aiordache
186e3913f0 "Bump 1.28.0-rc1"
Signed-off-by: aiordache <anca.iordache@docker.com>
2021-01-26 20:15:14 +01:00
dependabot-preview[bot]
7bc945654f Bump virtualenv from 20.0.30 to 20.2.2
Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
(cherry picked from commit 8785279ffd)
Signed-off-by: Ulysses Souza <ulyssessouza@gmail.com>
2021-01-26 20:15:14 +01:00
dependabot-preview[bot]
cc299f5cd5 Bump bcrypt from 3.1.7 to 3.2.0
Bumps [bcrypt](https://github.com/pyca/bcrypt) from 3.1.7 to 3.2.0.
- [Release notes](https://github.com/pyca/bcrypt/releases)
- [Changelog](https://github.com/pyca/bcrypt/blob/master/release.py)
- [Commits](https://github.com/pyca/bcrypt/compare/3.1.7...3.2.0)

Signed-off-by: dependabot-preview[bot] <support@dependabot.com>
2021-01-26 20:15:14 +01:00
Anca Iordache
536bea0859 Revert "Bump virtualenv from 20.0.30 to 20.2.1" (#7975)
This reverts commit 8785279ffd.

Signed-off-by: aiordache <anca.iordache@docker.com>
2021-01-26 20:15:14 +01:00
Anca Iordache
db7b666e40 Revert "Bump gitpython from 3.1.7 to 3.1.11" (#7974)
Signed-off-by: aiordache <anca.iordache@docker.com>
2021-01-26 20:15:14 +01:00
aiordache
945123145f Bump docker-py in setup.py
Signed-off-by: aiordache <anca.iordache@docker.com>
2021-01-26 20:15:14 +01:00
40 changed files with 685 additions and 138 deletions

View File

@@ -17,7 +17,7 @@
sha: v1.3.4
hooks:
- id: reorder-python-imports
language_version: 'python3.9'
language_version: 'python3.7'
args:
- --py3-plus
- repo: https://github.com/asottile/pyupgrade

View File

@@ -1,6 +1,86 @@
Change log
==========
1.28.2 (2021-01-26)
-------------------
### Miscellaneous
- CI setup update
1.28.1 (2021-01-25)
-------------------
### Bugs
- Revert to Python 3.7 bump for Linux static builds
- Add bash completion for `docker-compose logs|up --no-log-prefix`
1.28.0 (2021-01-20)
-------------------
### Features
- Support for Nvidia GPUs via device requests
- Support for service profiles
- Change the SSH connection approach to the Docker CLI's via shellout to the local SSH client (old behaviour enabled by setting `COMPOSE_PARAMIKO_SSH` environment variable)
- Add flag to disable log prefix
- Add flag for ansi output control
### Bugs
- Make `parallel_pull=True` by default
- Bring back warning for configs in non-swarm mode
- Take `--file` in account when defining `project_dir`
- On `compose up`, attach only to services we read logs from
### Miscellaneous
- Make COMPOSE_DOCKER_CLI_BUILD=1 the default
- Add usage metrics
- Sync schema with COMPOSE specification
- Improve failure report for missing mandatory environment variables
- Bump attrs to 20.3.0
- Bump more_itertools to 8.6.0
- Bump cryptograhy to 3.2.1
- Bump cffi to 1.14.4
- Bump virtualenv to 20.2.2
- Bump bcrypt to 3.2.0
- Bump gitpython to 3.1.11
- Bump docker-py to 4.4.1
- Bump Python to 3.9
- Linux: bump Debian base image from stretch to buster (required for Python 3.9)
- macOS: OpenSSL 1.1.1g to 1.1.1h, Python 3.7.7 to 3.9.0
- Bump pyinstaller 4.1
- Loosen restriction on base images to latest minor
- Updates of READMEs
1.27.4 (2020-09-24)
-------------------

View File

@@ -1,13 +1,13 @@
ARG DOCKER_VERSION=19.03
ARG PYTHON_VERSION=3.9.0
ARG PYTHON_VERSION=3.7.9
ARG BUILD_ALPINE_VERSION=3.12
ARG BUILD_CENTOS_VERSION=7
ARG BUILD_DEBIAN_VERSION=slim-buster
ARG BUILD_DEBIAN_VERSION=slim-stretch
ARG RUNTIME_ALPINE_VERSION=3.12
ARG RUNTIME_CENTOS_VERSION=7
ARG RUNTIME_DEBIAN_VERSION=buster-slim
ARG RUNTIME_DEBIAN_VERSION=stretch-slim
ARG DISTRO=alpine
@@ -38,7 +38,7 @@ RUN apt-get update && apt-get install --no-install-recommends -y \
git \
libc-dev \
libffi-dev \
libgcc-8-dev \
libgcc-6-dev \
libssl-dev \
make \
openssl \
@@ -68,8 +68,8 @@ WORKDIR /code/
COPY docker-compose-entrypoint.sh /usr/local/bin/
COPY --from=docker-cli /usr/local/bin/docker /usr/local/bin/docker
RUN pip install \
virtualenv==20.2.1 \
tox==3.20.1
virtualenv==20.4.0 \
tox==3.21.2
COPY requirements-dev.txt .
COPY requirements-indirect.txt .
COPY requirements.txt .
@@ -79,7 +79,7 @@ COPY tox.ini .
COPY setup.py .
COPY README.md .
COPY compose compose/
RUN tox --notest
RUN tox -e py37 --notest
COPY . .
ARG GIT_COMMIT=unknown
ENV DOCKER_COMPOSE_GITSHA=$GIT_COMMIT

2
Jenkinsfile vendored
View File

@@ -2,7 +2,7 @@
def dockerVersions = ['19.03.13']
def baseImages = ['alpine', 'debian']
def pythonVersions = ['py39']
def pythonVersions = ['py37']
pipeline {
agent none

View File

@@ -2,7 +2,7 @@
def dockerVersions = ['19.03.13', '18.09.9']
def baseImages = ['alpine', 'debian']
def pythonVersions = ['py39']
def pythonVersions = ['py37']
pipeline {
agent none
@@ -23,7 +23,7 @@ pipeline {
parallel {
stage('alpine') {
agent {
label 'linux && docker && ubuntu-2004'
label 'linux && docker && ubuntu-2004 && cgroup1'
}
steps {
buildImage('alpine')
@@ -31,7 +31,7 @@ pipeline {
}
stage('debian') {
agent {
label 'linux && docker && ubuntu-2004'
label 'linux && docker && ubuntu-2004 && cgroup1'
}
steps {
buildImage('debian')
@@ -41,7 +41,7 @@ pipeline {
}
stage('Test') {
agent {
label 'linux && docker && ubuntu-2004'
label 'linux && docker && ubuntu-2004 && cgroup1'
}
steps {
// TODO use declarative 1.5.0 `matrix` once available on CI
@@ -61,7 +61,7 @@ pipeline {
}
stage('Generate Changelog') {
agent {
label 'linux && docker && ubuntu-2004'
label 'linux && docker && ubuntu-2004 && cgroup1'
}
steps {
checkout scm
@@ -98,7 +98,7 @@ pipeline {
}
stage('linux binary') {
agent {
label 'linux && docker && ubuntu-2004'
label 'linux && docker && ubuntu-2004 && cgroup1'
}
steps {
checkout scm
@@ -134,7 +134,7 @@ pipeline {
}
stage('alpine image') {
agent {
label 'linux && docker && ubuntu-2004'
label 'linux && docker && ubuntu-2004 && cgroup1'
}
steps {
buildRuntimeImage('alpine')
@@ -142,7 +142,7 @@ pipeline {
}
stage('debian image') {
agent {
label 'linux && docker && ubuntu-2004'
label 'linux && docker && ubuntu-2004 && cgroup1'
}
steps {
buildRuntimeImage('debian')
@@ -157,7 +157,7 @@ pipeline {
parallel {
stage('Pushing images') {
agent {
label 'linux && docker && ubuntu-2004'
label 'linux && docker && ubuntu-2004 && cgroup1'
}
steps {
pushRuntimeImage('alpine')
@@ -166,7 +166,7 @@ pipeline {
}
stage('Creating Github Release') {
agent {
label 'linux && docker && ubuntu-2004'
label 'linux && docker && ubuntu-2004 && cgroup1'
}
environment {
GITHUB_TOKEN = credentials('github-release-token')
@@ -198,7 +198,7 @@ pipeline {
}
stage('Publishing Python packages') {
agent {
label 'linux && docker && ubuntu-2004'
label 'linux && docker && ubuntu-2004 && cgroup1'
}
environment {
PYPIRC = credentials('pypirc-docker-dsg-cibot')
@@ -247,7 +247,7 @@ def buildImage(baseImage) {
def runTests(dockerVersion, pythonVersion, baseImage) {
return {
stage("python=${pythonVersion} docker=${dockerVersion} ${baseImage}") {
node("linux && docker && ubuntu-2004") {
node("linux && docker && ubuntu-2004 && cgroup1") {
def scmvar = checkout(scm)
def imageName = "dockerbuildbot/compose:${baseImage}-${scmvar.GIT_COMMIT}"
def storageDriver = sh(script: "docker info -f \'{{.Driver}}\'", returnStdout: true).trim()

View File

@@ -1 +1 @@
__version__ = '1.28.0dev'
__version__ = '1.28.2'

View File

@@ -1,3 +1,6 @@
import enum
import os
from ..const import IS_WINDOWS_PLATFORM
NAMES = [
@@ -12,6 +15,21 @@ NAMES = [
]
@enum.unique
class AnsiMode(enum.Enum):
"""Enumeration for when to output ANSI colors."""
NEVER = "never"
ALWAYS = "always"
AUTO = "auto"
def use_ansi_codes(self, stream):
if self is AnsiMode.ALWAYS:
return True
if self is AnsiMode.NEVER or os.environ.get('CLICOLOR') == '0':
return False
return stream.isatty()
def get_pairs():
for i, name in enumerate(NAMES):
yield (name, str(30 + i))

View File

@@ -17,10 +17,16 @@ class DocoptDispatcher:
self.command_class = command_class
self.options = options
@classmethod
def get_command_and_options(cls, doc_entity, argv, options):
command_help = getdoc(doc_entity)
opt = docopt_full_help(command_help, argv, **options)
command = opt['COMMAND']
return command_help, opt, command
def parse(self, argv):
command_help = getdoc(self.command_class)
options = docopt_full_help(command_help, argv, **self.options)
command = options['COMMAND']
command_help, options, command = DocoptDispatcher.get_command_and_options(
self.command_class, argv, self.options)
if command is None:
raise SystemExit(command_help)

View File

@@ -2,7 +2,6 @@ import contextlib
import functools
import json
import logging
import os
import pipes
import re
import subprocess
@@ -26,6 +25,8 @@ from ..config.serialize import serialize_config
from ..config.types import VolumeSpec
from ..const import IS_WINDOWS_PLATFORM
from ..errors import StreamParseError
from ..metrics.decorator import metrics
from ..parallel import ParallelStreamWriter
from ..progress_stream import StreamOutputError
from ..project import get_image_digests
from ..project import MissingDigests
@@ -38,6 +39,8 @@ from ..service import ConvergenceStrategy
from ..service import ImageType
from ..service import NeedsBuildError
from ..service import OperationFailedError
from ..utils import filter_attached_for_up
from .colors import AnsiMode
from .command import get_config_from_options
from .command import get_project_dir
from .command import project_from_options
@@ -52,60 +55,122 @@ from .log_printer import LogPrinter
from .utils import get_version_info
from .utils import human_readable_file_size
from .utils import yesno
from compose.metrics.client import MetricsCommand
from compose.metrics.client import Status
if not IS_WINDOWS_PLATFORM:
from dockerpty.pty import PseudoTerminal, RunOperation, ExecOperation
log = logging.getLogger(__name__)
console_handler = logging.StreamHandler(sys.stderr)
def main():
def main(): # noqa: C901
signals.ignore_sigpipe()
command = None
try:
command = dispatch()
command()
_, opts, command = DocoptDispatcher.get_command_and_options(
TopLevelCommand,
get_filtered_args(sys.argv[1:]),
{'options_first': True, 'version': get_version_info('compose')})
except Exception:
pass
try:
command_func = dispatch()
command_func()
except (KeyboardInterrupt, signals.ShutdownException):
log.error("Aborting.")
sys.exit(1)
exit_with_metrics(command, "Aborting.", status=Status.FAILURE)
except (UserError, NoSuchService, ConfigurationError,
ProjectError, OperationFailedError) as e:
log.error(e.msg)
sys.exit(1)
exit_with_metrics(command, e.msg, status=Status.FAILURE)
except BuildError as e:
reason = ""
if e.reason:
reason = " : " + e.reason
log.error("Service '{}' failed to build{}".format(e.service.name, reason))
sys.exit(1)
exit_with_metrics(command,
"Service '{}' failed to build{}".format(e.service.name, reason),
status=Status.FAILURE)
except StreamOutputError as e:
log.error(e)
sys.exit(1)
exit_with_metrics(command, e, status=Status.FAILURE)
except NeedsBuildError as e:
log.error("Service '{}' needs to be built, but --no-build was passed.".format(e.service.name))
sys.exit(1)
exit_with_metrics(command,
"Service '{}' needs to be built, but --no-build was passed.".format(
e.service.name), status=Status.FAILURE)
except NoSuchCommand as e:
commands = "\n".join(parse_doc_section("commands:", getdoc(e.supercommand)))
log.error("No such command: %s\n\n%s", e.command, commands)
sys.exit(1)
exit_with_metrics(e.command, "No such command: {}\n\n{}".format(e.command, commands))
except (errors.ConnectionError, StreamParseError):
sys.exit(1)
exit_with_metrics(command, status=Status.FAILURE)
except SystemExit as e:
status = Status.SUCCESS
if len(sys.argv) > 1 and '--help' not in sys.argv:
status = Status.FAILURE
if command and len(sys.argv) >= 3 and sys.argv[2] == '--help':
command = '--help ' + command
if not command and len(sys.argv) >= 2 and sys.argv[1] == '--help':
command = '--help'
msg = e.args[0] if len(e.args) else ""
code = 0
if isinstance(e.code, int):
code = e.code
exit_with_metrics(command, log_msg=msg, status=status,
exit_code=code)
def get_filtered_args(args):
if args[0] in ('-h', '--help'):
return []
if args[0] == '--version':
return ['version']
def exit_with_metrics(command, log_msg=None, status=Status.SUCCESS, exit_code=1):
if log_msg:
if not exit_code:
log.info(log_msg)
else:
log.error(log_msg)
MetricsCommand(command, status=status).send_metrics()
sys.exit(exit_code)
def dispatch():
setup_logging()
console_stream = sys.stderr
console_handler = logging.StreamHandler(console_stream)
setup_logging(console_handler)
dispatcher = DocoptDispatcher(
TopLevelCommand,
{'options_first': True, 'version': get_version_info('compose')})
options, handler, command_options = dispatcher.parse(sys.argv[1:])
ansi_mode = AnsiMode.AUTO
try:
if options.get("--ansi"):
ansi_mode = AnsiMode(options.get("--ansi"))
except ValueError:
raise UserError(
'Invalid value for --ansi: {}. Expected one of {}.'.format(
options.get("--ansi"),
', '.join(m.value for m in AnsiMode)
)
)
if options.get("--no-ansi"):
if options.get("--ansi"):
raise UserError("--no-ansi and --ansi cannot be combined.")
log.warning('--no-ansi option is deprecated and will be removed in future versions.')
ansi_mode = AnsiMode.NEVER
setup_console_handler(console_handler,
options.get('--verbose'),
set_no_color_if_clicolor(options.get('--no-ansi')),
ansi_mode.use_ansi_codes(console_handler.stream),
options.get("--log-level"))
setup_parallel_logger(set_no_color_if_clicolor(options.get('--no-ansi')))
if options.get('--no-ansi'):
setup_parallel_logger(ansi_mode)
if ansi_mode is AnsiMode.NEVER:
command_options['--no-color'] = True
return functools.partial(perform_command, options, handler, command_options)
@@ -127,23 +192,23 @@ def perform_command(options, handler, command_options):
handler(command, command_options)
def setup_logging():
def setup_logging(console_handler):
root_logger = logging.getLogger()
root_logger.addHandler(console_handler)
root_logger.setLevel(logging.DEBUG)
# Disable requests logging
# Disable requests and docker-py logging
logging.getLogger("urllib3").propagate = False
logging.getLogger("requests").propagate = False
logging.getLogger("docker").propagate = False
def setup_parallel_logger(noansi):
if noansi:
import compose.parallel
compose.parallel.ParallelStreamWriter.set_noansi()
def setup_parallel_logger(ansi_mode):
ParallelStreamWriter.set_default_ansi_mode(ansi_mode)
def setup_console_handler(handler, verbose, noansi=False, level=None):
if handler.stream.isatty() and noansi is False:
def setup_console_handler(handler, verbose, use_console_formatter=True, level=None):
if use_console_formatter:
format_class = ConsoleWarningFormatter
else:
format_class = logging.Formatter
@@ -195,7 +260,8 @@ class TopLevelCommand:
-c, --context NAME Specify a context name
--verbose Show more output
--log-level LEVEL Set log level (DEBUG, INFO, WARNING, ERROR, CRITICAL)
--no-ansi Do not print ANSI control characters
--ansi (never|always|auto) Control when to print ANSI control characters
--no-ansi Do not print ANSI control characters (DEPRECATED)
-v, --version Print version and exit
-H, --host HOST Daemon socket to connect to
@@ -253,6 +319,7 @@ class TopLevelCommand:
environment_file = self.toplevel_options.get('--env-file')
return Environment.from_env_file(self.project_dir, environment_file)
@metrics()
def build(self, options):
"""
Build or rebuild services.
@@ -272,8 +339,6 @@ class TopLevelCommand:
--no-rm Do not remove intermediate containers after a successful build.
--parallel Build images in parallel.
--progress string Set type of progress output (auto, plain, tty).
EXPERIMENTAL flag for native builder.
To enable, run with COMPOSE_DOCKER_CLI_BUILD=1)
--pull Always attempt to pull a newer version of the image.
-q, --quiet Don't print anything to STDOUT
"""
@@ -287,7 +352,7 @@ class TopLevelCommand:
)
build_args = resolve_build_args(build_args, self.toplevel_environment)
native_builder = self.toplevel_environment.get_boolean('COMPOSE_DOCKER_CLI_BUILD')
native_builder = self.toplevel_environment.get_boolean('COMPOSE_DOCKER_CLI_BUILD', True)
self.project.build(
service_names=options['SERVICE'],
@@ -304,6 +369,7 @@ class TopLevelCommand:
progress=options.get('--progress'),
)
@metrics()
def config(self, options):
"""
Validate and view the Compose file.
@@ -353,6 +419,7 @@ class TopLevelCommand:
print(serialize_config(compose_config, image_digests, not options['--no-interpolate']))
@metrics()
def create(self, options):
"""
Creates containers for a service.
@@ -381,6 +448,7 @@ class TopLevelCommand:
do_build=build_action_from_opts(options),
)
@metrics()
def down(self, options):
"""
Stops containers and removes containers, networks, volumes, and images
@@ -449,6 +517,7 @@ class TopLevelCommand:
print(formatter(event))
sys.stdout.flush()
@metrics("exec")
def exec_command(self, options):
"""
Execute a command in a running container
@@ -525,6 +594,7 @@ class TopLevelCommand:
sys.exit(exit_code)
@classmethod
@metrics()
def help(cls, options):
"""
Get help on a command.
@@ -538,6 +608,7 @@ class TopLevelCommand:
print(getdoc(subject))
@metrics()
def images(self, options):
"""
List images used by the created containers.
@@ -592,6 +663,7 @@ class TopLevelCommand:
])
print(Formatter.table(headers, rows))
@metrics()
def kill(self, options):
"""
Force stop service containers.
@@ -606,6 +678,7 @@ class TopLevelCommand:
self.project.kill(service_names=options['SERVICE'], signal=signal)
@metrics()
def logs(self, options):
"""
View output from containers.
@@ -618,7 +691,7 @@ class TopLevelCommand:
-t, --timestamps Show timestamps.
--tail="all" Number of lines to show from the end of the logs
for each container.
--no-log-prefix Don't print prefix in logs.
--no-log-prefix Don't print prefix in logs.
"""
containers = self.project.containers(service_names=options['SERVICE'], stopped=True)
@@ -637,11 +710,12 @@ class TopLevelCommand:
log_printer_from_project(
self.project,
containers,
set_no_color_if_clicolor(options['--no-color']),
options['--no-color'],
log_args,
event_stream=self.project.events(service_names=options['SERVICE']),
keep_prefix=not options['--no-log-prefix']).run()
@metrics()
def pause(self, options):
"""
Pause services.
@@ -651,6 +725,7 @@ class TopLevelCommand:
containers = self.project.pause(service_names=options['SERVICE'])
exit_if(not containers, 'No containers to pause', 1)
@metrics()
def port(self, options):
"""
Print the public port for a port binding.
@@ -672,6 +747,7 @@ class TopLevelCommand:
options['PRIVATE_PORT'],
protocol=options.get('--protocol') or 'tcp') or '')
@metrics()
def ps(self, options):
"""
List containers.
@@ -728,6 +804,7 @@ class TopLevelCommand:
])
print(Formatter.table(headers, rows))
@metrics()
def pull(self, options):
"""
Pulls images for services defined in a Compose file, but does not start the containers.
@@ -751,6 +828,7 @@ class TopLevelCommand:
include_deps=options.get('--include-deps'),
)
@metrics()
def push(self, options):
"""
Pushes images for services.
@@ -765,6 +843,7 @@ class TopLevelCommand:
ignore_push_failures=options.get('--ignore-push-failures')
)
@metrics()
def rm(self, options):
"""
Removes stopped service containers.
@@ -809,6 +888,7 @@ class TopLevelCommand:
else:
print("No stopped containers")
@metrics()
def run(self, options):
"""
Run a one-off command on a service.
@@ -869,6 +949,7 @@ class TopLevelCommand:
self.toplevel_options, self.toplevel_environment
)
@metrics()
def scale(self, options):
"""
Set number of containers to run for a service.
@@ -897,6 +978,7 @@ class TopLevelCommand:
for service_name, num in parse_scale_args(options['SERVICE=NUM']).items():
self.project.get_service(service_name).scale(num, timeout=timeout)
@metrics()
def start(self, options):
"""
Start existing containers.
@@ -906,6 +988,7 @@ class TopLevelCommand:
containers = self.project.start(service_names=options['SERVICE'])
exit_if(not containers, 'No containers to start', 1)
@metrics()
def stop(self, options):
"""
Stop running containers without removing them.
@@ -921,6 +1004,7 @@ class TopLevelCommand:
timeout = timeout_from_opts(options)
self.project.stop(service_names=options['SERVICE'], timeout=timeout)
@metrics()
def restart(self, options):
"""
Restart running containers.
@@ -935,6 +1019,7 @@ class TopLevelCommand:
containers = self.project.restart(service_names=options['SERVICE'], timeout=timeout)
exit_if(not containers, 'No containers to restart', 1)
@metrics()
def top(self, options):
"""
Display the running processes
@@ -962,6 +1047,7 @@ class TopLevelCommand:
print(container.name)
print(Formatter.table(headers, rows))
@metrics()
def unpause(self, options):
"""
Unpause services.
@@ -971,6 +1057,7 @@ class TopLevelCommand:
containers = self.project.unpause(service_names=options['SERVICE'])
exit_if(not containers, 'No containers to unpause', 1)
@metrics()
def up(self, options):
"""
Builds, (re)creates, starts, and attaches to containers for a service.
@@ -1022,7 +1109,7 @@ class TopLevelCommand:
container. Implies --abort-on-container-exit.
--scale SERVICE=NUM Scale SERVICE to NUM instances. Overrides the
`scale` setting in the Compose file if present.
--no-log-prefix Don't print prefix in logs.
--no-log-prefix Don't print prefix in logs.
"""
start_deps = not options['--no-deps']
always_recreate_deps = options['--always-recreate-deps']
@@ -1049,7 +1136,7 @@ class TopLevelCommand:
for excluded in [x for x in opts if options.get(x) and no_start]:
raise UserError('--no-start and {} cannot be combined.'.format(excluded))
native_builder = self.toplevel_environment.get_boolean('COMPOSE_DOCKER_CLI_BUILD')
native_builder = self.toplevel_environment.get_boolean('COMPOSE_DOCKER_CLI_BUILD', True)
with up_shutdown_context(self.project, service_names, timeout, detached):
warn_for_swarm_mode(self.project.client)
@@ -1071,6 +1158,7 @@ class TopLevelCommand:
renew_anonymous_volumes=options.get('--renew-anon-volumes'),
silent=options.get('--quiet-pull'),
cli=native_builder,
attach_dependencies=attach_dependencies,
)
try:
@@ -1098,7 +1186,7 @@ class TopLevelCommand:
log_printer = log_printer_from_project(
self.project,
attached_containers,
set_no_color_if_clicolor(options['--no-color']),
options['--no-color'],
{'follow': True},
cascade_stop,
event_stream=self.project.events(service_names=service_names),
@@ -1120,6 +1208,7 @@ class TopLevelCommand:
sys.exit(exit_code)
@classmethod
@metrics()
def version(cls, options):
"""
Show version information and quit.
@@ -1401,13 +1490,11 @@ def log_printer_from_project(
def filter_attached_containers(containers, service_names, attach_dependencies=False):
if attach_dependencies or not service_names:
return containers
return [
container
for container in containers if container.service in service_names
]
return filter_attached_for_up(
containers,
service_names,
attach_dependencies,
lambda container: container.service)
@contextlib.contextmanager
@@ -1583,7 +1670,3 @@ def warn_for_swarm_mode(client):
"To deploy your application across the swarm, "
"use `docker stack deploy`.\n"
)
def set_no_color_if_clicolor(no_color_flag):
return no_color_flag or os.environ.get('CLICOLOR') == "0"

View File

@@ -87,7 +87,7 @@
"dockerfile": {"type": "string"},
"args": {"$ref": "#/definitions/list_or_dict"},
"labels": {"$ref": "#/definitions/list_or_dict"},
"cache_from": {"$ref": "#/definitions/list_of_strings"},
"cache_from": {"type": "array", "items": {"type": "string"}},
"network": {"type": "string"},
"target": {"type": "string"},
"shm_size": {"type": ["integer", "string"]},
@@ -330,7 +330,7 @@
"privileged": {"type": "boolean"},
"profiles": {"$ref": "#/definitions/list_of_strings"},
"pull_policy": {"type": "string", "enum": [
"always", "never", "if_not_present"
"always", "never", "if_not_present", "build"
]},
"read_only": {"type": "boolean"},
"restart": {"type": "string"},

View File

@@ -113,13 +113,13 @@ class Environment(dict):
)
return super().get(key, *args, **kwargs)
def get_boolean(self, key):
def get_boolean(self, key, default=False):
# Convert a value to a boolean using "common sense" rules.
# Unset, empty, "0" and "false" (i-case) yield False.
# All other values yield True.
value = self.get(key)
if not value:
return False
return default
if value.lower() in ['0', 'false']:
return False
return True

View File

64
compose/metrics/client.py Normal file
View File

@@ -0,0 +1,64 @@
import os
from enum import Enum
import requests
from docker import ContextAPI
from docker.transport import UnixHTTPAdapter
from compose.const import IS_WINDOWS_PLATFORM
if IS_WINDOWS_PLATFORM:
from docker.transport import NpipeHTTPAdapter
class Status(Enum):
SUCCESS = "success"
FAILURE = "failure"
CANCELED = "canceled"
class MetricsSource:
CLI = "docker-compose"
if IS_WINDOWS_PLATFORM:
METRICS_SOCKET_FILE = 'npipe://\\\\.\\pipe\\docker_cli'
else:
METRICS_SOCKET_FILE = 'http+unix:///var/run/docker-cli.sock'
class MetricsCommand(requests.Session):
"""
Representation of a command in the metrics.
"""
def __init__(self, command,
context_type=None, status=Status.SUCCESS,
source=MetricsSource.CLI, uri=None):
super().__init__()
self.command = "compose " + command if command else "compose --help"
self.context = context_type or ContextAPI.get_current_context().context_type or 'moby'
self.source = source
self.status = status.value
self.uri = uri or os.environ.get("METRICS_SOCKET_FILE", METRICS_SOCKET_FILE)
if IS_WINDOWS_PLATFORM:
self.mount("http+unix://", NpipeHTTPAdapter(self.uri))
else:
self.mount("http+unix://", UnixHTTPAdapter(self.uri))
def send_metrics(self):
try:
return self.post("http+unix://localhost/usage",
json=self.to_map(),
timeout=.05,
headers={'Content-Type': 'application/json'})
except Exception as e:
return e
def to_map(self):
return {
'command': self.command,
'context': self.context,
'source': self.source,
'status': self.status,
}

View File

@@ -0,0 +1,21 @@
import functools
from compose.metrics.client import MetricsCommand
from compose.metrics.client import Status
class metrics:
def __init__(self, command_name=None):
self.command_name = command_name
def __call__(self, fn):
@functools.wraps(fn,
assigned=functools.WRAPPER_ASSIGNMENTS,
updated=functools.WRAPPER_UPDATES)
def wrapper(*args, **kwargs):
if not self.command_name:
self.command_name = fn.__name__
result = fn(*args, **kwargs)
MetricsCommand(self.command_name, status=Status.SUCCESS).send_metrics()
return result
return wrapper

View File

@@ -11,6 +11,7 @@ from threading import Thread
from docker.errors import APIError
from docker.errors import ImageNotFound
from compose.cli.colors import AnsiMode
from compose.cli.colors import green
from compose.cli.colors import red
from compose.cli.signals import ShutdownException
@@ -83,10 +84,7 @@ def parallel_execute(objects, func, get_name, msg, get_deps=None, limit=None, fa
objects = list(objects)
stream = sys.stderr
if ParallelStreamWriter.instance:
writer = ParallelStreamWriter.instance
else:
writer = ParallelStreamWriter(stream)
writer = ParallelStreamWriter.get_or_assign_instance(ParallelStreamWriter(stream))
for obj in objects:
writer.add_object(msg, get_name(obj))
@@ -259,19 +257,37 @@ class ParallelStreamWriter:
to jump to the correct line, and write over the line.
"""
noansi = False
lock = Lock()
default_ansi_mode = AnsiMode.AUTO
write_lock = Lock()
instance = None
instance_lock = Lock()
@classmethod
def set_noansi(cls, value=True):
cls.noansi = value
def get_instance(cls):
return cls.instance
def __init__(self, stream):
@classmethod
def get_or_assign_instance(cls, writer):
cls.instance_lock.acquire()
try:
if cls.instance is None:
cls.instance = writer
return cls.instance
finally:
cls.instance_lock.release()
@classmethod
def set_default_ansi_mode(cls, ansi_mode):
cls.default_ansi_mode = ansi_mode
def __init__(self, stream, ansi_mode=None):
if ansi_mode is None:
ansi_mode = self.default_ansi_mode
self.stream = stream
self.use_ansi_codes = ansi_mode.use_ansi_codes(stream)
self.lines = []
self.width = 0
ParallelStreamWriter.instance = self
def add_object(self, msg, obj_index):
if msg is None:
@@ -285,7 +301,7 @@ class ParallelStreamWriter:
return self._write_noansi(msg, obj_index, '')
def _write_ansi(self, msg, obj_index, status):
self.lock.acquire()
self.write_lock.acquire()
position = self.lines.index(msg + obj_index)
diff = len(self.lines) - position
# move up
@@ -297,7 +313,7 @@ class ParallelStreamWriter:
# move back down
self.stream.write("%c[%dB" % (27, diff))
self.stream.flush()
self.lock.release()
self.write_lock.release()
def _write_noansi(self, msg, obj_index, status):
self.stream.write(
@@ -310,17 +326,10 @@ class ParallelStreamWriter:
def write(self, msg, obj_index, status, color_func):
if msg is None:
return
if self.noansi:
self._write_noansi(msg, obj_index, status)
else:
if self.use_ansi_codes:
self._write_ansi(msg, obj_index, color_func(status))
def get_stream_writer():
instance = ParallelStreamWriter.instance
if instance is None:
raise RuntimeError('ParallelStreamWriter has not yet been instantiated')
return instance
else:
self._write_noansi(msg, obj_index, status)
def parallel_operation(containers, operation, options, message):

View File

@@ -39,6 +39,7 @@ from .service import Service
from .service import ServiceIpcMode
from .service import ServiceNetworkMode
from .service import ServicePidMode
from .utils import filter_attached_for_up
from .utils import microseconds_from_time_nano
from .utils import truncate_string
from .volume import ProjectVolumes
@@ -489,7 +490,8 @@ class Project:
log.info('%s uses an image, skipping' % service.name)
if cli:
log.warning("Native build is an experimental feature and could change at any time")
log.info("Building with native build. Learn about native build in Compose here: "
"https://docs.docker.com/go/compose-native-build/")
if parallel_build:
log.warning("Flag '--parallel' is ignored when building with "
"COMPOSE_DOCKER_CLI_BUILD=1")
@@ -645,11 +647,13 @@ class Project:
silent=False,
cli=False,
one_off=False,
attach_dependencies=False,
override_options=None,
):
if cli:
log.warning("Native build is an experimental feature and could change at any time")
log.info("Building with native build. Learn about native build in Compose here: "
"https://docs.docker.com/go/compose-native-build/")
self.initialize()
if not ignore_orphans:
@@ -671,12 +675,17 @@ class Project:
one_off=service_names if one_off else [],
)
def do(service):
services_to_attach = filter_attached_for_up(
services,
service_names,
attach_dependencies,
lambda service: service.name)
def do(service):
return service.execute_convergence_plan(
plans[service.name],
timeout=timeout,
detached=detached,
detached=detached or (service not in services_to_attach),
scale_override=scale_override.get(service.name),
rescale=rescale,
start=start,
@@ -780,7 +789,9 @@ class Project:
return
try:
writer = parallel.get_stream_writer()
writer = parallel.ParallelStreamWriter.get_instance()
if writer is None:
raise RuntimeError('ParallelStreamWriter has not yet been instantiated')
for event in strm:
if 'status' not in event:
continue

View File

@@ -174,3 +174,18 @@ def truncate_string(s, max_chars=35):
if len(s) > max_chars:
return s[:max_chars - 2] + '...'
return s
def filter_attached_for_up(items, service_names, attach_dependencies=False,
item_to_service_name=lambda x: x):
"""This function contains the logic of choosing which services to
attach when doing docker-compose up. It may be used both with containers
and services, and any other entities that map to service names -
this mapping is provided by item_to_service_name."""
if attach_dependencies or not service_names:
return items
return [
item
for item in items if item_to_service_name(item) in service_names
]

View File

@@ -164,6 +164,10 @@ _docker_compose_docker_compose() {
_filedir "y?(a)ml"
return
;;
--ansi)
COMPREPLY=( $( compgen -W "never always auto" -- "$cur" ) )
return
;;
--log-level)
COMPREPLY=( $( compgen -W "debug info warning error critical" -- "$cur" ) )
return
@@ -290,7 +294,7 @@ _docker_compose_logs() {
case "$cur" in
-*)
COMPREPLY=( $( compgen -W "--follow -f --help --no-color --tail --timestamps -t" -- "$cur" ) )
COMPREPLY=( $( compgen -W "--follow -f --help --no-color --no-log-prefix --tail --timestamps -t" -- "$cur" ) )
;;
*)
__docker_compose_complete_services
@@ -545,7 +549,7 @@ _docker_compose_up() {
case "$cur" in
-*)
COMPREPLY=( $( compgen -W "--abort-on-container-exit --always-recreate-deps --attach-dependencies --build -d --detach --exit-code-from --force-recreate --help --no-build --no-color --no-deps --no-recreate --no-start --renew-anon-volumes -V --remove-orphans --scale --timeout -t" -- "$cur" ) )
COMPREPLY=( $( compgen -W "--abort-on-container-exit --always-recreate-deps --attach-dependencies --build -d --detach --exit-code-from --force-recreate --help --no-build --no-color --no-deps --no-log-prefix --no-recreate --no-start --renew-anon-volumes -V --remove-orphans --scale --timeout -t" -- "$cur" ) )
;;
*)
__docker_compose_complete_services
@@ -616,6 +620,7 @@ _docker_compose() {
# These options are require special treatment when searching the command.
local top_level_options_with_args="
--ansi
--log-level
"

View File

@@ -21,5 +21,7 @@ complete -c docker-compose -l tlscert -r -d 'Path to TLS certif
complete -c docker-compose -l tlskey -r -d 'Path to TLS key file'
complete -c docker-compose -l tlsverify -d 'Use TLS and verify the remote'
complete -c docker-compose -l skip-hostname-check -d "Don't check the daemon's hostname against the name specified in the client certificate (for example if your docker host is an IP address)"
complete -c docker-compose -l no-ansi -d 'Do not print ANSI control characters'
complete -c docker-compose -l ansi -a never always auto -d 'Control when to print ANSI control characters'
complete -c docker-compose -s h -l help -d 'Print usage'
complete -c docker-compose -s v -l version -d 'Print version and exit'

View File

@@ -342,6 +342,7 @@ _docker-compose() {
'--verbose[Show more output]' \
'--log-level=[Set log level]:level:(DEBUG INFO WARNING ERROR CRITICAL)' \
'--no-ansi[Do not print ANSI control characters]' \
'--ansi=[Control when to print ANSI control characters]:when:(never always auto)' \
'(-H --host)'{-H,--host}'[Daemon socket to connect to]:host:' \
'--tls[Use TLS; implied by --tlsverify]' \
'--tlscacert=[Trust certs signed only by this CA]:ca path:' \

View File

@@ -1,7 +1,7 @@
altgraph==0.17
appdirs==1.4.4
attrs==20.3.0
bcrypt==3.1.7
bcrypt==3.2.0
cffi==1.14.4
cryptography==3.2.1
distlib==0.3.1
@@ -23,6 +23,6 @@ pyrsistent==0.16.0
smmap==3.0.4
smmap2==3.0.1
toml==0.10.1
tox==3.20.1
virtualenv==20.2.1
tox==3.21.2
virtualenv==20.4.0
wcwidth==0.2.5

View File

@@ -4,7 +4,7 @@ certifi==2020.6.20
chardet==3.0.4
colorama==0.4.3; sys_platform == 'win32'
distro==1.5.0
docker==4.4.0
docker==4.4.1
docker-pycreds==0.4.0
dockerpty==0.4.1
docopt==0.6.2

View File

@@ -3,7 +3,7 @@
set -ex
CODE_PATH=/code
VENV="${CODE_PATH}"/.tox/py39
VENV="${CODE_PATH}"/.tox/py37
cd "${CODE_PATH}"
mkdir -p dist

View File

@@ -16,7 +16,7 @@
#
# 4. In Powershell, run the following commands:
#
# $ pip install 'virtualenv==20.2.1'
# $ pip install 'virtualenv==20.2.2'
# $ Set-ExecutionPolicy -Scope CurrentUser RemoteSigned
#
# 5. Clone the repository:

0
script/release/release.py Normal file → Executable file
View File

View File

@@ -15,7 +15,7 @@
set -e
VERSION="1.26.1"
VERSION="1.28.2"
IMAGE="docker/compose:$VERSION"

View File

@@ -36,7 +36,7 @@ if ! [ -x "$(command -v python3)" ]; then
brew install python3
fi
if ! [ -x "$(command -v virtualenv)" ]; then
pip3 install virtualenv==20.2.1
pip3 install virtualenv==20.2.2
fi
#

View File

@@ -11,7 +11,7 @@ docker run --rm \
"$TAG" tox -e pre-commit
get_versions="docker run --rm
--entrypoint=/code/.tox/py39/bin/python
--entrypoint=/code/.tox/py37/bin/python
$TAG
/code/script/test/versions.py docker/docker-ce,moby/moby"
@@ -21,9 +21,8 @@ elif [ "$DOCKER_VERSIONS" == "all" ]; then
DOCKER_VERSIONS=$($get_versions -n 2 recent)
fi
BUILD_NUMBER=${BUILD_NUMBER-$USER}
PY_TEST_VERSIONS=${PY_TEST_VERSIONS:-py39}
PY_TEST_VERSIONS=${PY_TEST_VERSIONS:-py37}
for version in $DOCKER_VERSIONS; do
>&2 echo "Running tests against Docker $version"

View File

@@ -32,7 +32,7 @@ install_requires = [
'texttable >= 0.9.0, < 2',
'websocket-client >= 0.32.0, < 1',
'distro >= 1.5.0, < 2',
'docker[ssh] >= 4.3.1, < 5',
'docker[ssh] >= 4.4.0, < 5',
'dockerpty >= 0.4.1, < 1',
'jsonschema >= 2.5.1, < 4',
'python-dotenv >= 0.13.0, < 1',

View File

@@ -58,13 +58,16 @@ COMPOSE_COMPATIBILITY_DICT = {
}
def start_process(base_dir, options):
def start_process(base_dir, options, executable=None, env=None):
executable = executable or DOCKER_COMPOSE_EXECUTABLE
proc = subprocess.Popen(
[DOCKER_COMPOSE_EXECUTABLE] + options,
[executable] + options,
stdin=subprocess.PIPE,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
cwd=base_dir)
cwd=base_dir,
env=env,
)
print("Running process: %s" % proc.pid)
return proc
@@ -78,9 +81,10 @@ def wait_on_process(proc, returncode=0, stdin=None):
return ProcessResult(stdout.decode('utf-8'), stderr.decode('utf-8'))
def dispatch(base_dir, options, project_options=None, returncode=0, stdin=None):
def dispatch(base_dir, options,
project_options=None, returncode=0, stdin=None, executable=None, env=None):
project_options = project_options or []
proc = start_process(base_dir, project_options + options)
proc = start_process(base_dir, project_options + options, executable=executable, env=env)
return wait_on_process(proc, returncode=returncode, stdin=stdin)
@@ -783,7 +787,11 @@ services:
assert BUILD_CACHE_TEXT not in result.stdout
assert BUILD_PULL_TEXT in result.stdout
@mock.patch.dict(os.environ)
def test_build_log_level(self):
os.environ['COMPOSE_DOCKER_CLI_BUILD'] = '0'
os.environ['DOCKER_BUILDKIT'] = '0'
self.test_env_file_relative_to_compose_file()
self.base_dir = 'tests/fixtures/simple-dockerfile'
result = self.dispatch(['--log-level', 'warning', 'build', 'simple'])
assert result.stderr == ''
@@ -845,13 +853,17 @@ services:
for c in self.project.client.containers(all=True):
self.addCleanup(self.project.client.remove_container, c, force=True)
@mock.patch.dict(os.environ)
def test_build_shm_size_build_option(self):
os.environ['COMPOSE_DOCKER_CLI_BUILD'] = '0'
pull_busybox(self.client)
self.base_dir = 'tests/fixtures/build-shm-size'
result = self.dispatch(['build', '--no-cache'], None)
assert 'shm_size: 96' in result.stdout
@mock.patch.dict(os.environ)
def test_build_memory_build_option(self):
os.environ['COMPOSE_DOCKER_CLI_BUILD'] = '0'
pull_busybox(self.client)
self.base_dir = 'tests/fixtures/build-memory'
result = self.dispatch(['build', '--no-cache', '--memory', '96m', 'service'], None)

View File

@@ -0,0 +1,125 @@
import logging
import os
import socket
from http.server import BaseHTTPRequestHandler
from http.server import HTTPServer
from threading import Thread
import requests
from docker.transport import UnixHTTPAdapter
from tests.acceptance.cli_test import dispatch
from tests.integration.testcases import DockerClientTestCase
TEST_SOCKET_FILE = '/tmp/test-metrics-docker-cli.sock'
class MetricsTest(DockerClientTestCase):
test_session = requests.sessions.Session()
test_env = None
base_dir = 'tests/fixtures/v3-full'
@classmethod
def setUpClass(cls):
super().setUpClass()
MetricsTest.test_session.mount("http+unix://", UnixHTTPAdapter(TEST_SOCKET_FILE))
MetricsTest.test_env = os.environ.copy()
MetricsTest.test_env['METRICS_SOCKET_FILE'] = TEST_SOCKET_FILE
MetricsServer().start()
@classmethod
def test_metrics_help(cls):
# root `docker-compose` command is considered as a `--help`
dispatch(cls.base_dir, [], env=MetricsTest.test_env)
assert cls.get_content() == \
b'{"command": "compose --help", "context": "moby", ' \
b'"source": "docker-compose", "status": "success"}'
dispatch(cls.base_dir, ['help', 'run'], env=MetricsTest.test_env)
assert cls.get_content() == \
b'{"command": "compose help", "context": "moby", ' \
b'"source": "docker-compose", "status": "success"}'
dispatch(cls.base_dir, ['--help'], env=MetricsTest.test_env)
assert cls.get_content() == \
b'{"command": "compose --help", "context": "moby", ' \
b'"source": "docker-compose", "status": "success"}'
dispatch(cls.base_dir, ['run', '--help'], env=MetricsTest.test_env)
assert cls.get_content() == \
b'{"command": "compose --help run", "context": "moby", ' \
b'"source": "docker-compose", "status": "success"}'
dispatch(cls.base_dir, ['up', '--help', 'extra_args'], env=MetricsTest.test_env)
assert cls.get_content() == \
b'{"command": "compose --help up", "context": "moby", ' \
b'"source": "docker-compose", "status": "success"}'
@classmethod
def test_metrics_simple_commands(cls):
dispatch(cls.base_dir, ['ps'], env=MetricsTest.test_env)
assert cls.get_content() == \
b'{"command": "compose ps", "context": "moby", ' \
b'"source": "docker-compose", "status": "success"}'
dispatch(cls.base_dir, ['version'], env=MetricsTest.test_env)
assert cls.get_content() == \
b'{"command": "compose version", "context": "moby", ' \
b'"source": "docker-compose", "status": "success"}'
dispatch(cls.base_dir, ['version', '--yyy'], env=MetricsTest.test_env)
assert cls.get_content() == \
b'{"command": "compose version", "context": "moby", ' \
b'"source": "docker-compose", "status": "failure"}'
@staticmethod
def get_content():
resp = MetricsTest.test_session.get("http+unix://localhost")
print(resp.content)
return resp.content
def start_server(uri=TEST_SOCKET_FILE):
try:
os.remove(uri)
except OSError:
pass
httpd = HTTPServer(uri, MetricsHTTPRequestHandler, False)
sock = socket.socket(socket.AF_UNIX, socket.SOCK_STREAM)
sock.bind(TEST_SOCKET_FILE)
sock.listen(0)
httpd.socket = sock
print('Serving on ', uri)
httpd.serve_forever()
sock.shutdown(socket.SHUT_RDWR)
sock.close()
os.remove(uri)
class MetricsServer:
@classmethod
def start(cls):
t = Thread(target=start_server, daemon=True)
t.start()
class MetricsHTTPRequestHandler(BaseHTTPRequestHandler):
usages = []
def do_GET(self):
self.client_address = ('',) # avoid exception in BaseHTTPServer.py log_message()
self.send_response(200)
self.end_headers()
for u in MetricsHTTPRequestHandler.usages:
self.wfile.write(u)
MetricsHTTPRequestHandler.usages = []
def do_POST(self):
self.client_address = ('',) # avoid exception in BaseHTTPServer.py log_message()
content_length = int(self.headers['Content-Length'])
body = self.rfile.read(content_length)
print(body)
MetricsHTTPRequestHandler.usages.append(body)
self.send_response(200)
self.end_headers()
if __name__ == '__main__':
logging.getLogger("urllib3").propagate = False
logging.getLogger("requests").propagate = False
start_server()

View File

@@ -948,7 +948,12 @@ class ServiceTest(DockerClientTestCase):
with open(os.path.join(base_dir, 'Dockerfile'), 'w') as f:
f.write("FROM busybox\n")
service = self.create_service('web', build={'context': base_dir})
service = self.create_service('web',
build={'context': base_dir},
environment={
'COMPOSE_DOCKER_CLI_BUILD': '0',
'DOCKER_BUILDKIT': '0',
})
service.build()
self.addCleanup(self.client.remove_image, service.image_name)
@@ -964,7 +969,6 @@ class ServiceTest(DockerClientTestCase):
service = self.create_service('web',
build={'context': base_dir},
environment={
'COMPOSE_DOCKER_CLI_BUILD': '1',
'DOCKER_BUILDKIT': '1',
})
service.build(cli=True)
@@ -1015,7 +1019,6 @@ class ServiceTest(DockerClientTestCase):
web = self.create_service('web',
build={'context': base_dir},
environment={
'COMPOSE_DOCKER_CLI_BUILD': '1',
'DOCKER_BUILDKIT': '1',
})
project = Project('composetest', [web], self.client)

View File

@@ -61,6 +61,7 @@ class DockerClientTestCase(unittest.TestCase):
@classmethod
def tearDownClass(cls):
cls.client.close()
del cls.client
def tearDown(self):

View File

@@ -0,0 +1,56 @@
import os
import pytest
from compose.cli.colors import AnsiMode
from tests import mock
@pytest.fixture
def tty_stream():
stream = mock.Mock()
stream.isatty.return_value = True
return stream
@pytest.fixture
def non_tty_stream():
stream = mock.Mock()
stream.isatty.return_value = False
return stream
class TestAnsiModeTestCase:
@mock.patch.dict(os.environ)
def test_ansi_mode_never(self, tty_stream, non_tty_stream):
if "CLICOLOR" in os.environ:
del os.environ["CLICOLOR"]
assert not AnsiMode.NEVER.use_ansi_codes(tty_stream)
assert not AnsiMode.NEVER.use_ansi_codes(non_tty_stream)
os.environ["CLICOLOR"] = "0"
assert not AnsiMode.NEVER.use_ansi_codes(tty_stream)
assert not AnsiMode.NEVER.use_ansi_codes(non_tty_stream)
@mock.patch.dict(os.environ)
def test_ansi_mode_always(self, tty_stream, non_tty_stream):
if "CLICOLOR" in os.environ:
del os.environ["CLICOLOR"]
assert AnsiMode.ALWAYS.use_ansi_codes(tty_stream)
assert AnsiMode.ALWAYS.use_ansi_codes(non_tty_stream)
os.environ["CLICOLOR"] = "0"
assert AnsiMode.ALWAYS.use_ansi_codes(tty_stream)
assert AnsiMode.ALWAYS.use_ansi_codes(non_tty_stream)
@mock.patch.dict(os.environ)
def test_ansi_mode_auto(self, tty_stream, non_tty_stream):
if "CLICOLOR" in os.environ:
del os.environ["CLICOLOR"]
assert AnsiMode.AUTO.use_ansi_codes(tty_stream)
assert not AnsiMode.AUTO.use_ansi_codes(non_tty_stream)
os.environ["CLICOLOR"] = "0"
assert not AnsiMode.AUTO.use_ansi_codes(tty_stream)
assert not AnsiMode.AUTO.use_ansi_codes(non_tty_stream)

View File

@@ -137,21 +137,20 @@ class TestCLIMainTestCase:
class TestSetupConsoleHandlerTestCase:
def test_with_tty_verbose(self, logging_handler):
def test_with_console_formatter_verbose(self, logging_handler):
setup_console_handler(logging_handler, True)
assert type(logging_handler.formatter) == ConsoleWarningFormatter
assert '%(name)s' in logging_handler.formatter._fmt
assert '%(funcName)s' in logging_handler.formatter._fmt
def test_with_tty_not_verbose(self, logging_handler):
def test_with_console_formatter_not_verbose(self, logging_handler):
setup_console_handler(logging_handler, False)
assert type(logging_handler.formatter) == ConsoleWarningFormatter
assert '%(name)s' not in logging_handler.formatter._fmt
assert '%(funcName)s' not in logging_handler.formatter._fmt
def test_with_not_a_tty(self, logging_handler):
logging_handler.stream.isatty.return_value = False
setup_console_handler(logging_handler, False)
def test_without_console_formatter(self, logging_handler):
setup_console_handler(logging_handler, False, use_console_formatter=False)
assert type(logging_handler.formatter) == logging.Formatter

View File

@@ -669,7 +669,7 @@ class ConfigTest(unittest.TestCase):
assert 'Invalid service name \'mong\\o\'' in excinfo.exconly()
def test_config_duplicate_cache_from_values_validation_error(self):
def test_config_duplicate_cache_from_values_no_validation_error(self):
with pytest.raises(ConfigurationError) as exc:
config.load(
build_config_details({
@@ -681,7 +681,7 @@ class ConfigTest(unittest.TestCase):
})
)
assert 'build.cache_from contains non-unique items' in exc.exconly()
assert 'build.cache_from contains non-unique items' not in exc.exconly()
def test_load_with_multiple_files_v1(self):
base_file = config.ConfigFile(

View File

View File

@@ -0,0 +1,36 @@
import unittest
from compose.metrics.client import MetricsCommand
from compose.metrics.client import Status
class MetricsTest(unittest.TestCase):
@classmethod
def test_metrics(cls):
assert MetricsCommand('up', 'moby').to_map() == {
'command': 'compose up',
'context': 'moby',
'status': 'success',
'source': 'docker-compose',
}
assert MetricsCommand('down', 'local').to_map() == {
'command': 'compose down',
'context': 'local',
'status': 'success',
'source': 'docker-compose',
}
assert MetricsCommand('help', 'aci', Status.FAILURE).to_map() == {
'command': 'compose help',
'context': 'aci',
'status': 'failure',
'source': 'docker-compose',
}
assert MetricsCommand('run', 'ecs').to_map() == {
'command': 'compose run',
'context': 'ecs',
'status': 'success',
'source': 'docker-compose',
}

View File

@@ -3,6 +3,7 @@ from threading import Lock
from docker.errors import APIError
from compose.cli.colors import AnsiMode
from compose.parallel import GlobalLimit
from compose.parallel import parallel_execute
from compose.parallel import parallel_execute_iter
@@ -156,7 +157,7 @@ def test_parallel_execute_alignment(capsys):
def test_parallel_execute_ansi(capsys):
ParallelStreamWriter.instance = None
ParallelStreamWriter.set_noansi(value=False)
ParallelStreamWriter.set_default_ansi_mode(AnsiMode.ALWAYS)
results, errors = parallel_execute(
objects=["something", "something more"],
func=lambda x: x,
@@ -172,7 +173,7 @@ def test_parallel_execute_ansi(capsys):
def test_parallel_execute_noansi(capsys):
ParallelStreamWriter.instance = None
ParallelStreamWriter.set_noansi()
ParallelStreamWriter.set_default_ansi_mode(AnsiMode.NEVER)
results, errors = parallel_execute(
objects=["something", "something more"],
func=lambda x: x,

View File

@@ -1,5 +1,5 @@
[tox]
envlist = py39,pre-commit
envlist = py37,py39,pre-commit
[testenv]
usedevelop=True