compose
compose copied to clipboard
Attaching to a new container, docker compose up misses the initial logs
Description
With docker-compose up, attaching to a new container misses the initial log lines for that container.
While testing #8859, I noticed that when a container was stopped, removed, then recreated, the first few log lines were missing.
Steps to reproduce the issue:
version: "3.8"
services:
aaa:
image: ubuntu:focal
init: true
command: bash -c 'echo Start aaa; while echo aaa; do sleep 5; done | cat -n'
bbb:
image: ubuntu:focal
init: true
command: bash -c 'echo Start bbb; while echo bbb; do sleep 5; done | cat -n'
- Start the stack with the above compose file and
docker-compose up
- In another terminal, stop and remove one of the containers with
docker-compose rm -fs bbb
- Start a new container with
docker-compose up -d bbb
- the first few lines of the logs will not be shown
Steps 2 and 3 can be also replaced by docker-compose up -d --scale bbb=1
Describe the results you received:
The logs from docker-compose up
will look something like this:
Attaching to test-aaa-1, test-bbb-1
test-bbb-1 | Start bbb
test-bbb-1 | 1 bbb
test-aaa-1 | Start aaa
test-aaa-1 | 1 aaa
test-bbb-1 | 2 bbb
test-aaa-1 | 2 aaa
test-bbb-1 | 3 bbb
test-aaa-1 | 3 aaa
test-bbb-1 exited with code 143
test-aaa-1 | 4 aaa
test-bbb-1 | 2 bbb
test-aaa-1 | 5 aaa
test-bbb-1 | 3 bbb
Describe the results you expected:
The logs should contain a second instance of "Start bbb" and "1 bbb".
Note that docker-compose logs -f
shows the full logs with no missing lines.
Attaching to test-aaa-1, test-bbb-1
test-aaa-1 | Start aaa
test-aaa-1 | 1 aaa
test-bbb-1 | Start bbb
test-bbb-1 | 1 bbb
test-aaa-1 | 2 aaa
test-bbb-1 | 2 bbb
test-aaa-1 | 3 aaa
test-bbb-1 | 3 bbb
test-bbb-1 exited with code 143
test-bbb-1 | Start bbb <--- missing
test-bbb-1 | 1 bbb <--- missing
test-aaa-1 | 4 aaa
test-bbb-1 | 2 bbb
test-aaa-1 | 5 aaa
test-bbb-1 | 3 bbb
Additional information you deem important (e.g. issue happens only occasionally):
This is not yet reproducible in v2.1.0, because of #8747, but can be reproduced in the current v2 branch. Hence the non-release version number below.
Output of docker compose version
:
Docker Compose version v2.1.0-7-g125752c1
Output of docker info
:
Client:
Context: default
Debug Mode: false
Plugins:
app: Docker App (Docker Inc., v0.9.1-beta3)
buildx: Build with BuildKit (Docker Inc., v0.6.3-docker)
scan: Docker Scan (Docker Inc., v0.9.0)
Server:
Containers: 14
Running: 12
Paused: 0
Stopped: 2
Images: 68
Server Version: 20.10.10
Storage Driver: overlay2
Backing Filesystem: extfs
Supports d_type: true
Native Overlay Diff: true
userxattr: false
Logging Driver: json-file
Cgroup Driver: cgroupfs
Cgroup Version: 1
Plugins:
Volume: local
Network: bridge host ipvlan macvlan null overlay
Log: awslogs fluentd gcplogs gelf journald json-file local logentries splunk syslog
Swarm: inactive
Runtimes: io.containerd.runc.v2 io.containerd.runtime.v1.linux runc
Default Runtime: runc
Init Binary: docker-init
containerd version: 5b46e404f6b9f661a205e28d59c982d3634148f8
runc version: v1.0.2-0-g52b36a2
init version: de40ad0
Security Options:
apparmor
seccomp
Profile: default
Kernel Version: 5.4.0-89-generic
Operating System: Ubuntu 20.04.3 LTS
OSType: linux
Architecture: x86_64
CPUs: 4
Total Memory: 14.6GiB
Name: tunip
ID: ZDRE:BL7Z:RPRJ:MYVQ:42AQ:SVGM:W3CS:6SMU:GTON:LXAB:FET7:WWZB
Docker Root Dir: /var/lib/docker
Debug Mode: false
Username: stephenthirlwall
Registry: https://index.docker.io/v1/
Labels:
Experimental: false
Insecure Registries:
127.0.0.0/8
Live Restore Enabled: false
WARNING: No swap limit support
Additional environment details:
It seems I've mis-characterised this issue. It can be reproduced more simply, and in v2.1.0:
- Start the stack with the above compose file and
docker-compose up
- In another terminal, run
docker-compose restart bbb
This gives the same missing logs
This issue has been automatically marked as stale because it has not had recent activity. It will be closed if no further activity occurs. Thank you for your contributions.
This issue has been automatically closed because it had not recent activity during the stale period.
This issue has been automatically closed because it had not recent activity during the stale period.