testcontainers-node
testcontainers-node copied to clipboard
Error: No host port found for host IP
Expected Behaviour Container should start up.
Actual Behaviour
⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯ Unhandled Error ⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯⎯
Error: No host port found for host IP
❯ resolveHostPortBinding ../../node_modules/.pnpm/[email protected]/node_modules/testcontainers/src/utils/bound-ports.ts:74:9
❯ ../../node_modules/.pnpm/[email protected]/node_modules/testcontainers/src/utils/bound-ports.ts:55:46
❯ Function.fromInspectResult ../../node_modules/.pnpm/[email protected]/node_modules/testcontainers/src/utils/bound-ports.ts:54:41
❯ GenericContainer.startContainer ../../node_modules/.pnpm/[email protected]/node_modules/testcontainers/src/generic-container/generic-container.ts:177:35
❯ processTicksAndRejections node:internal/process/task_queues:95:5
❯ createNewReaper ../../node_modules/.pnpm/[email protected]/node_modules/testcontainers/src/reaper/reaper.ts:84:28
❯ ../../node_modules/.pnpm/[email protected]/node_modules/testcontainers/src/reaper/reaper.ts:38:14
❯ withFileLock ../../node_modules/.pnpm/[email protected]/node_modules/testcontainers/src/common/file-lock.ts:14:12
❯ getReaper ../../node_modules/.pnpm/[email protected]/node_modules/testcontainers/src/reaper/reaper.ts:29:12
Testcontainer Logs
testcontainers [TRACE] Container runtime info:
testcontainers {
testcontainers "node": {
testcontainers "version": "v20.16.0",
testcontainers "architecture": "arm64",
testcontainers "platform": "darwin"
testcontainers },
testcontainers "containerRuntime": {
testcontainers "host": "localhost",
testcontainers "hostIps": [
testcontainers {
testcontainers "address": "::1",
testcontainers "family": 6
testcontainers }
testcontainers ],
testcontainers "remoteSocketPath": "/var/run/docker.sock",
testcontainers "indexServerAddress": "https://index.docker.io/v1/",
testcontainers "serverVersion": "27.1.1",
testcontainers "operatingSystem": "Docker Desktop",
testcontainers "operatingSystemType": "linux",
testcontainers "architecture": "aarch64",
testcontainers "cpus": 12,
testcontainers "memory": 8219254784,
testcontainers "runtimes": [
testcontainers "io.containerd.runc.v2",
testcontainers "runc"
testcontainers ]
testcontainers },
testcontainers "compose": {
testcontainers "version": "2.29.1-desktop.1",
testcontainers "compatability": "v2"
testcontainers }
testcontainers }
Steps to Reproduce
const postgresContainer = await new PostgreSqlContainer().start();
process.env.DB_PORT = postgresContainer.getPort().toString();
process.env.DB_USER = postgresContainer.getUsername();
process.env.DB_PASS = postgresContainer.getPassword();
process.env.DB_NAME = postgresContainer.getDatabase();
Environment Information
- Operating System: MacOS 14.5
- Docker Version:
4.33.0 (160616) - Node version: 20.16.0
- Testcontainers version:
"@testcontainers/postgresql": "10.11.0"
This is a co-worker's environment. I've checked their MacOS network settings - nothing seems out of the ordinary. They have an IPv4 setup with IPv6 set to automatic (my own working environment is more extensive, with custom IPv6). Their docker configuration settings are the default for a docker desktop installation. We also tried stopping all existing containers before re-running without any luck.
One thing I noticed between my own working environment and theirs, is the following:
testcontainers "containerRuntime": {
testcontainers "host": "localhost",
testcontainers "hostIps": [
testcontainers {
testcontainers "address": "::1",
testcontainers "family": 6
testcontainers },
testcontainers {
testcontainers "address": "127.0.0.1",
testcontainers "family": 4
testcontainers }
testcontainers ],
Mine has an IPv4 recognized while theirs do not:
testcontainers "containerRuntime": {
testcontainers "host": "localhost",
testcontainers "hostIps": [
testcontainers {
testcontainers "address": "::1",
testcontainers "family": 6
testcontainers }
testcontainers ],
We found a workaround:
TESTCONTAINERS_HOST_OVERRIDE=127.0.0.1
Testcontainers for Node should support IPv6 port mappings since a long time.
Interestingly, we just closed this IPv6 related PR in tc-go yesterday (https://github.com/testcontainers/testcontainers-go/pull/2403), because Docker/Moby now supports consistent IPv4/IPv6 port mappings (since Docker 27, see https://github.com/moby/moby/pull/47871).
Do you have an idea what is happening here @cristianrgreco? Given the upstream change in Moby, does it maybe make sense to remove the conditional IPv4/IPv6 logic from tc-node, to simplify the code?
Meet same issue in Windows with node 22.12.0 . It comes from https://github.com/nodejs/node/issues/56137. Temporary downgrade to 22.11.0 and wait for fix is released to LTS.
We found a workaround:
TESTCONTAINERS_HOST_OVERRIDE=127.0.0.1
Hello,
I updated to the latest docker desktop today (4.42.0). Since I did that, I have been getting this error when running test containers: No host port found for host IP
When I try the workaround of setting the host override above, I get Expected Reaper to map exposed port 8080. My current workaround to this problem is to downgrade to docker desktop 4.41.2. Then I can run my tests again.
Environment Information
- Operating System: MacOS 14.6.1
- Docker Version: 4.42.0
- Node version: v22.14.0
- Testcontainers version: "@testcontainers/postgresql": "11.0.0"
Let me know if you would like any other information!
We found a workaround:
TESTCONTAINERS_HOST_OVERRIDE=127.0.0.1Hello,
I updated to the latest docker desktop today (4.42.0). Since I did that, I have been getting this error when running test containers:
No host port found for host IPWhen I try the workaround of setting the host override above, I get
Expected Reaper to map exposed port 8080. My current workaround to this problem is to downgrade to docker desktop4.41.2. Then I can run my tests again.
Can confirm, same thing for me.
I'm looking into it. I confirm the issue with Docker Desktop 4.42.0. It looks Docker is no longer binding the host port. I'm liaising with the other Testcontainers maintainers to see if they're also affected. Will update this issue with any findings.
Seems to be a combination of things. If I disable the resource reaper, I'm able to get things working after making a few changes to the way ports are exposed (always specifying HostIp, always specifying the random port ourselves instead of deferring to Docker, always specifying the protocol as TCP). But there seems to be a separate issue with the resource reaper, which after these changes is still unable to bind the host port.
You can see from this video that Docker is behaving a bit unexpectedly, in that it shows that the port is bound and then it isn't:
https://github.com/user-attachments/assets/84695a24-45e7-429b-8378-150d606c553a
Can see the split second the port was bound:
I can confirm that this is happening after Docker Desktop v4.42.0 update. Unfortunately the update has been controller by our IT dept. So I'm not sure I can downgrade it.
MacOS 15.5
Edit: I was able to downgrade, and get the tests working again.
Is this only affecting Mac users?
Is this only affecting Mac users?
I don't have another way of testing it unfortunately.
I can confirm that this is happening after Docker Desktop v4.42.0 update. Unfortunately the update has been controller by our IT dept. So I'm not sure I can downgrade it.
MacOS 15.5
Experiencing this as well. Same versions of Docker / MacOS
I can confirm that this is happening after Docker Desktop v4.42.0 update. Unfortunately the update has been controller by our IT dept. So I'm not sure I can downgrade it. MacOS 15.5
Experiencing this as well. Same versions of Docker / MacOS
You can download the previous version (if you've installed manually) from here. Adding TESTCONTAINERS_HOST_OVERRIDE does not work unfortunately.
Thanks - yes I'd been down the override route first before landing here. Have downgraded for now, will just have to ignore the MDM nags to upgrade again for a while until it gets resolved!
The issue has been reported to the Docker team and they're working on it. The issue appears with Docker Engine 28.1.1.
I'm trying to get more info (to see if I can release a workaround) as well as gauge timelines for the fix. Will keep you updated.
@cristianrgreco was an issue opened in one of the Docker repos? If so, do you mind linking it here as well?
The docker ticket is internal unfortunately, but I'll update when a PR has been raised. It's possible it will take a few days for them to resolve, so may be worth rolling back if you haven't done so already. I've been told Docker have paused the rollout of 4.42.0 because of this issue. Apparently there is a race condition in DD when starting a container and making the public ephemeral ports available
See https://github.com/docker/for-mac/issues/7693
Thanks for the responses. We saw the same error when starting testcontainers from inside of WSL on our end, so your suggestion that it's an issue internal to Docker Engine makes sense for our case. We've rolled back for the time being.
Just want to share, that we highlighted this as a Known Issue in the release notes of the 4.42.0 Docker Desktop release.
As I understand the issue, I currently don't believe this is an issue within the Docker Engine itself, but more related to Docker Desktop (and the VM networking) specifically.
This release contains a regression with docker port, resulting in "No host port found for host IP" errors when using testcontainers-node. See testcontainers/testcontainers-node#818
The scope of the issue is much wider than just testcontainers-node, as can be seen here: https://github.com/docker/for-mac/issues/7693.
For today, I would give the following guidance to tc-node users: The issue will be fixed in a future patch release, but you should downgrade to 4.41.2 for now.
I'm able to fix the issue 100% of the time by adding a delay before getting the container inspect result here: https://github.com/testcontainers/testcontainers-node/blob/7f0501c475083b647dacda6df27f8dbbe24ece32/packages/testcontainers/src/generic-container/generic-container.ts#L200
Seems we're too fast 😄
A fix has been released in [email protected]. Please test and confirm
A fix has been released in [email protected]. Please test and confirm
Update to the [email protected] solve the problem to me
I'm on Windows with the same problem. 11.0.2 also fixes the problem for me.
v4.42.0 upgraded testcontainers version to 11.0.2 macos sequoia 15.5 but still getting the error.
v4.42.0 upgraded testcontainers version to 11.0.2 but still getting the error.
Did you run npm i
I'm on Windows with the same problem. 11.0.2 also fixes the problem for me.
@NilsMoller, How do you update just the testcontainers on Windows?
I'm on Windows with the same problem. 11.0.2 also fixes the problem for me.
@NilsMoller, How do you update just the testcontainers on Windows?
@nazhussain simply updating the package version in package.json and installing it.