maximum event size exceeded using @posthog/ai when passing inline images
Bug description
I am using the posthog client to proxy my gemini/openAI requests. When sending inline images to the API, the posthog request fails. Fortunatelly it does not fail the whole request, only the posthog request (so i still get the answer back from the API). Example error:
Error while flushing PostHog: message=HTTP error while fetching PostHog: status=413, reqByteLength=1252075, response body=maximum event size exceeded: Event rejected by kafka during send PostHogFetchHttpError: HTTP error while fetching PostHog: status=413, reqByteLength=1252075
at retriable (XXXnode_modules/posthog-core/src/index.ts:1123:17)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async retriable (XXXnode_modules/posthog-core/src/utils.ts:38:19)
at async PostHog.fetchWithRetry (XXXnode_modules/posthog-core/src/index.ts:1106:12)
at async PostHog._flush (XXXnode_modules/posthog-core/src/index.ts:1046:9) {
response: Response {
status: 413,
statusText: 'Payload Too Large',
headers: Headers {
date: 'Wed, 02 Jul 2025 17:41:58 GMT',
'content-type': 'text/plain; charset=utf-8',
'transfer-encoding': 'chunked',
connection: 'keep-alive',
vary: 'origin, access-control-request-method, access-control-request-headers, Accept-Encoding',
'access-control-allow-credentials': 'true',
'x-envoy-upstream-service-time': '21',
'content-encoding': 'gzip',
server: 'envoy',
'strict-transport-security': 'max-age=31536000; includeSubDomains'
},
body: ReadableStream { locked: true, state: 'closed', supportsBYOB: true },
bodyUsed: true,
ok: false,
redirected: false,
type: 'basic',
url: 'https://eu.i.posthog.com/batch/'
},
reqByteLength: 1252075
}
Example request:
const posthogClient = new PostHog(POSTHOG_PUBLIC_KEY, { host: "https://eu.i.posthog.com" });
const geminiClient = new OpenAI({
apiKey: GEMINI_API_KEY,
baseURL: "https://generativelanguage.googleapis.com/v1beta/openai",
posthog: posthogClient,
});
let base64Image = "xxxx"; // use valid image here. In theory the gemini API support up to 20MB of inline images
const messages: any = [
{
role: "user",
content: [
{
type: "text",
text: "Describe the image",
},
{ type: "image_url", image_url: { url: `data:image/jpeg;base64,${base64Image}` } },
],
},
];
const response = await this.geminiClient.chat.completions.create({
model: "models/gemini-2.5-flash-lite-preview-06-17",
messages: messages,
max_tokens: 100,
});
More details about passing images can be found here: https://ai.google.dev/gemini-api/docs/image-understanding#inline-image
Note: I guess this will also happen for audio and video passing. Maybe even for passing documents.
Related sub-libraries
- [ ] All of them
- [ ] posthog-web
- [ ] posthog-node
- [x] posthog-ai
- [ ] posthog-react-native
- [ ] posthog-nextjs-config
cc @k11kirky
From the looks of it, the PostHog events ingestion is giving you this error. While the Gemini API might allow you to send an inline image up to 20MB, PostHog events can't be that big (or even close to that big).
The Gemini API also allows you to upload the file to the Files API, and then use a reference to that file. Would that work for you?
Here's a reference to that: https://ai.google.dev/gemini-api/docs/image-understanding#upload-image
@Radu-Raicea I know that it is possible to use the Files API.
Wouldn't a better solution be to strip out the image data (base64 string) in the posthog method automatically so that a user/developer doesn't have to change their code logic? That way also in the future developers still can just replace their OpenAI import without touching their code and posthog tracking would work out of the box as intended.
@RaminGe I've shipped the changes that strip the inline base64 encoded image data: https://github.com/PostHog/posthog-js/pull/2217
The version numbers of the library that include this change are >6.1.1
For those using the Python SDK, the same changes will be made as part of this issue: https://github.com/PostHog/posthog-python/issues/305
Has this shipped? I seem to be getting the same error with:
"posthog-js": "^1.279.0",
"posthog-node": "^5.10.2",
Error:
Error while flushing PostHog: message=HTTP error while fetching PostHog: status=413, reqByteLength=868722, response body=maximum event size exceeded: Event rejected by kafka during send PostHogFetchHttpError: HTTP error while fetching PostHog: status=413, reqByteLength=868722
at retriable (file:///node_modules/@posthog/core/dist/posthog-core-stateless.mjs:593:77)
at processTicksAndRejections (node:internal/process/task_queues:105:5)
at retriable (file:///node_modules/@posthog/core/dist/utils/index.mjs:22:25)
at PostHog.fetchWithRetry (file:///node_modules/@posthog/core/dist/posthog-core-stateless.mjs:582:16)
at PostHog._flush (file:///node_modules/@posthog/core/dist/posthog-core-stateless.mjs:549:17) {
response: Response {
status: 413,
statusText: 'Payload Too Large',
headers: Headers {
date: 'Wed, 22 Oct 2025 11:03:35 GMT',
'content-type': 'text/plain; charset=utf-8',
'transfer-encoding': 'chunked',
connection: 'keep-alive',
vary: 'origin, access-control-request-method, access-control-request-headers, Accept-Encoding',
'access-control-allow-credentials': 'true',
'x-envoy-upstream-service-time': '17',
'content-encoding': 'gzip',
server: 'envoy',
'strict-transport-security': 'max-age=31536000; includeSubDomains'
},
body: ReadableStream { locked: true, state: 'closed', supportsBYOB: true },
bodyUsed: true,
ok: false,
redirected: false,
type: 'basic',
url: 'https://us.i.posthog.com/batch/'
},
reqByteLength: 868722
}
Edit
Or is it only for version >6.1.1??
But
% npm outdated
Package Current Wanted Latest Location Depended by
posthog-js 1.279.0 1.279.3 1.279.3 node_modules/posthog-js ...
posthog-node 5.10.2 5.10.3 5.10.3 node_modules/posthog-node ...
This version doesn't seem to exist according to npm outdated...
@maximedupre Hey! That version refers to this package: https://www.npmjs.com/package/@posthog/ai
If you're using the manual capture method, you should make sure to remove the base64 messages before sending the events.
We're also working on a new ingestion pipeline for LLM events, which will allow you to send much larger events.
@Radu-Raicea Thanks for the response!
Sorry forgot to mention I'm using "@posthog/ai": "^6.4.4", and the error is indeed coming from LLM calls...
@maximedupre Do you have any non-text content in your LLM input/output? Or, do you have a really big input (think 300k tokens)?
@Radu-Raicea Yes, I am sending base64 images to openai. But per your previous comment, that should be stripped out, right?
Sending images like this:
input: [
{
role: 'user',
content: [
{
type: 'input_text',
text: `...`
},
...chunks.map((chunk) => ({
type: 'input_image' as const,
image_url: `data:image/jpeg;base64,${chunk.toString('base64')}`,
detail: 'high' as const
}))
]
}
],
@Radu-Raicea Should we reopen this issue?
Running into this with OpenAI as well, seems like its only fixed for Gemini?
Edit: seems to specifically be missing for the Responses API in OpenAI, if I'm reading the source code correctly - I've attempted a PR for this, linked to the ticket.
I've reopened this issue and we'll take a look at it, thanks for flagging this!