@posthog/ai throws "Class extends value undefined is not a constructor or null" when using OpenAI (Anthropic import issue)`
Bug description
Please describe.
Using @posthog/ai with OpenAI results in:
TypeError: Class extends value undefined is not a constructor or null
at Object.<anonymous> (node_modules/@posthog/ai/src/anthropic/index.ts:46:56)
This happens even though I’m not using Anthropic. The error points to PostHogAnthropic extends AnthropicOriginal, but AnthropicOriginal is undefined. It looks like @posthog/ai imports a default from @anthropic-ai/sdk that doesn’t exist or isn’t being resolved properly.
How to reproduce
-
Install dependencies:
npm install openai @posthog/ai posthog-node -
Add the following code:
import { OpenAI } from '@posthog/ai' import { PostHog } from 'posthog-node' const phClient = new PostHog( "POSTHOG_KEY", { host: "https://us.i.posthog.com" } ); const openai = new OpenAI({ apiKey: "POSTHOG_KEY", posthog: phClient, }); -
Run the file with
node→ crash withTypeError: Class extends value undefined is not a constructor or null.
Related sub-libraries
- [ ] All of them
- [ ] posthog-js (web)
- [ ] posthog-js-lite (web lite)
- [ ] posthog-node
- [ ] posthog-react-native
- [ ] @posthog/react
- [x] @posthog/ai
- [ ] @posthog/nextjs-config
Additional context
- I am not using Anthropic, but the package fails during import because
PostHogAnthropicextends anundefinedAnthropicOriginal. - Installing
@anthropic-ai/sdkmanually did not fix the issue. - Seems like the import in
@posthog/ai/src/anthropic/index.tsis broken or should be optional when Anthropic isn’t used.
Thank you for your bug report – we love squashing them!
I am also getting this same error when using the Google Vertex AI provider with this library.
From my analysis this issue is because how they are importing anthroipic sdk in the library, that part of code should not trigger but for some reason it is triggering.
Odd thing for me is that it works fine when running as a normal nextjs process, I only get this issue when running automated tests with Jest with this config:
import nextJest from 'next/jest.js';
/** @type {import('jest').Config} */
const createJestConfig = nextJest({
// Provide the path to your Next.js app to load next.config.js and .env files in your test environment
dir: './',
});
// Add any custom config to be passed to Jest
const config = {
testEnvironment: 'node', // Use node environment for server-side tests
transform: {
'^.+\\.(js|jsx|ts|tsx)$': ['babel-jest', { presets: ['next/babel'] }],
},
moduleNameMapper: {
// Handle module aliases (if you use them in your project)
'^@/(.*)$': '<rootDir>/src/$1',
},
testPathIgnorePatterns: ['<rootDir>/node_modules/', '<rootDir>/.next/'],
detectOpenHandles: true,
};
// createJestConfig is exported this way to ensure that next/jest can load the Next.js config which is async
export default createJestConfig(config);
I'm using this with nest.js (backend framework), while startup it throws this error.
@posthog-bot Any updates ?
@Radu-Raicea @carlos-marchal-ph your input will be appreciated a lot on this matter.
I'm using this for the Vercel gateway; the error throws on the Anthropic client.
so changing the import from @posthog/ai to @posthog/ai/vercel fixed it for me.
import { withTracing } from "@posthog/ai/vercel";
Good advice, I got around this by changing this:
import { GoogleGenAI } from '@posthog/ai'
import { PostHog } from 'posthog-node'
to this
import { PostHogGoogleGenAI as GoogleGenAI } from '@posthog/ai/gemini'
import { PostHog } from 'posthog-node'
@PostHog/team-llm-analytics
I got this working by importing it from @posthog/ai/openai also use node16 or later in modeuleResolution for typescript . I used "module": "nodenext" intsconfig.json.
import OpenAI from "@posthog/ai/openai";
import { PostHog } from 'posthog-node';
const phClient = new PostHog(
"POSTHOG_KEY",
{ host: "https://us.i.posthog.com" }
);
const openai = new OpenAI({
apiKey: "OPEN_AI_KEY",
posthog: phClient,
});
Still broken for me...
Use names imports like import OpenAI from "@posthog/ai/openai" as per provider you're working, it's a temporary workout until posthog team fixed it.
Use names imports like
import OpenAI from "@posthog/ai/openai"as per provider you're working, it's a temporary workout until posthog team fixed it.
Thanks, I had some issues with that solution. I ended up creating a patch with @dsinghvi's code in the PR above and it is working for me.
Still broken for me too on Anthropic here
so we fixing this or nah?
Good advice, I got around this by changing this:
import { GoogleGenAI } from '@posthog/ai' import { PostHog } from 'posthog-node' to this
import { PostHogGoogleGenAI as GoogleGenAI } from '@posthog/ai/gemini' import { PostHog } from 'posthog-node'
I've tried this, but it seems PostHog is not logging the thoughtsToken of gemini. Only the output and input. Can't tell if it's a different bug or related to this workaround.
I'm having this issue when using the actual Anthropic wrapper.
/app/node_modules/.pnpm/@[email protected]_@[email protected]_@[email protected]_@opentel_3d63026c5d7eeb563f4d68eb3f58318e/node_modules/@posthog/ai/dist/index.cjs:2469
class WrappedMessages extends AnthropicOriginal.Messages {
^
TypeError: Class extends value undefined is not a constructor or null
at Object.<anonymous> (/app/node_modules/.pnpm/@[email protected]_@[email protected]_@[email protected]_@opentel_3d63026c5d7eeb563f4d68eb3f58318e/node_modules/@posthog/ai/dist/index.cjs:2469:49)
at Module._compile (node:internal/modules/cjs/loader:1761:14)
at Object..js (node:internal/modules/cjs/loader:1893:10)
at Module.load (node:internal/modules/cjs/loader:1481:32)
at Module._load (node:internal/modules/cjs/loader:1300:12)
at TracingChannel.traceSync (node:diagnostics_channel:328:14)
at wrapModuleLoad (node:internal/modules/cjs/loader:245:24)
at Module.require (node:internal/modules/cjs/loader:1504:12)
at require (node:internal/modules/helpers:152:16)
at Object.<anonymous> (/app/dist/agents/normalizationAgent/normalizationAgent.js:4:14)
Node.js v24.12.0
Versions:
- @anthropic-ai/sdk: ^0.71.2
- @posthog/ai: ^7.3.0