chatgpt-chatbot icon indicating copy to clipboard operation
chatgpt-chatbot copied to clipboard

Error code while running with npm start even after creating .env file and added the API Key there!

Open narinderkmaurya opened this issue 2 years ago • 2 comments

This is the error

Error: Request failed with status code 429 at createError (C:\Users\Administrator\Desktop\projects\chatgpt-chatbot\node_modules\axios\lib\core\createError.js:16:15) at settle (C:\Users\Administrator\Desktop\projects\chatgpt-chatbot\node_modules\axios\lib\core\settle.js:17:12) at IncomingMessage.handleStreamEnd (C:\Users\Administrator\Desktop\projects\chatgpt-chatbot\node_modules\axios\lib\adapters\http.js:322:11) at IncomingMessage.emit (node:events:525:35) at endReadableNT (node:internal/streams/readable:1359:12) at process.processTicksAndRejections (node:internal/process/task_queues:82:21) { config: { transitional: { silentJSONParsing: true, forcedJSONParsing: true, clarifyTimeoutError: false }, adapter: [Function: httpAdapter], transformRequest: [ [Function: transformRequest] ], transformResponse: [ [Function: transformResponse] ], timeout: 0, xsrfCookieName: 'XSRF-TOKEN', xsrfHeaderName: 'X-XSRF-TOKEN', maxContentLength: -1, maxBodyLength: -1, validateStatus: [Function: validateStatus], headers: { Accept: 'application/json, text/plain, */*', 'Content-Type': 'application/json', 'User-Agent': 'OpenAI/NodeJS/3.3.0', Authorization: 'Bearer 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', 'Content-Length': 103 }, method: 'post', data: '{"model":"gpt-3.5-turbo","messages":[{"role":"user","content":"hello there help me with react today"}]}', url: 'https://api.openai.com/v1/chat/completions' }, request: <ref *1> ClientRequest { _events: [Object: null prototype] { abort: [Function (anonymous)], aborted: [Function (anonymous)], connect: [Function (anonymous)], error: [Function (anonymous)], socket: [Function (anonymous)], timeout: [Function (anonymous)], finish: [Function: requestOnFinish] }, _eventsCount: 7, _maxListeners: undefined, outputData: [], outputSize: 0, writable: true, destroyed: false, _last: true, chunkedEncoding: false, shouldKeepAlive: false, maxRequestsOnConnectionReached: false, _defaultKeepAlive: true, useChunkedEncodingByDefault: true, sendDate: false, _removedConnection: false, _removedContLen: false, _removedTE: false, strictContentLength: false, _contentLength: 103, _hasBody: true, _trailer: '', finished: true, _headerSent: true, _closed: false, socket: TLSSocket { _tlsOptions: [Object], _secureEstablished: true, _securePending: false, _newSessionPending: false, _controlReleased: true, secureConnecting: false, _SNICallback: null, servername: 'api.openai.com', alpnProtocol: false, authorized: true, authorizationError: null, encrypted: true, _events: [Object: null prototype], _eventsCount: 10, connecting: false, _hadError: false, _parent: null, _host: 'api.openai.com', _closeAfterHandlingError: false, _readableState: [ReadableState], _maxListeners: undefined, _writableState: [WritableState], allowHalfOpen: false, _sockname: null, _pendingData: null, _pendingEncoding: '', server: undefined, _server: null, ssl: [TLSWrap], _requestCert: true, _rejectUnauthorized: true, parser: null, _httpMessage: [Circular *1], [Symbol(res)]: [TLSWrap], [Symbol(verified)]: true, [Symbol(pendingSession)]: null, [Symbol(async_id_symbol)]: 19, [Symbol(kHandle)]: [TLSWrap], [Symbol(lastWriteQueueSize)]: 0, [Symbol(timeout)]: null, [Symbol(kBuffer)]: null, [Symbol(kBufferCb)]: null, [Symbol(kBufferGen)]: null, [Symbol(kCapture)]: false, [Symbol(kSetNoDelay)]: false, [Symbol(kSetKeepAlive)]: true, [Symbol(kSetKeepAliveInitialDelay)]: 60, [Symbol(kBytesRead)]: 0, [Symbol(kBytesWritten)]: 0, [Symbol(connect-options)]: [Object] }, _header: 'POST /v1/chat/completions HTTP/1.1\r\n' + 'Accept: application/json, text/plain, */*\r\n' + 'Content-Type: application/json\r\n' + 'User-Agent: OpenAI/NodeJS/3.3.0\r\n' + 'Authorization: Bearer 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\r\n' + 'Content-Length: 103\r\n' + 'Host: api.openai.com\r\n' + 'Connection: close\r\n' + '\r\n', _keepAliveTimeout: 0, _onPendingData: [Function: nop], agent: Agent { _events: [Object: null prototype], _eventsCount: 2, _maxListeners: undefined, defaultPort: 443, protocol: 'https:', options: [Object: null prototype], requests: [Object: null prototype] {}, sockets: [Object: null prototype], freeSockets: [Object: null prototype] {}, keepAliveMsecs: 1000, keepAlive: false, maxSockets: Infinity, maxFreeSockets: 256, scheduling: 'lifo', maxTotalSockets: Infinity, totalSocketCount: 1, maxCachedSessions: 100, _sessionCache: [Object], [Symbol(kCapture)]: false }, socketPath: undefined, method: 'POST', maxHeaderSize: undefined, insecureHTTPParser: undefined, joinDuplicateHeaders: undefined, path: '/v1/chat/completions', _ended: true, res: IncomingMessage { _readableState: [ReadableState], _events: [Object: null prototype], _eventsCount: 4, _maxListeners: undefined, socket: [TLSSocket], httpVersionMajor: 1, httpVersionMinor: 1, httpVersion: '1.1', complete: true, rawHeaders: [Array], rawTrailers: [], joinDuplicateHeaders: undefined, aborted: false, upgrade: false, url: '', method: null, statusCode: 429, statusMessage: 'Too Many Requests', client: [TLSSocket], _consuming: false, _dumped: false, req: [Circular *1], responseUrl: 'https://api.openai.com/v1/chat/completions', redirects: [], [Symbol(kCapture)]: false, [Symbol(kHeaders)]: [Object], [Symbol(kHeadersCount)]: 22, [Symbol(kTrailers)]: null, [Symbol(kTrailersCount)]: 0 }, aborted: false, timeoutCb: null, upgradeOrConnect: false, parser: null, maxHeadersCount: null, reusedSocket: false, host: 'api.openai.com', protocol: 'https:', _redirectable: Writable { _writableState: [WritableState], _events: [Object: null prototype], _eventsCount: 3, _maxListeners: undefined, _options: [Object], _ended: true, _ending: true, _redirectCount: 0, _redirects: [], _requestBodyLength: 103, _requestBodyBuffers: [], _onNativeResponse: [Function (anonymous)], _currentRequest: [Circular *1], _currentUrl: 'https://api.openai.com/v1/chat/completions', [Symbol(kCapture)]: false }, [Symbol(kCapture)]: false, [Symbol(kBytesWritten)]: 0, [Symbol(kEndCalled)]: true, [Symbol(kNeedDrain)]: false, [Symbol(corked)]: 0, [Symbol(kOutHeaders)]: [Object: null prototype] { accept: [Array], 'content-type': [Array], 'user-agent': [Array], authorization: [Array], 'content-length': [Array], host: [Array] }, [Symbol(errored)]: null, [Symbol(kUniqueHeaders)]: null }, response: { status: 429, statusText: 'Too Many Requests', headers: { date: 'Sat, 01 Jul 2023 11:45:29 GMT', 'content-type': 'application/json; charset=utf-8', 'content-length': '206', connection: 'close', vary: 'Origin', 'x-request-id': '102755ba14bff310271e273989ece66d', 'strict-transport-security': 'max-age=15724800; includeSubDomains', 'cf-cache-status': 'DYNAMIC', server: 'cloudflare', 'cf-ray': '7dfe67308850f29e-BOM', 'alt-svc': 'h3=":443"; ma=86400' }, config: { transitional: [Object], adapter: [Function: httpAdapter], transformRequest: [Array], transformResponse: [Array], timeout: 0, xsrfCookieName: 'XSRF-TOKEN', xsrfHeaderName: 'X-XSRF-TOKEN', maxContentLength: -1, maxBodyLength: -1, validateStatus: [Function: validateStatus], headers: [Object], method: 'post', data: '{"model":"gpt-3.5-turbo","messages":[{"role":"user","content":"hello there help me with react today"}]}', url: 'https://api.openai.com/v1/chat/completions' }, request: <ref *1> ClientRequest { _events: [Object: null prototype], _eventsCount: 7, _maxListeners: undefined, outputData: [], outputSize: 0, writable: true, destroyed: false, _last: true, chunkedEncoding: false, shouldKeepAlive: false, maxRequestsOnConnectionReached: false, _defaultKeepAlive: true, useChunkedEncodingByDefault: true, sendDate: false, _removedConnection: false, _removedContLen: false, _removedTE: false, strictContentLength: false, _contentLength: 103, _hasBody: true, _trailer: '', finished: true, _headerSent: true, _closed: false, socket: [TLSSocket], _header: 'POST /v1/chat/completions HTTP/1.1\r\n' + 'Accept: application/json, text/plain, */*\r\n' + 'Content-Type: application/json\r\n' + 'User-Agent: OpenAI/NodeJS/3.3.0\r\n' + 'Authorization: Bearer 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\r\n' + 'Content-Length: 103\r\n' + 'Host: api.openai.com\r\n' + 'Connection: close\r\n' + '\r\n', _keepAliveTimeout: 0, _onPendingData: [Function: nop], agent: [Agent], socketPath: undefined, method: 'POST', maxHeaderSize: undefined, insecureHTTPParser: undefined, joinDuplicateHeaders: undefined, path: '/v1/chat/completions', _ended: true, res: [IncomingMessage], aborted: false, timeoutCb: null, upgradeOrConnect: false, parser: null, maxHeadersCount: null, reusedSocket: false, host: 'api.openai.com', protocol: 'https:', _redirectable: [Writable], [Symbol(kCapture)]: false, [Symbol(kBytesWritten)]: 0, [Symbol(kEndCalled)]: true, [Symbol(kNeedDrain)]: false, [Symbol(corked)]: 0, [Symbol(kOutHeaders)]: [Object: null prototype], [Symbol(errored)]: null, [Symbol(kUniqueHeaders)]: null }, data: { error: [Object] } }, isAxiosError: true, toJSON: [Function: toJSON] }

narinderkmaurya avatar Jul 01 '23 11:07 narinderkmaurya

A 429 response means "too many requests". The API key that you provided does not have enough tokens. You will need to add more to your account.

bradtraversy avatar Jul 01 '23 12:07 bradtraversy

A 429 response means "You've reached your usage limit. See your usage dashboard and billing settings for more details."

Bigthinz avatar Jul 28 '23 00:07 Bigthinz