node-bunyan
node-bunyan copied to clipboard
QUESTION: MDC support for bunyan like java logback and log4j had????
We do not want any logic depending on protocol(http, gRPC) and we keep our client API and server API 100% the same. in fact, we want to share the API to both and generate the client and generate the mapping of http request to controller call like we have done in java in the past. HOWEVER, we also want a filter that stamps a unique request id AND takes username from cookie and puts it in MDC.put("userId", username) as well. Then in plain old typescript with no access to http request/response as it could be thrift, http, gRPC re-using the same plain old code, we want this log
log.info("starting)
to print out that requestID and userId.
Looking at the example here https://cloud.google.com/logging/docs/setup/nodejs it looks like one has to have access to the request object leaking http into the plain old typescript code.
Our code though is plain typescript request DTO and response DTO and plain old typescript not knowing about the protocol as seen here -> https://github.com/deanhiller/ts-prototype/blob/main/server/src/controllers/baseController.ts
This BaseController implements BaseApi which the react code re-uses the same BaseApi so if BaseApi changes, both server and client both change in a 100% the same fashion.
Is there another way???
ANOTHER ID: What if we had a Map of 'requestID' -> MDCContextMap. Downside is I have to pass requestId throughout. upside is in the filter at top
Filter ->
const requestId = generateNewId();
const globalMap = new Map<string, MDCContextMap>();
global.put(requestId, new MDCContextMap());
//user defined MDC fields
global.get(requestId).put("userName", getCurrentLoggedInUserName());
global.get(requestId).put("subRequest", getSomeOther());
//any other contexts can be added later in the code as well usually done from a batch for loop later.
try {
return await invokeController(request, requestID);
} finally {
globalMap.remove(requestId);
}
kind of hacky though as every log has to have log.info(requestId, "Log something");
but now the logger json output has access to all fieldds in the MDCContextMap that a user adds
Is there someone in core typescript javascript features we can request they port the slick solution from Twitter's Promise.scala / Context.scala. It combined ThreadLocal with passing the Map context over the async boundary to be re-instated on the callback. It was very slick so in java, I never pass around requestId or loggers and I just create loggers and log.
I found some references to AsyncLocalStorage and it works when I use it personally but when setup for Bunyan, it is not logging my values. I have the following code
const logger: Logger = bunyan.createLogger({ // The JSON payload of the log as it appears in Cloud Logging // will contain "name": "my-service" name: 'my-service', streams: streams , serializers: { req: () => { const store = asyncLocalStorage.getStore(); const value = store ? store.get('requestId') : undefined; console.log("we are here. store="+store+" email="+value); return { method: store ? store.get('method') : undefined, url: store ? store.get('url') : undefined, remoteAddr: store ? store.get('remoteAddr') : undefined, // Include the requestId in each log entry requestId: store ? store.get('requestId') : undefined, userId: store ? store.get('userId') : undefined }; } } });
I have this code and am expecting bunyan to log userId as console.log is logging it if I manually get it from the async storage ->
` async fetchMe(request: FetchUserRequest): Promise<FetchUserResponse> { const store = asyncLocalStorage.getStore(); const userId = store.get('userId');
console.log(`userId=${userId}`); //logs it correctly
this._logger.info("Should magically log it userId"); //does not log userId
this._logger.info({}, "or this should magically log it??"); //does not log userId
`
My final solution leveraging asyncLocalStorage though really a logger should support this like many other languages.
A clientId is logged IF on a request thread An email is logged IF on a request thread and logged in
(the context is dependent on each app but most have clientId, requestId, email as a start). Usually there is a context map and all the context is transferred out to the json but with bunyan must do manually.
` import Logger, {Stream} from "bunyan"; import * as os from "node:os"; import {credential} from "firebase-admin"; import * as admin from "firebase-admin"; import {LoggingBunyan} from "@google-cloud/logging-bunyan"; import chalk from 'chalk'; import {asyncLocalStorage} from "./inversify.config";
function throwExc(msg: string): string { throw new Error(msg); }
export enum Environment { EMPTY, PRODUCTION, DEV_CLOUD, DEV_LOCAL }
export abstract class AppConfig {
abstract projectId(): string;
abstract environment(): Environment;
abstract sendGridApiKey(): string;
abstract oauthClientId(): string;
abstract oauthClientSecret(): string;
imageBucketName(): string {
return this.projectId()+"-images";
}
firestoreUrl(): string {
return `https://${this.projectId()}.firebaseio.com`
}
firebaseDefaultBucket(): string {
const name = `${this.projectId()}.firebasestorage.app`;
return name;
}
abstract databaseUrl(logger: Logger): string;
fetchStreamWrapper(): Stream {
const originalStream = this.fetchStreams();
return {
level: 'info',
stream: {
write: (log: Object) => {
const logString = log as string; // Assert log as a string
const logObj = JSON.parse(logString);
const store = asyncLocalStorage.getStore();
if(store) {
const requestId = store ? store.get('requestId') : undefined;
const method = store.get('method');
const url = store.get('url');
const email = store.get('email');
const remoteAddr = store.get('remoteAddr');
const clientId = store.get('clientId');
// Add additional fields for logging
logObj.extraInfo = {
hostname: os.hostname(),
timestamp: new Date().toISOString(),
...(requestId ? {'requestId': requestId} : {}),
...(method ? {'method': method} : {}),
...(url ? {'url': url} : {}),
...(remoteAddr ? {'remoteAddr': remoteAddr} : {}),
...(email ? {'email': email} : {}),
...(clientId ? {'clientId': clientId} : {})
};
}
// Log the modified object
const modifiedLogString = JSON.stringify(logObj);
// Proxy the log to the original stream
if (originalStream.stream && typeof originalStream.stream.write === 'function') {
originalStream.stream.write(modifiedLogString);
}
},
},
};
}
abstract fetchStreams(): Stream;
abstract attachmentsStorePath(chatId: string): string;
abstract chatsCollectionPath(): string;
abstract gptChatsCollectionPath(): string;
abstract statusCollectionPath(): string;
abstract gptAssistantPrefix(): string;
abstract firestoreCredential(): credential.Credential | undefined;
abstract firestoreInitialization(): void;
abstract baseUrl(rawServer?: boolean): string;
abstract emailDomain(): string;
abstract trainerSignupListId(): string;
abstract trainerOnboardListId(): string;
abstract allowedOrigins(): string[];
}
export class ProdConfig extends AppConfig { baseUrl(rawServer?: boolean): string { //Trainers of Advancement, Mastery, and Innovation return "https://tami.biltup.com"; }
gptAssistantPrefix(): string {
return "Production ";
}
projectId(): string {
return "biltup-community";
}
oauthClientId(): string {
return "309934505077-kkp3v1t9479jnmgqdeh2rerk9nab94mo.apps.googleusercontent.com";
}
oauthClientSecret(): string {
return "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";
}
attachmentsStorePath(chatId: string): string {
return `mail/${chatId}`;
}
chatsCollectionPath(): string {
return "chats";
}
gptChatsCollectionPath(): string {
return "gpt-chats";
}
statusCollectionPath(): string {
return "session-status";
}
environment(): Environment {
return Environment.PRODUCTION;
}
sendGridApiKey(): string {
return process.env.SENDGRID_API_KEY ?? throwExc("Missing definition of SENDGRID_API_KEY");
}
databaseUrl(logger:Logger): string {
logger.info("Running in production mode");
return process.env.DATABASE_URL ?? throwExc("Missing definition of DATABASE_URL");
}
fetchStreams(): Stream {
console.log("Running logging in production");
const loggingBunyan = new LoggingBunyan();
return loggingBunyan.stream('info');
}
firestoreCredential(): credential.Credential | undefined {
return admin.credential.applicationDefault();
}
firestoreInitialization(): void {
//do nothing, this is for development only for emulator to work and hide dev code
}
emailDomain(): string {
//Trainers of Advancement, Mastery, and Innovation
return "tami.biltup.com";
}
trainerSignupListId(): string {
return "a975c784-cc84-4f59-918d-259906664643";
}
trainerOnboardListId(): string {
return "7495fb38-3a0d-4a96-986d-a50d382983de";
}
allowedOrigins(): string[] {
const allowedOrigins = ['https://tami.biltup.com', 'app.biltup.com'];
return allowedOrigins;
}
}
export class LocalConfig extends AppConfig {
baseUrl(rawServer?: boolean): string {
if(rawServer) {
return "http://localhost:8080";
} else {
return "http://localhost:4200";
}
}
projectId(): string {
return "biltup-dev-local";
}
oauthClientId(): string {
return "964035523793-mime2tmcluuqe7a93in5a3hhf4ptv512.apps.googleusercontent.com";
}
oauthClientSecret(): string {
return "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";
}
sendGridApiKey(): string {
return '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';
}
gptAssistantPrefix(): string {
const username = os.userInfo().username;
return `${username}'s `
}
attachmentsStorePath(chatId: string): string {
const username = os.userInfo().username;
return `mail/${username}/${chatId}`;
}
chatsCollectionPath(): string {
//each developer has their own space...
const username = os.userInfo().username;
return `developers/${username}/chats`;
}
gptChatsCollectionPath(): string {
//each developer has their own space...
const username = os.userInfo().username;
return `developers/${username}/gpt-chats`;
}
statusCollectionPath(): string {
//each developer has their own space...
const username = os.userInfo().username;
return `developers/${username}/session-status`;
}
environment(): Environment {
return Environment.DEV_LOCAL;
}
databaseUrl(logger:Logger): string {
const databaseUrl = "postgresql://app_user:password@localhost:5432/biltup";
logger.info("Running dev mode. url=" + databaseUrl);
return databaseUrl;
}
fetchStreams(): Stream {
console.log("Running logging in development");
const stream: Stream =
{
level: 'info',
stream: {
write: (log: Object) => {
const str = log as string; // Assert it as a string
const logObj = JSON.parse(str);
const levelNames = ['TRACE', 'DEBUG', 'INFO', 'WARN', 'ERROR', 'FATAL'];
// Extract date and time
const timestamp = new Date(logObj.time);
const day = timestamp.getDate();
const time = `${timestamp.toTimeString().split(' ')[0]}.${timestamp.getMilliseconds()}`; // HH:MM:SS.mmm format
let clientId = logObj.extraInfo?.clientId || '';
let email = logObj.extraInfo?.email || '';
if(email.startsWith("biltup")) {
email = email.substring(5);
}
if(email.endsWith("biltup.com")) {
email = email.substring(0, email.length - "biltup.com".length);
}
if(clientId) {
clientId = clientId.substring(0, 6);
}
type LogLevel = (typeof levelNames)[number]; // 'TRACE' | 'DEBUG' | 'INFO' | 'WARN' | 'ERROR' | 'FATAL'
// Infer level from logObj.level
const level: LogLevel = levelNames[logObj.level / 10 - 1] || 'INFO';
// Map log levels to colors
const levelColor: Record<LogLevel, chalk.Chalk> = {
TRACE: chalk.gray,
DEBUG: chalk.green,
INFO: chalk.cyanBright,
WARN: chalk.yellow,
ERROR: chalk.red,
FATAL: chalk.bgRed.white,
};
const paddedLevel = level.padEnd(5); // Align levels like "INFO ", "ERROR", "DEBUG"
// Format the log message with colors
let logMsg = `${levelColor[level](`[${paddedLevel}]`)} ${chalk.gray(day)}T${chalk.gray(time)} [${clientId},${email}]: ${logObj.msg}`;
// If the log contains an error with a stack trace, format it as multiline
// If the log contains an error, include its details
if (logObj.err) {
const err = logObj.err;
logMsg += `\n${chalk.red('Error Details:')}`;
logMsg += `\n ${chalk.white('Message:')} ${chalk.yellow(err.message)}`;
logMsg += `\n ${chalk.white('Name:')} ${chalk.yellow(err.name)}`;
if (err.stack) {
logMsg += `\n ${chalk.red('Stack Trace:')}\n${chalk.redBright(err.stack)}`;
}
}
console.log(logMsg); // Output as plain text
},
}
};
return stream;
}
firestoreCredential(): credential.Credential | undefined {
return undefined;
}
firestoreInitialization(): void {
process.env['FIRESTORE_EMULATOR_HOST'] = 'localhost:8088';
process.env['FIREBASE_AUTH_EMULATOR_HOST'] = 'localhost:9099';
}
emailDomain(): string {
return "devapp.biltup.com";
}
trainerSignupListId(): string {
return "ffce68a7-e9e5-41d6-add3-4a1b2142379e";
}
trainerOnboardListId(): string {
return "dbcc1bb6-68d7-440f-9967-584da485cb6c";
}
allowedOrigins(): string[] {
const allowedOrigins = ['http://localhost:8080', 'http://localhost:4200'];
return allowedOrigins;
}
}
export class CloudDevConfig extends LocalConfig { projectId(): string { return "biltup-dev-cloud"; }
environment(): Environment {
return Environment.DEV_CLOUD;
}
firestoreCredential(): credential.Credential | undefined {
const pathToFile = `${process.cwd()}/firestore/biltup-dev-cloud-06a6eaebed28.json`;
const serviceAccount = require(pathToFile);
return admin.credential.cert(serviceAccount);
}
firestoreInitialization(): void {
//do not use emulator so do nothing here
}
}
`