supertokens-core icon indicating copy to clipboard operation
supertokens-core copied to clipboard

Logging Research(how logs can be exported/piped to the user)

Open jscyo opened this issue 2 years ago • 2 comments

How other Auth providers handle logging:

Auth0:

There are 3 methods for consuming Auth0 logs:

  • Viewing the log events from the Auth0 Dashboard
  • Retrieve log events using the management API. You can specify search criteria
  • Log streaming. Auth0 offers a number of integrations with services that allow them to stream the logs to a number of services(They have integrations in their marketplace like papertrail, aws cloudwatch etc..)

FusionAuth

  • They provide a number of apis to retrieve log information from their cloud service
  • There don't seem to be any methods for streaming logs like auth0.

Tools we can use:

Configure docker's logging driver:

  • You can configure dockers logging driver to write logs to a file or even to a service
  • Maybe write logs to a file which can be retrieved via an API call.

logsput

A popular library that allows for log streaming, can integrate with papertrail

Notes:

  • Papertrail
  • AWS Cloudwatch
  • Nxlog
  • Splunk
  • Docker built-in dual logging and forwarding

jscyo avatar Nov 08 '22 06:11 jscyo

Additional Research:

We want to use log streaming.

  • How would that work?

The webhook method is where we ask them to give us an HTTP URL. The core would call that URL and stream some log info to it. This allows the user to choose any service to consume the logs.

  • Can you achieve webhook behavior with logspout.

    • The most common method for using logspout is by redirecting the logs to a remote syslog server
  • What would the output look like? Can you configure the output in logspout

    • You can configure the output type, JSON lines seem to be a popular type.
    • The default output is just the raw output from the SuperTokens core.
  • Also how often should logs be sent?

    • With log streaming, it will be sent continuously.
  • Should there be config options to filter logs?

    • Logspout does allow you to filter logs, but, I think most people would normally filter logs in their own logging software.

jscyo avatar Nov 14 '22 08:11 jscyo

Using Logspout for logging and log streaming:

Logspout is a log router that runs as a docker container and listens to other docker images and can steam logs to popular services like AWS cloudwatch, elastic search, and papertrail or any syslog servers.

Using logspout to expose an endpoint that will stream logs

Running the following will create an endpoint http://localhost:3000/logs which will stream logs

docker run -d --name="logspout" \
        --volume=/var/run/docker.sock:/var/run/docker.sock \
        --publish=127.0.0.1:3000:80 \
        gliderlabs/logspout

This can be tested by running the above command and then running curl http://localhost:3000/logs, if you now run the supertokens core you should see the following: curl_logs

Logspout filtering

Logspout allows you to create filters to filter logs per container name/label/sources

Connecting to other Log collation services:

The main purpose of logspout is to route logs to services like papertrail or aws cloudwatch by supplying a syslog tls endpoint from the service.

For example, connecting with paper trail:

$ docker run --name="logspout" \
	--volume=/var/run/docker.sock:/var/run/docker.sock \
	gliderlabs/logspout \
	syslog+tls://logs.papertrailapp.com:55555

Customizing log format

As mentioned in their docs, you should be able to customize the format for the logs

Testing

  • [x] What happens when the service which is logging fails? (Test for all cases: if the service restarts will it log again?)
    • Since logspout is listening to the output of docker containers it does not matter if the container it is logging starts and stops.
    • If logspout restarts the endpoint will continue to stream logs from that point onwards
  • [ ] Is there a health check API to know if logspout is active?

jscyo avatar Nov 21 '22 08:11 jscyo