aspnet-core-logging
aspnet-core-logging copied to clipboard
This repo shows ASP.NET Core 8 logging in action; it also serves as a learning, experimenting and teaching path for .NET, Azure Pipelines and other technologies & tools.
aspnet-core-logging
Description
This repository shows ASP.NET Core 8 logging in action; it also serves as a learning, experimenting and teaching path for .NET, Azure Pipelines and other technologies & tools.
:exclamation: Currently this web API uses JSON web tokens (JWT) for authentication & authorization purposes, but momentarily the mechanism used for generating these tokens has been greatly simplified to the point of being actually naive as my focus is set on other topics; on the other hand, I do intend on providing a more realistic implementation in a not so far away future.
This project has several posts associated with it:
- Structured logging in ASP.NET Core using Serilog and Seq
- Use Docker Compose when running integration tests with Azure Pipelines
- Use Docker when running integration tests with Azure Pipelines
- Build an ASP.NET Core application using Azure Pipelines
- Logging HTTP context in ASP.NET Core
Build
Build Server | Operating System | Status |
---|---|---|
Azure Pipelines | Linux | |
Azure Pipelines | macOs | |
Azure Pipelines | Windows |
Code quality
Provider | Badge |
---|---|
Codacy | |
FOSSA | |
SonarCloud |
Setup local development environment
In order to run this application locally, you need to setup some things first, like: run PostgreSQL and pgAdmin via Docker Compose, create a PostgreSQL database using EF Core database migrations, etc.
Setup local persistence services
This ASP.NET Core web API uses PostgreSQL as persistent storage and pgAdmin as database manager, all running locally via Docker Compose.
Create Docker volumes
These volumes are needed to store data outside the Docker containers running the PostgreSQL databases and their manager.
- Volume used by the local development database
docker volume create --name=aspnet-core-logging-dev_data
- Volume used by the integration tests when run locally
docker volume create --name=aspnet-core-logging-it_data
- Volume used by pgAdmin tool
docker volume create --name=pgadmin_data
- Volume used by Seq tool
docker volume create --name=seq_data
Create .env file
The .env file is used by Docker Compose to avoid storing sensitive data inside docker-compose.yml
file.
Create a new file named .env
inside the folder where you have checked-out this git repository and add the following lines:
# Environment variables used by 'aspnet-core-logging-dev' service
# suppress inspection "UnusedProperty"
DB_DEV_POSTGRES_USER=<DB_DEV_USER>
# suppress inspection "UnusedProperty"
DB_DEV_POSTGRES_PASSWORD=<DB_DEV_PASSWORD>
# Environment variables used by 'aspnet-core-logging-it' service
# suppress inspection "UnusedProperty"
DB_IT_POSTGRES_USER=<DB_IT_USER>
# suppress inspection "UnusedProperty"
DB_IT_POSTGRES_PASSWORD=<DB_IT_PASSWORD>
# Environment variables used by 'pgadmin' service
# suppress inspection "UnusedProperty"
PGADMIN_DEFAULT_EMAIL=<PGADMIN_EMAIL_ADDRESS>
# suppress inspection "UnusedProperty"
PGADMIN_DEFAULT_PASSWORD=<PGADMIN_PASSWORD>
Make sure you replace all of the above <DB_DEV_USER>
, <DB_DEV_PASSWORD>
, ..., <PGADMIN_PASSWORD>
tokens with the appropriate values.
Compose commands
All of the commands below must be run from the folder where you have checked-out this git repository.
This folder contains a docker-compose.yml
file describing the aforementioned compose services.
Run compose services
# The -d flag instructs Docker Compose to run services in the background
docker compose up -d
Stop compose services
docker compose stop
Start compose services
docker compose start
Display compose service log
# The -f flag instructs Docker Compose to display and follow the log entries of the 'pgadmin' service
docker compose logs -f pgadmin
Destroy compose services
The command below will not delete the Docker volumes!
docker compose down
Setup pgAdmin
Once the services have been started using docker compose up
command, pgAdmin UI is ready to be used.
Open pgAdmin UI
Open your browser and navigate to http://localhost:8080.
In order to start using pgAdmin, you need to authenticate - use the PGADMIN_DEFAULT_EMAIL
and PGADMIN_DEFAULT_PASSWORD
properties found in your .env
file to login.
Register your local database server
When asked about a PostgreSQL server to register, populate the fields found inside Connection
tab as below:
- Host name/address =
aspnet-core-logging-dev
- the compose service name and not the container name (the Docker Compose networking page is a little bit misleading, as it mentions container name, that's why the services found inside thedocker-compose.yml
file are named differently than their containers) - Port =
5432
- the Docker internal port - Username = the value of the
${DB_DEV_POSTGRES_USER}
property from the local.env
file - Password = the value of the
${DB_DEV_POSTGRES_PASSWORD}
property from the local.env
file
Setup environment variables
Since storing sensitive data inside configuration file put under source control is not a very good idea, the following environment variables must be defined on your local development machine:
Name | Value | Description |
---|---|---|
CONNECTIONSTRINGS__TODO | Server=localhost; Port=5432; Database=aspnet-core-logging-dev; Username=satrapu; Password=***; | The connection string pointing to the local development database |
CONNECTIONSTRINGS__TODOFORINTEGRATIONTESTS | Server=localhost; Port=5433; Database=aspnet-core-logging-it; Username=satrapu; Password=***; | The connection string pointing to the integration tests database |
GENERATEJWT__SECRET | <YOUR_JWT_SECRET> | The secret used for generating JSON web tokens for experimenting purposes only |
The connection strings above use the same username and password pairs find in the local .env
file.
The port from each connection string represent the host port declared inside the local docker-compose.yml
file -
see more about ports here.
Setup local development database
In order to run the application locally, you need to have an online PostgreSQL database whose schema is up-to-date. The database will be started using the aforementioned Docker Compose commands, while its schema will be updated via one of the options below.
Option 1: Manually run database migrations
In order to create and update the local development database, you need to install EF Core CLI tools; the reference documentation can be found here. I also recommend reading about database migrations here. All of the commands below should be executed from the folder where you have checked-out this git repository.
- Install dotnet-ef
dotnet tool install dotnet-ef --global
:exclamation: Please restart the terminal after running the above command to ensure the following dotnet ef
commands do not fail.
- Update dotnet-ef to latest version, if requested to do so
dotnet tool update dotnet-ef --global
- Add a new database migration
dotnet ef migrations add <MIGRATION_NAME> --startup-project ./Sources/Todo.WebApi --project ./Sources/Todo.Persistence
- List existing database migrations
dotnet ef migrations list --startup-project ./Sources/Todo.WebApi --project ./Sources/Todo.Persistence
- Update database to the last migration
dotnet ef database update --startup-project ./Sources/Todo.WebApi --project ./Sources/Todo.Persistence
- Drop existing database
dotnet ef database drop --startup-project ./Sources/Todo.WebApi --project ./Sources/Todo.Persistence
Option 2: Run database migrations at application startup
Ensure the MigrateDatabase
configuration property is set to true
.
See more about applying EF Core migrations at runtime here.
Inspect log events using Seq
In order to inspect application log events generated via Serilog, navigate to http://localhost:8888, which will open Seq UI.
Inspect traces using Jaeger
In order to inspect application traces, navigate to http://localhost:16686/search, which will open Jaeger UI.
To see Jaeger metrics, navigate to http://localhost:14269/metrics.
To see Jaeger health status, navigate to http://localhost:14269/.