This is a fork of Chris Williams' https://github.com/chriswill/serilog-sinks-azureblobstorage
This fork specifically focuses on file formatting/date pattern to allow granularity down to the millisecond. Ex. {yyyy}/{MM}/{dd}/MyApplication{HHmmssfff}.log This is done to support Azure Event Grid triggering. Under Azure Event Grid Triggeing, a whole log file can be routed to an logging aggregator service.
End fork comments
Writes to a file in Windows Azure Blob Storage.
Azure Blob Storage offers appending blobs, which allow you to add content quickly to a single blob without locking it for updates. For this reason, appending blobs are ideal for logging applications.
The AzureBlobStorage sink appends data to the blob in text format. Here's a sample line:
[2018-10-17 23:03:56 INF] Hello World!
Package - Serilog.Sinks.AzureBlobStorage | Platforms - .NET 4.5, .Net Standard 1,3
Usage
var connectionString = CloudStorageAccount.Parse("ConnnectionString");
var log = new LoggerConfiguration()
.WriteTo.AzureBlobStorage(connectionString)
.CreateLogger();
In addition to the storage connection, you can also specify:
- Message line format (default: [{Timestamp:yyyy-MM-dd HH:mm:ss} {Level:u3}] {Message:lj}{NewLine}{Exception})
- Blob container (default: logs)
- Blob filename (default: log.txt)
By default, the log file name is logs.txt, but you can add date substitutions to create a rolling file implementation. These are more fully shown in the Unit Test project. But as an example, you can create a log file name like this: {yyyy}/{MM}/{dd}/log.txt
.WriteTo.AzureBlobStorage(connectionString, Serilog.Events.LogEventLevel.Information, null, "{yyyy}/{MM}/{dd}/log.txt")
On December 15, 2018 (when this was written), log files would appear to be in a folder structure as shown below:
\2018
-----\12
----\15
log.txt
In the file name, the values must appear in descending order, e.g.: yy MM dd hh mm, although it is not required to include all date elements.
By default, whenever there is a new event to post, the Azure Blob Storage sink will send it to Azure storage. For cost-management or performance reasons, you can choose to "batch" the posting of new log events.
You should create the sink by calling the AzureBatchingBlobStorageSink class, which inherits from PeriodicBatchingSink.
An example configuration is:
.WriteTo.AzureBlobStorage(connectionString, Serilog.Events.LogEventLevel.Information, null, null, null, true, TimeSpan.FromSeconds(15), 10)
This configuration would post a new batch of events every 15 seconds, unless there were 10 or more events to post, in which case they would post before the time limit.
It is possible to configure the sink using Serilog.Settings.Configuration by specifying the folder and file name and connection string in appsettings.json
:
"Serilog": {
"WriteTo": [
{"Name": "AzureBlobStorage", "Args": {"connectionString": "", "storageFolderName": "", "storageFileName": ""}}
]
}
JSON configuration must be enabled using ReadFrom.Configuration()
; see the documentation of the JSON configuration package for details.
To use the file sink with the Serilog.Settings.AppSettings package, first install that package if you haven't already done so:
Install-Package Serilog.Settings.AppSettings
Instead of configuring the logger in code, call ReadFrom.AppSettings()
:
var log = new LoggerConfiguration()
.ReadFrom.AppSettings()
.CreateLogger();
In your application's App.config
or Web.config
file, specify the file sink assembly and required path format under the <appSettings>
node:
<configuration>
<appSettings>
<add key="serilog:using:AzureBlobStorage" value="Serilog.Sinks.AzureBlobStorage" />
<add key="serilog:write-to:AzureBlobStorage.connectionString" value="DefaultEndpointsProtocol=https;AccountName=ACCOUNT_NAME;AccountKey=KEY;EndpointSuffix=core.windows.net" />
<add key="serilog:write-to:AzureBlobStorage.formatter" value="Serilog.Formatting.Compact.CompactJsonFormatter, Serilog.Formatting.Compact" />
Unfortunately the Azure Storage emulator does not support append blobs, so I'm omitted unit tests from this project. I'd love to have unit tests, but I'd like to have them be able to run on Azure Dev Ops hosted build agents. Suggestions?
This is a fork of the Serilog Azure Table storage sink. Thanks and acknowledgements to the original authors of that work.