Skip to main content

Global exception handling and custom logging in AspNet Core with MongoDB

In this, we would be looking into logging and global exception handling in the AspNet Core application with proper registration of logger and global exception handling.

Custom logging


The first step is to create a data model that we want to save into DB.

Error log Data model
These are few properties to do logging which could be extended or reduced based on need.

 public class ErrorLog  
 {  
      /// <summary>  
      /// Gets or sets the Error log identifier.  
      /// </summary>  
      /// <value>  
      /// The Error log identifier.  
      /// </value>  
      [BsonRepresentation(BsonType.ObjectId)]  
      public ObjectId Id { get; set;   
      /// <summary>  
      /// Gets or sets the date.  
      /// </summary>  
      /// <value>  
      /// The date.  
      /// </value>  
      public DateTime Date { get; set; }  
      /// <summary>  
      /// Gets or sets the thread.  
      /// </summary>  
      /// <value>  
      /// The thread.  
      /// </value>  
      public string Thread { get; set; }  
      /// <summary>  
      /// Gets or sets the level.  
      /// </summary>  
      /// <value>  
      /// The level.  
      /// </value>  
      public string Level { get; set; }  
      /// <summary>  
      /// Gets or sets the logger.  
      /// </summary>  
      /// <value>  
      /// The logger.  
      /// </value>  
      public string Logger { get; set; }  
      /// <summary>  
      /// Gets or sets the message.  
      /// </summary>  
      /// <value>  
      /// The message.  
      /// </value>  
      public string Message { get; set; }  
      /// <summary>  
      /// Gets or sets the exception.  
      /// </summary>  
      /// <value>  
      /// The exception.  
      /// </value>  
      public string Exception { get; set; }  
      /// <summary>  
      /// Gets or sets the user agent.  
      /// </summary>  
      /// <value>  
      /// The user agent.  
      /// </value>  
      public string UserAgent { get; set; }  
      /// <summary>  
      /// Gets or sets the ip address.  
      /// </summary>  
      /// <value>  
      /// The ip address.  
      /// </value>  
      public string IpAddress { get; set; }  
      /// <summary>  
      /// Gets or sets the URL.  
      /// </summary>  
      /// <value>  
      /// The URL.  
      /// </value>  
      public string Url { get; set; }  
      /// <summary>  
      /// Gets or sets the referrer.  
      /// </summary>  
      /// <value>  
      /// The referrer.  
      /// </value>  
      public string Referrer { get; set; }  
      /// <summary>  
      /// Gets or sets the name of the user.  
      /// </summary>  
      /// <value>  
      /// The name of the user.  
      /// </value>  
      public string UserName { get; set; }  
      /// <summary>  
      /// Gets or sets the name of the server.  
      /// </summary>  
      /// <value>  
      /// The name of the server.  
      /// </value>  
      public string ServerName { get; set; }  
 }  

The logger, for collecting log information and inserting into DB

It has to be extended from ILogger for implementing the definition of Logger.


 using Microsoft.AspNetCore.Http;  
 using Microsoft.Extensions.DependencyInjection;  
 using Microsoft.Extensions.Logging;  
 using Microsoft.Extensions.OptionsModel;  
 using MongoDB.Driver;  
 using System;  
 using System.Collections;  
 using System.Collections.Generic;  
 using System.Linq;  
 using System.Text;  
 /// <summary>  
 /// Mongo DB logger  
 /// </summary>  
 /// <typeparam name="TLog">The type of the log.</typeparam>  
 /// <seealso cref="Microsoft.Extensions.Logging.ILogger" />  
 public class MongoLogger<TLog> : ILogger  
   where TLog : ErrorLog, new()  
 {  
   /// <summary>  
   /// The indentation  
   /// </summary>  
   private const int Indentation = 2;  
   /// <summary>  
   /// The filter for log level  
   /// </summary>  
   private readonly Func<string, LogLevel, bool> Filter;  
   /// <summary>  
   /// The logger requester name  
   /// </summary>  
   private readonly string Name;  
   /// <summary>  
   /// The services  
   /// </summary>  
   private readonly IServiceProvider Services;  
   /// <summary>  
   /// The _mongo database  
   /// </summary>  
   private IMongoDatabase _mongoDb;  
   /// <summary>  
   /// Initializes a new instance of the <see cref="MongoLogger{TLog}"/> class.  
   /// </summary>  
   /// <param name="name">The name.</param>  
   /// <param name="filter">The filter.</param>  
   /// <param name="serviceProvider">The service provider.</param>  
   public MongoLogger(string name, Func<string, LogLevel, bool> filter, IServiceProvider serviceProvider)  
   {  
     Name = name;  
     Filter = filter ?? GetFilter(serviceProvider.GetService<IOptions<MongoLoggerOption>>());  
     Services = serviceProvider;  
   }  
   /// <summary>  
   /// Gets the database log.  
   /// </summary>  
   /// <value>  
   /// The database log.  
   /// </value>  
   protected IMongoCollection<ErrorLog> DbLog  
   {  
     get  
     {  
       _mongoDb = _mongoDb ?? Services.GetService<IMongoDatabase>();  
       return _mongoDb.GetCollection<ErrorLog>(nameof(ErrorLog));  
     }  
   }  
   /// <summary>  
   /// Begins a logical operation scope.  
   /// </summary>  
   /// <typeparam name="TState"></typeparam>  
   /// <param name="state">The identifier for the scope.</param>  
   /// <returns>  
   /// An IDisposable that ends the logical operation scope on dispose.  
   /// </returns>  
   public IDisposable BeginScope<TState>(TState state)  
   {  
     return null;  
   }  
   /// <summary>  
   /// Checks if the given <paramref name="logLevel" /> is enabled.  
   /// </summary>  
   /// <param name="logLevel">level to be checked.</param>  
   /// <returns>  
   ///  <c>true</c> if enabled.  
   /// </returns>  
   public bool IsEnabled(LogLevel logLevel)  
   {  
     return Filter(Name, logLevel);  
   }  
   /// <summary>  
   /// Writes a log entry.  
   /// </summary>  
   /// <typeparam name="TState"></typeparam>  
   /// <param name="logLevel">Entry will be written on this level.</param>  
   /// <param name="eventId">Id of the event.</param>  
   /// <param name="state">The entry to be written. Can be also an object.</param>  
   /// <param name="exception">The exception related to this entry.</param>  
   /// <param name="formatter">Function to create a <c>string</c> message of the <paramref name="state" /> and <paramref name="exception" />.</param>  
   public void Log<TState>(LogLevel logLevel, EventId eventId, TState state, Exception exception,  
     Func<TState, Exception, string> formatter)  
   {  
     if (!IsEnabled(logLevel))  
     {  
       return;  
     }  
     var message = string.Empty;  
     var values = state as IReadOnlyList<KeyValuePair<string, object>>;  
     if (formatter != null)  
     {  
       message = formatter(state, exception);  
     }  
     else if (values != null)  
     {  
       var builder = new StringBuilder();  
       FormatLogValues(  
         builder,  
         values,  
         level: 1,  
         bullet: false);  
       message = builder.ToString();  
       if (exception != null)  
       {  
         message += Environment.NewLine + exception;  
       }  
     }  
     else  
     {  
       message = $"{Convert.ToString(message)} [Check formatting]";  
     }  
     if (string.IsNullOrEmpty(message))  
     {  
       return;  
     }  
     var log = new TLog  
     {  
       Date = DateTime.UtcNow,  
       Level = logLevel.ToString(),  
       Logger = Name,  
       Message = message,  
       Thread = eventId.ToString(),  
     };  
     if (exception != null)  
     {  
       log.Exception = exception.ToString();  
     }  
     var httpContext = Services.GetRequiredService<IHttpContextAccessor>()?.HttpContext;  
     if (httpContext != null)  
     {  
       log.UserAgent = httpContext.Request.Headers["User-Agent"];  
       log.UserName = httpContext.User.Identity.Name;  
       try  
       {  
         log.IpAddress = httpContext.Connection.LocalIpAddress?.ToString();  
       }  
       catch (ObjectDisposedException)  
       {  
         log.IpAddress = "Disposed";  
       }  
       log.Url = httpContext.Request.Path;  
       log.ServerName = httpContext.Request.Host.Value;  
       log.Referrer = httpContext.Request.Headers["Referer"];  
     }  
     DbLog.InsertOne(log);  
   }  
   /// <summary>  
   /// Formats the log values.  
   /// </summary>  
   /// <param name="builder">The builder.</param>  
   /// <param name="logValues">The log values.</param>  
   /// <param name="level">The level.</param>  
   /// <param name="bullet">if set to <c>true</c> new item insert.</param>  
   private void FormatLogValues(StringBuilder builder, IReadOnlyList<KeyValuePair<string, object>> logValues, int level, bool bullet)  
   {  
     if (logValues == null)  
     {  
       return;  
     }  
     var isFirst = true;  
     foreach (var kvp in logValues)  
     {  
       builder.AppendLine();  
       if (bullet && isFirst)  
       {  
         builder.Append(' ', level * Indentation - 1)  
             .Append('-');  
       }  
       else  
       {  
         builder.Append(' ', level * Indentation);  
       }  
       builder.Append(kvp.Key)  
           .Append(": ");  
       if (kvp.Value is IEnumerable && !(kvp.Value is string))  
       {  
         foreach (var value in (IEnumerable)kvp.Value)  
         {  
           if (value is IReadOnlyList<KeyValuePair<string, object>>)  
           {  
             FormatLogValues(  
               builder,  
               (IReadOnlyList<KeyValuePair<string, object>>)value,  
               level + 1,  
               bullet: true);  
           }  
           else  
           {  
             builder.AppendLine()  
                 .Append(' ', (level + 1) * Indentation)  
                 .Append(value);  
           }  
         }  
       }  
       else if (kvp.Value is IReadOnlyList<KeyValuePair<string, object>>)  
       {  
         FormatLogValues(  
           builder,  
           (IReadOnlyList<KeyValuePair<string, object>>)kvp.Value,  
           level + 1,  
           bullet: false);  
       }  
       else  
       {  
         builder.Append(kvp.Value);  
       }  
       isFirst = false;  
     }  
   }  
   /// <summary>  
   /// Gets the filter.  
   /// </summary>  
   /// <param name="options">The options.</param>  
   /// <returns>Filtered item based on request.</returns>  
   private Func<string, LogLevel, bool> GetFilter(IOptions<MongoLoggerOption> options)  
   {  
     if (options != null)  
     {  
       return ((category, level) => GetFilter(options.Value, category, level));  
     }  
     else  
       return ((category, level) => true);  
   }  
   /// <summary>  
   /// Gets the filter.  
   /// </summary>  
   /// <param name="options">The options.</param>  
   /// <param name="category">The category.</param>  
   /// <param name="level">The level.</param>  
   /// <returns>Filtered item based on request.</returns>  
   private bool GetFilter(MongoLoggerOption options, string category, LogLevel level)  
   {  
     if (options.Filters != null)  
     {  
       var filter = options.Filters.Keys.FirstOrDefault(p => category.StartsWith(p));  
       if (filter != null)  
         return (int)options.Filters[filter] <= (int)level;  
       else return true;  
     }  
     return true;  
   }  
 }  

The Log<TState> is the main function where actual logging logic has to written which takes help of FormatLogValues for formatting exceptions.

The GetFilter uses MongoLoggerOption for checking the enabled filters from IsEnabled function

 /// <summary>  
 /// Mongo DB logger options.  
 /// </summary>  
 public class MongoLoggerOption  
 {  
   /// <summary>  
   /// Gets or sets the filters for logging.  
   /// </summary>  
   /// <value>  
   /// The filters for logging.  
   /// </value>  
   public Dictionary<string, LogLevel> Filters { get; set; }  
 }  


Till now we have created a model and Logger implementation. Now, it is time for creating configuration/structure to register logger. For registration, we can create logger provider and an extension method to register logger on startup.

Logger provider
The importance of this is just to create an instance of logger based on filter and service provider.

 using Microsoft.Extensions.Logging;  
 using System;  
 /// <summary>  
 /// Mongo DB logger provider  
 /// </summary>  
 /// <typeparam name="TLog">The type of the log.</typeparam>  
 /// <seealso cref="Microsoft.Extensions.Logging.ILoggerProvider" />  
 public class MongoLoggerProvider<TLog> : ILoggerProvider  
   where TLog : ErrorLog, new()  
 {  
   /// <summary>  
   /// The filter for logging.  
   /// </summary>  
   private readonly Func<string, LogLevel, bool> Filter;  
   /// <summary>  
   /// The service provider  
   /// </summary>  
   private readonly IServiceProvider ServiceProvider;  
   /// <summary>  
   /// Initializes a new instance of the <see cref="MongoLoggerProvider{TLog}"/> class.  
   /// </summary>  
   /// <param name="serviceProvider">The service provider.</param>  
   /// <param name="filter">The filter.</param>  
   public MongoLoggerProvider(IServiceProvider serviceProvider, Func<string, LogLevel, bool> filter)  
   {  
     Filter = filter;  
     ServiceProvider = serviceProvider;  
   }  
   /// <summary>  
   /// Creates the logger.  
   /// </summary>  
   /// <param name="name">The name.</param>  
   /// <returns></returns>  
   public ILogger CreateLogger(string name)  
   {  
     return new MongoLogger<TLog>(name, Filter, ServiceProvider);  
   }  
   /// <summary>  
   /// Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources.  
   /// </summary>  
   public void Dispose()  
   {  
     // Nothing to dispose.  
   }  
 }  


ILoggerFactory implementation for registration

 using Microsoft.Extensions.Logging;  
 using System;  
 /// <summary>  
 /// Mongo DB logger registration.  
 /// </summary>  
 public static class MongoLoggerFactoryExtensions  
 {  
   /// <summary>  
   /// Adds the Mongo DB logger framework.  
   /// </summary>  
   /// <typeparam name="TLog">The type of the log.</typeparam>  
   /// <param name="factory">The factory.</param>  
   /// <param name="serviceProvider">The service provider.</param>  
   /// <param name="filter">The filter.</param>  
   /// <returns></returns>  
   /// <exception cref="ArgumentNullException"></exception>  
   public static ILoggerFactory AddMongoFramework<TLog>(this ILoggerFactory factory,  
     IServiceProvider serviceProvider, Func<string, LogLevel, bool> filter = null)  
     where TLog : ErrorLog, new()  
   {  
     if (factory == null)  
     {  
       throw new ArgumentNullException(nameof(factory));  
     }  
     factory.AddProvider(new MongoLoggerProvider<TLog>(serviceProvider, filter));  
     return factory;  
   }  
 }  

Startup.cs, logger registration

 public IServiceProvider ServiceProvider { get; set; }  
 // INFO: Just an helper to populate ServiceProvider  
 public void ConfigureServices(IServiceCollection services)  
 {  
      //...  
      ServiceProvider = services.BuildServiceProvider();  
 }  
 public void Configure(IApplicationBuilder app, IHostingEnvironment env, ILoggerFactory loggerFactory)  
 {  
      loggerFactory.AddMongoFramework<ErrorLog>(ServiceProvider);  
      //...  
 }  

That is all we need to for logging, have a look on usage https://docs.asp.net/en/latest/fundamentals/logging.html.


Global exception handling


Since logging is implemented, global exception handling would be straightforward in few easy steps.

Implementation of IExceptionFilter

 using Microsoft.AspNetCore.Mvc;  
 using Microsoft.AspNetCore.Mvc.Filters;  
 using Microsoft.AspNetCore.Routing;  
 using Microsoft.Extensions.Logging;  
 /// <summary>  
 /// Global exception filter for the application.  
 /// </summary>  
 /// <seealso cref="Microsoft.AspNetCore.Mvc.Filters.IExceptionFilter" />  
 public class GlobalExceptionFilter  
   : IExceptionFilter  
 {  
   /// <summary>  
   /// The logger  
   /// </summary>  
   private readonly ILogger<GlobalExceptionFilter> Logger;  
   /// <summary>  
   /// Initializes a new instance of the <see cref="GlobalExceptionFilter"/> class.  
   /// </summary>  
   /// <param name="logger">The logger.</param>  
   public GlobalExceptionFilter(ILogger<GlobalExceptionFilter> logger)  
   {  
     Logger = logger;  
   }  
   /// <summary>  
   /// Called after an action has thrown an <see cref="T:System.Exception" />.  
   /// </summary>  
   /// <param name="context">The <see cref="T:Microsoft.AspNetCore.Mvc.Filters.ExceptionContext" />.</param>  
   public void OnException(ExceptionContext context)  
   {  
     Logger.LogError(new EventId(999, "GlobalException"), context.Exception, "Unhandled Error");  
     context.ExceptionHandled = true;  
           // TODO: Create page/action for end user based on parameter.   
           // TIP: It can be further classified for AJAX and page request to change message accordingly.  
     context.Result = new RedirectToRouteResult(    
       new RouteValueDictionary(new  
       {  
         action = "Index",  
         controller = "Error",  
         id = "Unknown"  
       }));  
   }  
 }  

Filter factory for injecting logger

 using Microsoft.AspNetCore.Mvc.Filters;  
 using Microsoft.Extensions.Logging;  
 using System;  
 namespace MyProject.Web.Infrastructure.Filter  
 {  
   /// <summary>  
   /// <see cref="GlobalExceptionFilter"/> factory.  
   /// </summary>  
   /// <seealso cref="Microsoft.AspNetCore.Mvc.Filters.IFilterFactory" />  
   public class GlobalExceptionFilterFactory  
     : IFilterFactory  
   {  
     /// <summary>  
     /// Gets a value that indicates if the result of <see cref="M:Microsoft.AspNetCore.Mvc.Filters.IFilterFactory.CreateInstance(System.IServiceProvider)" />  
     /// can be reused across requests.  
     /// </summary>  
     public bool IsReusable => false;  
     /// <summary>  
     /// Creates an instance of the executable filter.  
     /// </summary>  
     /// <param name="serviceProvider">The request <see cref="T:System.IServiceProvider" />.</param>  
     /// <returns>  
     /// An instance of the executable filter.  
     /// </returns>  
     public IFilterMetadata CreateInstance(IServiceProvider serviceProvider)  
     {  
       var loggerFactory = (ILogger<GlobalExceptionFilter>)serviceProvider.GetService(typeof(ILogger<GlobalExceptionFilter>));  
       return new GlobalExceptionFilter(loggerFactory);  
     }  
   }  
 }  

Global exception filter registration in Startup.cs

 services.AddMvc(option =>  
 {  
   if (!Environment.IsDevelopment())  
   {  
     option.Filters.Add(new GlobalExceptionFilterFactory());  
   }  
 })  


That is all we need to configure global exception handling. I have not added a controller for redirection of errors which can be easily done.

Comments

  1. Thanks for your contribution in sharing such a useful information. This was really helpful to me. Waiting for your further updates.
    Digital Marketing Training in Chennai

    Digital Marketing Course in Chennai

    ReplyDelete

Post a Comment

Popular posts from this blog

Elegantly dealing with TimeZones in MVC Core / WebApi

In any new application handling TimeZone/DateTime is mostly least priority and generally, if someone is concerned then it would be handled by using DateTime.UtcNow on codes while creating current dates and converting incoming Date to UTC to save on servers. Basically, the process is followed by saving DateTime to UTC format in a database and keep converting data to native format based on user region or single region in the application's presentation layer. The above is tedious work and have to be followed religiously. If any developer misses out the manual conversion, then that area of code/view would not work. With newer frameworks, there are flexible ways to deal/intercept incoming or outgoing calls to simplify conversion of TimeZones. These are steps/process to achieve it. 1. Central code for storing user's state about TimeZone. Also, central code for conversion logic based on TimeZones. 2. Dependency injection for the above class to be able to use global

Using Redis distributed cache in dotnet core with helper extension methods

Redis cache is out process cache provider for a distributed environment. It is popular in Azure Cloud solution, but it also has a standalone application to operate upon in case of small enterprises application. How to install Redis Cache on a local machine? Redis can be used as a local cache server too on our local machines. At first install, Chocolatey https://chocolatey.org/ , to make installation of Redis easy. Also, the version under Chocolatey supports more commands and compatible with Official Cache package from Microsoft. After Chocolatey installation hit choco install redis-64 . Once the installation is done, we can start the server by running redis-server . Distributed Cache package and registration dotnet core provides IDistributedCache interface which can be overrided with our own implementation. That is one of the beauties of dotnet core, having DI implementation at heart of framework. There is already nuget package available to override IDistributedCache i

Making FluentValidation compatible with Swagger including Enum or fixed List support

FluentValidation is not directly compatible with Swagger API to validate models. But they do provide an interface through which we can compose Swagger validation manually. That means we look under FluentValidation validators and compose Swagger validator properties to make it compatible. More of all mapping by reading information from FluentValidation and setting it to Swagger Model Schema. These can be done on any custom validation from FluentValidation too just that proper schema property has to be available from Swagger. Custom validation from Enum/List values on FluentValidation using FluentValidation.Validators; using System.Collections.Generic; using System.Linq; using static System.String; /// <summary> /// Validator as per list of items. /// </summary> /// <seealso cref="PropertyValidator" /> public class FixedListValidator : PropertyValidator { /// <summary> /// Gets the valid items /// <

Handling JSON DateTime format on Asp.Net Core

This is a very simple trick to handle JSON date format on AspNet Core by global settings. This can be applicable for the older version as well. In a newer version by default, .Net depends upon Newtonsoft to process any JSON data. Newtonsoft depends upon Newtonsoft.Json.Converters.IsoDateTimeConverter class for processing date which in turns adds timezone for JSON data format. There is a global setting available for same that can be adjusted according to requirement. So, for example, we want to set default formatting to US format, we just need this code. services.AddMvc() .AddJsonOptions(options => { options.SerializerSettings.DateTimeZoneHandling = "MM/dd/yyyy HH:mm:ss"; });

Kendo MVC Grid DataSourceRequest with AutoMapper

Kendo Grid does not work directly with AutoMapper but could be managed by simple trick using mapping through ToDataSourceResult. The solution works fine until different filters are applied. The problems occurs because passed filters refer to view model properties where as database model properties are required after AutoMapper is implemented. So, the plan is to intercept DataSourceRequest  and modify names based on database model. To do that we are going to create implementation of  CustomModelBinderAttribute to catch calls and have our own implementation of DataSourceRequestAttribute from Kendo MVC. I will be using same source code from Kendo but will replace column names for different criteria for sort, filters, group etc. Let's first look into how that will be implemented. public ActionResult GetRoles([MyDataSourceRequest(GridId.RolesUserGrid)] DataSourceRequest request) { if (request == null) { throw new ArgumentNullExce

Trim text in MVC Core through Model Binder

Trimming text can be done on client side codes, but I believe it is most suitable on MVC Model Binder since it would be at one place on infrastructure level which would be free from any manual intervention of developer. This would allow every post request to be processed and converted to a trimmed string. Let us start by creating Model binder using Microsoft.AspNetCore.Mvc.ModelBinding; using System; using System.Threading.Tasks; public class TrimmingModelBinder : IModelBinder { private readonly IModelBinder FallbackBinder; public TrimmingModelBinder(IModelBinder fallbackBinder) { FallbackBinder = fallbackBinder ?? throw new ArgumentNullException(nameof(fallbackBinder)); } public Task BindModelAsync(ModelBindingContext bindingContext) { if (bindingContext == null) { throw new ArgumentNullException(nameof(bindingContext)); } var valueProviderResult = bindingContext.ValueProvider.GetValue(bin

Kendo MVC Grid DataSourceRequest with AutoMapper - Advance

The actual process to make DataSourceRequest compatible with AutoMapper was explained in my previous post  Kendo MVC Grid DataSourceRequest with AutoMapper , where we had created custom model binder attribute and in that property names were changed as data models. In this post we will be looking into using AutoMapper's Queryable extension to retrieve the results based on selected columns. When  Mapper.Map<RoleViewModel>(data)  is called it retrieves all column values from table. The Queryable extension provides a way to retrieve only selected columns from table. In this particular case based on properties of  RoleViewModel . The previous approach that we implemented is perfect as far as this article ( 3 Tips for Using Telerik Data Access and AutoMapper ) is concern about performance where it states: While this functionality allows you avoid writing explicit projection in to your LINQ query it has the same fatal flaw as doing so - it prevents the query result from

OpenId Authentication with AspNet Identity Core

This is a very simple trick to make AspNet Identity work with OpenId Authentication. More of all both approach is completely separate to each other, there is no any connecting point. I am using  Microsoft.AspNetCore.Authentication.OpenIdConnect  package to configure but it should work with any other. Configuring under Startup.cs with IAppBuilder app.UseCookieAuthentication(new CookieAuthenticationOptions { AuthenticationScheme = CookieAuthenticationDefaults.AuthenticationScheme, LoginPath = new PathString("/Account/Login"), CookieName = "MyProjectName", }) .UseIdentity() .UseOpenIdConnectAuthentication(new OpenIdConnectOptions { ClientId = "<AzureAdClientId>", Authority = String.Format("https://login.microsoftonline.com/{0}", "<AzureAdTenant>"), ResponseType = OpenIdConnectResponseType.IdToken, PostLogoutRedirectUri = "<my website url>",

Data seed for the application with EF, MongoDB or any other ORM.

Most of ORMs has moved to Code first approach where everything is derived/initialized from codes rather than DB side. In this situation, it is better to set data through codes only. We would be looking through simple technique where we would be Seeding data through Codes. I would be using UnitOfWork and Repository pattern for implementing Data Seeding technique. This can be applied to any data source MongoDB, EF, or any other ORM or DB. Things we would be doing. - Creating a base class for easy usage. - Interface for Seed function for any future enhancements. - Individual seed classes. - Configuration to call all seeds. - AspNet core configuration to Seed data through Seed configuration. Creating a base class for easy usage public abstract class BaseSeed<TModel> where TModel : class { protected readonly IMyProjectUnitOfWork MyProjectUnitOfWork; public BaseSeed(IMyProjectUnitOfWork MyProjectUnitOfWork) { MyProject

MongoDB navigation property or making it behave as ORM in .Net

This is an implementation to make models to have  navigation properties work like ORM does for us. What actually happens in ORM to make navigation properties work? Entity Framework has proxy classes implementation to allow lazy loading and eager loading implementation. While creating proxy classes it also changes definitions for actual classes to make navigation properties work to get values based on Model's for navigation properties. Most of ORMs work in same fashion like Telerik DataAccess has enhancer tool which changes class definition at compile time to enable navigation properties. In this implementation, we would retain the original class but we would have extension methods to allow initializing properties to have navigation proprieties work. Let's first create desire model on which we need to implement. I am picking up simple one-to-many relationship example from Person to Address. public class Person { public int PersonId { get; set; }