Skip to main content

Advance enum generation for lookup table through T4

Sometime back I had written simple T4 to generate enum for look up tables http://vikutech.blogspot.in/2014/01/enumeration-generation-for-lookup-table.html. Later on I had posted same on Nu-Get https://www.nuget.org/packages/t4.lookup.

This time I am going to extend it to support multiple enums, enum Id, configuration through class and different settings for enum description, columns.

Here is the entire code for generating code

 <#@ template debug="true" hostSpecific="true" #>  
 <#@ output extension=".cs" #>  
 <#@ Assembly Name="System.Data" #>  
 <#@ import namespace="System" #>  
 <#@ import namespace="System.Collections.Generic" #>  
 <#@ include file="EF.Utility.CS.ttinclude"#>  
 <#@ import namespace="System.Data.SqlClient" #>   
 <#  
   // TODO: Look for alternatives to remove connection string  
   var connectString = "data source=.;initial catalog=Test;integrated security=True";  
   var configList = new List<EnumConfiguration>{  
       new EnumConfiguration{  
         TableName = "LookupCountry",  
         EnumName = "Country",    // Can be empty, Get inherited based on table name  
         EnumColumn = new EnumColumnDetail {  
           Key = "Code",  
           EnumDesc = "Name",   // Not Mandatory  
           EnumSequence = "Id"   // Not Mandatory  
           },  
         },  
       new EnumConfiguration{  
         TableName = "LookupCountry",  
         EnumColumn = new EnumColumnDetail {  
           Key = "Code",  
           },  
         }  
       };  
   var code = new CodeGenerationTools(this);  
 #>  
 using System.CodeDom.Compiler;  
 namespace <#=    code.VsNamespaceSuggestion()#>  
 {    
      //------------------------------------------------------------------------------  
      // <auto-generated>  
      //   This code was generated from a template and will be overwritten as soon   
      //       as the template is executed.  
      //  
      //   Changes to this file may cause incorrect behavior and will be lost if  
      //   the code is regenerated.  
      // </auto-generated>  
      //------------------------------------------------------------------------------  
 <#  
   using (SqlConnection connection =  
       new SqlConnection(connectString))  
   {  
     connection.Open();  
     foreach(var config in configList)  
     {  
       if(!config.IsValid()) {  
         throw new ArgumentException("Please check the configuration carefully.");  
       }  
     #>  
   /// <summary>  
   /// <#=config.EnumNameDescription #>  
   /// <summary>  
   [GeneratedCode("T4.Lookup", "2.0")]  
   public enum <#=config.EnumNameDescription #>  
   {  
 <#  
       var command = new SqlCommand(config.ToString(), connection);  
       SqlDataReader reader = command.ExecuteReader();  
       // Call Read before accessing data.   
       while (reader.Read())  
       {  
         var enumId = String.IsNullOrEmpty(config.EnumColumn.EnumSequence) ? String.Empty : (" = " + reader[2]);  
 #>  
     /// <summary>  
     /// <#=          reader[1] #>  
     /// </summary>  
           <#=reader[0] #><#=enumId#>,   
 <#  
       }  
       reader.Close();   
 #>  
   }  
 <#  
     }  
   }  
 #>  
 }  
 <#+  
     /// <summary>  
   /// Enum configuration details  
   /// </summary>  
   internal class EnumConfiguration  
   {  
     /// <summary>  
     /// The enum name  
     /// </summary>  
     private string _enumName;  
     /// <summary>  
     /// The enum name description for XML comment  
     /// </summary>  
     private string _enumNameDesc;  
     public string TableName { get; set; }  
     /// <summary>  
     /// Gets or sets the name of the enum.  
     /// </summary>  
     /// <value>  
     /// The name of the enum.  
     /// </value>  
     public string EnumName  
     {  
       get  
       {  
         return String.IsNullOrEmpty(_enumName) ? TableName : _enumName;  
       }  
       set  
       {  
         _enumName = value;  
       }  
     }  
     /// <summary>  
     /// Gets or sets the enum name description.  
     /// </summary>  
     /// <value>  
     /// The enum name description.  
     /// </value>  
     public string EnumNameDescription  
     {  
       get  
       {  
         return String.IsNullOrEmpty(_enumNameDesc) ? EnumName : _enumNameDesc;  
       }  
       set  
       {  
         _enumNameDesc = value;  
       }  
     }  
     /// <summary>  
     /// Gets or sets the enum column details.  
     /// </summary>  
     /// <value>  
     /// The enum column details.  
     /// </value>  
     public EnumColumnDetail EnumColumn { get; set; }  
     /// <summary>  
     /// Determines whether this instance is valid.  
     /// </summary>  
     /// <returns></returns>  
     public bool IsValid()  
     {  
       return (!String.IsNullOrEmpty(TableName) && EnumColumn != null &&  
         !String.IsNullOrEmpty(EnumColumn.Key));  
     }  
     /// <summary>  
     /// Returns SQL query string.  
     /// </summary>  
     /// <returns>  
     /// Query string for SQL.  
     /// </returns>  
     public override string ToString()  
     {  
       if (IsValid())  
       {  
         // Generate SQL query  
         return String.Format("SELECT {0} AS EnumKey, {1} AS EnumDesc {3} FROM {2} order by {0}",  
           EnumColumn.Key, EnumColumn.EnumDesc, TableName,  
           String.IsNullOrEmpty(EnumColumn.EnumSequence) // Sequence number, if needed  
           ? String.Empty :  
           ", " + EnumColumn.EnumSequence);   
       }  
       return null;  
     }  
   }  
   /// <summary>  
   /// Generating enum column details  
   /// </summary>  
   internal class EnumColumnDetail  
   {  
     /// <summary>  
     /// The enum description  
     /// </summary>  
     private string _enumDesc;  
     /// <summary>  
     /// Gets or sets the key for enum that would be used.  
     /// </summary>  
     /// <value>  
     /// The key.  
     /// </value>  
     public string Key { get; set; }  
     /// <summary>  
     /// Gets or sets the enum description column.  
     /// </summary>  
     /// <value>  
     /// The enum description column.  
     /// </value>  
     public string EnumDesc  
     {  
       get  
       {  
         return String.IsNullOrEmpty(_enumDesc) ? Key : _enumDesc;  
       }  
       set  
       {  
         _enumDesc = value;  
       }  
     }  
     /// <summary>  
     /// Gets or sets the enum sequence Id.  
     /// </summary>  
     /// <value>  
     /// The enum sequence Id.  
     /// </value>  
     public string EnumSequence { get; set; }  
   }  
  #>  

connectString : The Sql connection string to retrieve tables from look up.

EnumConfiguration class is created to put up configuration for enum generation. configList is initialized to set multiple values based on need.

These are the details to configure enum generation:
TableName : The name of look up table. [Mandatory]
EnumName : Desired enum name associated with TableName. If not provided, get same name as                       table name. [Not-Mandatory]
EnumColumn :
       Key : The column name of table to generate enum names. [Mandatory]
       EnumDesc : The column name of table to generate XML description for each enum name.                             [Not-Mandatory]
       EnumSequence : The column name of table to generate underlying enum value.
                                 [Not-Mandatory]


Comments

Popular posts from this blog

Handling JSON DateTime format on Asp.Net Core

This is a very simple trick to handle JSON date format on AspNet Core by global settings. This can be applicable for the older version as well.

In a newer version by default, .Net depends upon Newtonsoft to process any JSON data. Newtonsoft depends upon Newtonsoft.Json.Converters.IsoDateTimeConverter class for processing date which in turns adds timezone for JSON data format.

There is a global setting available for same that can be adjusted according to requirement. So, for example, we want to set default formatting to US format, we just need this code.


services.AddMvc() .AddJsonOptions(options => { options.SerializerSettings.DateTimeZoneHandling = "MM/dd/yyyy HH:mm:ss"; });



Elegantly dealing with TimeZones in MVC Core / WebApi

In any new application handling TimeZone/DateTime is mostly least priority and generally, if someone is concerned then it would be handled by using DateTime.UtcNow on codes while creating current dates and converting incoming Date to UTC to save on servers.
Basically, the process is followed by saving DateTime to UTC format in a database and keep converting data to native format based on user region or single region in the application's presentation layer.
The above is tedious work and have to be followed religiously. If any developer misses out the manual conversion, then that area of code/view would not work.
With newer frameworks, there are flexible ways to deal/intercept incoming or outgoing calls to simplify conversion of TimeZones.
These are steps/process to achieve it. 1. Central code for storing user's state about TimeZone. Also, central code for conversion logic based on TimeZones. 2. Dependency injection for the above class to be able to use globally. 3. Creating Mo…

Trim text in MVC Core through Model Binder

Trimming text can be done on client side codes, but I believe it is most suitable on MVC Model Binder since it would be at one place on infrastructure level which would be free from any manual intervention of developer. This would allow every post request to be processed and converted to a trimmed string.

Let us start by creating Model binder

using Microsoft.AspNetCore.Mvc.ModelBinding; using System; using System.Threading.Tasks; public class TrimmingModelBinder : IModelBinder { private readonly IModelBinder FallbackBinder; public TrimmingModelBinder(IModelBinder fallbackBinder) { FallbackBinder = fallbackBinder ?? throw new ArgumentNullException(nameof(fallbackBinder)); } public Task BindModelAsync(ModelBindingContext bindingContext) { if (bindingContext == null) { throw new ArgumentNullException(nameof(bindingContext)); } var valueProviderResult = bindingContext.ValueProvider.GetValue(bindingC…

Making FluentValidation compatible with Swagger including Enum or fixed List support

FluentValidation is not directly compatible with Swagger API to validate models. But they do provide an interface through which we can compose Swagger validation manually. That means we look under FluentValidation validators and compose Swagger validator properties to make it compatible. More of all mapping by reading information from FluentValidation and setting it to Swagger Model Schema.
These can be done on any custom validation from FluentValidation too just that proper schema property has to be available from Swagger.
Custom validation from Enum/List values on FluentValidation using FluentValidation.Validators; using System.Collections.Generic; using System.Linq; using static System.String; /// <summary> /// Validator as per list of items. /// </summary> /// <seealso cref="PropertyValidator" /> public class FixedListValidator : PropertyValidator { /// <summary> /// Gets the valid items /// </sum…

Using Redis distributed cache in dotnet core with helper extension methods

Redis cache is out process cache provider for a distributed environment. It is popular in Azure Cloud solution, but it also has a standalone application to operate upon in case of small enterprises application.

How to install Redis Cache on a local machine?
Redis can be used as a local cache server too on our local machines.

At first install, Chocolateyhttps://chocolatey.org/, to make installation of Redis easy. Also, the version under Chocolatey supports more commands and compatible with Official Cache package from Microsoft.
After Chocolatey installation hit choco install redis-64.
Once the installation is done, we can start the server by running redis-server.

Distributed Cache package and registration
dotnet core provides IDistributedCache interface which can be overrided with our own implementation. That is one of the beauties of dotnet core, having DI implementation at heart of framework.

There is already nuget package available to override IDistributedCache i:e Microsoft.Extensio…

Channel, ChannelReader and ChannelWriter to manage data streams in multi-threading environment

I came across Channel class while working with SignalR which looks really interesting. By looking into NuGet packages (https://www.nuget.org/packages/System.Threading.Channels), it seems just 4 months old.

The Channel class provides infrastructure to have multiple reads and write simuletensely through it's Reader and Writer properties.

This is where it is handy in case of SignalR where data streaming needs to be done but is not just limited to that but wherever something needs to be read/write/combination of both in a multi-threading environment.

In my case with SignalR, I had to stream stock data at a regular interval of time.

public ChannelReader<StockData> StreamStock() { var channel = Channel.CreateUnbounded<StockData>(); _stockManager.OnStockData = stockData => { channel.Writer.TryWrite(stockData); }; return channel.Reader; }
The SignalR keeps return type of ChannelReader<StockData> open so that whatever written i…

Architecture solution composting Repository Pattern, Unit Of Work, Dependency Injection, Factory Pattern and others

Project architecture is like garden, we plant the things in certain order and eventually they grow in similar manner. If things are planted well then they will all look(work) great and easier to manage. If they grow as cumbersome it would difficult to maintain and with time more problems would be happening in maintenance.

There is no any fixed or known approach to decide project architecture and specially with Agile Methodology. In Agile Methodology, we cannot predict how our end products will look like similarly we cannot say a certain architecture will fit well for entire development lifespan for project. So, the best thing is to modify the architecture as per our application growth. I understand that it sounds good but will be far more problematic with actual development. If it is left as it is then more problems will arise with time. Just think about moving plant vs a full grown tree.

Coming to technical side, In this article, I will be explaining about the various techniques tha…

Data seed for the application with EF, MongoDB or any other ORM.

Most of ORMs has moved to Code first approach where everything is derived/initialized from codes rather than DB side. In this situation, it is better to set data through codes only. We would be looking through simple technique where we would be Seeding data through Codes.

I would be using UnitOfWork and Repository pattern for implementing Data Seeding technique. This can be applied to any data source MongoDB, EF, or any other ORM or DB.

Things we would be doing.
- Creating a base class for easy usage.
- Interface for Seed function for any future enhancements.
- Individual seed classes.
- Configuration to call all seeds.
- AspNet core configuration to Seed data through Seed configuration.
Creating a base class for easy usage public abstract class BaseSeed<TModel> where TModel : class { protected readonly IMyProjectUnitOfWork MyProjectUnitOfWork; public BaseSeed(IMyProjectUnitOfWork MyProjectUnitOfWork) { MyProjectUnitOfWork = …

Kendo MVC Grid DataSourceRequest with AutoMapper - Advance

The actual process to make DataSourceRequest compatible with AutoMapper was explained in my previous post Kendo MVC Grid DataSourceRequest with AutoMapper, where we had created custom model binder attribute and in that property names were changed as data models.

In this post we will be looking into using AutoMapper's Queryable extension to retrieve the results based on selected columns. When Mapper.Map<RoleViewModel>(data) is called it retrieves all column values from table. The Queryable extension provides a way to retrieve only selected columns from table. In this particular case based on properties of RoleViewModel.
The previous approach that we implemented is perfect as far as this article (3 Tips for Using Telerik Data Access and AutoMapper) is concern about performance where it states: While this functionality allows you avoid writing explicit projection in to your LINQ query it has the same fatal flaw as doing so - it prevents the query result from being cached.
Since …

Centralized model validation both for MVC/WebApi and SPA client-side validation using FluentValidation

Validation is one of the crucial parts of any application. It has to validate on both client side and server side requests.
What are target features or implementation from this article?Model validation for any given model.Centralized/One code for validation on both server-side and client-side.Automatic validation of model without writing any extra codes on/under actions for validation. NO EXTRA/ANY codes on client-side to validate any form.Compatible with SPA.Can be compatible with any client-side validation framework/library. Like Angular Reactive form validation or any jquery validation libraries. Tools used in the implementation?FluentValidation: I feel DataAnnotation validation are excellent and simple to use, but in case of complex validation or writing any custom validations are always tricker and need to write a lot of codes to achieve whereas FluentValidations are simple even in case of complex validation. Generally, we need to validate incoming input against database values,…