Skip to main content

Posts

Showing posts from 2015

Activity logs through Telerik Data Access

Some time back I had written a blog on managing history for required models ( http://vikutech.blogspot.in/2015/08/implementing-automating-audit-logs-in-telerik-data- access.html ). In this particular example, we would be looking for managing small summary of information based on tables affected in the particular request of context. Like what kind of operation were done on models and by whom. The approach for implementation is to create required model to store summary information and attach DB context events based on required format. The Basic data model for collecting information, just having user identifier and message. /// <summary> /// User activity log domain model /// </summary> public class UserActivityLog : IPrimaryKey<int> { /// <summary> /// Gets or sets the identifier. /// </summary> /// <value>The identifier.</value> public int Id { get; set; } /// <summa

File upload mechanism to support Local server, File Server and Azure server

Making direct dependency with Azure sounds as a big problem if there is a possibility of switching to other platforms in future. Also, in general, making dependencies with Azure also makes mandatory to run the application through Azure Emulator which itself takes the time to load application whenever need to be debugged. It takes huge amount of development time also based on architecture point it is not a good idea to have any hard dependency in application. In this article, we will look into creating a mechanism for file upload which could be switched from Azure to local server or vice-versa. The two key parts that are needed are to access the uploaded file and to upload the file. The starting stage is to create interfaces to access file path and upload files to server. This would allow us to change the implementation based on need. Interfaces IFileUpload : Interface for uploading files using MyProject.Model.DTO.Upload; using MyProject.Model.Enumeration; using Syste

Optimization through WeakReference

WeakReference is a new element added into 4.5 framework, but it was available in Java from the earlier stage only. So, this article could be useful for .Net, Java, and Android as well. What is it? It is nothing more than a class but what special about is to interact with Garbage Collector in a certain way which could help out in optimization. Scenario We all know that static members are created and shared throughout the application and life cycle by maintaining only single instance. Possibly it can useful in cases where data need to be frequently accessed but what about those that doesn't need to be frequently accessed all times. Those data might be resource hungry data so static or some kind of caching mechanism is required. Storing something in static members or caching increases memory footprint for the application. This is once situation, but there might be other situation as well. WeekReference is a technique to hook up member with Garbage collector to retrieve actual

Configuring Ninject, Asp.Net Identity UserManager, DataProtectorTokenProvider with Owin

It can be bit tricky to configure both Ninject and Asp.Net Identity UserManager if some value is expected from DI to configure UserManager. We will look into configuring both and also use OwinContext to get UserManager. As usual, all configuration need to be done on Startup.cs. It is just a convention but can be used with different name, the important thing is to decorate class with following attribute to make it Owin start-up: [assembly: OwinStartup(typeof(MyProject.Web.Startup))] Ninject configuration Configuring Ninject kernel through method which would be used to register under Owin. Startup.cs public IKernel CreateKernel() { var kernel = new StandardKernel(); try { //kernel.Bind<IHttpModule>().To<HttpApplicationInitializationHttpModule>(); // TODO: Put any other injection which are required. return kernel; } catch { kernel.Dispose(); thro

Implementing/Automating audit logs in Telerik Data Access

Audit logs can be tedious task if done manually, also developer might miss to update audit log implementation on certain level. The codes would be repeated on all places if not centralized. There are many approach available to maintain change history of model/table. Like having single history table and manage all changes of all models in same table. We may maintain in same table with some flags and JSON data for change list. We will look for maintaining history table based on each required data models with minimum effort and performance. To reduce code, I am going to use T4 to generate history models automatically based on original model. Also we are going to take care of Artificial type values. Step 1 - Create a custom attribute to mark model that history need to be maintained. /// <summary> /// Attribute to maintain history table /// </summary> [AttributeUsage(AttributeTargets.Class)] public class ManageHistoryAttribute : Attribute