Skip to main content

Displaying progress status in asynchronous programming

UX is most important in any application and responsive application is an add-on to make UI free by running long running task in background. Sometimes we need to show more useful information about long running task to keep user informed about progress. In this article, we will be looking into showing up progress status for long running process with Task Parallel Library (TPL).

In this particular example, we will be creating a Console application to re-size any amount of jpg files under directory with new image dimension but our main focus would be on showing progress status.

Prior to .Net 4.5 Framework version, there was no inbuilt mechanism to show status of task. If it was needed, then we need to create event handler on class that is processing the long running task with custom defined event argument for progress status class and then subscription need to be done on consuming class for getting progress information. This was better and tidy approach for getting progress status but lot of things were needed to make it work.

In 4.5, an interface named IProgress<T> is introduced having just single function Report<T value>. Due to generic by nature we could use custom defined classes or any on other built in types to show status. It is included in mscorlib.dll. http://msdn.microsoft.com/en-us/library/hh138298(v=vs.110).aspx

Let's roll and explore on the new approach.

We will be having our own custom class to show status of total files, skipped, processed and percentage of completion.

   public class ImageResizeProgress  
   {  
     public int TotalFiles { get; set; }  
     public int SkippedFiles { get; set; }  
     public int ProcessedFiles { get; set; }  
     public decimal Percentage  
     {  
       get  
       {  
         if ((TotalFiles - SkippedFiles) != 0)  
         {  
           return ProcessedFiles / (decimal)(TotalFiles - SkippedFiles);  
         }  
         return 1;  
       }  
     }  
   }  

The above class looks pretty simple without any kind of noise in it. In case of percentage there is no multiplication of 100 by we will be using ToString("p") to show up percentage value.

After completion of progress class, we need to populate classes member according to need on processing function for image re-sizing. If you look into below function, ImageResizeProgress class is wrapped into IProgress interface under parameter. After changing values of curProgress object, whenever showing status is required we just need to call progress.Report(curProgress), in which it will trigger function that we are going to create next.

 public Task ProcessFilesForImageDimension(string fileOrDirectory,  
       IProgress<ImageResizeProgress> progress)  
     {  
       var curProgress = new ImageResizeProgress();  
       // Delegate to resize image  
       Action<string> resize = (file) =>  
       {  
         Thread.Sleep(100);  
         if (ResizeImage(file))  
         {  
           curProgress.ProcessedFiles++;  
         }  
         else  
         {  
           curProgress.SkippedFiles++;  
         }  
         progress.Report(curProgress);  
       };  
       return Task.Factory.StartNew(() =>  
       {  
         if (Directory.Exists(fileOrDirectory))  
         {  
           var files = Directory.GetFiles(fileOrDirectory, "*.jpg");  
           curProgress.TotalFiles = files.Length;  
           progress.Report(curProgress);  
           foreach (var file in files)  
           {  
             resize(file);  
           }  
         }  
         else if (File.Exists(fileOrDirectory))  
         {  
           resize(fileOrDirectory);  
         }  
         else  
         {  
           progress.Report(curProgress);  
         }  
       });  
     }  

Let's create function which will display status on console. This static method will only take ImageResizeProgress as a argument. You can consume object to show status in your own way.

     private static void ShowStatus(ImageResizeProgress progress)  
     {  
       Console.Clear();  
       Console.SetCursorPosition(0, 0);  
       Console.WriteLine("Completion percentage: " + progress.Percentage.ToString("p"));  
       Console.WriteLine("Total Files: " + progress.TotalFiles);  
       Console.WriteLine("Processed Files: " + progress.ProcessedFiles);  
       Console.WriteLine("Skipped Files: " + progress.SkippedFiles);  
     }  

Now, everything is setup we just need to call ProcessFilesForImageDimension function to initiate operation. For IProgress interface we need to pass concrete class implementation which again is defined under mscorlib http://msdn.microsoft.com/en-us/library/hh194158(v=vs.110).aspx. The Progress class constructor takes action as parameter. In our case we have defined method named ShowStatus to pass in constructor.

 private static void Main(string[] args)  
     {  
       ImageDimensionManager dimensionManager = new ImageDimensionManager();  
       // TODO: Change location  
       Task tskImageResize = dimensionManager.ProcessFilesForImageDimension  
         (@"C:\Users\Public\Pictures\Sample Pictures",  
         new Progress<ImageResizeProgress>(ShowStatus));  
       tskImageResize.Wait();  
     }  

Now, everything is set we can expect result something like this.

Source code: https://www.dropbox.com/s/tf9vughitrj426s/AsyncProgessbar.zip

Comments

Popular posts from this blog

Making FluentValidation compatible with Swagger including Enum or fixed List support

FluentValidation is not directly compatible with Swagger API to validate models. But they do provide an interface through which we can compose Swagger validation manually. That means we look under FluentValidation validators and compose Swagger validator properties to make it compatible. More of all mapping by reading information from FluentValidation and setting it to Swagger Model Schema. These can be done on any custom validation from FluentValidation too just that proper schema property has to be available from Swagger. Custom validation from Enum/List values on FluentValidation using FluentValidation.Validators; using System.Collections.Generic; using System.Linq; using static System.String; /// <summary> /// Validator as per list of items. /// </summary> /// <seealso cref="PropertyValidator" /> public class FixedListValidator : PropertyValidator { /// <summary> /// Gets the valid items /// <...

Elegantly dealing with TimeZones in MVC Core / WebApi

In any new application handling TimeZone/DateTime is mostly least priority and generally, if someone is concerned then it would be handled by using DateTime.UtcNow on codes while creating current dates and converting incoming Date to UTC to save on servers. Basically, the process is followed by saving DateTime to UTC format in a database and keep converting data to native format based on user region or single region in the application's presentation layer. The above is tedious work and have to be followed religiously. If any developer misses out the manual conversion, then that area of code/view would not work. With newer frameworks, there are flexible ways to deal/intercept incoming or outgoing calls to simplify conversion of TimeZones. These are steps/process to achieve it. 1. Central code for storing user's state about TimeZone. Also, central code for conversion logic based on TimeZones. 2. Dependency injection for the above class to ...

Handling JSON DateTime format on Asp.Net Core

This is a very simple trick to handle JSON date format on AspNet Core by global settings. This can be applicable for the older version as well. In a newer version by default, .Net depends upon Newtonsoft to process any JSON data. Newtonsoft depends upon Newtonsoft.Json.Converters.IsoDateTimeConverter class for processing date which in turns adds timezone for JSON data format. There is a global setting available for same that can be adjusted according to requirement. So, for example, we want to set default formatting to US format, we just need this code. services.AddMvc() .AddJsonOptions(options => { options.SerializerSettings.DateTimeZoneHandling = "MM/dd/yyyy HH:mm:ss"; });

Trim text in MVC Core through Model Binder

Trimming text can be done on client side codes, but I believe it is most suitable on MVC Model Binder since it would be at one place on infrastructure level which would be free from any manual intervention of developer. This would allow every post request to be processed and converted to a trimmed string. Let us start by creating Model binder using Microsoft.AspNetCore.Mvc.ModelBinding; using System; using System.Threading.Tasks; public class TrimmingModelBinder : IModelBinder { private readonly IModelBinder FallbackBinder; public TrimmingModelBinder(IModelBinder fallbackBinder) { FallbackBinder = fallbackBinder ?? throw new ArgumentNullException(nameof(fallbackBinder)); } public Task BindModelAsync(ModelBindingContext bindingContext) { if (bindingContext == null) { throw new ArgumentNullException(nameof(bindingContext)); } var valueProviderResult = bindingContext.ValueProvider.GetValue(bin...

Storing and restoring Kendo Grid state from Database

There is no any built in way to store entire grid state into database and restore back again with all filters, groups, aggregates, page and page size. At first, I was trying to restore only filters by looking through DataSourceRequest. DataSourceRequest is kind of communication medium between client and server for the operation we do on grid. All the request comes via DataSourceRequest. In previous approach, I was trying to store IFileDescriptor interface which come with class FileDescriptor by looping through filters and serializing into string for saving into database but this IFileDescriptor can also contain CompositeFilterDescriptor which can be nested in nested object which are very tricky to handle. So, I had decompiled entire Kendo.MVC library and found out that all Kendo MVC controls are derived from “JsonObject”. It is there own implementation with ”Serialize” abstract function and “ToJson” function. In controls they are overriding “Serialize” method which depicts t...

Configuring Ninject, Asp.Net Identity UserManager, DataProtectorTokenProvider with Owin

It can be bit tricky to configure both Ninject and Asp.Net Identity UserManager if some value is expected from DI to configure UserManager. We will look into configuring both and also use OwinContext to get UserManager. As usual, all configuration need to be done on Startup.cs. It is just a convention but can be used with different name, the important thing is to decorate class with following attribute to make it Owin start-up: [assembly: OwinStartup(typeof(MyProject.Web.Startup))] Ninject configuration Configuring Ninject kernel through method which would be used to register under Owin. Startup.cs public IKernel CreateKernel() { var kernel = new StandardKernel(); try { //kernel.Bind<IHttpModule>().To<HttpApplicationInitializationHttpModule>(); // TODO: Put any other injection which are required. return kernel; } catch { kernel.Dispose(); thro...

LDAP with ASP.Net Identity Core in MVC with project.json

Lightweight Directory Access Protocol (LDAP), the name itself explain it. An application protocol used over an IP network to access the distributed directory information service. The first and foremost thing is to add references for consuming LDAP. This has to be done by adding reference from Global Assembly Cache (GAC) into project.json "frameworks": { "net461": { "frameworkAssemblies": { "System.DirectoryServices": "4.0.0.0", "System.DirectoryServices.AccountManagement": "4.0.0.0" } } }, These  System.DirectoryServices  and  System.DirectoryServices.AccountManagement  references are used to consume LDAP functionality. It is always better to have an abstraction for irrelevant items in consuming part. For an example, the application does not need to know about PrincipalContext or any other dependent items from those two references to make it extensible. So, we can begin wi...

Kendo MVC Grid DataSourceRequest with AutoMapper - Advance

The actual process to make DataSourceRequest compatible with AutoMapper was explained in my previous post  Kendo MVC Grid DataSourceRequest with AutoMapper , where we had created custom model binder attribute and in that property names were changed as data models. In this post we will be looking into using AutoMapper's Queryable extension to retrieve the results based on selected columns. When  Mapper.Map<RoleViewModel>(data)  is called it retrieves all column values from table. The Queryable extension provides a way to retrieve only selected columns from table. In this particular case based on properties of  RoleViewModel . The previous approach that we implemented is perfect as far as this article ( 3 Tips for Using Telerik Data Access and AutoMapper ) is concern about performance where it states: While this functionality allows you avoid writing explicit projection in to your LINQ query it has the same fatal flaw as doing so - it prevents the qu...

Data seed for the application with EF, MongoDB or any other ORM.

Most of ORMs has moved to Code first approach where everything is derived/initialized from codes rather than DB side. In this situation, it is better to set data through codes only. We would be looking through simple technique where we would be Seeding data through Codes. I would be using UnitOfWork and Repository pattern for implementing Data Seeding technique. This can be applied to any data source MongoDB, EF, or any other ORM or DB. Things we would be doing. - Creating a base class for easy usage. - Interface for Seed function for any future enhancements. - Individual seed classes. - Configuration to call all seeds. - AspNet core configuration to Seed data through Seed configuration. Creating a base class for easy usage public abstract class BaseSeed<TModel> where TModel : class { protected readonly IMyProjectUnitOfWork MyProjectUnitOfWork; public BaseSeed(IMyProjectUnitOfWork MyProjectUnitOfWork) { ...

Kendo MVC Grid DataSourceRequest with AutoMapper

Kendo Grid does not work directly with AutoMapper but could be managed by simple trick using mapping through ToDataSourceResult. The solution works fine until different filters are applied. The problems occurs because passed filters refer to view model properties where as database model properties are required after AutoMapper is implemented. So, the plan is to intercept DataSourceRequest  and modify names based on database model. To do that we are going to create implementation of  CustomModelBinderAttribute to catch calls and have our own implementation of DataSourceRequestAttribute from Kendo MVC. I will be using same source code from Kendo but will replace column names for different criteria for sort, filters, group etc. Let's first look into how that will be implemented. public ActionResult GetRoles([MyDataSourceRequest(GridId.RolesUserGrid)] DataSourceRequest request) { if (request == null) { throw new Argume...