Skip to main content

Displaying progress status in asynchronous programming

UX is most important in any application and responsive application is an add-on to make UI free by running long running task in background. Sometimes we need to show more useful information about long running task to keep user informed about progress. In this article, we will be looking into showing up progress status for long running process with Task Parallel Library (TPL).

In this particular example, we will be creating a Console application to re-size any amount of jpg files under directory with new image dimension but our main focus would be on showing progress status.

Prior to .Net 4.5 Framework version, there was no inbuilt mechanism to show status of task. If it was needed, then we need to create event handler on class that is processing the long running task with custom defined event argument for progress status class and then subscription need to be done on consuming class for getting progress information. This was better and tidy approach for getting progress status but lot of things were needed to make it work.

In 4.5, an interface named IProgress<T> is introduced having just single function Report<T value>. Due to generic by nature we could use custom defined classes or any on other built in types to show status. It is included in mscorlib.dll. http://msdn.microsoft.com/en-us/library/hh138298(v=vs.110).aspx

Let's roll and explore on the new approach.

We will be having our own custom class to show status of total files, skipped, processed and percentage of completion.

   public class ImageResizeProgress  
   {  
     public int TotalFiles { get; set; }  
     public int SkippedFiles { get; set; }  
     public int ProcessedFiles { get; set; }  
     public decimal Percentage  
     {  
       get  
       {  
         if ((TotalFiles - SkippedFiles) != 0)  
         {  
           return ProcessedFiles / (decimal)(TotalFiles - SkippedFiles);  
         }  
         return 1;  
       }  
     }  
   }  

The above class looks pretty simple without any kind of noise in it. In case of percentage there is no multiplication of 100 by we will be using ToString("p") to show up percentage value.

After completion of progress class, we need to populate classes member according to need on processing function for image re-sizing. If you look into below function, ImageResizeProgress class is wrapped into IProgress interface under parameter. After changing values of curProgress object, whenever showing status is required we just need to call progress.Report(curProgress), in which it will trigger function that we are going to create next.

 public Task ProcessFilesForImageDimension(string fileOrDirectory,  
       IProgress<ImageResizeProgress> progress)  
     {  
       var curProgress = new ImageResizeProgress();  
       // Delegate to resize image  
       Action<string> resize = (file) =>  
       {  
         Thread.Sleep(100);  
         if (ResizeImage(file))  
         {  
           curProgress.ProcessedFiles++;  
         }  
         else  
         {  
           curProgress.SkippedFiles++;  
         }  
         progress.Report(curProgress);  
       };  
       return Task.Factory.StartNew(() =>  
       {  
         if (Directory.Exists(fileOrDirectory))  
         {  
           var files = Directory.GetFiles(fileOrDirectory, "*.jpg");  
           curProgress.TotalFiles = files.Length;  
           progress.Report(curProgress);  
           foreach (var file in files)  
           {  
             resize(file);  
           }  
         }  
         else if (File.Exists(fileOrDirectory))  
         {  
           resize(fileOrDirectory);  
         }  
         else  
         {  
           progress.Report(curProgress);  
         }  
       });  
     }  

Let's create function which will display status on console. This static method will only take ImageResizeProgress as a argument. You can consume object to show status in your own way.

     private static void ShowStatus(ImageResizeProgress progress)  
     {  
       Console.Clear();  
       Console.SetCursorPosition(0, 0);  
       Console.WriteLine("Completion percentage: " + progress.Percentage.ToString("p"));  
       Console.WriteLine("Total Files: " + progress.TotalFiles);  
       Console.WriteLine("Processed Files: " + progress.ProcessedFiles);  
       Console.WriteLine("Skipped Files: " + progress.SkippedFiles);  
     }  

Now, everything is setup we just need to call ProcessFilesForImageDimension function to initiate operation. For IProgress interface we need to pass concrete class implementation which again is defined under mscorlib http://msdn.microsoft.com/en-us/library/hh194158(v=vs.110).aspx. The Progress class constructor takes action as parameter. In our case we have defined method named ShowStatus to pass in constructor.

 private static void Main(string[] args)  
     {  
       ImageDimensionManager dimensionManager = new ImageDimensionManager();  
       // TODO: Change location  
       Task tskImageResize = dimensionManager.ProcessFilesForImageDimension  
         (@"C:\Users\Public\Pictures\Sample Pictures",  
         new Progress<ImageResizeProgress>(ShowStatus));  
       tskImageResize.Wait();  
     }  

Now, everything is set we can expect result something like this.

Source code: https://www.dropbox.com/s/tf9vughitrj426s/AsyncProgessbar.zip

Popular posts from this blog

Handling JSON DateTime format on Asp.Net Core

This is a very simple trick to handle JSON date format on AspNet Core by global settings. This can be applicable for the older version as well.

In a newer version by default, .Net depends upon Newtonsoft to process any JSON data. Newtonsoft depends upon Newtonsoft.Json.Converters.IsoDateTimeConverter class for processing date which in turns adds timezone for JSON data format.

There is a global setting available for same that can be adjusted according to requirement. So, for example, we want to set default formatting to US format, we just need this code.


services.AddMvc() .AddJsonOptions(options => { options.SerializerSettings.DateTimeZoneHandling = "MM/dd/yyyy HH:mm:ss"; });



Global exception handling and custom logging in AspNet Core with MongoDB

In this, we would be looking into logging and global exception handling in the AspNet Core application with proper registration of logger and global exception handling.

Custom logging
The first step is to create a data model that we want to save into DB.

Error log Data model
These are few properties to do logging which could be extended or reduced based on need.

public class ErrorLog { /// <summary> /// Gets or sets the Error log identifier. /// </summary> /// <value> /// The Error log identifier. /// </value> [BsonRepresentation(BsonType.ObjectId)] public ObjectId Id { get; set; /// <summary> /// Gets or sets the date. /// </summary> /// <value> /// The date. /// </value> public DateTime Date { get; set; } /// <summary> /// Gets or sets the thread. /// </summary> /// <v…

Elegantly dealing with TimeZones in MVC Core / WebApi

In any new application handling TimeZone/DateTime is mostly least priority and generally, if someone is concerned then it would be handled by using DateTime.UtcNow on codes while creating current dates and converting incoming Date to UTC to save on servers.
Basically, the process is followed by saving DateTime to UTC format in a database and keep converting data to native format based on user region or single region in the application's presentation layer.
The above is tedious work and have to be followed religiously. If any developer misses out the manual conversion, then that area of code/view would not work.
With newer frameworks, there are flexible ways to deal/intercept incoming or outgoing calls to simplify conversion of TimeZones.
These are steps/process to achieve it. 1. Central code for storing user's state about TimeZone. Also, central code for conversion logic based on TimeZones. 2. Dependency injection for the above class to be able to use globally. 3. Creating Mo…

LDAP with ASP.Net Identity Core in MVC with project.json

Lightweight Directory Access Protocol (LDAP), the name itself explain it. An application protocol used over an IP network to access the distributed directory information service.

The first and foremost thing is to add references for consuming LDAP. This has to be done by adding reference from Global Assembly Cache (GAC) into project.json

"frameworks": { "net461": { "frameworkAssemblies": { "System.DirectoryServices": "4.0.0.0", "System.DirectoryServices.AccountManagement": "4.0.0.0" } } },
These System.DirectoryServices and System.DirectoryServices.AccountManagement references are used to consume LDAP functionality.

It is always better to have an abstraction for irrelevant items in consuming part. For an example, the application does not need to know about PrincipalContext or any other dependent items from those two references to make it extensible. So, we can begin with some bas…

Unit Of Work injection through Asp.Net Core Dependency Injection

This article is not directly related to UnitOfWork but leveraging Asp.Net Core Dependency Injection to consume Unit Of Work.

In one of the previous article about project architecture, I was not very satisfied with the approach for Unit Of Work implementation for initialization of repository even if with some advantage.

Here is old code for UnitOfWork.

public sealed partial class MyProjectUnitOfWork : UnitOfWork<DbContext>, IMyProjectUnitOfWork { public MyProjectUnitOfWork(IContextFactory<DbContext> contextFactory) : base(contextFactory) { } /// <summary> /// BookRepository holder /// </summary> private MyProject.DB.Repository.BookRepository _bookRepository; /// <summary> /// Gets the BookRepository repository. /// </summary> /// <value> /// The BookRepository repository. /// </value> MyProject.Interface.Repository.IBoo…

Architecture solution composting Repository Pattern, Unit Of Work, Dependency Injection, Factory Pattern and others

Project architecture is like garden, we plant the things in certain order and eventually they grow in similar manner. If things are planted well then they will all look(work) great and easier to manage. If they grow as cumbersome it would difficult to maintain and with time more problems would be happening in maintenance.

There is no any fixed or known approach to decide project architecture and specially with Agile Methodology. In Agile Methodology, we cannot predict how our end products will look like similarly we cannot say a certain architecture will fit well for entire development lifespan for project. So, the best thing is to modify the architecture as per our application growth. I understand that it sounds good but will be far more problematic with actual development. If it is left as it is then more problems will arise with time. Just think about moving plant vs a full grown tree.

Coming to technical side, In this article, I will be explaining about the various techniques tha…

Client side Validation for data driven view engine

The prerequisite for this is to have a designed database driven view engine. This can be a good guidance to implement DB driven view engine Data Driven Custom View Engine in ASP.NET MVC (http://www.dotnetcurry.com/aspnet-mvc/946/data-driven-custom-view-engine-aspnet-mvc).

If we talk about the concept then we can say for DB driven view engine a dynamic form/screen table would require along with the associate attribute set for controls. The controls, Attribute set can have constraints like Required, MaxLength, RegEx etc. similar to available DataAnnotation implementation, just that it has to come through DB.

The jQuery unobtrusive validation is all about adding certain HTML 5 data attributes. So, if we can find rules of the controls (required, max length etc) and set it to HTML attribute from view engine then we are done.

While designing DB driven view engine, there should be a place where we need to loop through available controls to identify it's type and write as HTML/element. A…

Configuring Ninject, Asp.Net Identity UserManager, DataProtectorTokenProvider with Owin

It can be bit tricky to configure both Ninject and Asp.Net Identity UserManager if some value is expected from DI to configure UserManager. We will look into configuring both and also use OwinContext to get UserManager.

As usual, all configuration need to be done on Startup.cs. It is just a convention but can be used with different name, the important thing is to decorate class with following attribute to make it Owin start-up:

[assembly: OwinStartup(typeof(MyProject.Web.Startup))]
Ninject configuration

Configuring Ninject kernel through method which would be used to register under Owin.

Startup.cs
public IKernel CreateKernel() { var kernel = new StandardKernel(); try { //kernel.Bind<IHttpModule>().To<HttpApplicationInitializationHttpModule>(); // TODO: Put any other injection which are required. return kernel; } catch { kernel.Dispose(); throw; }…

T4, Generating interface automatically based on provided classes

With new techniques and patterns interface plays a key role in application architecture. Interface makes application extendable like defining file upload interface and implementing based on file system, Azure Blob storage, Amazon S3. At starting we might be implementing based on Azure Blob but later we might move to Windows based file system and so on.

Ideally we create interface based on need and start implementing actual default implementation class. Many a times at starting of implementation there is one to one mapping between Interface and Class. Like from above example File upload interface and the initial or default class implementation that we design and with time it will get extended.
In this article, we will try to create interface based on default class implementation. This is not at all recommended in Test Driven Design (TDD) where we test the application before actual code implementation but I feel sometimes and in some situations it is okay do that and test straight afte…

OpenId Authentication with AspNet Identity Core

This is a very simple trick to make AspNet Identity work with OpenId Authentication. More of all both approach is completely separate to each other, there is no any connecting point.

I am using Microsoft.AspNetCore.Authentication.OpenIdConnect package to configure but it should work with any other.

Configuring under Startup.cs with IAppBuilder
app.UseCookieAuthentication(new CookieAuthenticationOptions { AuthenticationScheme = CookieAuthenticationDefaults.AuthenticationScheme, LoginPath = new PathString("/Account/Login"), CookieName = "MyProjectName", }) .UseIdentity() .UseOpenIdConnectAuthentication(new OpenIdConnectOptions { ClientId = "<AzureAdClientId>", Authority = String.Format("https://login.microsoftonline.com/{0}", "<AzureAdTenant>"), ResponseType = OpenIdConnectResponseType.IdToken, PostLogoutRedirectUri = "<my website url>", Au…