Build Efficient Worker Roles for Optimum Resource Utilization and Scalability

Build Efficient Worker Roles for Optimum Resource Utilization and Scalability

In one of the previous blogs, I discussed building background services that can scale by tenant. There could however be scenarios where you want the system to scale by load (e.g. number of messages to be processed from queue). Often in such scenarios you want to have control over the way the load is generated to avoid redundancy, but want the processing to happen as soon as the message arrives. You would also want maximum utilization of the CPU for processing to minimize costs of scaling. Having an effective worker role design can help you better understand the efficiency of background services. The following figure illustrates one such design using the OnStart and Run methods inside the RoleEntryPoint class.

You can use the OnStart method to create instances of scheduled services using utilities such as Quartz.Net cron service scheduler that can then run at a pre-defined interval and populate designated queues with messages to be processed. Typically, you would want only one instance from the configured instances to be able to write into the queue to avoid duplication of messages for processing. The following code shows a typical cron schedule. The service configured will have the implementation of the leased method (we discussed in the previous blog post) that will schedule the messages in queue.

public override bool OnStart()    {      NetworkSettings.SetServicePointManagerDefaultSetting();       UnityContainer = UnityHelper.ConfigureUnity();      QueueProvider = UnityContainer.Resolve();      LogService = UnityContainer.Resolve();      ScheduleServices();      return base.OnStart();    }

The code inside the ScheduledServices method could look like:

DateTimeOffset runTime = DateBuilder.EvenMinuteDate(DateTime.Now);        JobScheduler scheduler = new JobScheduler();        scheduler.Schedule(runTime,          "SiteConfigurationSchedule",          "SiteConfigurationGroup",          RoleEnvironment.GetConfigurationSettingValue("SiteConfigurationCronSchedule"));        scheduler.Schedule(runTime,          "AggregationSchedule",          "AggregationGroup",          RoleEnvironment.GetConfigurationSettingValue("AggregationCronSchedule"));

These are examples of different types of cron services that are run by based on the defined schedule.

The following code illustrates the implementation inside the Run method of a worker role that uses Task Parallel Library to process multiple queues the same time.

public override void Run()    {      while (true)      {        try        {          Parallel.Invoke(                        () =>            {              ProcessMessages(Constants.QueueNames.SiteConfigurationQueue, (m, e) => m.CreateSiteConfiguration(e));            },            () =>            {              var hasMessages = ProcessMessages(Constants.QueueNames.PacketDataQueue, null, (m, e) => m.ComputeSiteMetrics(e));              if (!hasMessages)                Thread.Sleep(200);            });        }        catch (Exception ex)        {          LogService.Error(ex);        }      }    }

This can scale to as many instances as you want, depending on the load on the queue and the expected throughput. The parallel processing will ensure that the CPU is optimally utilized in the worker role, and the run method will generate a continuous run of the instance to process items from the queue. You can also use the auto scale configuration to automatically scale the instances based on load.

Known Issues

There is one known issue you must be aware of in this design regarding the data writes on an Azure Table storage. Since multiple instance will be writing to the table, if you are running updates, there is a chance that the data could have been modified between the time you picked the record and updated it back after processing. Azure, by default, rejects such operations. You can, however, force an update by setting the table entity’s ETag property to “*”. The following code illustrates a generic table entity save–with forced updates.

public void Save(T entity, bool isUpdate = false) where T : ITableEntity, new()    {      TableName = typeof(T).Name;      if (isUpdate)        entity.ETag = "*";      operations.Value.Add(isUpdate ? TableOperation.Replace(entity) : TableOperation.Insert(entity));    }

A word of caution though. This may not be the design you want to pursue if the system you are building is completely intolerant to a certain degree of data corruption at any point in time, since a forced update may result in such a behaviour.

Share the Post:
Heading photo, Metadata.

What is Metadata?

What is metadata? Well, It’s an odd concept to wrap your head around. Metadata is essentially the secondary layer of data that tracks details about the “regular” data. The regular

XDR solutions

The Benefits of Using XDR Solutions

Cybercriminals constantly adapt their strategies, developing newer, more powerful, and intelligent ways to attack your network. Since security professionals must innovate as well, more conventional endpoint detection solutions have evolved

AI is revolutionizing fraud detection

How AI is Revolutionizing Fraud Detection

Artificial intelligence – commonly known as AI – means a form of technology with multiple uses. As a result, it has become extremely valuable to a number of businesses across

AI innovation

Companies Leading AI Innovation in 2023

Artificial intelligence (AI) has been transforming industries and revolutionizing business operations. AI’s potential to enhance efficiency and productivity has become crucial to many businesses. As we move into 2023, several

data fivetran pricing

Fivetran Pricing Explained

One of the biggest trends of the 21st century is the massive surge in analytics. Analytics is the process of utilizing data to drive future decision-making. With so much of

kubernetes logging

Kubernetes Logging: What You Need to Know

Kubernetes from Google is one of the most popular open-source and free container management solutions made to make managing and deploying applications easier. It has a solid architecture that makes