devxlogo

Caching with Weak References in .NET

Caching with Weak References in .NET

o cache objects returned from a service, you can use one of several available mechanisms, the simplest of which is to reference the objects indefinitely. However, this approach ultimately suffers if an application caches too much data. Alternatively, you could use the Microsoft Caching Application Block (CAB), an extensible module that facilitates common caching scenarios. But when you’re creating a sophisticated architecture requiring that no more than one copy of a given object (such as a customer record) exist in memory at a given time, the CAB won’t suffice.

One such scenario arises when you’re implementing change notification. To implement change notification in a service-oriented environment, the client needs to have a reference to all client objects to deliver events. When using traditional caching techniques, such as the CAB, there is a delay between the time that the cache releases the reference to an object and the time that object actually gets collected, because the UI is still referencing the formerly cached object. During this time, there is no way for a change notification event to be routed to the object. Also, while the UI is still referencing an object (but the cache is not), the client framework will create another copy of that object the next time the UI asks for it (perhaps in a different screen).

Fortunately, you can approach the problem another way?by using the WeakReference class. According to the MS documentation, “? the WeakReference class cannot be treated as an automatic solution to memory management problems. You still have to establish rules for your application, that is, a caching policy, about what entries are kept or removed from a cache.” However, building a cache manager that uses WeakReference objects is a viable approach.

The solution described in this article uses a combination of a WeakReference and a strong reference to create a cache that will meet the application’s demands.

Getting Started
This solution revolves around the CacheReference class. At the heart of this class are two fields:

   class CacheReference       where T : class   {      private WeakReference m_weakReference;      private T m_strongReference;   ...

Initially, the object you want to cache will be referenced by m_strongReference. This member is strongly typed (using the generic type T), so it performs better than wrapping an object type. You use the m_weakReference member to mark the object as ready for collection. The Target property wraps access to these fields. It takes care of returning the cached value or returns null if the object has been garbage collected:

   public T Target   {      get      {         //Check to see if we have a strong reference         if (m_strongReference != null)            return m_strongReference;         if (m_weakReference == null)            throw new InvalidOperationException("Unable to return                 Target when it has been collected");         if (m_weakReference.IsAlive)            return (T)m_weakReference.Target;         //The target has been collected - ditch the weak reference               m_weakReference = null;         return null;      }      set      {         m_weakReference = null;         m_strongReference = value;      }   }

You need to provide a way to specify that the cached object is eligible for garbage collection. To make the cached object collectable, switch from using the strong reference to the weak reference. That change allows the garbage collector to collect the object (if there are no other references to the object). One additional benefit of the CacheReference class is that an object that has been marked as collectible can be promoted back to a strong reference if it has not yet been garbage collected. This behavior is governed by the AllowCollection property (see Listing 1).

Checking the Pulse
Finally, you need a way to determine whether the item has been collected or not. The IsAlive property returns true if you have a strong reference; otherwise, it returns the value of the m_weakReference.IsAlive property:

   public override Boolean IsAlive   {      get      {         if (m_strongReference != null)            return true;         if (m_weakReference == null)            return false;         return m_weakReference.IsAlive;      }   }

You should check this property before attempting to access the cached object to avoid an InvalidOperationException.

Abstract the Non-Generic Properties
To make caching a list of objects of different types easier, you can move some items to a base class. The following code shows the base class as well as an added field that keeps track of the last time an instance of this object was accessed. The cache manager uses that field value to expire the object:

   abstract class CacheReference   {      protected DateTime m_lastAccess;               public CacheReference()      {         m_lastAccess = DateTime.Now;      }         public DateTime LastAccess      {         get { return m_lastAccess; }      }         public abstract Boolean IsAlive { get; }      public abstract Boolean AllowCollection { get; set; }   }

Exploring the CacheManager Class
This class is similar to the CacheManager class that is part of the Caching Application Block, but this version is a little less pluggable because it contains a hard-coded scavenging scheme:

   class CacheManager   {      private Dictionary<String, CacheReference> m_cache = new                   Dictionary<string, CacheReference>();

The m_cache field is a dictionary of CacheReference objects that will contain all the cached items. You can add CacheReference objects because CacheReference inherits from the non-generic CacheReference class.

You will need a method to add an item to the cache. Use a generic method so that you can create a CacheReference instance to hold the item.

   public void Add<T>(String id, T tocache) where T : class   {      m_cache[id] = new CacheReference<T>(tocache);   }

You’ll also need a way to get items back out. The Get method shown below handles item retrieval. You pass in a string ID; the Get method takes care of checking to see if the provided ID exists in the cache. If so, the method verifies that the object is alive. If the item has already been collected (when temp.IsAlive == false), the method removes the item from the cache:

   public T Get<T>(String id) where T : class   {      CacheReference temp = null;         if (m_cache.TryGetValue(id, out temp))      {         if (temp.IsAlive)         {            //The object is alive and doing well - return it.            return ((CacheReference<T>)temp).Target;         }         else         {            //The item has been garbage collected.  Remove it.            m_cache.Remove(id);               return null;         }      }      else          return  null;   }

Note that the Get method calls a method named Remove, which removes objects from the cache. Remove is a separate method because you’ll also want to be able to explicitly remove an item from the cache:

   public void Remove(String id)   {      CacheReference cacheReference;         if (m_cache.TryGetValue(id, out cacheReference))      {         if (cacheReference.IsAlive)            cacheReference.AllowCollection = true;         m_cache.Remove(id);      }   }

So far, you have only basic cache functionality: adding, retrieving, and removing items.

Taking Out the Garbage
The last piece of the puzzle is to provide an expiration mechanism. A straightforward mechanism to expire objects is to scavenge x number of objects after y objects have been added to the cache. To facilitate this approach, add some code to the CacheManager class that gives you the ability to set those values externally:

   private int m_maxItemsBeforeScavenge;   private int m_itemsToScavenge;      public CacheManager(int maxItemsBeforeScavenge, int itemsToScavenge)   {      m_maxItemsBeforeScavenge = maxItemsBeforeScavenge;      m_itemsToScavenge = itemsToScavenge;   }

A simple scavenging scheme (see Listing 2) attempts to expire a number of objects (m_itemsToScavenge) whenever the cache grows to a predetermined number of objects (m_maxItemsBeforeScavenge). Note that the Scavenge method uses an explicit call to GC.Collect. While calling GC.Collect explicitly is normally frowned upon, calling it after dereferencing an unusually large number of objects is acceptable (and desirable in this case).

Putting the Cache into Action
The downloadable sample application (see Figure 1) lets you test the cache, using a very small Foo test class:

   class Foo   {      public String Name;   }
Author’s Note: In production, this class wouldn’t be a good candidate for caching using weak references because the WeakReference itself could consume more memory than the object that it’s referencing.

?
Figure 1. Testing the Cache: The sample application creates and caches a large number of objects and lets you see how many have been garbage collected.

This is the code that generates the objects:

   for (int x = 0; x < 50000; x++)   {      Foo foo = new Foo();      foo.Name = "Bar " + x.ToString();         if (x == 0)      m_foo = foo;         m_cacheManager.Add<Foo>(foo.Name, foo);   }

Here's the output:

   Scavenging is starting.  There are 40001 items in the list.   Scavenging is Complete.  There are 39503 items in the list.   Scavenging is starting.  There are 40001 items in the list.   Scavenging is Complete.  There are 39502 items in the list.   Scavenging is starting.  There are 40001 items in the list.   Scavenging is Complete.  There are 39502 items in the list.   Scavenging is starting.  There are 40001 items in the list.   Scavenging is Complete.  There are 39502 items in the list.

The WeakReference object is very powerful, but?just as Microsoft warns?you must combine it with additional logic to make it useful in a caching application. But you'll find the effort required to take advantage of the CacheReference class worthwhile if you need to distribute events from a service to client objects.

devxblackblue

About Our Editorial Process

At DevX, we’re dedicated to tech entrepreneurship. Our team closely follows industry shifts, new products, AI breakthroughs, technology trends, and funding announcements. Articles undergo thorough editing to ensure accuracy and clarity, reflecting DevX’s style and supporting entrepreneurs in the tech sphere.

See our full editorial policy.

About Our Journalist