Refreshing the Metadata
Metadata has assumed a very important role in the quest to design increasingly more generic and extensible software frameworks. Although I've gone into a fair level of detail describing how my company uses metadata to drive data validation rules, Brierley & Partners' CRM (customer relationship management) framework uses metadata on a much broader scale to define the very essence of a customer
. In an abstract sense, we define a customer
as having a collection of profiles
. The specific types of profiles and events vary dramatically per implementation and are therefore also defined external to the application. So not only have we removed the business rules surrounding the data from inline code, we've also removed the definition of the customer data itself. As the extent of the metadata repository grows, performance quickly becomes an issue.
To allow the generic code base to perform well in a high transactional environment we decided to cache the metadata within each process where it is required. As an instance of our CRM framework is created, once per process, it immediately loads the metadata definition from a database into local memory. This allows for very quick access to the information that drives the behavior of the system. Note that by placing all of this system level knowledge (data structure, database access rules, validation rules, etc.) within the backend framework we can apply it consistently regardless of the origin of the data. For example, processing could originate from a wireless device, a Web site, an XML Web service or from a batch process, but each share the same back end. By caching the metadata we have solved the pending performance issue. However, what happens when the business rules change? In a nutshell, the cache becomes invalid and we need to refresh it. We use .NET remoting as the foundation for various systems management tasks including refreshing the metadata and enabling distributed logging.
Describing the mechanics of .NET's distributed object infrastructure is beyond the scope of this article. Instead, we'll highlight how to use .NET remoting to solve our invalidated cache scenario since it relates to data validation.
Our systems management server lives in a traditional Windows NT service and makes itself known to the remoting environment within our implementation of the System.ServiceProcess.ServiceBase OnStart
Hashtable hshtProps = new Hashtable();
BinaryServerFormatterSinkProvider sink = new
m_chan = new TcpChannel(hshtProps, null, sink);
WellKnownServiceTypeEntry entry = new
However, in the spirit of extensibility, the recommended approach for defining the characteristics of the server is to externalize these configuration elements into a configuration file and use RemotingConfiguration.Configure <>
This server keeps track of each process that needs to be notified when the contents of the metadata have changed. In order to support this functionality, the server implements an interface allowing instances of the framework to register themselves as subscribers. The server, implemented as a Singleton, maintains a list of each subscriber and periodically pings each instance to verify its continued existence. The metadata (e.g. data validation rules) is updated using a GUI client that serves as a publisher. That is, when the metadata is modified it publishes this event to the systems management server, which in turn iterates over the subscriber list telling each instance that its metadata is out of date. Within each instance of the framework all access to the metadata is regulated through a static property, and using synchronization objects, the data can be refreshed properly in a multi-threaded environment.
The .NET Framework has allowed Brierley & Partners to build a very extensible CRM platform that is driven by externally defined rules. Through a unique combination of .NET reflection and remoting we have been able to deploy a highly configurable software platform that is also very responsive and suited to high transactional volume.