Browse DevX
Sign up for e-mail newsletters from DevX


Data Validation Using .NET and External Metadata : Page 4

Data validation is a task that we perform every day when writing code. Applying the techniques described here can make your systems more robust and eliminate the constant rebuilding of applications every time a business rule changes.




Building the Right Environment to Support AI, Machine Learning and Deep Learning

Refreshing the Metadata
Metadata has assumed a very important role in the quest to design increasingly more generic and extensible software frameworks. Although I've gone into a fair level of detail describing how my company uses metadata to drive data validation rules, Brierley & Partners' CRM (customer relationship management) framework uses metadata on a much broader scale to define the very essence of a customer. In an abstract sense, we define a customer as having a collection of profiles and events. The specific types of profiles and events vary dramatically per implementation and are therefore also defined external to the application. So not only have we removed the business rules surrounding the data from inline code, we've also removed the definition of the customer data itself. As the extent of the metadata repository grows, performance quickly becomes an issue.

To allow the generic code base to perform well in a high transactional environment we decided to cache the metadata within each process where it is required. As an instance of our CRM framework is created, once per process, it immediately loads the metadata definition from a database into local memory. This allows for very quick access to the information that drives the behavior of the system. Note that by placing all of this system level knowledge (data structure, database access rules, validation rules, etc.) within the backend framework we can apply it consistently regardless of the origin of the data. For example, processing could originate from a wireless device, a Web site, an XML Web service or from a batch process, but each share the same back end. By caching the metadata we have solved the pending performance issue. However, what happens when the business rules change? In a nutshell, the cache becomes invalid and we need to refresh it. We use .NET remoting as the foundation for various systems management tasks including refreshing the metadata and enabling distributed logging.

Describing the mechanics of .NET's distributed object infrastructure is beyond the scope of this article. Instead, we'll highlight how to use .NET remoting to solve our invalidated cache scenario since it relates to data validation.

Our systems management server lives in a traditional Windows NT service and makes itself known to the remoting environment within our implementation of the System.ServiceProcess.ServiceBase OnStart method.

Hashtable hshtProps = new Hashtable(); hshtProps.Add("port", 8085); BinaryServerFormatterSinkProvider sink = new BinaryServerFormatterSinkProvider(); sink.TypeFilterLevel = System.Runtime.Serialization. Formatters.TypeFilterLevel.Full; m_chan = new TcpChannel(hshtProps, null, sink); ChannelServices.RegisterChannel(m_chan); WellKnownServiceTypeEntry entry = new WellKnownServiceTypeEntry( typeof(SystemsManagement .ServerManager), "ServerManager", WellKnownObjectMode.Singleton); RemotingConfiguration.RegisterWellKnownServiceType (entry);

However, in the spirit of extensibility, the recommended approach for defining the characteristics of the server is to externalize these configuration elements into a configuration file and use RemotingConfiguration.Configure <>.

This server keeps track of each process that needs to be notified when the contents of the metadata have changed. In order to support this functionality, the server implements an interface allowing instances of the framework to register themselves as subscribers. The server, implemented as a Singleton, maintains a list of each subscriber and periodically pings each instance to verify its continued existence. The metadata (e.g. data validation rules) is updated using a GUI client that serves as a publisher. That is, when the metadata is modified it publishes this event to the systems management server, which in turn iterates over the subscriber list telling each instance that its metadata is out of date. Within each instance of the framework all access to the metadata is regulated through a static property, and using synchronization objects, the data can be refreshed properly in a multi-threaded environment.

The .NET Framework has allowed Brierley & Partners to build a very extensible CRM platform that is driven by externally defined rules. Through a unique combination of .NET reflection and remoting we have been able to deploy a highly configurable software platform that is also very responsive and suited to high transactional volume.

Jonathan Schafer is an Architect at Brierley & Partners. Jonathan has 15 years experience designing and developing distributed applications in the Microsoft environment, and two and a half years experience with .NET. Jonathan is currently architecting systems for client applications. Brierley & Partners, Inc. is a direct marketing agency specializing in the design and implementation of customer relationship programs. Since its founding in 1985, Brierley has supported the loyalty program design efforts for 150 of the nation's leading brands, serving as the architect for Hertz #1 Club Gold, Hilton HHonors, and Blockbuster Rewards. Brierley's direct marketing and interactive agency clients include Sony, Hertz, Hilton, Epson, Nokia, MSN, and Vail Resorts. The company's 200 professionals are based in Dallas, Los Angeles, Chicago, London, and San Francisco.
Comment and Contribute






(Maximum characters: 1200). You have 1200 characters left.



Thanks for your registration, follow us on our social networks to keep up-to-date