Login | Register   
LinkedIn
Google+
Twitter
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX


advertisement
 

Object-Relational Mapping - Taking the Horror Out of Data Access : Page 2

What are the benefits of using object relational mapping as persistence mechanism rather than .NET DataSets? Real-world use shows that developers who choose to work with persistent objects make more robust, scalable and maintainable software. By using object relational mapping you can still access data by connecting directly to the RDBMS, unlike choosing an OODBMS. There is a lot to be gained by using O/R mapping technology, and this article just talks about this.


advertisement

4. Explaining object-relational mapping
Now that we have convinced you about the benefits… let’s take a look at object-relational mapping in practice.

Design Goals



Increase
• Abstraction
• Maintainability
• Scalability

Decrease
• Database coupling
• Repetitive coding

What is the design goal of an O/R DAL?
The main reason why you would want to hide the columns and tables in your relational database is mainly that of improving productivity. The increased abstraction allows more natural modelling of data and simpler handling of complex relationships between data entities. Thus, the one-class-matches-one-table-mapping that some mistakenly believes is O/R mapping turns out to be pretty useless – all overhead, no abstraction.

The other important goal is to remove the dependency of your business methods and UI of the underlying database and database schema. There are several good reasons to do this. An application might after extensive use require refactoring to remove deadlocks or performance bottlenecks. You might want to upgrade or even switch the RDBMS (never happens some say, but that is often a confusion of cause and effect. If your application is closely coupled to your database, you would never be tempted to suggest switching it…). What this amounts to is greatly simplified maintenance. Any changes to the data source require a limited maintenance effort that is confined to a well-defined part of the application.

What is the penalty?
The benefits of the O/R DAL are all very well but there is a cost. Both at design time, during implementation and run-time. Firstly you need to know elementary OO-modelling techniques including the fine art of NOT over-engineering your object-oriented data model. Just like painting, which takes a minute to learn and a lifetime to master, OO data modelling is an art that requires experience. But, it is great fun and the alternative, pure relational data modelling, is really not a serious option anymore.

The biggest drawback is the cost of implementing of a good O/R DAL. Whereas the business methods will benefit immensely, a typical method being half the size and practically self-documenting, actually writing the O/R DAL takes years of experience and a lot of patience. There are a lot of pitfalls to those who choose a DIY approach, and if you do, make sure you keep the O/R DAL really, really, really simple. Even simple O/R DAL’s can be useful as long as you don’t loose out on the abstraction (remember what we said about one-object-maps-one-table…). At the end of this article you will find resources that will help you if you choose to write your own O/R DAL.

On the other hand, there have emerged excellent tools that generate the O/R DAL for you. Just make sure that the tool you choose also generates the database schema, returns objects in relationships so you can navigate with the OO-dot syntax and allows you full flexibility to generate and map different database schemas. If it isn’t flexible enough you could end up in trouble, but if it is you’ll be laughing all the way to deadline (on time). Not to mention maintenance, which is a joy with a standardized and accurately documented API.

Another issue that is raised when introducing an O/R DAL is that of performance. A well-written O/R DAL typically adds less than one percent of overhead to the corresponding dynamic SQL or calls to SPROC. However, there are some cases where a SPROC is a lot faster, such as ”update all product prices by 10%”. Since all data is stored in the RDBMS, you could easily bypass the O/R DAL and write a SPROC to take care of this use case. Just don’t forget to ask yourself if this performance optimisation is really necessary. Even if the update takes a couple minutes longer using the O/R DAL, it might be such an unusual event that it doesn’t matter. The rule is ”don’t fix what isn’t broken”, and the priority should always be maintenance over performance, as long as performance is good enough.

Functionality over performance
The design principle is to develop your entire application using the O/R DAL first. When you have verified that you have met your functional requirements, implemented a ton of changes and made sure your customer is happy with the application, then you can solve performance issues. You basically use a Model | Implement | Test | Tune approach.

When you tune your application you start by identifying business methods that are performance bottlenecks, estimate how these limitations affect the functionality of the application and implement a performance fix if you consider it necessary. Since performance fixes usually are geared towards special circumstances this approach retains flexibility as long as possible.

What about caching?
Data caching is really, really important when creating an O/R DAL. Not only because of performance, but also to achieve consistency during a transaction. In fact, you will be eating dust if you don’t implement data caching and later try to support real transaction management.

Imagine if you display a value, such as the number of page visits, at the top and bottom of a web page. If the value isn’t cached during the life span of the page, it is quite possible that the two numbers will be different. This is unacceptable. To avoid this you would declare local variables in your ASP/ASP.NET-page, but if the O/R DAL supports caching, this is something you don’t have to think about. Very neat, save LOC, and reduce bugs.

If the O/R DAL supports global caching, you can improve the performance of your entire application by an order of magnitude if there is a lot of consecutive reading of the same data. Special considerations should be taken to avoid inconsistency between caches on different servers in a web farm. If you only implement a transaction cache that is flushed after each transaction and is kept isolated from other caches then you won’t have any problem. On the other hand, if each server has a shared cache it should probably only be invoked on read-only data such as articles or user data.

Should we use lazy load?
Lazy loading improves scalability and performance by reducing the amount of data that is read when just navigating around the data object hierarchy. The idea of lazy loading is that you in any given code snippet only use a subset of properties of an object and this might change according to run-time conditions. Thus, deciding which data to actually read is a good idea to post-pone as late as possible. In those cases you benefit from a chunky recordset, you might choose to bypass the O/R DAL, but don’t fix what isn’t…

Supporting OO-concepts
When implementing your own O/R DAL you really want to choose the features you support carefully. Keep it really, really, really simple. Something that is neat is inheritance. This allows reuse within your object-oriented data model and is a great improvement over a non-reusable model.

Another neat feature is polymorphism. Without it you will loose the possibility of creating generic methods to process objects inherited from the same base class.

Late binding let’s you store any object in a property or collection allowing a generic bookmark feature.

Finally, introspection allows you to write more flexible, and robust, applications by allowing you to dynamically inspect the structure of a given class. Could be useful if you want to implement a generic service that takes an object, searches for a certain set of properties in that object and only manipulates them.



Comment and Contribute

 

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Sitemap
Thanks for your registration, follow us on our social networks to keep up-to-date