Login | Register   
LinkedIn
Google+
Twitter
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX


advertisement
 

Pitfalls on the Road to Enterprise Architecture Maturity

Here are some ways that an enterprise architecture gets lost on the way to maturity and some alternate paths to get back on track.


advertisement

There are several definitions of enterprise architecture (EA) maturity. Many -- if not most -- are modeled on the Capability Maturity Model Integration (CMMI). Perhaps because the precise definition of maturity levels (not to mention the number of levels) vary between versions of enterprise architecture models, most enterprises find themselves setting their own levels a little higher and a little sooner than would seem practical for their intended models.

Whereas the CMMI has five stages of maturity, a compact model could have only four (Initiated > Defined > Measurable > Optimized) and still be recognizable as the same idea. The US DoC ACMM Framework as referenced by TOGAF has six stages because it starts at Level 0 (none), which could be interpreted as a sincere approach to incorporating the real world into the model. For models that start at stage 0, the first stage is the decision to pursue an enterprise architecture and architecture maturity. Models that start at stage 1 both create architecture maturity as a byproduct and pursue it as a goal.



In this article, I will identify some assumptions based on the goals of an enterprise architecture strategy and consider how those assumptions map to the reality experienced in various enterprises.

How Most Enterprises Measure Architectural Maturity

Most enterprises use some kind of scoring system to measure their progress along the path to architectural maturity. The United States Department of Commerce's DOC ACMM v1.2 contains a scoring system that is clear, concise, and consistent. It has six levels, each measured in nine areas, and a summary scoring system that boils it all down to a scale of 1 to 5.

The number of levels, areas evaluated, clarity of the evaluation system and scale used to measure the current level of architectural maturity can vary greatly from one enterprise to another when examined at a detailed level. Still, when considering the high-level definitions of what maturity means (and enterprise architecture is a high-level discipline), the variations then appear to be fairly minimal.

The scale begins either at the enterprise architecture being non-existent or at the best guess of the current state of the enterprise architecture. The levels then move up the scale to some desired end state. The goal is to place a marker along the path to architectural maturity that states "We are here."

With such a relatively clear understanding of the architectural maturity path and where it will lead, the level of maturity at which one enterprise perceives itself should (with acceptable minimal variations) be recognizable from within the enterprise and appear so to an external observer as well. So why is this so often not the case?

The Dangers of Spreadsheets for EA Data Gathering

It would be difficult to imagine working in IT without spreadsheets. Spreadsheets are an excellent tool for outlining and organizing data resulting from complex activities. However, they are a dangerous tool for gathering the data that they present so well, despite both the temptation and convenience spreadsheets provide for it.

One of the major challenges for Enterprise Architects is obtaining the information they need to do their jobs. Even after they get past the common challenges of gaining acceptance and cooperation, they are still left with the universal speed bumps of getting the necessary time and attention from managers to provide the data required for management.

Carrying the process one step further, managers (especially at the earlier stages of architecture maturity) are reluctant to take the time from their employees to obtain the necessary, low-level inputs that lead to accurate measurements. Finally, the employees themselves have a tendency to rush when providing inputs and (unless the information requests are very carefully formatted) will tend to rate their efforts as high as they believe they can without being challenged.

For these reasons (and usually others), the data for the spreadsheets are gathered using spreadsheets, which creates accuracy issues. After reviewing the processes employed to gather the data used to determine architectural maturity more closely, it is easy to see where a Level 2 enterprise could believe it has achieved a higher level. It is also understandable that what an outsider would perceive as an organization just barely starting on Level 2 will be presented in a slide deck (another very useful and frequently abused tool) as a company well on its way to achieving Level 4.

Two issues in using spreadsheets as the input for measuring architectural maturity are immediately obvious. One is that spreadsheets are great tools for summarizing but respondents need a thorough understanding of how to measure the value of the data they are providing. However, including the documentation for that understanding in the spreadsheet often results in a hard-to-read format. Providing a separate document to explain how to complete the spreadsheet may seem like a simple solution, but it is just as simple for respondents to ignore that document -- especially for respondents who are trying to complete it quickly.

Most Web-based surveys have the same challenges as the spreadsheet, only they look better. The bottom line here is that sometimes solving one problem (in this case simplifying the information-gathering process) can create other problems, such as inaccurate data due to haste, misinterpretation of the data being requested, or both.



Comment and Contribute

 

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Sitemap