Nine ASP.NET Site Navigation Problem Solutions: Part 2

art one of this series introduced the first seven common site map navigation problems and their solutions. This part explores more advanced techniques with the final two problems:

  • Hiding unauthorized pages.
  • Including database-driven content in site map data.

The solution to the first problem requires a brief review of ASP.NET 2.0 authorization and page level security, while solving the second problem involves extending the Site Map Provider model and caching dynamic content using ASP.NET 2.0’s new SqlCacheDependency class.

#8: Hiding Unauthorized Pages
In ASP.NET 1.1, hiding unauthorized pages involved setting the visibility of LinkButton controls or preventing/enabling the execution of sections of code manually, using a call to User.IsInRole(). In contrast, ASP.NET 2.0 provides a configurable, extensible, no-code approach. Setting it up involves three steps:

  1. Configure the SiteMapProvider to use security trimmings.
  2. Configure the RoleProvider to retrieve roles.
  3. Configure page- or directory-level authorization rules.

I’ll explain each step in the following sections.

Step 1: Enable security trimmings

Site-map files with more than 150 nodes can take substantially longer to perform security-trimming operations.

Enabling security trimmings forces the .NET Framework to limit siteMapNodes exposed by the SiteMapDataSource based on authorization information. Configure the SiteMapProvider to use security trimmings by adding a securityTrimmingEnabled=”true” attribute to the XmlSiteMapProvider in the application’s web.config file as shown below:

               securityTrimmingEnabled="true" />        

It’s worth noting Microsoft’s warning that “Site-map files with more than 150 nodes can take substantially longer to perform security-trimming operations.” Microsoft recommends using the roles attribute (described at the end of this solution) to help mitigate this potential performance problem.

Step 2: Retrieve Roles
The .NET Framework uses RoleProviders to determine which users are in which roles. ASP.NET 2.0 comes with three pre-packaged role providers, or you can extend the RoleProvider class to handle your own authorization as described below. There are three pre-packaged role providers:

  • The WindowsTokenRoleProvider retrieves role information from the groups to which a user belongs, but works only with Windows authentication.
  • The AuthorizationStoreRoleProvider works with Microsoft Authorization Manager (AzMan) and can retrieve role information from an Active Directory or XML file.
  • The SqlRoleProvider retrieves authorization information from either a SQL Server Express database located in your App_Code directory or from tables in your SQL Server that you can create with the aspnet_regsql.exe command.

Implementing these providers is not complex, but is beyond the scope of this article and is already well documented (see the Related Resources in the left column for more information).

A Custom RoleProvider Example
Unfortunately, not all applications can use the built-in role providers. For example, what if you migrate an application from ASP.NET 1.1, or hold authorization information in Oracle or a custom data store?

If you want the benefits of using User.IsInRole() or declarative security, then you probably retrieve data from your data source on a Login page, store the roles in the authentication ticket, and add them as a new GenericPrincipal to HttpContext.Current.User in the Application_AuthenticateRequest event in your Global.asax file. If so, your Login page calls a function that looks like this:

   private void SetAuthenticationTicket(string strUserName) {      string strRoles =          GetCommaDelimitedRolesFromDataSource(strUserName);      FormsAuthenticationTicket tkt = new          FormsAuthenticationTicket(1, strUserName,          DateTime.Now, DateTime.Now.AddMinutes(20),          false, strRoles);      string cookiestr = FormsAuthentication.Encrypt(tkt);      HttpCookie ck = new HttpCookie(         FormsAuthentication.FormsCookieName, cookiestr);      ck.Path = FormsAuthentication.FormsCookiePath;      Response.Cookies.Add(ck);   }

To link the above approach into the site map you must create a new class, inherit from RoleProvider, and override the GetRolesForUser() method. In your custom class the overridden GetRolesForUser() method will look nearly identical to the AuthenticateRequest event in your Global.asax file:

   public class CustomRoleProvider : RoleProvider {     ...     public override string[] GetRolesForUser(string username) {       // if the user is authenticated       if (HttpContext.Current.User != null) {         // retrieve the list of roles assigned during log in         FormsIdentity id = (            FormsIdentity)HttpContext.Current.User.Identity;         FormsAuthenticationTicket tkt = id.Ticket;         HttpCookie authcookie = HttpContext.Current.Request.Cookies[            FormsAuthentication.FormsCookieName];         FormsAuthenticationTicket authTicket =             (FormsAuthenticationTicket)FormsAuthentication.Decrypt(            authcookie.Value);         string[] astrRoles = authTicket.UserData.Split(',');            return astrRoles;       }      return null;     }   }   

The full CustomRoleProvider class is available in Listing 1 and in the downloadable code that accompanies this article.

Naturally, you need to tell ASP.NET about the existence of your CustomRoleProvider?and you do that in the Web.config file:


Step 3: Configure Authorization Rules
The last step is to configure which roles have access to which pages or directories. The best approach involves using authorization allow and authorization deny statements in the Web.config file. The .NET Framework uses this information to automatically allow or deny all users’ ability to access pages. What’s great is that the SiteMapProvider will use the same information to decide which siteMapNodes the SiteMapDataSource should provide (assuming you have enabled securityTrimmings). Consequently, without writing a line of actual code, you can make a site’s entire site navigation display only the pages to which users have access.

Setting the allow and deny configuration statements is the tricky part. If your authorization rules are simple you can create one directory for each role and allow and deny by directory using configuration settings such as:


The users=”?” statement refers to anonymous users. A users=”*” statement would refer to all users. Thus, the preceding configuration would deny unauthenticated users, deny users in the ReadOnlyRole, and allow users from the AdminRole.

Now, what if someone created a new role called ClerksRole?and forgot to explicitly deny it access to the AdminOnlyDir? ASP.NET 2.0 assumes that all users have access to all directories and all pages by default. This is the equivalent of putting an at the root of your site, because permissions cascade down directories. The result is that any new roles will have access to the AdminOnlyDir even though they shouldn’t.

One could implement a more pessimistic approach to security by setting at the root of the site, and then use allow statements to permit access to individual pages or directories. Here’s an example:

               ...                                       ...                                                              
After setting up the Web.config security authorization rules, you may find you would like to display a page that a user does not have access to.

This approach allows the AdminRole access to everything while denying all other roles access to everything unless they have been granted access explicitly. Specifically, the preceding configuration gives the AdminRole access to the AdminOnlyDir while the ReadOnlyRole?and any new roles?won’t have access. Remember that order is important and deny statements should always follow allow statements.

Roles Attribute
After setting up the Web.config security authorization rules, you may find you would like to display a page that a user does not have access to. For example, this situation might occur for the default page in a site. Users should see it, but it will require them to login when they try to access it. You can override the security settings in Web.config with the roles attribute of the siteMapNode in the Web.sitemap file:


Assuming forms authentication is set up correctly, ASP.NET will redirect users to the login page if they are denied authorization to Default.aspx.

The roles=”*” setting comes in handy in two additional circumstances: referencing external resources and increasing performance. You must set the roles attribute to * for pages that access external resources because ASP.NET cannot retrieve their authorization information from the Web.config. The same technique also increases performance because it bypasses the .NET Framework’s need to check each page’s permissions. This can help reduce the penalty of having more than 150 pages with security trimming enabled.

#9 Keeping Your Navigation Fresh
The problem of adding dynamic content to a site map reveals the full beauty of ASP.NET’s provider model. It requires little effort to extend the provider model to insert database-driven siteMapNodes into any section of the site map and continue to enjoy the benefits of the Web.sitemap approach described in the previous eight solutions. And with a little more effort you can cache the database content, refreshing it only when the database changes. This last solution shows how to surmount these two challenges.

Author’s Note: The caching discussion in the second half of this solution focuses solely on SQL Server 2005 and ASP.NET 2.0’s new dependency technology.

Build a Custom Site Map Provider
While a custom site map provider provides the most flexible and maintainable solution, there are two alternate approaches worth mentioning. First, one could reuse a single page in multiple siteMapNodes, because Web.sitemap supports querystring parameters in its url attribute. The problem is that maintaining the Web.sitemap for pages that might change frequently, such as message board posts, would be a Webmaster’s nightmare.

Second, the siteMapNode supports a siteMapFile attribute. This attribute allows a siteMapNode to retrieve its content from a separate .sitemap file. You could employ this technique by creating/overwriting a .sitemap file from inside a database trigger; however, this approach would limit the scalability of your application, because the database and Web server would need to be on the same machine. A more appropriate use of the technique might involve using a manually maintained Web.Sitemap that pulls data from a code-generator-maintained .sitemap file (Blue Ink uses this technique, see my bio for details).

The best approach will most likely involve developing a custom site map provider.

Regardless, the best approach will most likely involve developing a custom site map provider. The process requires developing a new class that inherits from SiteMapProvider, identifying it to the Web.config file, and referencing it from the Web.sitemap. You reference a new provider from Web.sitemap using the final siteMapNode attribute: provider.


The provider attribute allows a siteMapNode to retrieve its content from a provider specified in Web.config. The value CategoriesProvider shown in the example references the Northwind.CategoriesSiteMapProvider class:


While a single site map provider must be the default, ASP.NET supports multiple site map providers so they can reference each other. This approach confers the ability to compartmentalize sections of the site map. After configuring the Web.sitemap and Web.config entries, you must develop the custom provider.

As mentioned previously, each site map provider must ultimately inherit from the SiteMapProvider class; however, an intermediary class called StaticSiteMapProvider provides a partial implementation of SiteMapProvider that simplifies the task of loading site map information into memory. The StaticSiteMapProvider adds several methods to the base SiteMapProvider class, including AddNode(), RemoveNode(), Clear(), and BuildSiteMap().

The functions a custom provider will most likely need to override are BuildSiteMap() and Initialize() (from SiteMapProvider). ASP.NET calls Initialize()once throughout the lifetime of the application, making it an ideal place to retrieve configuration settings from the Web.config. In contrast, BuildSiteMap() is called frequently and must return the root siteMapNode. A custom site map provider should implement BuildSiteMap() to return a cached copy of the site map if possible, to increase performance. The following pseudocode shows the general flow:

   private SiteMapNode mnodeRoot = null;   public override SiteMapNode BuildSiteMap() {     if (site map has already been built)       return mnodeRoot;        Clear();     SiteMapNode nodeRoot = GetRootNode();     AddNode(nodeRoot);        open a connection to Northwind     select all categories        foreach (category) {       SiteMapNode nodeCategory = CreateCategoryNode();       AddNode(nodeCategory, nodeRoot);     }   }

Unfortunately, the preceding approach suffers from two major problems. The first is that the site map information will never change unless it is manually invalidated (for instance by setting mnodeRoot to null in a maintenance section of the Web site). Ideally the site map information would automatically refresh if the data it displays changes. I’ll show you how to use the SqlCacheDependency object to solve this problem shortly.

The second problem is more subtle, but is extremely important. ASP.NET 2.0 implements SiteMapProvider (and in fact all providers) using the singleton pattern. Thus ASP.NET 2.0 shares one instance of any custom site map providers among all page requests?and thus among multiple threads. One benefit of this approach is that custom site map providers can store persistent data in member variables, but the disadvantage is that custom site map providers must consider thread safety in any code that modifies the site map provider’s state. Therefore, you must modify the approach shown above to account for multiple threads attempting to build the site map at the same time. Here’s a modified example:

   private readonly object mobjLock = new object();      public override SiteMapNode BuildSiteMap() {     if (site map has already been built)       return mnodeRoot;        lock(mobjLock) {         Clear();       SiteMapNode nodeRoot = GetRootNode();       AddNode(nodeRoot);          loop through data {         call AddNode()       }     }   }

The code above ensures that no two threads will ever try to update the state of the site map provider at the same time. Notice that the code calls lock() only when the state will be updated; calling it earlier it would reduce performance unnecessarily.

Detecting Changed Data
Wouldn’t it be great if the database could notify the site map provider any time its data changed without polling the server or using triggers? In fact, this is exactly what SQL Server 2005 can do with the new SqlCacheDependency class introduced in ASP.NET 2.0. To implement this new technology you must perform four steps:

  1. Configure the environment
  2. Initialize the database connection
  3. Implement the SqlCacheDependency
  4. Implement the call-back logic

Configuring the environment involves setting up the database and modifying Web.config. The documentation makes it sound like setting up the database is as easy as issuing the following statement, which turns on the service broker that handles callbacks:


Unfortunately there are a number of other subtle requirements. The owner property of your SQL Server database must contain a value. You can find the property by right clicking on the database, selecting Properties, and then selecting Files from the Properties dialog. You will most likely want to set the property to sa. Your database must also have a Compatibility Level of 90. You can set this property under the Options category, also in the database properties dialog. Furthermore the account your application runs under must be granted a plethora of permissions if it’s not an administrator account (one should hope not!). Here’s a SQL script that grants the required permissions.

   GRANT CREATE PROCEDURE to [ASPNET];   GRANT CREATE QUEUE to [ASPNET];   GRANT CREATE SERVICE to [ASPNET];   GRANT REFERENCES ON CONTRACT::[      PostQueryNotification] to [ASPNET];   GRANT VIEW DEFINITION to [ASPNET];   EXEC sp_addrole 'sql_dependency_subscriber'   GRANT SUBSCRIBE QUERY NOTIFICATIONS TO [ASPNET]   GRANT RECEIVE ON QueryNotificationErrorsQueue TO [ASPNET]   GRANT REFERENCES on CONTRACT::[      PostQueryNotification] to [ASPNET]   EXEC sp_addrolemember 'sql_dependency_subscriber', '[ASPNET]'   GRANT SELECT TO [ASPNET]

You must also modify the Web.config file to notify ASP.NET to use the SqlCacheDependency. To do that, simply add the following within the element:


After the configuration is complete the custom site map provider must initialize the database connection with a call to SqlDependency.Start(). This initialization must occur once during the lifetime of the application, so clearly the Initialize() method is the place to do it, as shown below:

   public override void Initialize(string name,       NameValueCollection attributes) {         base.Initialize(name, attributes);      string strConnectionName = attributes["connectionStringName"];      if (String.IsNullOrEmpty("connectionStringName"))         throw new ProviderException();      mstrConnectionString =            WebConfigurationManager.ConnectionStrings[         strConnectionName].ConnectionString;      SqlDependency.Start(mstrConnectionString);   }

The preceding Initialize() method retrieves the connectionStringName attribute from Web.config and uses that to retrieve the actual connection string from the connectionStrings section. The call to SqlCacheDependency.Start prepares the database to accept dependency information.

The third step, implementing SqlCacheDependency, involves instantiating the class and passing it a SqlCommand. This must occur prior to opening the connection and using the command, because ASP.NET 2.0 sends the dependency information along with the command’s SQL or stored procedure (SQL Server 2005 can monitor either SQL statements or stored procedures).

   SqlConnection cn = new SqlConnection(mstrConnectionString);   SqlCommand cm = new SqlCommand(      "SELECT CategoryId, CategoryName FROM dbo.Categories," cn);   SqlCacheDependency dependency = new SqlCacheDependency(cm);   cn.Open();   SqlDataReader sdr = cm.ExecuteReader();

Any SQL used by the SqlCommand must adhere to certain restrictions. First, it must specify each column, so SELECT * is not an option. Second, table names must use two-part names. For example, if the table belongs to the dbo schema (as in the preceding example), then the SQL must specify table names as the two-part dbo..

To complete implementation of the SqlCacheDependency, you need to insert the SqlCacheDependency into the HttpRuntime.Cache:

   HttpRuntime.Cache.Insert(     "CategoriesDependency,"     new object(),     dependency,     Cache.NoAbsoluteExpiration,     Cache.NoSlidingExpiration,     CacheItemPriority.NotRemovable,     new CacheItemRemovedCallback(OnSiteMapChanged)   );

The preceding code inserts a simple object named CategoriesDependency into the Cache that will expire only when the information in the database changes. When that happens, SQL Server 2005 notifies ASP.NET 2.0, which then removes the object and calls the callback function OnSiteMapChanged().

The last step is to implement the callback logic. If BuildSiteMap rebuilds when mnodeRoot is null, then the implementation of OnSiteMapChanged() is simple:

   public void OnSiteMapChanged(string strKey,       object item, CacheItemRemovedReason reason) {         lock (mobjLock) {         if (strKey.Equals("CategoriesDependency") && reason.Equals(            CacheItemRemovedReason.DependencyChanged)) {               Clear();               mnodeRoot = null;         }      }   }
Author’s Note: Don’t forget to lock the code because the function updates the state.

That completes the process. Remember to insert the date into your nodes to confirm that everything worked before testing. You can see the entire class in Listing 2.

A Little Debug Help
There are quite a few moving parts, and since everything occurs asynchronously troubleshooting the process can be difficult. If you’re having difficulties, the first thing to check is whether the notification was successfully set in SQL Server, which you can check with a call to:

   select * from sys.dm_qn_subscriptions

If that command fails to return anything, check the SQL Server logs, because they’re likely to describe the reason for failure. SQL Server Books Online contains a section about Troubleshooting Query Notifications.

Prior to ASP.NET 2.0, site navigation was either time consuming to develop, or?more likely?involved numerous maintenance hours when the site changed. But thanks to the new features in ASP.NET 2.0, everything from the initial setup through advanced techniques such as security is remarkably easy to implement and maintain. After reviewing the nine solutions presented in this two-part article, you should be well on your way to greatly simplifying the life of your Web site administrator?or perhaps your own.

Share the Post:
Share on facebook
Share on twitter
Share on linkedin


The Latest

your company's audio

4 Areas of Your Company Where Your Audio Really Matters

Your company probably relies on audio more than you realize. Whether you’re creating a spoken text message to a colleague or giving a speech, you want your audio to shine. Otherwise, you could cause avoidable friction points and potentially hurt your brand reputation. For example, let’s say you create a

chrome os developer mode

How to Turn on Chrome OS Developer Mode

Google’s Chrome OS is a popular operating system that is widely used on Chromebooks and other devices. While it is designed to be simple and user-friendly, there are times when users may want to access additional features and functionality. One way to do this is by turning on Chrome OS

homes in the real estate industry

Exploring the Latest Tech Trends Impacting the Real Estate Industry

The real estate industry is changing thanks to the newest technological advancements. These new developments — from blockchain and AI to virtual reality and 3D printing — are poised to change how we buy and sell homes. Real estate brokers, buyers, sellers, wholesale real estate professionals, fix and flippers, and beyond may