Login | Register   
LinkedIn
Google+
Twitter
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX


advertisement
 

Piloting Microsoft's Security Ship

Steve Lipner, Microsoft Director of Security Assurance, led the push earlier this year to retrain Microsoft employees in security principles. Find out what the Secure Windows Initiative is all about from the man who's making it happen.


advertisement
hat you are involved in at Microsoft and what changes have been made in the security arena? Steve Lipner: The Secure Windows Initiative (SWI) was created in early 2000, right about the time we shipped Windows 2000. Despite the name, it has always been focused company-wide with the aim of eliminating vulnerabilities from our products.

The focus is not on adding security features to products; there is another team of developers, a subsystem development group within Windows that does that. We really focused on making sure that the code doesn't have the kinds of vulnerabilities that put customers at risk. The Secure Windows Initiative had operated primarily by working with teams, by doing training, by helping them to conduct code reviews and design reviews, and we have done that through the Windows XP development process, as well as working with other teams.

Steve Lipner, Microsoft Director of Security Assurance
Toward the end of last year, it became apparent to us that there was more that we needed to do to make sure that we got Windows XP Service Pack 1 and Windows .NET Server to the point that we felt it should be.
"I was hired at MS to run the Security Response Center, the team that deals with discovered vulnerabilities, and I came here a little less than 3 years ago. I started working on security for a government contractor called The Mitre Corporation in 1970 and I worked at Digital Equipment and a small company called Trusted Information Systems that had a firewall product and got bought by Network Associates. So, I've been in the security business—computer security, network security—doing just a variety of things like research, commercial products, consulting, government programs at a bunch of places."
—Steve Lipner
At the end of last year, just as the .NET Common Language Runtime was being completed, they had similar concerns because that's a very security-critical component. Before they shipped, they stopped and said, "We're going to do nothing but security until we're comfortable that we've got the vulnerability rate down to where we can ship this product and it will be secure enough for our customers."


That process worked pretty well, and they had kind of a structure with a team of between one and two thousand people. CODE: You're talking late 2001?

SL: They were actually doing this in November-December, so as we started to look at things we could do for Windows, some of the SWI folks were involved in that process. We talked to them and said, "OK, what does our plan look like for Windows?" For one thing, instead of between one and two thousand people,we had close to nine thousand. We've got a lot more code. In some cases, you've got older code (the CLR was new). But the basic outlines were pretty similar, so we put together a plan and a schedule in December and got buy-off on that up through pretty senior levels, and then in January, we did organization, we built training materials, we built a database to make sure that an individual developer would sign off on every file in Windows.

We put together sign-off criteria: Here are the things you check for. Here are guides, features, internal Web sites, all the things you need to do. The last week of January, a handful of our people, actually about four of them did all of it (Mike Howard was one of them), trained 8,500 people—everyone who contributes to Windows—on writing secure code. Here is a specialized course for program managers, for developers, for testers. We have a room over in the conference center that holds about 950 people, and we filled it 10 times. Then, starting February 1, everyone stopped and focused entirely on security.

They focused first on threat models because that turns out to be a very important mechanism for finding potential vulnerabilities. Do we have counters for all of them? As well, there was code review, attack testing, and a design review to find places where we can reduce the attack surface of the product. You can make things not run or run with less privilege; you can provide what we call defense in-depth so that it takes two failures rather than one to attack a system successfully. By the end of March, we had made a batch of design changes and understood what we needed to do. We found and started to fix a batch of security vulnerabilities. Then people went back to work doing the actual changes, testing it in the normal product development cycle that culminates with the release of .NET Server late this year. A significant number of those changes are in XP Service Pack 1, which will release pretty soon now.

Editor's Note: The Windows XP Service Pack 1 is now available.
CODE: The people who went through this training, 8,500 or so, were the people primarily focused on Windows XP and .NET Server.

What the press has not picked up on is that we have done the same thing for Visual Studio .NET, SQL Server, Exchange Server, Commerce Server, and Office.
SL: In that exercise, yes. What the press has not picked up on is that we have done the same thing for Visual Studio .NET, SQL Server, Exchange Server, Commerce Server, and Office. Those other teams were staggered out between March and June/July, but with the same basic structure. CODE: Did these other teams do the same "stop development" thing?

SL: Yes. It wasn't two months in every case; it was between a month and two. One of the things they focused on was threat modeling, and some of them, SQL in particular, did their threat modeling, and subsequently, their more code-focused and testing-focused activities. CODE: What ongoing changes effect how you develop your products?

SL: The book, Writing Secure Code, is a good introduction to the concepts. The thing that we are focusing on now is how to integrate that into our development processes. We did the threat modeling and security push late in the development cycle, so the next release of Windows (Longhorn) will probably still do something like the security push toward the end, but we will do threat modeling earlier in the process. That gives us the best leverage in getting the security measures into the product such as attack surface reduction, adding counter-measures, and improving the security by default.

Among the things we did during the security push was to use the threat models to prioritize our efforts. In other words, how exposed is this and how badly can something go wrong? Windows comes with lots and lots of sample code, and we know that end users will take the sample code and do the minimum they can to use it in their applications. So, we treated all the sample code as highest priority for security review during the security push because even though that sample code does not run as part of Windows, it is critical to developers who are building apps to run on our platform. We felt that if we are telling an end user how to build an insecure app, that's about as bad as we can do, so that was a key priority.

Another thing that we did after the security push was to build the operating system, Windows .NET Server, with the Visual C++ 7 GS flag, to address the runtime buffer overrun problem. That doesn't keep you from writing a buffer overrun and doesn't keep a buffer overrun from interfering with your system, but it does greatly improve the odds against a problem. If someone exploits a buffer overrun, it will cause an error rather than getting hostile code running on your system. That is something that is available to your readers as they are building applications. They just turn on the flag and the performance penalty is inconsequential, and it provides an additional level of comfort with the code that comes out.

Of course, with a .NET application you can't write a buffer overrun in the CLR. It has strong type checking and more of a runtime than we provide with the operating system so the checking is actually integral to the managed code. CODE: The Secure Windows Initiative started in early 2000, but there are lots of other initiatives being talked about, such as Trustworthy Computing. Is that just a marketing term, or a new department, or something else? Can you give us a road map?

SL: Trustworthy Computing is a company-wide initiative that is much broader than security. The objective is to get computing to the point that it is so solid that you don't have to think about it. When I pick up the phone, no matter whom I'm calling, I just punch numbers and know with a high degree of confidence that the quality is pretty good, the privacy of the call will be acceptable to me, and it's just going to work.

The objective is to get to the point where computing is as solid as the phone system, the electric supply, and so forth.
That's different from people's perception and the reality of computing today. The objective is to get to the point where computing is as solid as the phone system, the electric supply, and so forth.

The things we talk about as part of the Trustworthy Computing Initiative are security, privacy, availability, and supplier reliability and integrity. At this point there are major initiatives in place or in process in all of those areas. Last summer and fall, after the Code Red and Nimda worms, we launched what we call the Strategic Technology Protection Program, which was focused on providing management tools and deployment aids to help customers secure their systems.

CODE: The IIS Lockdown Tool and things like that? SL: Yes, the IIS Lockdown Tool and the Baseline Security Analyzer were things we delivered out of the STPP, as well as the Software Update Service, which allows you to run a copy of Windows Update in your own organization. With Code Red and Nimda, the STPP was a short-term response, but we began to say, "What can we do to get ahead of this," and out of that to a large part, we got to the conclusion that this really did justify the level of corporate attention and commitment that the Trustworthy Computing e-mail articulates.

We have focused on security in every development group in the company, and that simply wouldn't have happened without the Trustworthy Computing Initiative.
We have focused on security in every development group in the company, and that simply wouldn't have happened without the Trustworthy Computing Initiative. CODE: Any new products or new branding of initiatives? How about Palladium?

SL: Palladium is a long-term direction for building a higher level of security into products by altering the hardware environment they run on. The hardware has the ability to tell that a piece of software came from Microsoft and has not been modified, for example. CODE: As a result of the security review, what has changed with the OS products that can give a developer a higher degree of security in running on the Microsoft platform?

SL: The long-term goal is to reduce the number or severity of the security patches that we release. Some of the things you will see are in the area of Secure by Design, Secure by Default, and Secure by Deployment. Secure by Design is about what we do internally to make the systems more robust.

Secure by Default gets pretty visible because features that would have been sitting there running, whether you need them or not, are now disabled unless you need them. Features that would have been running with local system privilege, if they are running at, are often now running with local service or network service privilege. Even if those services have a vulnerability, if you get into them there is much less that you can do. For example, look at the DLL search order. In previous Windows versions, the search order looked in the working directory for the DLL instead of in the system directory. Now we have flipped that in SP1 and .NET Server. If I can get a DLL with the same name as a system DLL onto your system in a directory where data is being accessed, the system is still going to look for that DLL in the system directory first. That DLL I laid into your data directory does exactly no good.

That is a Defense In-Depth sort of thing. To exploit that, I had to get the DLL onto your system somehow, and I'm not supposed to be able to do that. Even if I did, Defense In-Depth says that it doesn't matter. Secure by Deployment is the things we released with the STPP: things that help you deploy securely for your environment.

Here is the story about Secure by Default. I was running some SP1 build and I had to rebuild, so I wanted to copy all my data off to another machine. I created a share and made sure I had the rights to it, and I went off to another machine to access the share and I get "Access Denied." What's going on? It turns out that the default share permission in an SP1 system is Read (not Full) access to the share as well as to the files, so you have to give access to both if you want to copy data into a share. That's Secure by Default in action.



   
Comment and Contribute

 

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Sitemap
Thanks for your registration, follow us on our social networks to keep up-to-date