Login | Register   
LinkedIn
Google+
Twitter
RSS Feed
Download our iPhone app
TODAY'S HEADLINES  |   ARTICLE ARCHIVE  |   FORUMS  |   TIP BANK
Browse DevX
Sign up for e-mail newsletters from DevX


advertisement
 

Bjarne Stroustrup Expounds on Concepts and the Future of C++ : Page 4

Danny Kalev asks Bjarne the hard questions about concepts and C++'s future.


advertisement

Commitees and Standardization

DK: You've said more than once that a 100-person committee "isn't a good forum for innovative design". This raises a more fundamental question: perhaps the days of an ISO standards committee are over? Ruby and Python use the "benevolent dictator" model for their standardization; Linux uses a community process. Do you think it's time to consider of a different model of standardization that's more suitable for our era, with less red tape and more direct community involvement?

BS: If you read my papers, you'll find concern about the model for standardization/evolution (e.g. HOPL3 paper and HOPL2 paper). However, I'm not at all convinced that "more direct community involvement" will lead to improvement. The average "voice" in a debate about language evolution seems short sighted, unduly influenced by fashion, and focused on a single feature at a time. Many believe in "simple perfect solutions" for complex issues and few appear to have much respect for existing code (billions of lines of code) and practices (millions of programmers). Few are willing to stay with the design/standardization project for years or to do the massive tedious work involved in getting the details right. As I see it, "community projects" work much better when there is a well-understood conceptual framework already in place, e.g. as Unix provided for Linux. Choosing among fundamentally different design directions is harder. Also, please remember that C++ standardization is a volunteer effort. Had "C++" been able to afford even a small paid professional staff, things might have been different.



I'm not a great fan of dictatorships, however efficient and benevolent they may appear in the short run. It seems to me that most "benevolent dictators" eventually become corrupted by their powers and use them to push their own hobbyhorses and condemn things they don't like – often with less insight and reasoning than is required from "ordinary mortals."

For C++, the dictatorship model was never an option. After the initial years, there never was a single point of distribution or a single implementation. C++ is more deeply embedded in the fundamental infrastructures of many companies than are the scripting languages. Furthermore, in the late 1980s and early 1990s companies like HP, IBM, Intel, Microsoft, and Sun certainly did not want something as fundamental to their businesses as C++ to be de facto controlled by a single person over whom they had no control—I suspect that's still the case. The C++ standards effort was initiated by major users.

DK: The "design by committee" approach, which some critiques claim was applied in the case of concepts, is often a recipe for a failure. My favorite example is export (I later learned that John Spicer used the same analogy in Berlin. The minutes are available here). Why not restrict the committee's charter to standardizing existing, proven practice? After all, that's how we got STL—and that approach would probably expedite the C++0x standardization process.

BS: I doubt that many would have considered STL "existing, proven practice" in 1996. Also, there is a significant difference between library proposals and language feature proposals: It is relatively easy to try out a good approximation to a library proposal.

In Frankfurt, I pointed out that:

  • Some people insist: "Don't standardize without commercial implementation."
  • Major implementers do not implement without a standard (unless they see a commercial advantage from a proprietary design).

My suggested way of dodging the horns of this dilemma is to provide an experimental implementation as a proof of concept. This merely moves the debate to "what's a sufficiently complete implementation?" and "How accurately does the experimental implementation reflect the detailed standards proposal?"

For concepts, we had an experimental implementation (ConceptGCC) and a working STL using it. For some, that wasn't enough, but for me the actual problem was that too many people focused on implementation issues to the point where design issues were ignored. You need not just a design and an implementation, you need libraries built using them, and projects completed using those libraries. That would be convincing, but there isn't time and funding for such many-year experiments. Who would build projects and products on such an experimental foundation? And if someone did, how could you standardize something different from what was used in those massively expensive early experiments? Instead, we would see a proliferation of incompatible proprietary dialects and languages. The long lag between C++98 and C++0x is part of the reason that companies such as Apple and Microsoft have been pushing proprietary "technologies," but by no means the only one. Companies compete for market share and profits, not for the generality of their language designs.

So, evaluating a new idea is hard in general and harder in the context of a widely used language. I do not consider it impossible, though. I'm usually satisfied by an experimental implementation, even though it cannot be complete in all aspects; maybe a feature is not 100 percent implemented, maybe not all interactions with other features are implemented, maybe performance has not been tuned, maybe debugging support is missing or error messages are incomplete, etc. In every real case, we have to judge what's relevant and important to decide how much work is worth while. However, this is by no means all that's needed for a feature to be acceptable. In addition I try to focus on the design itself, looking at simplicity, completeness, generality, extensibility, usability, and teachability.

We need to explore the design space by examining many variants of a design—often a slight variation of a design makes the resulting feature much more usable in the hands of ordinary programmers. That's the kind of possibility that a focus on implementation tends to miss. I try to construct collections of "typical uses" and "important uses" that can be tried out for the variant designs (the many papers on initializer lists and initialization on the WG21 site provide examples of this approach). Also, I try to write tutorials, and have students use early variants of a design. Invariably, we have to explore interactions with other language features and possible future extensions.

If you think you have a quick and/or simple alternative to "design by committee" that does not involve either significant funding or corporate control, you are almost certainly wrong. In WG21, we try to minimize the effects of "design by committee" by having most design done by individuals "off line" and leaving larger groups to "polish" the resulting designs. Sometimes, that works.

John Spicer mentioned export in Frankfurt and so did I (and so—I suspect—did others). It is a popular source of examples of all kind of things. However, I do not see a way to derive a single simple morality tale from that. I mentioned export as a lesson not to try to straddle a pair of technical alternatives, but for the need to make clear design decisions.

DK: The last sentence is very interesting. Do you mean that accepting both the "inclusion model" of template compilation (which has been the only model in use practically) and exported templates into C++98 was similar to the attempt to merge two different concepts philosophies (Texas and Indiana, as Douglas Gregor calls them) into a consolidated proposal? I presume that lesson from both cases is that such compromises aren't successful. Is that correct?

BS: Yes, the mistake with export was to create a compromise design that supported both the inclusion model and the separate compilation model of template instantiation. Please note that (contrary to your question and to what many people believe) variants of both models were in industrial use. Because of the "variants" part, standardization was badly needed and there was strong support for both models, so compromises were hard to craft: somebody would get hurt whatever decision we made

In retrospect, enabling both models without an enforcement mechanism to ensure that every implementation was complete simply allowed vendors to decide the issue by implementing only the inclusion model. EDG implemented export, despite serious reservations, and got burned, because people were not willing to use export as long as major vendors weren't implementing it. I, at least, had not anticipated that outcome and (in retrospect only) think that the committee failed in not making a clear choice. However, before criticizing the committee too harshly, please remember that choosing "pain now" over "possible pain later" is very hard for an individual, and harder still for a group, however talented and well meaning.

I find it hard to characterize the fundamental differences between the concept proposal variants. Others are less uncertain; that's part of the problem. The hard question is "which differences are fundamental?" Certainly, most of the heat has been over the role of concept maps. I see them as a necessary mechanism for mapping features (as described above) only. Others see them as good in themselves (even if empty; adding no types or operations). The pre-Frankfurt concept design allowed for concepts for which concept maps were required even if empty ("explicit concepts") and concepts for which concept maps were optional unless mapping was needed or an ambiguity needed to be resolved ("automatic concepts"). My contention (in the "simplifying the use of concepts" paper) is that concept maps are not a suitable mechanism for ambiguity resolution, Further (directly addressing your question), the discussions over the proper use of concept maps should be settled in the language rules (by the standards committee), and a single kind of concepts and a specific mechanism for ambiguity resolution should be provided. The alternative is for bodies of code based on different philosophies to emerge, which would not interoperate smoothly, and for the discussions to become serious fights among users about the proper way to program.

DK: I sense disappointment and frustration among C++ users (see this discussion for example). Apparently, it's not just concepts. Programmers constantly complain that badly-needed features (garbage collection, reflection, GUI) never make it into C++, whereas too much emphasis is put on features that seem academic and esoteric. Is C++ at a dead-end as some of these commenters claim? Has the language become too complex to allow healthy and timely evolution?

BS: One problem is that there is no agreement as to what constitutes a "badly-needed feature" and some of the loudest voices are in no mood to discuss: They think they know. I'm personally in favor of some suitable flavor of GC, and C++0x does provide a GC ABI. Sadly, I have no hope for a standard GUI because there are too many proprietary ones out there, and too many powerful organizations prefer their own GUI as a lock-in mechanism over any potential standard. My suspicion is that "academic and esoteric" is usually just FUD meaning "something I don't have in some other language that I like." I remember when class hierarchies and virtual functions were popular targets for such claims.

There has never been a shortage of claims that C++ was a dead end—often by people busily imitating it.

Evolving C++ through the standards process is indeed slow and difficult. It can also (as I said above) be worthwhile beyond any simple experiment. In terms of alternative models for evolution, corporate languages evolve faster because they have much more funding, because they don't have to serve multiple communities, and because they don't have compatibility requirements comparable to C++. Remember Visual Basic? C++ was often criticized for not evolving as fast as VB, but C++ has chosen a model of evolution that precludes massive breaking of existing code. I still think that an industrial research lab (think "Bell Labs at the height of its powers") is the best nursery for practical new ideas.

Much of the C++ community has yet to catch up with C++98. In some areas, so have other language communities.

DK: Let's talk about the future. What's going to happen now? Will concepts be added by hook or by crook to the language (say in the form of a Technical Report), or has the time come to shelve concepts for now and focus on features that Joe and Siddhartha Coders would like to see in C++? Has the removal of concepts set a precedent for further radical changes in the WD?

BS: Wow, that's quite a set of leading/biased questions.

First, we complete and ship C++0x with all the improvements we have approved (such as, concurrency support, an improved and extended standard library, uniform and general initialization, lambdas (closures), move semantics, general constant expressions, etc.; see my C++0x FAQ). I don't see further major changes coming along for this round of standardization. C++0x will deliver major improvements to the C++ community, and some parts are already available from GCC, Microsoft, and others. It will feel like a new language.

Bjarne Stroustrup: If you think you have a quick and/or simple alternative to "design by committee" that does not involve either significant funding or corporate control, you are almost certainly wrong.

Secondly, C++ simply cannot follow every fad. To be useful, it must evolve slowly and compatibly. If you want a fashionable language, a language with all the latest bells and whistles, a minimal language, or a language deeply integrated into a proprietary system, etc., you know where to find such things. However, if you want a language that's useful for programs with a lifespan of decades, with flexibility that is up there with the best, with performance ahead of most of the competition, with a generality that is unmatched, and running on just about any hardware, try C++.

Thirdly, many people will try to find a good and proper way to get concepts—in some form or other—into C++. Nobody plans to get concepts "by hook or by crook;" that's not the way C++ standardization works. By building directly on the pre-Frankfurt concepts and applying modifications along the lines I suggested in "simplifying the use of concepts" we could have had something acceptable (IMO) within months, but that will not happen now. Instead, we must re-evaluate the basics of concepts and rebuild more or less from scratch (relying on the experience gained so far)—and experiment. My stated estimate is that that will take on the order of five years, but that it will happen. As ever, "concepts" must compete for resources and attention with other ideas. If people really do prefer (say) reflection in some form, I suspect they will work on that instead.

I do not think concepts are suitable for a TR (Technical Report) because they are too deep in the type system. A good TR topic is something isolated enough to be optional. Type systems issues are "all or nothing."

DK: So overall, you're optimistic about the future of C++?

BS: As ever, I'm cautiously optimistic. I am not—and never was—of the "Rah rah! My language/tool/design-philosophy/whatever is the solution to all your problems and will take over the world tomorrow" school of optimism. I'm quite happy to see C++ "just" support a few million programmers well, and dominate large areas of software infrastructure and advanced applications development. I consider "software infrastructure," embedded systems, systems software, and resource-constrained applications huge and growing areas of concern, and C++'s natural "niche." C++0x will be a better tool for demanding software development than C++98 is.

For now, the committee need to wraps up and ship C++0x ("x" is becoming hexadecimal) and implementations must be completed (some features and libraries are shipping already), so that people can start benefiting from the committee's hard work.

About Bjarne Stroustrup

Bjarne Stroustrup is the designer and original implementer of C++. He is a founding member of the ISO C++ standards committee and a major contributor to C++0x. He worked at Bell Labs and AT&T Labs and is now a professor at Texas A&M University. He is a member of the USA National Academy of Engineering, an ACM Fellow and an IEEE Fellow. His publication list is as long as your arm, including Programming: Principles and Practice using C++ and The C++ Programming Language.


Danny Kalev is a certified system analyst and software engineer specializing in C++. He was a member of the C++ standards committee between 1997 and 2000 and has since been involved informally in the C++0x standardization process. He is the author of "The ANSI/ISO Professional C++ Programmer's Handbook" and "The Informit C++ Reference Guide: Techniques, Insight, and Practical Advice on C++."
Comment and Contribute

 

 

 

 

 


(Maximum characters: 1200). You have 1200 characters left.

 

 

Sitemap