Overdue Proof of Concept
I'm unaware of any in-depth research that tests to see how efficient OOP actually is when compared to procedure-oriented programming. I'm afraid OOP would fail in that comparison much more often than many admit. With the possible exception of GUI components, I've never heard of an OOP success story that on close inspection demonstrated OOP's efficiency. OOP does
allow you to hide your code from others, but there are non-OOP ways to do this as well. Black box code does
eliminate one category of programming error, but does it create other kinds of bugs?
|Computer "science" is littered with the debris of abandoned ideas.|
Computer "science" is littered with the debris of abandoned ideas. And the struggle between those who want practical results versus those who love airy theory is hardly something new. Nearly three decades ago the Basic language was introduced as a teaching tool—a way to teach programming to college students. Because its primary goal is clarity
, Basic employed a diction, punctuation, and syntax that were as similar to English as possible.
For a while it was a success, but things took a turn. In those early days, computer memory was scarce and processors were slow. Processor-intensive programs such as games and CAD had to be written in low-level languages just to compete in the marketplace. To conserve memory and increase execution speed, such programs were written in assembly language and then C, which conformed to the computer's inner structure rather than to the programmer's natural language. For example, people think of addition as 2 + 2, but a computer stack might work faster if its programming looks like this: 2 2 +. Programmers describe it as little Ashley's first birthday party: the computer starts counting from zero, so to the machine it's her zeroth birthday party.
When fast execution and memory conservation were more essential than clarity, zero-based indices, reverse-polish notation, and all kinds of bizarre punctuation and diction rose up into programming languages from the deep structure of the computer hardware itself. Some people don't care about the man-centuries of unnecessary debugging these inefficiencies have caused. I do.
|Efficiency is the goal of OOP, but the result is too often the opposite.|
The day when a programming language needs to accommodate the machine rather than the human has long since passed. Indeed, in Microsoft's Visual Studio .NET suite of languages, the compiled result is identical whether you use Java, C++, or Basic. But professionals and academics haven't heard this news, or they're just dismissing it. They continue to favor C++ and other offspring of C. So colleges now turn out thousands of programmers annually who don't even know that serious alternatives to C++ or OOP exist. Countless academics point to OOP as the reason C++ is superior to C, neglecting to mention that C itself was an inherently painful language to use and that any
abstraction would've been an improvement. C++ too is a difficult language to use; it's just not as
difficult as C. That's pretty faint praise.
Efficiency is the stated goal of C-style languages and OOP, but the result is too often the opposite:
- Programming has become bloated—ten lines of code are now needed where one used to suffice.
- Wrapping and mapping often use up programmer and execution time as OOP code struggles with various data stores.
- Massive API code libraries are "organized" into often-inexplicable structures, requiring programmers to waste time just figuring out where a function (method) is located and how to employ it.
- The peculiar, inhuman grammatical features in C++ and OOP's gratuitous taxonomies continue to waste enormous amounts of programming time.
The Future of OOP
At this point, it's difficult to predict whether OOP will fade rapidly like some intellectual fads or persist like the long, bad dream of Aristotelianism. However, with so many true believers in positions of power, OOP now has the fervor of a religion, its followers busy in every corner of contemporary computing. Many thousands of drone programmers labor under its spell because their places of work offer no alternative. And the profession is now guarded by a priest class that benefits from OOP's murk and mystery—the fewer people who can communicate with computers, the more secure their jobs.
If the professors introduce a new, enticing theory, perhaps OOP will subside. But I've been around long enough to know that the new theory may be even less efficient than OOP. To me, hope resides in the computer itself, not us foolish humans. I expect the machine to eventually be capable of interpreting human instructions in human languages. When that happy day arrives, most OOP dogma will likely seem bizarre, wasteful, and irrational—just one more dead end in our fumbling efforts to communicate with intelligent machines.
Author Note: Mr. B. Jacobs has an excellent Web site devoted to debunking OOP. Take a look.