... part of the Web Site of George North -- back to my Papers
Application System
by: George North
Principles of Operating Systems I
CSCI 4401
Fall 1995
Instructor: M. Rasit Eskicioglu
The history of computing is punctuate by the increasing complexity of its
problem domain. This is to say that our expectation is computer science
can and will solve society's everyday problems. The goal is to make computing
ubiquitous. Evidence, over the last ten years, of this is the (at first
reluctant then) general acceptance of graphic user interfaces. This success
has placed computers into the hands of a significant percentage of the world's
population. Still, much more progress is needed before computers become
our everyday, unconscious, surrogate, servants. The largest problem to conquer
is acquiring the ability to develop software that is both extendible and
reusable.
Taligent will say that these problems can be solved by the use of object
oriented design methods and frameworks. Taligent proposes an applications
development environment built around their "Application System Framework."
TABLE OF CONTENTS
I. A Brief History of Computing 4
II. The Software Crisis 6
III. Taligent's History 11
IV. A Typical Development Model 12
V. The Problems, Restated 13
VI. Application Systems 16
VII. How Taligent is Different 19
VIII. Frameworks -- Shifting the Burden of Complexity 20
References 24
Taligent's Application System
a review by George North
"Success in the world of computing in the 21st century requires the
pragmatic application of object technology to solve real customer problems."
-- Joe Guglielmi, Chairman and CEO, Taligent, Inc.
I. A Brief History of Computing ...
Computers are called computers by accident. By some quark of fate, it happened
that the first tasks put to computers were number based calculations ...
computing. If these first tasks were drawing, music editing, simulating
or language processing, we would know this device by some other name. Information
engine (infoEngine) is a more general concept. More often than not, computers
are used to publish. Thinking about a computer as we think about a book
results a much more general idea of what this device is.
Like the first books, the first computers were expensive and difficult to
access. And like books, over time computers have become less expensive and
easier to use. The task of making a computer useful is called programming.
Loosely, programming can be broken into two broad areas ... systems programming
and applications programming. More on applications programming later, first
let us consider systems programming.
An operating System is the control program for a computer. Its primary purpose
is to make the computer convenient to use. Its secondary purpose is efficient
operation. Operating Systems (OS) make a computer available to end users.
Convenience and efficiency, easy and fast, are often contradictory goals.
As we shall see, this is the number one problem facing computer science
today, and is the problem that Taligent is trying to address.
In 1960, Ivan Sullivan's Scratch Pad was the first computer program to demonstrate
the possibilities of interactive computing. In 1968, Doug Englbart demonstrated
the most important improvement in computer interface to date, the mouse.
But, in the 1960's, it was not generally accepted that computers needed
to be easy to use. After all, being very expensive, and being very scarce,
why would it be necessary for the average person to fell comfortable interacting
with computers.
One company, Xerox, did fund a research project -- Xerox Parc. The key insight
of this project came from an investigation into how people learn. A study
of children showed: the sense of touch predominates at a very early age;
vision predominates at ages 4, 5 and 6; and that our ability for symbolic
reasoning develops in later years. These studies showed that it is always
touch and vision that is employed first when learning a new task. Learning
to use computers was difficult because symbolic reasoning dominated the
task, and because touch and vision were of only secondary value. This work
led directly to the development of a computer with an interactive operating
environment simulating a world that was already understood. By early 1970,
the Alto Office was ready, at $45,000 a copy. But Xerox was not, and the
product never reached market.
The 1970's would be a turning point. In 1971, the first CPU on a single
chip was developed -- the microprocessor. In January 1975, the Altair 8800
was offered for sale ... a complete computer for $500.00. This was letting
the Genie out of the bottle. By the next year, Steve Jobs and Steve Wozniak
formed Apple Computer and became the father(s) of personal computing. The
Apple II was affordable and useful computer, but not easy to learn, not
easy to use. In 1979, Steve Jobs visited Xerox Parc, and returned the next
day ... telling his employee's: "our future just changed."
By 1981, IBM had decided that maybe there was a future in microcomputers
and introduced the IBM PC. IBM and Apple became arch competitors. With its
marketing clout, IBM quickly came to dominate the personal computer market.
But, IBM really didn't care. Having licensed DOS from Bill Gates, without
an OS development project of their own, IBM lost control of its microcomputer
business as quickly as they created it. Still, PC-DOS grew to market dominance,
a market increasingly controlled by Gates' company, Microsoft.
In 1984, Apple introduces Macintosh -- the first computer using an interactive
interface ... the first computer that was truly easy to learn and to use.
The Mac found a following among new computer users. It was not so successful
within the computing establishment. For many years it was strongly criticized
as a toy, with a silly little mouse, and its stupid graphical user interface
(GUI). It was being criticized for its best features.
Through the rest of the '80s, Apple and IBM remained arch competitors. But
the environment was changing. In the 1990's, Microsoft emerged as the controlling
force in computing. Fitting a GUI environment around DOS, Microsoft made
legitimate the Mac's look and feel. Computer hardware was getting cheaper
and more powerful, but application software was becoming more
complex. At the very time when computers were becoming more affordable and
easier to use, software was becoming more difficult and more expensive to
develop.
A new generation of applications software was needed.
II. The Software Crisis ...
Taligent was founded in 1992 by Apple Computer and IBM to help both companies
address what is sometimes called the "software crisis": software
applications take too long to create and maintain and are too difficult
to learn and use.
This has been a familiar refrain since the earliest days of computing, and
such criticism is part of the evolution of any new technology. But the software
crisis has come to a head in the 1990s. New customer needs in the areas
of workgroup collaboration, workflow automation, information management,
multimedia, and data access and visualization are driving market growth
for 32-bit, multitasking computers and related hardware and software, to
the tune of half a trillion dollars so far in this decade. Unfortunately,
most application software for these new computers hasn't caught up with
their capabilities. Customers are clamoring for new kinds of solutions.
They're not looking for a better spreadsheet, word processor, or other "personal
productivity" software; they're looking for integrated enterprise solutions
that can handle the accelerating information needs of a global economy.
The applications that customers want are very difficult to create with traditional
software development technology. Sophisticated applications take years to
develop and millions of lines of code, and when they finally get finished
they are difficult to maintain and modify. At the same time, the economics
of commercial software development are exacerbating the problem. Only the
largest application companies are making money.
"In fiscal 1994, for example, Microsoft's revenue from the applications
portion of its business increased by more than $650 million to about $2.9
billion, while the total revenue for the software applications industry--including
Microsoft--increased by only about $550 million to about $8.3 billion (Dataquest
estimates quoted by O'Connor, 1994). In other words, the rest of the industry
experienced a collective decrease in revenue of about $100 million. Small
companies are being gobbled up by larger ones or going out of business,
and even some of the larger ones are facing severe losses. Both the risks
and the price of entry for software developers who want to bring innovative
technology to market are getting higher every day. This is a problem for
the entire computer industry."
Commercial software developers face a mature and unforgiving software market.
Innovative new products are becoming more and more difficult to develop
and market successfully. The software needs of many users aren't being met.
In many ways, the state of commercial software development reflects the
state of the industry as a whole.
Check out a computer trade magazine. What are the software mail-order houses
selling these days? Version 6.0 of a popular word processor. Version 4.0
of one presentation program, version 5.0 of another. New versions of two
or three big spreadsheet applications, of page-layout applications, of low-
and high-end graphics programs, of utilities and communications packages
and financial applications. But how many listings can you find for brand
new, version 1.0 applications?
How many new application categories have emerged in the past few years?
For example, how many new kinds of applications solve group and enterprise
problems? Is everyone so busy revising their current applications or porting
them to new platforms that they don't have time to develop new ones? Have
all the important applications already been written?
Why such big applications? Because each new version is loaded with more
and more new features. Most users don't use or even know about the added
features despite forests of dialog boxes and menus and submenus that hit
the bottom of the screen. All those new features have to be documented,
so of course manuals keep getting bigger too. You don't hear companies boasting
about their featherweight manuals anymore.
As the complexity of system software and applications increases, so does
the time it takes to bring a new product or a new release of an existing
product to market. The time it takes to get a 1.0 application to market
can easily exceed the funding window of a startup company with a good idea
and specialized understanding of a particular market. This is especially
true of applications for business computing environments. Another consequence
of the length of time between releases is that upgrade prices are approaching
what applications themselves used to cost. Companies that used to send out
upgrades free may now charge a hundred dollars or more for each one. Prices
have risen because the effort of producing an upgrade to a successful commercial
application is beginning to approach and even exceed the effort required
to get the original release out the door.
Ten years ago, a couple of typical programmers with a good idea could create
an application and have a chance of establishing a successful commercial-grade
product. Now it's more common to see 10-, 20-, or even 100-person teams
working on applications. Good applications still emerge from time to time
that are written by one or two gurus -- but there have never been
and never will be enough gurus to write all the software that needs to be
written .
Another reason applications are getting so big is that software
developers are often forced to add what amount to operating system capabilities
. No matter how quickly system software vendors like Apple, IBM, or Microsoft
develop system software extensions and refinements, they can never meet
all the specific system software needs of the big commercial developers.
Some developers claim that over half of their application code is really
system software needed to get their applications to work. They have to extend
the existing operating system, or work around bugs, or shield themselves
from parts of the system that work against what they're trying to do. Rather
than hiring people with expertise in a particular domain, such as publishing
or accounting, developers have to hire computer systems experts to write
user interface elements, printer drivers, line layout systems, fixed-point
graphics models, help systems, and so on . Application teams routinely
include more specialists in various areas of software engineering than in
the tasks that an application is supposed to perform.
The economics of software development are beginning to dictate the kinds
of applications that will and won't be developed. The cost of developing,
testing, documenting, duplicating, and legally protecting a major commercial-grade
horizontal or vertical application can exceed $50 million. That's just for
development; marketing and support can require similar expenditures. In
general, a successful mainstream application requires something on the order
of 100,000 units per year in sales to amortize the development costs, which
implies a market of, say, a million units per year if the application claims
a 10 percent share of a given segment. This means that certain kinds of
applications are likely to get written, and other kinds don't stand a chance.
Most software companies can't afford to target markets of 10,000 or 1,000
or 100 units per year.
Yet there are many more potential customers in focused, vertical
markets than in traditional horizontal markets . Despite the personal
computer revolution of the past decade or so, over half of the business
workers in the United States still don't even have computers on their desks.
This is likely due to the absence of software adapted to their specific
business needs.
The logical consequence of these trends is fewer and fewer profitable software
companies. Over the past several years, software market share has become
increasingly consolidated within a few large companies. A very small percentage
of software titles command the vast majority of total software sales. Big
companies are buying up little companies at an increasing rate, because
only the biggest companies can afford the scale of organization and expenditure
it takes to develop and sustain successful applications for today's computer
systems.
It's not hard to imagine where the commercial software industry is headed
if these trends continue to their logical extreme. Eventually, only one
software company will be making money--the company selling version 10.0
of its single all-purpose application, WordSheet. WordSheet 10.0 is developed
by the top 100 software gurus in the world, who all work for the company.
Tested by 250 test engineers, four years between releases, and costing $500
per upgrade, WordSheet 10.0 doesn't fit on floppies anymore. You have to
buy it on a CD.
These trends are bad enough for commercial software developers, but they
spell disaster for corporate software developers and developers in other
institutions, such as laboratories and universities, that create their own
software. In-house development teams most commonly consist of just two or
three people per project, although they can range up to thousands of programmers
in rare cases. Whatever their size, these teams usually develop custom programs
for use only within their own companies or as part of their company's products,
which may not sell in the high volumes common in the retail software market.
Fast turnaround is critical. They can't take two or three years to get an
application developed. Product cycles as short as three to six months are
desired.
Frequently late, over budget, and in many cases obsolete before they ever
get used, corporate software projects are notorious for creating bottlenecks.
When software is an essential part of a product, the slow pace of development
directly impedes a company's ability to differentiate itself and compete
in new markets. An article by W. Wayt Gibbs in the September 1994 issue
of Scientific American lists some of the more spectacular software disasters
in recent history, including the California Department of Motor Vehicles
attempt to merge the state's driver and vehicle registration systems, which
cost over $43 million and was abandoned without ever being used; the failed
$165 million attempt by American Airlines to link its flight-reservation
software with the reservation systems of Marriott, Hilton, and Budget; and
the Denver airport's computerized baggage system, which contributed to the
delay in the airport's opening at a cost of hundreds of millions of dollars.
These are extreme examples, but similar disasters on a smaller scale are
now part of everyday life in America. The software crisis affects everyone,
including millions of people who never touch a computer.
Macintosh examples -- some examples from the history of the Macintosh computer,
itself an innovative hardware platform, illustrate this problem. The Macintosh
II, which Apple introduced in 1987, was the first Macintosh computer that
included a memory management unit (MMU). An MMU allows a computer to use
virtual memory, which involves treating hard disk memory as if it were additional
RAM. But it was several years before Macintosh system software used this
capability. Millions of MMU chips went unused until System 7 was introduced
in 1991. Even then, support for virtual memory was only partial, because
System 7 didn't provide protected address spaces, shared memory, memory-mapped
files, locking of pages, or other features that take full advantage of the
MMU.
These kinds of hardware support problems aren't the fault of Apple's system
software engineers, who have learned to perform miracles with the legacies
they must live with. The problems arise because the original Macintosh system
software didn't anticipate the need for supporting new hardware capabilities.
For example, it didn't support color, multiple monitors of varying size,
hard disks, bus slots, or advanced networks. Despite these limitations,
Apple engineers came up with inventive solutions that support all of these
features today. But this support comes at a cost in terms of both the engineering
effort and the complexity of the solutions required.
III. Taligent's History
When Taligent was founded in early 1992, Apple and IBM had already committed
substantial resources to the exploration of object-oriented programming
(OOP) as the basis for a new generation of applications. At Apple, this
commitment went at least as far back as the early 1980s and the Lisa Toolkit.
Other object-oriented projects included the Object Pascal programming language,
the MacApp application framework, and (in spirit, if not in detail) the
HyperCard "software construction kit."
After furious internal debates in the late 1980s, a new project, code-named
"Pink," emerged. It was called Pink because, at a meeting in 1988,
key Apple engineers and managers settled on a direction for the company
by jotting down ideas on index cards and pinning the cards to the wall in
two groups: blue cards, representing technologies that could be supported
as extensions to the current system software for Macintosh" computers,
and pink cards, representing technologies for a future dream system. The
technologies listed on the blue cards eventually formed the core of System
7, Apple's current system software. The pink cards listed precursors of
Taligent's object-oriented system software. IBM's involvement with objects
also extended back to the early 1980s and a series of pioneering projects
that were incorporated into software for System/38(tm) and eventually the
AS/400 , one of IBM's most important hardware platforms.
Apple and IBM joined forces to form Taligent because they shared an interest
in catalyzing the development of a new generation of applications, based
on object-oriented technology, that would work the same way across an entire
organization, regardless of the underlying hardware platforms. In early
1994 Hewlett-Packard decided that it could also benefit from this approach
and became Taligent's third major investor and partner.
IV. A Typical Development Model
Today's software development techniques evolved from high-level procedural
languages and structured programming techniques that first achieved widespread
acceptance 25 years ago. At the time, procedural techniques involved a new
learning curve for programmers who were used to older styles of programming.
Nevertheless, the new approach succeeded because the practical advantages
it provided far outweighed the retraining costs involved.
Like procedural programming in its time, object-oriented programming is
creating a new generation of programming techniques, radically changing
the way programmers work, the kinds of software they produce, and the way
they maintain and upgrade software products.
When the Pink project first got under way at Apple in 1988, few existing
operating systems could meet the new system's basic requirements, which
included preemptive multitasking, lightweight threads, fast interprocess
communication (IPC), large numbers of protected address spaces, dynamic
shared libraries, and the like. The team therefore began to develop both
a new programming model for application development and a new microkernel-based
operating environment that could drive power-hungry applications by taking
full advantage of 32-bit (and higher) multitasking computers.
Since that time the low-level capabilities of a number of existing operating
systems have caught up with the basic processing requirements of the new
kinds of applications that customers want. Taligent's investors have continued
to develop their own operating systems, sometimes using Taligent technology
and sometimes supplying technology to Taligent for further development.
Most important, customers interested in Taligent technology have made it
clear that in addition to their need for new application capabilities, they
need to preserve their investment in existing data and associated applications
and operating systems. Any new kinds of applications must not only run on
a variety of existing operating systems but also be capable of interoperating
with existing procedural applications.
Although Taligent is still focused on its original goal of making applications
easier to create and use, the company's strategy has evolved in response
to the expressed needs of its customers. Instead of developing a
traditional operating system from the microkernel up, Taligent has created
a new category of system software, called an application system, that provides
a comprehensive set of services for building flexible, portable
application solutions.
V. The Problems, Restated
The sad stories in previous section (evidence of a software crisis) can
all be traced to the same problem. The problem isn't the hardware, which
keeps getting cheaper and faster and more reliable. It's not the users,
who are more computer literate than they've ever been. It's not application
developers, who generally have many more ideas than they can successfully
develop. A single problem lies behind nearly all the difficulties faced
by the computer industry: the dominant paradigm for developing,
marketing, and deploying application software is stifling innovation.
Increased frustration and slower productivity are common as a technology
matures. It was equally true of the technology that preceded today's procedural
development methods. Twenty-five or thirty years ago, most computer programs
basically consisted of a long list of instructions and data. The processor
would start at the beginning of the list, follow all the instructions, and
output a result. This worked well for solving problems such as preparing
a payroll or performing a series of complex calculations, but not for other
tasks that people wanted computers to perform. As the hardware became more
sophisticated and more powerful, this kind of programming also became increasingly
difficult for programmers, who often wrote programs in relatively low-level
languages or even assembly language.
These frustrations led to some dramatic changes in programming techniques.
Within a few years, high-level procedural languages and structured programming
techniques, previously the domain of academics and a few teams of researchers,
achieved general acceptance in the marketplace. Rather than writing a list
of arcane instructions for the machine to read from top to bottom, programmers
could work in higher-level languages that made it easier to work with logical
abstractions such as loops and data structures.
Personal computer operating systems, such as CP/M, ProDOS , and MS-DOS,
began to emerge in the late 1970s and early 1980s. At first these operating
systems consisted of abstractions that represented hardware devices, such
as the terminal, printer, or disk drive. For example, instead of writing
a lot of code to move a specific kind of disk drive head from place to place
on a disk, programmers could write simpler code that used an operating system
abstraction to write a file to the disk. They could concentrate more on
the needs of the applications they were writing, using the operating system
to manipulate the hardware as necessary.
This led to the three-tier arrangement with clear-cut barriers between the
hardware, the operating system, and an application. To communicate across
the barrier between the application and the operating system, programmers
learned to use application program interfaces (APIs), the commands and definitions
supported by the operating system. This arrangement provided high-level
abstractions of the hardware for programmers to use instead of dealing with
it directly. The operating system took care of the hardware, so the application
didn't have to.
As time passed, operating system designers began to add more and more abstractions
to handle file management, printing, graphics, and other programming tasks.
Programmers in turn began to create more complex applications that took
advantage of these capabilities and added new ones. In the 1980s system
software such as Mac OS, Windows, and OS/2 expanded to include application
libraries that developers could use to add new capabilities to their applications,
including menus, windows, dialog boxes, networking, telecommunications,
interapplication communication, and so on.
For example, a Mac OS programmer needs to call just one function to display
a standard dialog box that allows users to scroll through lists of files,
directories, and disk drives; responds appropriately when the user clicks
or double-clicks various fields and buttons within the dialog box; and returns
control to the application only after the user has made an appropriate selection.
Although some tasks may require calling lower-level operating system functions
directly, in general the applications written for this kind of system software
include more calls to application libraries than to the underlying operating
system libraries.
As the application libraries and the underlying operating systems continued
to mature and acquire more capabilities, the number of APIs and programming
reference manuals and the complexity of the programming task began
to increase exponentially . But if an application needed more specialized
capabilities than some part of the system happened to provide (for example
a printer driver or a high-end graphics engine), the programmer usually
didn't have the ability to modify the underlying library. Instead, the programmer
either had to live with it as is or had to create a new one from scratch,
essentially replacing part of the operating system .
For example, Aldus Persuasion writes all its printer drivers from scratch.
Adobe Illustrator, which runs on both the Mac OS and Windows, ignores QuickDraw`
(the Mac OS graphics engine) and GDI (the Windows graphics engine) and uses
its own PostScript graphics system instead. Developers spend a lot of time
trying to make their programs work with (or around) the underlying operating
system, and therefore less of their time working on the tasks their application
performs for the customer. Whether it calls application libraries or the
underlying operating system, half or more of a modern application's code
is essentially a form of system software.
VI. Application Systems
All this has resulted in the large operating systems and large applications
that dominate the market today. In effect, application developers
are pushing down into the domain of system software, spending more and more
energy filling in the pieces they need to build advanced applications
. At the same time, system software designers continue to expand their application
libraries to meet new user demands, pushing up into the application domain.
As the system expands, it becomes more difficult for proliferating teams
of software engineers to make all the pieces work together seamlessly, so
costs and time to market keep increasing, especially for multiplatform development
efforts. Complex new system solutions don't address the portability needs
of distributed applications.
It's as if system software engineers have built a wonderful house of cards
as high as it can possibly be built. Now only a few gurus have hands steady
enough to reach into the interior and change something without knocking
the whole structure down. And if you can't get a guru when you want to insert
a new card, you have to hire armies of people to hold all the cards steady.
The tension between the needs of developers and the increasing complexity
of operating systems is leading in the 1990s toward a new category of system
software that Taligent calls an application system, as suggested by the
right side of Figure 1. Instead of requiring programmers to incorporate
system-related code into their programs, an application system provides
a comprehensive set of integrated application and distributed services capabilities,
freeing the programmer to concentrate on code related to a single application's
problem domain.
Environments such as HyperCard for the Mac OS, IBM's CICS`, Lotus Notes,
and Smalltalk each resemble certain aspects of an application system, because
they allow application developers to create applications without necessarily
using the underlying operating system directly. Although the lines between
these application environments and their underlying operating systems aren't
always clear-cut, they provide many default behaviors that save application
developers time and effort and ultimately benefit users as well.
Traditional operating systems abstracted the underlying hardware for application
developers, and in some cases, such as the UNIX" system, allowed the
applications to be portable across different hardware platforms. Taligent's
application system abstracts the underlying operating system for application
developers and thereby allows applications created with those abstractions
to be portable across different operating systems.
Taligent,s CommonPoint application systems portable across multiple host
operating systems and capable of interoperating with applications running
on those systems, the CommonPoint system provides robust support for distributed
computing and a user environment designed for people working together over
a network. CommonPoint application developers develop software by using
either host development tools or tools in the CommonPoint Developer Series.
The CommonPoint Developer Series includes the cpProfessional(tm) development
environment for prototyping and development and the cpConstructor(tm) user
interface builder for creating user interface elements.
The CommonPoint application system supports a comprehensive set of features,
including compound documents, 2-D and 3-D graphics, and real-time document
sharing and collaboration. It also provides a complete programming model
that is designed to work the same way on all host operating systems. Taligent's
investors plan to ship the CommonPoint application system with AIX , OS/2,
HP-UX , and Mac OS systems, and Taligent plans to sell it as a separate
software package that runs on other 32-bit systems such as Windows NT and
Windows 95. Taligent is vigorously pursuing ports to all popular 32-bit
operating systems and is building a network of OEM and distribution partners
to deliver its products.
Whenever the CommonPoint system (or a CommonPoint application) needs something
from the host, such as file system or networking services, it uses the standard
calling APIs provided by OS Services. These APIs are represented in Figure
2 by the gray layer below the CommonPoint and CommonPoint Developer Series
boxes. The CommonPoint implementation for each host translates those API
calls into the appropriate calls to the host system's APIs.
No matter what host it is running on, the CommonPoint system requires only
the lowest portion of host system software--file system calls, driver calls,
kernel calls, and so on. It doesn't rely on any of the host system's higher-level
features, such as graphics packages. Programmers writing applications for
the CommonPoint system don't need to call the host APIs directly (though
they can), and applications developed using the Taligent programming model
can be designed to be source-code compatible across all host operating systems.
Taligent's goal is to allow developers who follow the rules to port their
software simply by recompiling their code, although as a practical matter
some testing on different platforms will still be necessary.
In addition applications written for the host operating systems can run
at the same time as CommonPoint applications and can interoperate with them,
thus preserving customers' software investments. Both the host operating
system and the CommonPoint application system use the same low-level host
protocols to communicate with the underlying 32-bit (and higher) hardware.
VII. How Taligent is Different
Taligent products facilitate new ways of working for both application and
system programmers. Most importantly, everyone plays by the same set of
rules. Code written by a developer for the CommonPoint application system
stands on the same technical footing as any code written by Taligent or
by Apple, IBM, or Hewlett-Packard for their versions of the system. Taligent's
use of object-oriented technology throughout all of its products, especially
its use of frameworks, makes this possible.
All Taligent products are built from OOP abstractions known as frameworks
. A framework is an extensible library of cooperating classes that make
up a reusable design solution for a given problem domain. Taligent uses
frameworks not only for application domains such as user interface, text,
and documents, but also for traditional system domains such as graphics,
multimedia, fonts, and printing; low-level services such as drivers and
network protocols; and development tools. Virtually everything that is implemented
as a library in a traditional operating system is implemented as a framework
in the CommonPoint system.\
Nearly everything in the real world can be described in terms of conceptual
frameworks--that is, within a frame of reference. Shopping in a grocery
store, driving a car, playing baseball, buying a house, starting a business,
or flying the space shuttle all involve working within clearly defined rules
and relationships while permitting great flexibility. You can buy a lot
of groceries or just a few; shop at a big supermarket, a corner store, or
a fruit stand at the side of the highway; pay with cash, check, credit card,
or coupons. Despite such variations, everyone knows what shopping for groceries
means, how it works, and how to interact with the other people involved.
Similarly, an experienced driver renting an unfamiliar car takes only a
few minutes to adjust the seat and mirrors and adapt to the new locations
of controls and dials, the feel of the steering, and the location of the
car's edges in relation to other vehicles and the edge of the road. By following
some simple rules and paying attention, the driver can interact safely with
other drivers of different kinds of vehicles going in different directions
at different speeds and navigate many different roads and freeways to reach
one particular destination among many possibilities.
Everything, that is, except software, and everyone except software developers.
It's as if programmers have to grow the food, build the store, stock the
shelves, and invent a monetary system each time they want to buy groceries.
Computer software may be the last major industrial product that's
still built from scratch each time .
Although the preceding comparisons may seem exaggerated, most programmers
would agree that developing software requires much more work than it should.
Taligent is attempting to change the programming paradigm by using object-oriented
frameworks for everything--not only for developing applications but also
for developing the CommonPoint system itself.
Taligent frameworks provide generic code at all levels of the system that
programmers can reuse and customize easily. Instead of using thousands of
interrelated APIs to write monolithic applications for a monolithic operating
system, developers can use individual framework APIs to create smaller,
more specialized programs that are fully integrated with the rest of the
system and with other programs. And instead of writing only applications
that sit on top of the operating system, developers can customize the application
system itself to create new kinds of tools, utilities, drivers, file formats,
networking protocols, data types, graphics primitives, drawing algorithms,
and the like, any of which can be products in their own right.
VIII. Frameworks -- Shifting the Burden of Complexity
Frameworks were first developed to free application programmers from the
chores involved in displaying menus, windows, dialog boxes, and other standard
user interface elements for personal computers. Frameworks also represent
a change in the way programmers think about the interaction between the
code they write and code written by others. Frameworks represent the next
level of abstraction beyond class libraries ... shifting the burden of complexity.
Larry Tesler, one of Apple's best-known OOP gurus, once pointed out that
you can never actually make things easier in software. You can only shift
the burden of complexity from one place to another. For example, the development
of very-large-scale integration (VLSI) technology shifted much of the complexity
that was formerly the domain of computer system engineers to the domain
of integrated circuit designers. This in turn freed computer system engineers
to create more complex kinds of hardware. Eventually, the building-block
approach that VLSI made possible led to the pervasive use of integrated
circuits in ways that were previously inconceivable, for example in appliances,
cars, toys and many other products.
Taligent wants to shift much of the complexity of software development
from application developers to application system engineers . Twenty
years ago, a typical operating system for a personal computer was relatively
simple, as shown by the leftmost column in Figure 4. An operating system
like MS-DOS or CP/M could run in 64K of RAM and from a floppy disk. Because
they had to fit in the same 64K of RAM, applications were also simple. But
users had to work hard to use the applications. They had to memorize commands,
keep track of obscure file names, and learn all kinds of esoteric details
and tricks to accomplish relatively simple tasks. Moving data from one application
to another was always difficult and often impossible. Users had to bear
much of the burden of complexity involved in running applications.
After the Macintosh computer and operating systems with graphical user interfaces
arrived on the scene, these relationships changed, as shown in the second
column in Figure 4. System software companies realized that if they could
make computers easier for users to use, many new users might buy their products.
To this end, they added user interface capabilities to their underlying
system software in the form of application libraries or "toolboxes."
Now applications could take advantage of windows, dialog boxes, input devices
like the mouse, and other innovations that greatly simplified things from
the user's point of view. But the complexity didn't just disappear; instead,
it was shifted to the application developers, who had to work a lot harder
to create applications, and to system software engineers, who had to provide
the new capabilities. The work was worth it though, because more users could
use computers with less effort, and applications could meet a wider variety
of user needs, leading to a net increase in the total complexity of the
problems that the applications could be used to solve.
Taligent maintains that half or more of what application developers
are doing today is system software work that no longer has to be their responsibility
. The computer industry can't expect this situation to continue if it intends
to meet customer demands for innovative, distributed applications that run
in networked environments. Taligent would like to shift the complexity of
computing once again, away from the application developer and into the application
system. The third column in Figure 4 illustrates one consequence of this
approach. Users still don't have to bother with the technical details of
the system, and the applications they use provide the same kinds of capabilities
as before, but small engineering teams can deliver a product in perhaps
6 to 12 months that might take a bigger team two years or more to develop
for the previous generation of operating systems. This by itself can dramatically
change the economics of software development.
One way or another, the software crisis must be solved. The ever-increasing
flow of information on which our society depends is overwhelming our ability
to process it. Computing has become so fundamental to the way we live and
work that software limitations affect everyone, whether directly because
of system failures or indirectly because the high cost of software development
inflates the cost of goods and services. However, technology is not enough,
by itself, to create the new kinds of applications customers are demanding.
REFERENCES
Inside Taligent Technology
Sean Cotter with
Mike Potel, Vice President, Technology Development
Taligent, Inc.
Addison-Wesley Publishing Company
Copyright 1995 by Sean Cotter and Taligent, Inc. All rights reserved.
The Machine that Changed the World
WGBH, Boston
BBC TV
1992
|