[ Back to Index of Web Works ] part of George's Web Site

Related Web Works


http://acm.org:82/siglink/ECHT94TR/Engelbart.html

Long Distance Perspectives on Hypermedia

Keynote Address at ECHT94 by Douglas C. Engelbart and Christina Engelbart, Bootstrap Institute

By: V. Balasubramanian Hoffmann-La Roche & Rutgers University Email: bala@pegasus.rutgers.edu

Helen Ashman Defence Science and Technology Organization Email: helen.ashman@dsto.defence.gov.au ---------------------------------------------------------------------------- The keynote address by Doug Engelbart was one of the well-attended and thought provoking events at ECHT94. He has been one of the early pioneers in the world of hypermedia who realized its potential as early as 1968. In sharp contrast to Ted Nelson's animated presentation at Hypertext '93, Doug Engelbart was soft-spoken and clearly demonstrative of his ideas. Participants who attended both conferences could see the marked difference in the workings of two great visionaries in the field. 

The emphasis of the speech was computer-support to boost the collective IQ of communities or the collective intelligence of corporations. Engelbart said this has been the focus of his research for a very long time since the development of Augment/NLS in 1968. He said that the theme behind NLS was to help people operate effectively and efficiently within the domain of complex information structures. He mentioned that research stopped in the mid-70s because the research community believed that he was going in the wrong direction. Between 1978 to 1988, a number of corporations were using NLS. In 1988, the project ran out of resources and could not be supported anymore. However, the Augment system was kept alive by Engelbart on a DEC 20. He said he has just received some funds from ARPA to port Augment using VisualWorks for the front-end.

The objective behind the formation of the Bootstrap Institute is to enable the emergence of high-performance organizations. In order to boost collective IQ, Engelbart emphasized that we need a revolution and cannot stick to short-term, market-oriented strategies. He proceeded to explain in detail the concepts behind the Bootstrap Paradigm. The central theme of the paradigm is CoDIAK, which stands for Concurrent Development, Integration, and Application of Knowledge. It is expensive for organizations to change human systems comprised of paradigms, organizations, procedures, customs, methods, language, knowledge, training, attitudes. To augment the capabilities of human systems, we should focus on the capability infrastructure, that is, the co-evolution of human systems with tool systems such as facilities, media, tools, machinery, vehicles, etc. He felt that the match of tool systems to the needs of human systems is very limited today while what we really need to boost the collective IQ is of many times in magnitude.

The bootstrap concept comes from the fact that in order to boost an intellectual activity A, we need a quality driven process called B, which in turn needs a boost through another activity C. Such an approach would reduce the improvement-cycle time in intellectual activities. Organizations deal with the external environment by scanning, ingesting, and interacting. They also work with various sources of information such as recorded dialog (memos, minutes, rationales, commentary, reports), intelligence collection (articles, books, proceedings, brochures, surveys), and knowledge products (plans, proposals, reports, specifications, source code, etc.). All these diverse sources should become part of the hypermedia or CoDIAK environment. Documents must have implicit, intrinsic addressability to support linking. He stressed on the concept of universal addressability, similar to Ted Nelson's "docuverse" or "all for one and one for all" concept. This calls for the design of an Open Hypermedia System (OHS). Such an OHS should provide the necessary infrastructure for collaborative knowledge work providing structure, object linking, viewing, browsing, shared screens, scripting, etc.

Engelbart felt that while everyone is excited today about email, conferencing etc, these concepts were demonstrated 12 to 13 years before the invention of the PC! He felt that there was a decade of lost opportunities due to the proliferation of PCs and DOS. Engelbart cautioned designers not to design systems which are just easy-to-use. Ease of use most often limits the potential of the system. The underlying system should be geared for high performance organizations and individuals and should not be constrained just because it should be easy-to-use. He was of the opinion that if we do not explore new opportunities now as we become increasingly global we will all collectively run off the track and crash! He encouraged researchers to collectively talk about collective IQ and drum up research funds. He felt that his greatest contribution to the field would be to affect all our vocabulary to think and talk about collective IQ.

Engelbart demonstrated the concepts of universal addressability, unlimited jump facilities, backtracking, outlining, and command-sequence scripting by showing Augment/NLS, the way it was implemented 26 years ago. He also showed the use of a chord-key set to optimize finger movements and accomplish more by combining command sequences. He added that people should be able to mail links along with documents.

To share with the audience more details of his ideas, Engelbart later on conducted an informal session. During this session, he showed a film from 1968 demonstrating the rich hypermedia features of Augment/NLS. It had many of the present-day hypermedia features such as static links, basic search capabilities, outline or overview mode (of documents) and presentation of pictorial information. The video showed that the demonstration was taking place over a pair of networked computers, one of the very first such connections. It seemed that many of the hypertext systems existing today have very little more to offer than this very early prototype, and this includes the current versions of World-Wide Web browsers. Most certainly many of the fundamental principles characterized by today's hypermedia systems were already present in Engelbart's prototype of 1968.

Engelbart's research group was also responsible for a number of other important inventions, such as the computer mouse, the chord key-set, the word-processor and the point-and-click user interface. He showed an amusing slide of the very first mouse ever made, and explained why his two-wheeled mouse was superior to the present-day track-ball mouse. The two-wheeled mouse had one wheel each for both horizontal and vertical movements. Hence, moving the mouse only on a single wheel meant that perfect horizontal or vertical movement could be easily achieved unlike the problems with the present-day mouse where the pointer seems to disappear more often even with the slightest of hand movements.

It certainly appears that a great deal of innovative work has come from his research laboratory over the past three decades. Many of these innovations have taken some time to appear in general use and of those that have appeared, few are recognized as having originated from Engelbart's laboratory. The informal session showed us just how much of our computer heritage we owe to his early vision and experimentation. It was to be regretted that the attendance at this informal session was so poor, due to the whiskey-tasting event that was being held simultaneously.

We were all delighted to have someone of Engelbart's caliber and excellent reputation to give the keynote presentation, and Ian Ritchie and the Conference Committee must be commended for arranging it. However, Douglas Engelbart's speech left us all wondering: how far have we really progressed in the twenty-six years since Engelbart's pioneering work? How much more is there to do before we can fully realize his vision? ---------------------------------------------------------------------------- Keith Instone / instone@acm.org / 94-12-20

Project Bootstrap's Goal

By: Walter Vannini Human Systems Email: vannini@uqbar.cirfid.unibo.it ---------------------------------------------------------------------------- Professor Douglas Engelbart opened the Conference works with a speech titled "Long-Term Perspectives in Hypermedia." Engelbart's speech described his current "Bootstrap" project's goal, no less visionary than hypertext must have seemed thrity years ago: augmenting the intelligence of society as a whole: people, companies and countries. Whole kit and caboodle.

The basis of this goal is simple: intelligence in an individual (system) is not only the brains (computing power) but the nervous system as well (information transport and distribution). Also, in the case of human social systems, people, tools and goals are not independent variables. Thus, he proposed to tackle the problem from both sides, tools and people. 

Regarding tools, Engelbart stressed the importance of improving the "nervous system" of our computers, their ability to collect, retrieve and distribute information. As far as people are concerned, instead, software should be rethought from the ground up, using effectiveness to the user and to the task as the main goals. The conclusion, untold but apparent, is that pure computing power is no longer the key issue; instead it is often overabundant, while too much software is designed the wrong way, overextending the number of automated menial tasks, while not even trying to reduce the time necessary to perform complex ones.

He pointed out how almost all software only takes on the most cognitively trivial tasks; for instance, word processors are great at typographical tricks, but are are absolutely worthless at helping people refine vague ideas so that they not only print well, but convey meaning.

Engelbart also had something to say regarding companies, which he deemed incapable of any serious long-term strategy. This short-sightedness is responsible for enormous waste of effort, because short-term progress hardly ever grows up as a long-term one. As an example, Engelbart pointed to networking technologies: initially developed during the '60s and '70s and very well grounded in the research community, they were completely ignored by the PC industry, only to be rediscovered in the past few years.

Not content with saying things many people had rather not hear, Engelbart went further and suggested that one of the many spinoffs of his Bootstrap Project, the "ABC methodology for improvement," could be used to bring some perspective in industrial development. The method works like this. A is any company activity or group. B is a group in charge of improving A's production time, while C is another group in charge of improving B's improvement-of-A time. Trivial? No. As he pointed out, C is independent of either A or B. Which means that C-level groups might collaborate across different departments, companies or even countries. Now, this is what I call long-term perspective!

Engelbart closed his speech mentioning the "interface problem", which seems to be at the center of today's debates. He pointed out how "user-friendliness" might be more a pitfall than a useful concept in application design as "user-friendly" usually means "friendly to the novice user:" skilled users are actually penalized by interfaces conceived for casual, naive users. Tough tasks remain tough, with the added complication of facing programs which treat you as if you were the dumb one.

Engelbart's speech was a perfect keynote for the ECHT Conference, because it pointed out what the hidden, unspoken feeling of the Conference would be: technology is not the issue anymore. We already have plenty of that. The point now is to build a technological culture to: 1) help developers solve real-world problems for real people and 2) help people see the enormous changes this technology will bring, not as a curse but as another step in the evolution of society.

The only weak point I saw in the speech was the demonstration of his NLS/AUGMENT system. It was cool, and it made you shiver to think it's more than twenty years old, but it added almost nothing to what Engelbart had already said. His arguments were at such a high level that I felt the demo even had trivializing effect. Douglas Engelbart once more did what he's always been good at: showing us bits of our own future and ways to build it. Now the problem is, we won't be able to say we didn't hear. ---------------------------------------------------------------------------- Keith Instone / instone@acm.org / 94-12-14

Too Much Hypertext or Too Little?

ECHT94 Keynote Address by Jakob Nielsen, SunSoft

By: Keith Instone Bowling Green State University Email: instone@cs.bgsu.edu ---------------------------------------------------------------------------- The middle keynote address at ECHT94 took place in the morning of Thursday, September 22. Jakob Nielsen, a recognized leader in the hypertext community, author of the "Green Bible" Hypertext & Hypermedia and several books on usability, gave a presentation entitled "Too Much Hypertext or Too Little?"

The title of Nielsen's presentation deals with the idea that if you give someone "too much" hypertext, they will become lost; if you give them "too little", they will not even be able to get started. His talk was in two parts-the first being about usability and the second about the future of the user interface.

The first half of Nielsen's keynote can be summed up as "Usability engineering: you can do it, too, even on the Web". He described the rapid usability work he did to come up with a Web interface to Sun's internal information. The key word here is rapid, as the internal Sun Web was growing quickly, and even a two-week delay could have led to problems, as different departments were starting to do their own things. The testing involved 4 studies to categorize the different areas and to design icons for them. To categorize areas, he made an index card for each small piece of the information space and had people group the cards into similar piles. The iterative icon design involved making an icon for each group, and then asking people what they thought the icon stood for. Mis-understood icons were redesigned.

[Image] The final version of the Sun internal web home page 

The little stories Nielsen told of the evolution of individual icons was interesting. For example, one icon he tried for the "Marketing, Sales and PR" area looked like a television with the Sun logo on the screen. People found it intuitive-they associated it with the marketing area. But no one actually liked the icon, so a better one was made (the search light).

But the true value of this part of Nielsen's presentation was his explanation by example. He was not just up there touting discount usability engineering. He was showing you that he himself does it for real-world tasks, and that it is not rocket science-anyone can do it. There is a lot of development going on right now with the World-Wide Web (and other parts of the hypertext field), and most (if not all) of it could be done better if they followed Nielsen's example of discount usability engineering.

For the second half of his keynote, Nielsen became more forward-thinking and talked about user interfaces of the future. He claimed that the "Age of Great Applications" will come to an end; huge applications ("FatWare") with tons of features will cease to exist. Instead, flexible modules would be developed, so that users can mix and match software to meet their needs. In a cute analogy, he said current software is a "Tyrannosaurus Apps" which will become extinct and be replaced by small, modular, mobile "mammals."

Small objects with rich attributes and multiple views will be the norm. Instead of WYSIWYG, we will have multiple views for our objects. Objects will have to knw how to display themselves under different circumstances, such as over the phone, on a workstation, or based on various levels of interest.

The implication for hypertext will be that it will no longer be a special application, but it will be intergrated into the entire environment. And as information becomes more modular, we will need more hypertext to access it.

Norbert Streitz introduced Nielsen by pointing out his great ability to act as a communicator of ideas. With his keynote address at ECHT94, Jakob Nielsen continued to communicate ideas of how to improve hypertext today and what hypertext will be like in the future. ---------------------------------------------------------------------------- Keith Instone / instone@acm.org / 94-12-14

World Wide Hypermedia

ECHT94 Keynote Address by Tim Berners-Lee, MIT & CERN

By: Walter Vannini Human Systems Email: vannini@uqbar.cirfid.unibo.it ---------------------------------------------------------------------------- The two souls of the hypertext community met each other during the closing speech by Tim Berners-Lee, the father of the World-Wide Web. On one side was the strictly scholarly view of hypertext as a pure research subject, the source of endless research papers on the alleged virtues of systems so experimental as to never have gotten out of their inventors' labs-and sometimes, even their minds. On the other side was the practitioners' view, of hypertext as a tool, a means to many practical ends. It wasn't hard to tell which side Tim Berners-Lee, with his zero research papers and several million users, was on. Apart from the facade of amicality and professional partnership, more than one face in the audience could have been taken straight out of a soap opera, complete of evil looks and icy smiles.

The success of WWW has been sweeping and undeniable, the reasons being the immediacy of its model, its unorthodox, simplifying use (or abuse) of the SGML standard and its hardware-and-software-independent freeware approach. It is clear how such a success could not help but raise more than an eyebrow.

Berners-Lee started his speech by pointing out the context where the Web was conceived:

* a highly distributed community of researchers (high-energy physicists) requiring a fast and efficient exchange of experimental information among workgroups * researchers with the most diverse hardware allowances, from dumb terminals to personal computers to workstations * impossiblity of centralizing or otherwise regulating the flow of information: each researcher shared information with whoever could be of help at the moment, on an ad hoc basis * inability to forecast who could need some information at a given time * the need to have each researcher be able to produce his or her own information in the needed format and to relate it with the information already available.

In a context like this, Berners-Lee tried out a solution based on the the technological minimum common denominator: a system that could run native on any platform with any kind of interface, either graphical or character-based; a system that was distributed, non-centralized, and user-customisable. A system that, instead of trying to seek a standard at the hardware or software level, did so at the data level.

Thus was the Web born. Berners-Lee recalled the role of the Internet and its community culture in the success of the Web: the original code was released for free on the net, and programmers around the world provided voluntary help by porting the original NeXT code to other platforms. From that moment on, it's history. The Internet now sees the Web as its second nature. What had originally been conceived for a tool for a limited community of researchers is now the tool of choice of "Mr. Smith Goes to The Internet."

The hypertext community was taken completely by surpise. The ongoing debate on models and standards on data interchange was brought back to square one. Out of the blue, the standard was not only there, but already in use by millions of people.

Of course, as Berners-Lee himself pointed out, the Web is not the last word. In its earnest, grass-roots approach the Web ignored most of the research on hypertext, and this resulted in a number of weaknesses, both as a model and as a tool. The two weak spots most cited by we critics are:

* the Web only recognizes one kind of hypertext link: binary, unidirectional, untyped; * the data format is based on the SGML standard, but the standard has been applied in such a reductive fashion that the Web cannot access the gigabytes of information already stored as SGML code.

Now that Berners-Lee is in charge of an international MIT/CERN committee on Web development, there is good hope that the Web will evolve to include more of the results of hypertext research.

Despite the all too easy criticisms, it must be said the the Web did what no other hypertext system did (with the possible notable exception of HyperCard): it brought hypertext to the large public. And it did so in a pretty nice way, too; I guess not even the harshest critic can deny that the Web is the closest thing to the Xanadu dream other than Xanadu itself; only, no Ted Nelson behind it. Berners-Lee spoke with a very pragmatic, down-to-earth attitude. The Web, he said, is proof that all that was said and written on hypertext in these years is true. Ordinary people, with an average computer literacy, are perfectly capable of using hypertext tools that traverse the Internet in their everyday activity. This demonstrates that pure technology is not the main issue any longer; now we also have to develop a culture to help these people appreciate how these very tools change the way information is organized, how work is organized, how life is organized. Without changes in culture the impulse to progress brought by mere technology will hardly be significant: cars were not an important invention because they were faster or cheaper than horse-powered vehicles (and in the beginning they were not, anyway) but because they changed the way people looked at and thought of travel and transport.

The builders of the new electronic world, continued Berners-Lee, must be aware that they are also building a society which will inhabit it, and that a society cannot live without universally accepted rules. Berners-Lee said it is necessary to build so-called "semantic objects" such as contracts, letters of payment, but also conceptual dependencies and personal opinions. These semantic objects should also be machine-interpretable, fostering the growth of electronic, distributed social and economical relations in cyberspace as a replacement for today's paper-based, centralized ones. Berners-Lee has called for the definition of a "constitution" for cyberspace; the Internet, he said, is not only a means of communication, but also a whole, growing, social system. Actions made in cyberspace, he continued, have more and more consequences on one's physical life. A whole new society is being born, and this society needs rules of behaviour (and technologies) to develop a cyberspace-based, online economy upon which to thrive.

Whoever believes the Internet is the next step in the evolution of society could not have hear better words. ---------------------------------------------------------------------------- Keith Instone / instone@acm.org / 94-12-14

Spatial User Interface Metaphors in Hypermedia Systems

ECHT94 Workshop

By: Andreas Dieberger Vienna University of Technology Email: dieberger.chi@xerox.com

Keith Andrews Graz University of Technology Email: kandrews@iicm.tu-graz.ac.at ---------------------------------------------------------------------------- The workshop "Spatial User Interface Metaphors in Hypermedia Systems" was held on September 18, 1994 in Edinburgh in conjunction with ECHT94. Participants were selected on the basis of position papers, which were then distributed before the workshop. To promote discussion in advance of the workshop an e-mail discussion list was set up. In all, there were 11 participants at the workshop.

Spatial user interface metaphors essentially try to utilise human abilities to use space to organise objects and to navigate within structures like houses or cities. The use of space in a particular application depends on the goal of that application-be it to make structure explicit, to support navigation, or to support organisation. Well-known examples of spatial user interfaces metapors are the desktop metaphor, the Xerox Information Visualizer, and most virtual reality systems.

The position papers and the first round of discussion brought forth that the main interests of many participants were in easing information access and navigation using spatial user interface metaphors.

As a starting point for the workshop, Keith Andrews from Graz University of Technology gave a presentation of the Harmony client for Hyper-G. Harmony supports both hyperlinking within and between three-dimensional scenes and navigation through a dynamically generated information landscape.

Several topics were isolated as being relevant to the field of spatialuser interface metaphors. Due to time constraints not all of these issues could be discussed during the workshop.

Layout How should spatial structures be laid out? Should hand-crafted or automatically generated layouts be used? This issue is related to the goal of a system. Systems where users modify the spatial layout sometimes use spatial relationships to communicate with the machine or other users. An example of such a system, VIKI, was presented in an ECHT94 paper session. Automatically crafted spatial layouts can be used to show structure inherent to the information in the system.

Large Dynamic Multi-User Environments How should large, dynamic, multi-user environments be supported? An advantage of spatial metaphors is the ease with which people are able to communicate about spatial relationships, allowing collaborative navigation in such systems. Communication about the environment and navigation relies partly on static features (points of orientation) in the system: therefore systems should not be too dynamic.

What to Represent Spatially? Spatial layout expresses a (spatial) relationship: what is the meaning of this relationship? Hyperlink relationships map to a topological space, composite structures to a graph/set, and search results (from information retrieval) to a vector space. The content or attributes of nodes may be used to embellish the space, or the contents may themselves display some spatially meaningful structure.

Evaluation Evaluating spatial metaphors. Although many spatial metaphors have been proposed or even implemented, it seems that few systems have been evaluated on the usability of their spatial metaphor - at least in the field of hypermedia. This lack of evaluation makes the field of spatial user interface metaphors seem very immature.

Conventions In order to promote the transfer of knowledge between systems, "conventions" on the meaning of spatial layouts seem necessary. Conventions also should clarify the meaning of spatial icons or widgets (user interface elements). Spatial user interfaces and their elements should also show clear affordances (that is: they should visually communicate their use). Local neighbourhoods might have their own conventions, metrics, and customs.

"Magic Features" User interface metaphors are typically not an identical mapping from a source domain to a target domain (for instance, objects not falling off a virtual desktop). Features which extend the metaphor beyond the source domain (sometimes called "magic features") often make a metaphor particularly useful. In the context of spatial metaphors such magic features are especially interesting in navigation. The use and realisation of such features-especially for supporting navigation-has so far not been very well addressed in the literature.

Applicability of Spatial Metaphors Perhaps the most important issue concerning spatial metaphors is to decide when to use them and when not to. As user interfaces today tend to be very realistic, a spatial conception often seems appropriate. However spatial user interfaces are not a universal solution to navigation problems.

The results of the discussions clearly point to the need for detailed evaluation of spatial user interface metaphors. They might be useful in a smaller range of applications than is believed today. Like in all other user interface issues, these evaluations will probably show that there is no universally valid spatial metaphor. This is especially true when also social and cultural issues of the concept "space" are taken into account: most spatial metaphors today (and also most metaphors discussed at the workshop) are quite detached from the social beings who are supposed to use them.

Spatial metaphors have to bring real advantages for the system: it is hard to sell a metaphor alone. Despite the advances in virtual reality systems, spatial metaphors are very much in their infancy. One of the conclusions of the workshop was that we should stop talking about spatial metaphors, start building and, above all, evaluating them.  ---------------------------------------------------------------------------- Keith Instone / instone@acm.org / 95-01-03

----------------------------------------------------------------------------

This was scanned in from my dog eared copy ofBYTE Magazine 20 Year Special Issue Sept 1995

Dreaming of the Future Douglas Engelbart

Digital technology could help make this a better world. But we've also got to change our way of thinking

CoDIAK-improvement

Despite the rapid progression of computing technology, the world faces incredible hazards as we enter a common economicpolitical vehicle, travelling at an everaccelerating pace through increasingly complex terrain. Our headlights are much too dim and blurry, and we have totally inadequate steering and braking controls.

Many years ago, I dreamed that digital technology could greatly augment our collective human capabilities for dealing with complex, urgent problems. Computers, high-speed communications, displays, interfaces-it's as if suddenly, in an evolutionary sense, we're getting ~ super new nervous system to upgrade our collective social organisms. I dreamed that people were talking seriously about the potential of harnessing that technological and social nervous system to improve the collective IQ of our various organisations.

Then I dreamed that we got strategic and began to form cooperative alliances of organizations, employing advanced networked computer tools and methods to develop and apply new collective knowledge. Call these alliances NICs (Networked Improvement Communities). This seemed eminently sensible. The new technologies could enable much more effective distributed collaboration, and the potential for shared risk and multiplied benefits seemed promising. In the dream, the solution involves giving high priority to the collective capability for a distributed community (or organization) to develop, integrate, and apply new knowledge. We already had this capability, of course; organizations handle new collective problems all the time. But yes, it would be nice if we could be a lot more effective at it. In the dream, this collaborative capability was called CoDIAK, for Concurrent Development, Integration, and Application of Knowledge.

Sounds great. The better we get, the better we get at getting better. Call it bootstrapping. And just think of the important role for technologists. Although exciting new technology innovations have indeed been introduced within the NICs, the technology efforts have been overshadowed by the concurrent efforts in "human-system" innovation. This includes new skills, methods, collaborative organizational structures, telecommuting, knowledge-worker teams, distributed goal setting, planning and management processes. One of the ideas computer-oriented folks have contributed is the open hyperdocument system. For this to make a difference, we must shed our outdated concept of a document. We need to think in terms of flexible jumping and viewing options. The objects assembled into a document should be dealt with explicitly as representations of kernel concepts in the authors' minds, and explicit structuring options have to be utilized to provide a much enhanced mapping of the source concept structures.

The Web/HTML (Hypertext Markup Language) publishing-browsing landslide has moved steadily toward a highly structured, object-oriented architecture with integrated editor-browser tool sets. But this needs to become the way the majority of people do all their work. Draft notes, E-mail, plans, source code, to-do lists, what have you-all can be hyperdocument pieces, instantly and intrinsically linkable, and with work processes involving fewer and fewer hard-copy printouts.

It has been exciting to watch the emergence of totalquality management, process reengineering, NII (National Information Infrastructure), the World Wide Web, and so forth. But it pains me that we haven't yet put up an explicit CoDIAK target, nor explored how NICs could fly. Since the first of these dreams got fixed in my head, decades ago, I've struggled with the realization that the sooner the world get serious about pursuing the possibilities, the greater the chance that we can reduce the hazards facing this careening vessel carrying us along.

If the dream of improving human destiny doesn't move people, how about the thought that the companies that adopt the best CoDIAK-improvement strategy will have a significant competitive advantage. Wouldn't you want your group to have the highest collective IQ? I confess that I am a dreamer. Someone once called me "just a dreamer." That offended me, the "just" part; being a real dreamer is hard work. It really gets hard when you start believing in your dreams.-

As a researcher and inventor in the late 1950s and early 1960s, Douglas Engelbart envisioned most of the computing concepts we now take for granted (see the brief biography on page 137). He heads the Bootstrap Institute. You can reach him by sending E-mail to engelbart@bootstrap.org. ---------------------------------------------------------------------------- ----------------------------------------------------------------------------

[Image]

[Image]

[Image] The man who sees the future

Doug Engelbart built the mouse; he may alter computing again

Doug Engelbart is seated on a sofa in a quiet corner of [Image] his home in Atherton, Calif. On the shelf above him is a book titled The Magic of Thinking Big--which is what he is best known for. Today, he is thinking of a complex system, a freeway at rush hour. But his thoughts are not confined to the present as he explains the difficulty of expressing concepts for which past experiences offer no frame of reference. "You realize when you merge onto a freeway that you're betting your life on what you see in that little rearview mirror, and yet you don't think anything of it," Engelbart says in his soft, grandfatherly voice. "Now, think back to 1905 and what people used mirrors for. Nobody at that time would have ever thought of betting their life on what they could grok at 60 miles an hour with a few glances in a vanity mirror."

Grok is a favorite term of the 71-year-old Engelbart. It means to comprehend immediately, and it first appeared in the 1960s at a time when, it could be said, Engelbart was grokking the digital future. The mouse, on-screen windows, hypertext and many of the other innovations that define computing today are the direct result of Engelbart's work back then. "Yet," says Paul Saffo, a director of the Institute for the Future, "all the brilliant things he has produced are mere baubles compared to the ideas he's trying to get across."

Those ideas are based on the premise that the complexity and urgency of the world's problems are increasing at a rate that is greater than mankind's ability to cope. Engelbart's solution is bound up in his idea of augmenting the collective IQ of organizations through a process known as bootstrapping. Though these ideas have driven him for the better part of 45 years, to the outside world they remain largely ungrokked.

Visionary. That's why Engelbart is relatively unknown beyond the high-tech community. Yet ask almost any executive in Silicon Valley and he or she will describe him as a visionary so far ahead of his time that he has often had difficulty explaining his concepts to those rooted firmly in the present. "I've always thought of [Engelbart] as the father of personal computing," says Alan Kay, a senior fellow at Apple Computer who is often referred to by that title.

In his 1985 book, Tools for Thought, Howard Rheingold called his chapter on Engelbart "The Loneliness of a Long-Distance Thinker." That sums up what has been Engelbart's blessing and curse. It all began in 1951. Fresh out of the Navy, with a fiancee and a new engineering job at the precursor of today's National Aeronautics and Space Administration, Engelbart saw his future as a featureless hallway going on indefinitely. "I just thought," he recalls, "I've got a steady job, I'm going to get married and live happily ever after, and oh, my God, are all my life's goals already met?"

The thought caused Engelbart to calculate how many minutes he would invest in his working life until age 60 (roughly 5.5 million) and what return he wanted on that investment. He considered changing careers. Then he hit upon the idea that would change his life: computers. "It was February 1951 when I had the picture," says Engelbart. "To think then of computers being devoted to supporting individuals sitting there interacting with them was crazy."

Things might have been different had Engelbart's vision been confined to one or two notions about office automation. But his was a worldview, an overarching idea of how computers and humans could interact and of how information could be displayed, networked, organized, cross-referenced and logged to augment the collective IQ of organizations. Engelbart was finally able to pursue his dream in 1963, when he founded the Augmentation Research Center at Stanford Research Institute. That's when the innovations started to flow; from the mouse, to help menus, to becoming the second node on the Internet, much of what defines computing today was developed at ARC by Engelbart and his colleagues.

In the cold. When the funding for ARC gave out in 1977, Engelbart took his crusade to the private sector, but he found himself in an intellectual wilderness where few understood his ideas and even fewer were willing to back them. "The rate at which a person can mature is directly proportional to the embarrassment he can tolerate," Engelbart said in a 1994 interview. "I've tolerated lots."

Yet today, two Silicon Valley companies, Sun Microsystems and Netscape Communications, are working closely with Engelbart to create an alliance of business, government and civic organizations that will act as a prototype for many of his ideas. At its heart is the concept of bootstrapping, an engineering term describing a process in which the results of an action are fed back to achieve greater results more quickly with less effort.

In practice, the alliance will involve organizations coming together in a dialogue that will be tracked and cross-referenced to build up a knowledge base. The dialogue will range from organizations communicating their needs to computer companies to more abstract ideas about the coevolution of computers and humanity.

There is also a technological component. Although the alliance will be open to all, the enabling technologies will be Netscape's World Wide Web browser and Sun's Java language, which lets programs run over the Net. The idea is for users to migrate onto the Web so all data on an individual's desktop computer will be accessible at any time, from anywhere, using any PC.

Engelbart remains hopeful. "In the late '50s and so many years afterward, I'd wake up in a sweat and say, `God, why am I doing this?' " he says. "But on the other hand, you think that as soon as the world learns how to get this value, it will make a big difference." And so Doug Engelbart, who grokked the digital future more than four decades ago, continues to wait for that future to grok him.

BY ERIC RANSDELL IN SAN FRANCISCO

[Image]

Have a comment? Want to read what others have to say? Click here.

-------------------------------------------------------------

CREDITS

Send comments to webmaster@usnews.com

© Copyright U.S. News & World Report, Inc. All rights reserved.

This site is engineered by AGTinteractive[Image]




[ Back to Index of Web Works | Top ]