Multimedia - A Q&A Topical Approach (Part II)

 1. Printed books do not have hypertext ability. Do you believe hypertext makes documents so powerful as to render printed books obsolete? If so, what kinds? All books or just certain kinds? 

Answer: This is a deeply philosophical question. I can easily argue on both sides of "the fence." On one hand, I, like many people around at this time in the history of humanity, were raised and educated within a 'paper-based society.' Books in particular, for many, many centuries (actually millennia) were associated with the medium of paper, as invented by the Assiro-Babilonians and Egyptians thousands of years ago. Before that, they used the walls (of their dwellings, rather caves), or clay plates/tables to record, write on (important) information to be passed along from generation to generation. It must had been a "culture shock" to move from that 'technology' level to 'paper' level. And we (the humans) lived through millennia with the paper as the technology, or medium to record information on (and unfortunately, such an easily to be destroyed medium!).

But, the computers, the Internet (Web), and now the Hypertext/Hypermedia technology are rapidly changing again our society! And it's going to happen regardless what (some of us) like, prefer, or believe! On one hand I do like books. I actually love them and appreciated them since I was very young. More than that, I do not know how many people noticed that the value of 'goodness' of a book is sometimes indicated by its smell!…

Technology is changing that. And we are so lucky to witness the techno-socio-cultural transition. Somehow, we are as lucky as our human counterparts were when the transition from the clay tables or plates to paper - papyrus, occurred! We as a society will become within our lifetime, paperless. So the books will be electronic and in hypertext format. For some of us, it will be just of a sentimental value to keep the paper-based version. Non-paper based 'books' will be the standard, and it will greatly enhance the study, learning and reading process. For sure our children will see or feel no difference, because they will be raised like that. They will not know another way or feel any emotions toward paper-based (all) books or journals.

Hypertext can provide people with a new type of interactive learning experience. However, the introduction of hypertext can easily become a barrier to those who are not familiar with the technology. For example, people need to understand the electronic context, learn the visual symbols, and know the mechanics of accessing, reading and writing nonlinear texts. First, people learn to read hypertexts by interacting with online documents created in the HyperCard program. Second, people read about hypertext in a traditionally printed booklet.

In the 1990s, there are a variety of working hypertext models. According to Heim (1993), "hypertext is a mode of interacting with texts, not a specific tool for a single purpose. You can realize what hypertext is, or can be, only by sitting down with it for half an hour. Once caught in the interactive nature of the thing, you can begin to imagine an immense range of possible applications" (pp. 29-30). Today, the term Hypertext has been expanded to include a wide range of computer applications such as interactive books, encyclopedias, online reference indexes, and other forms of nonlinear reading and writing which are created by means of computer technology.

Blazing trails through information space is "analogous to the trail of mental association in the [people's] mind" (Nyce & Kahn, 1991, p. 58). "Instead of a linear, page-by-page, line-by-line, book- by-book approach, the user connects information in an intuitive, associative manner. Hypertext fosters a literacy that is prompted by jumps of intuition and association" (Heim, 1993, p. 30).

By jumping to different passages in a single text or to passages in different texts, multilinear hypertext presents multiple points of view. "No piece of hypertext ever sings solo; it always collaborates in a (cacophonous?) choir with all of the other nodes of the network in which it is implicated" (Fowler, 1994, p. 18). Multilinear reading requires people to make critical choices about which passage of text or point of view they will access next. The complexity of these decisions depends on the type of document and the design of the hypertext system. For example, HyperCard documents that reside on a single computer have fewer connecting texts than networked systems.

Replacing ambiguous choices with predictable ones, improves reading in general. According to Smith (1985), prediction is an essential reading skill. We predict because the world is full of ambiguity and we can become overwhelmed with possibilities. A second "reason for prediction is that there would otherwise be far too many alternatives to choose among" (p. 77). Hypertext designs eliminate alternatives by linking together texts based on a theme or topic. By enabling people to select texts based on a related topic, they will not become overwhelmed with information. In contrast, people can gain a greater sense of control that will maintain their interest in the texts.

Interest in texts can be sustained because the process of manually searching to find the cross-referenced texts listed in a book is eliminated. Cross-references are automatically linked to a document. With a click of the mouse button a linked source or author's note will appear on the screen. "The reader can follow the link to another text indicated by the note and thus move entirely outside the scholarly article itself" (Landow, 1992, p. 5). This type of access to multiple texts improves critical thinking skills because readers can decided if a note warrants careful reading or if the reader should return to the main text. Moreover, people have access to multiple texts with different points of view. In contrast, printed books tend to present a single vision.

To access multiple texts in a hypertext system, people need to learn how to navigate through electronic space. Navigation is a nonlinear process with multiple paths. In contrast, traditional books are linear with a single path. Books are linear because they are structured by the written word. Nelson (1987) states: "Ordinary writing is sequential for two reasons. First, it grew out of speech and speech-making, which have to be sequential; and second, because books are not convenient to read except in a sequence" (p. DM 29). In contrast, hypertexts are nonlinear because they are based on computer technology. "Computer storage and screen display mean that we no long have to have things in sequence; totally arbitrary structures are possible" (p. DM 29).

The nonlinear characteristic of hypertext creates a new type of reading and writing environment. It is an environment that supports the development of interactive learning materials. These materials include reference works, electronic magazines, and instructional how- to tutorials. For example, Hypertexts are currently being used by the Advertising & Communications Department (ADCOM) in the course Multimedia Computing for Advertising Communications.

The benefits of learning to use hypertexts include providing the people with a more active role than is possible with traditional textbooks. People are required to make decisions about the information they are reading and accessing through the computer. People using hypertexts must be mentally active while they are interacting with the information. Consequently, learning with hypertext becomes more people-centered because the emphasis of hypertext is on an active reader. Because hypertext is people- centered, hypertext systems are generally called learning systems, rather than teaching systems.

However, some hypertexts are written to present information in a continuous progression. For example, several software companies distribute hypertext tutorials along with their programs to help users learn the software. While, these tutorials are visually engaging and encourage self-paced instruction, they are simultaneously "locked" or "fixed" tutorials. People cannot incorporate their own notes or easily modify the information. Educators introducing hypertexts into the classroom should be aware that some hypertexts are "read only" and others are for reading and writing. Generally, commercially distributed hypertexts, such as software tutorials and encyclopedias on CD-ROMs, are read-only texts. In contrast, academic projects (Bolter, 1992; Landow, 1992) can incorporate hypertext reading and writing skills.

Barrett, E. (1989). The society of text: Hypertext, hypermedia, and the social construction of information. Cambridge: MIT Press.

Bolter, D.J. (1992). Literature in the electronic writing space. In M.C. Tuman (Ed.), Literacy online, pp. 19-42. Pittsburgh: University of Pittsburgh Press.

Bunting, T. (1993). Designing interactive writing classes in HyperCard. Technology for New Learning Environments, Presentations at the Conference on Instructional

Technologies, June 9-11, 1993, pp. 19-21. Sponsored by the State University of New York, Faculty Access to Computing Technology (SUNY FACT).

Bush, V. (1991). As we may think. In J.M. Nyce & P. Kahn (Eds.), From Memex to hypertext, pp. 85-110.

Delany, P. & Landow G.P. (1991). Hypermedia and literary studies. Cambridge: MIT Press.

Dickey, W. (1991). Poem descending a staircase: Hypertext and simultaneity of experience. In P. Delany & G.P. Landow Hypermedia and literary studies, pp. 143-152.

Cambridge: MIT Press.

Fowler, R.M. (1994, July). How the secondary orality of the electronic age can awaken us to the primary orality of antiquity or what hypertext can teach us about the Bible.

Interpersonal Computing and Technology: An Electronic Journal for the 21st Century, pp. 12-46. (archived as FOWLER IPCTV2N3 on LISTSERV@GUVM).

He, Peter Wei & Knapp Sara D. (1994). The teaching of the Internet and its influence on instruction. New faculty roles in new learning environments, Presentations at the

third conference on instructional technologies, June 1-3, 1994, University at Albany, State University of New York, pp. 31-34. Sponsored by SUNY FACT.

Heim, M. (1993). The metaphysics of virtual reality. New York: Oxford University Press.

Landow, G.P. (1991). The rhetoric of hypermedia: Some rules for authors. In P. Delany & G.P.

Landow (Eds.), Hypermedia and literary studies, pp. 81-103. Cambridge, MIT Press.

__________ (1992). Hypertext: The convergence of contemporary critical theory and technology. Baltimore: The Johns Hopkins University Press.

Lanham, Richard A. (1993). The electronic word: Democracy, technology, and the arts. Chicago: University of Chicago Press.

Lynch, P.J. (1990, September/October). Graphic design and multimedia. Syllabus, pp. 9-10.

Nelson, T. (1987). Computer lib/dream machines, (Revised Edition). Redmond, WA: Tempus Books of Microsoft Press.

__________. (1992). Opening hypertext: A memoir. In M.C. Tuman, (Ed.), Literacy online: The promise (and peril) of reading and writing with computers, pp. 43-57.

Pittsburgh: University of Pittsburgh Press.

Nielsen, J. (1990). Hypertext & hypermedia. Boston: Academic Press.

Nyce, J.M. & Kahn, P. (1991). From Memex to hypertext: Vannevar Bush and the mind's machine. Boston: Academic Press.

Santoro, G.M. (1994, April). The Internet: An overview. Communication Education, 43(2) 71-86.

Shneiderman, B. & Kearsley, G. (1989). Hypertext hands-on! Reading, MA: Addison-Wesley Publishing Company.

Smith, F. (1985). Reading without nonsense. New York: Teachers College Press.

Tuman, M.C. (Ed.) (1992). Literacy online: The promise (and peril) of reading and writing with computers. Pittsburgh: University of Pittsburgh Press.

2. Of all the different kinds of occupations you can think of, which ones need multimedia the most? The least? What is your chosen occupation? Why will you need to know about multimedia to do well in your line of work?

Answer: The world of multimedia is one of dynamic change. Trying to understand the future of human society, what occupations will be in the future and how the technology, and multimedia in particular will impact those is a challenging set of questions. To answer these questions we need to understand what the future of multimedia will be; and that is like "shooting at a moving target." Neither sector of modern society is changing with the rapidity of today's multimedia technology and its market. There exists a maelstrom of technologies such as CD-ROM, CD-R, DVD, MPEG-1, MPEG-2, the WWW, VRLM, Java and a host of other acronyms that are swirling around us with such force that even the experts are quite often broadsided by new and unexpected developments.

It's hard to think about a profession, which to some extent will not make use of some multimedia-like capabilities on a desktop or in a networked, distributed environment. Because we, as a society are more and more technology-based. Just looking to our country only, which is very representative of our civilization, too. At the beginning of this century we were still basically an agricultural society. According to statistics, as much as 80% of the population were involved in producing food for us and our exports. About during that time, Henry Ford, one of the greatest entrepreneurs this century knows invented the manufacturing process of automobiles on a 'manufacturing band'. His idea went beyond the automobile industry. And propulsioned us within a decade or so, into a World industrial powerhouse. The World War I and then II further strengthen our position. The percentage of population involved in agriculture dropped dramatically and being replaced by manufacturing jobs. In the '50s, '60s and up to mid '70s, the proportion reversed, in the sense that 70% - 80% of the population was involved in manufacturing, and less and less in producing food. But staring in the '70s, another major revolution occurred: computers and information technology! And since then, with all the 'socio-economic turmoil' another turn around of percentages of U.S. workers occurred: as today 70% to 80% of us are "information or knowledge workers" and fewer and fewer involved in manufacturing activities. The later are actually "exported" to places where they can be produced at a lower price. And as far as agricultural type of jobs, only less than 2% of Americans are involved into. And they can produce food for all of us and a good part of the rest of the World! We are becoming more and more a technology-, knowledge-based society.

Coming back to multimedia and to some extent to virtual reality, somehow related, it will be hard to identify one profession not impacted my these newer technology. We are talking about a Virtual Office (new term for telecommuting), N/C - network computer, eventually V/C (a term I guess I coined about a year ago and I repeat it here), the Virtual Computer.

In my daily line of work, multimedia and alike technologies, i.e., CD-ROM, video conferencing, visualization (virtual reality) as below presented as part of this test questions already have a major role. I don't see how me and more co-workers, mostly engineers and scientists would do efficiently our highly technology based tasks. And as above said, I can not think of any profession, or occupation that will not take advantage of multimedia, in the future. Right now, you can think of many. But this is just temporarily in nature.

Barrett, E. (1989). The society of text: Hypertext, hypermedia, and the social construction of information. Cambridge: MIT Press.

Burger, Jeff. (1991). Hypertext/Hypermedia Handbook. New York: Intertext Publications: McGraw-Hill.

Keyes, Jessica (1996). The Ultimate Multimedia Handbook. Second Edition. New York: Intertext Publications: McGraw-Hill.

Lynch, P.J. (1990, September/October). Graphic design and multimedia. Syllabus, pp. 9-10.

Nielsen, J. (1990). Hypertext & hypermedia. Boston: Academic Press.

Rabb, Margaret Y. (1993). The presentation design book: tips, techniques & advice for creating effective, attractive slides, overheads, multimedia presentations, screen shows & more. 2nd ed., Chapel Hill, NC: Ventana Press.

Tuman, M.C. (Ed.) (1992). Literacy online: The promise (and peril) of reading and writing with computers. Pittsburgh: University of Pittsburgh Press.

3. How does virtual reality differ from multimedia? 

Answer: My understanding of Virtual Reality (VR) is as an extension of Multimedia (MM). And these days there is a constant tendency to move toward more and more of "virtual anything"… We've been hearing now for some time of the N/C (Network Computer), which by the way isn't a return to the '50s, '60s, or even '70s whereas instead of a mainframe computer now we have the network, and instead of a dumb terminal now we have an intelligent, GUI-based computer? The keyboard will go away, since the voice recognition and touch screen is coming strong. Instead of a monitor, the TV will be the medium; or the other way around; anyway, one or the other (the TV or the PC monitor) has to go away. So, are we headed toward a VC (Virtual Computer)?

Multimedia started as an extension of the PC toward other devices we have around us: TV, Radio, CD player, etc. And, every day is getting closer and closer to the point where it will replace them entirely. The Virtual Reality, itself an extension of something else, i.e., what in the old days we called 'Simulation' has more and more technology needs that usually fall under multimedia arena.

Assorted computer visionaries are starry-eyed analysts¾ along with several awful movies¾ have combined to give virtual reality a kind of flaky reputation. But despite some technology evangelists' best efforts to ruin VR, the technology actually is beginning to make serious inroads into business. Many managers still equate virtual reality with helmeted, snadex-clad space cadets failing about in a semi-darkened room. But practical VR applications tend to be far more practical and somewhat less frenzier. Virtual reality has a "growing number of practical uses in such fields as manufacturing, marketing and training", according to Gordon Richardson, director of technology and product development at consultancy Arthur D. Little, Inc. in Cambridge. Mass. He explains that managers should regard VR simply as an extension of existing computer technology. VR is an umbrella term for technologies that give users access to artificially created environments. "In a VR world, users can test procedures quickly, safely and inexpensively or visualize information in an entirely new way," - Richardson says. He adds that there are almost as many VR technologies as there are potential uses. "Users can experience the technology with something as sophisticated as a full-immersion holodeck-like platform from Star Trek or with a desktop Pentium PC that lets them view and manipulate objects with an ordinary monitor mouse."¾ Pretty much as on a multimedia equipped PC, I would add.

For instance, one of the most exciting true applications of VR is the 3-D arena. Until now, it has been a simple fact of life. Medical doctors must work in three dimensions. Medical images generated by CT scans and MRI are captured in two dimensions. "In their minds, doctors have had to reconstruct and consolidate the cross-sectional images into 3-D models." said Ronald Summers, M.D., and Ph.D., a clinical radiologist in the diagnostic radiology department at the National Institute of Health in Bethesda, MD.

That is starting to change. While completing a fellowship at Duke University, Summers saw a demonstration of a "virtual colonscopy," in which the presenter used virtual reality techniques to examine the colon. "I saw that the approach had real potential for non-invasive diagnostic," Summers said.

The output from the MRI and CT scanners Summers and his colleagues routinely use is uploaded to the Internet. From there, the data sets, which range from 70 Mb to 120 MB are downloaded to Summers' SGI workstation. Summers strips the header information and generates a 3-D uniform lattice to use as input for the software. "With the 3-D model and the Fly Viewer, I can move around and look for lesions of the bronchial wall. I can also look for areas of narrowing, and I can make measurements of bronchial size," he said. "The measurements cannot be made accurately with the bronchoscope used in the standard procedure because the angle is too wide."

In addition, to "flying through" the data on the screen, Summers can capture the visualization on standard VHS videotape or create 35 mm slides for presentation.

There is no doubt that 3-D visualization presents a compelling visual display of the medical image data. "It is spurring us to think about diagnostic images in a new way". "The key is getting more information out of the data."

As another example, Xerox may have failed to capitalize on the revolutionary graphical user interface developed by its Palo Alto Research Center (PARC), but the company is now pioneering a set of 3-D interfaces designed to help users see complex information in more imaginative ways.

InXight, a Xerox company formed in December 1996 to commercialize PARC's innovations, is delivering interface components called VizControls that display large numbers of data objects -- the results of a Web search for instance -- in 3-D formats that help users see the importance of objects' relationships. (Web address: www.inxight.com)

"The problem that we are trying to address is information overload," said Ramana Rao, who is the chief technology officer at InXight. One VixControl, a Hyperbolic Tree, is a navigational aid that displays file structures in the form of a spherical tree. As users rotate the sphere, the perspective on the links changes. "That kind of visual environment makes finding information in a directory tree much easier to do," said John Robb, an analyst at Forrester Research, in Cambridge, Mass.

Another control is the Perspective Wall, a 3-D image of a wall that helps users see relationships between documents measured across time and some other property. For instance, doctors could view patient records to spot disease clusters in geographical regions.

VizControls are being integrated into Web-search engines such as InfoSeek and into decision-support applications such as data-warehousing software from ComShare.

Controls available later this year will include Web Forager, which displays documents using a metaphor based on books and rooms; Table Lens, which can be used to view large spreadsheets for statistical spikes or anomalies; and Documental Lens, which lets users track of the main one they are working on.

"We're are trying to take advantage of humans' ability to process information in a variety of ways," said Mohan Trikka, InXight's president.

Whether it's being used to investigate the structure of proteins or the presence of oil in deep-buried rock samples, an innovative 3D display environment at Brookhaven National laboratory in Long Island, New York, is helping to bring researchers into the heart of their microscopic data.

In the conference-room-like setting of Brookhaven's new 3D visualization center, scientists wearing ordinarily 3D visualization glasses are able to explore some of the smallest molecular structures known to man. heir journeys are made possible by a combination of an imaging procedure called computed microtomography, commercial visualization software, projection algorithms, and a rear-projection stereographic display system.

With the new 3D visualization setup, says Brookhaven senior scientist Arnold Peskin, "[the geologists] can use the mouse to actually navigate through the pores and see how well the 'Swiss cheese' is connected, which gives them a better idea as to whether they can get the things on the inside out. A 3D picture just shows holes and blotches."

In addition to examining rock samples, the Brookhaven have used the visualization system to create 3D projection images of protein structures, engineering diagrams ("It's a natural for CAD/CAM," says Peskin), and even the complex design of a collider detector prior to construction.

Another group now using the facility includes doctors who administer treatments for the irradiation of cancerous tumors. "These doctors use one picture that shows the anatomy, including the tumor, then they use another picture that shows the dosage distribution of the nuclear medicine. They superimpose the two pictures to make sure the medicine is where the tumor is, and that's not destroying healthy tissue." says Peskin.

Because 3D visualization facility is easy to replicate, Peskin envisions networks of facilities that will allow physicians or scientists to teleconferencing from remote sites while viewing the same image.

References:

Andleight, Prabhat K. (1996). Multimedia Systems Design. Upper Saddle River, NJ: Prentice Hall PTR.

Burger, Jeff. (1993). The desktop multimedia bible. Reading, Mass.: Addison-Wesley Pub. Co.

Chorafas, Dimitris N. (1994). Intelligent multimedia databases: from object orientation and fuzzy engineering to intentional database structures. Englewood, N.J.: Prentice-Hall.

Heim, M. (1993). The metaphysics of virtual reality. New York: Oxford University Press.

John Iovine, John (1995). Step Into Virtual Reality. New York: Intertext Publications: McGraw-Hill.

Levy, Joseph, R. & Bjelland, Harley. (1995). Create Your Own Virtual Reality System. New York: Intertext Publications: McGraw-Hill.

Pimentel, Ken & Teixeira, Kevin. (1995). Virtual Reality: Through the New Looking Glass. Second Edition. New York: Intertext Publications: McGraw-Hill.

Demo

 1. Describe the capabilities and limitations of three multimedia CD-ROM educational titles from the point of view of your organization. Include a list of references. 

Answer: The following three CD-ROM educational titles are produced by NASA/JPL and are being used by my organization. My analysis contains a short description of each title, followed by its capabilities and limitation. At the end of my answer to this question, I will also review some of the key terms and technologies involved with these CD-ROM titles.

Welcome To the Plants (WTP) - CD-ROM Information

CD-ROM Description

NASA, the Planetary Data System, and the Data Distribution Laboratory are providing the educational CD-ROM entitled "Welcome to the Planets" consisting of 190 selected images acquired over approximately 20 years of NASA planetary exploration. Each image is accompanied by information about Solar System bodies and various spacecraft that explored them.

The CD-ROM contains two versions of WTP.

1. The WTP Macintosh and PC Windows standalone version comes complete with narration, images, data and a customized kiosk-style interface.

2. The HTML version allows you to view the WTP pages without a network connection. You can simply place the CD-ROM in your drive, and use your WWW client's 'Open Local' or 'Open File' options to access the pages on the CD-ROM. 

Hardware Requirements 

"Welcome to the Planets (TM)" for Macintosh computers requires operating System 7.0 or higher, with 8 Mbytes RAM, and a 640 X 480 screen with 8-bit color. WTP.MAC runs directly from the CD-ROM by double-clicking on the program icon in the top level directory of the CD-ROM. "Welcome to the Planets (TM)" for Windows computers requires 8 Mbytes RAM, and a 640 X 480 screen with 8-bit color. To achieve best performance on Windows machines, PC users may run the installation program, PCINSTAL.EXE, which copies the executable to a user-selected subdirectory of the hard disk. After PCINSTAL.EXE has been run, "Welcome to the Planets (TM)" should be run by opening this executable from the hard disk. Most actions are accomplished by pushing buttons, moving sliders, or clicking on an item of interest.

Ordering Information

Planetary scientists may order WTP CD-ROM through the Planetary Data System.

The general public may purchase Welcome the Planets CD-ROM through the National Space Science Data Center. (NSSDC)

Known Bugs

There are confirmed reports of users not being able to launch external applications (like LVIEW, etc) when accessing the WTP CD-ROM off the CD-ROM using NCSA Mosaic. Netscape seems to handle the images correctly. The producers of the CD-ROM are investigating a workaround. These know "bugs" are part of the limitations of this CD-ROM. Also, it can be listed other things such as: not availability on so called "high-end" machines, capable of special graphics handling: midrange and workstations. For better results, even in the PC or Macintosh machines case, more powerful devices and memory is required. The listed capabilities are for minimum to run the CD-ROM.

The TOPEX/POSEIDON Informational CD-ROM: Perspectives on an Ocean Planet

This CD-ROM provides a comprehensive look at the TOPEX/POSEIDON mission and satellite.

The TOPEX/POSEIDON mission was designed to provide information about the changing topography of the world's oceans which, in turn, helps scientists to understand the ocean's role in the global climate. The TOPEX/POSEIDON satellite was launched in August 1992 and is expected to operate through at least September, 1998. TOPEX/POSEIDON measures the global ocean topography every 10 days.

The CD-ROM contains over an hour of digital video, audio, images, and text captions which describe everything from the impetus for the mission to the science results obtained in the first three years. The CD-ROM is divided into the following seven sections:

 · Mission History and Background

· Mission Description

· Spacecraft

· How the Measurement System Works

· Mission Operations

· What We've Learned

· Science Data.

The CD-ROM is designed for a general audience. It can also be used for educational purposes (the material is targeted for Grades 9 and up). The software and data on the CD-ROM are formatted such that they can be accessed by Macintosh, PC, and other systems. For each of these platforms, the following minimum requirements should be met:

 · Macintosh- System 7.1, 68030 processor, 5MB FREE RAM, 256 color 640x480 pixel monitor, 2X CD-ROM drive - QuickTime 2.1 (included on CD)

· PC- Windows 3.1, 80386 processor, 8MB FREE RAM, - 256 color 640x480 pixel monitor, 2X CD-ROM drive, audio capability, QuickTime for Windows (included on CD)

· Other - WWW browser, 256 color 640x480 pixel monitor, 2X CD-ROM drive, audio capability.

As far as limitations of the product, same as above, the listed requirements are very minimal in nature. The more capable machine is available the better and more enjoyable this CD-ROM is. And, it's availability on a workstation beyond Macintosh and PC. Even though most of the people use such machines, here at JPL, there are engineers and scientists that do there work on workstations such as HP, Sun, DEC Alpha, SGI, to name the most popular. Usually these machine require special software installed to be able to run a CD-ROM. But, as above mentioned, the performance in terms of speed and visualization capabilities of such machines is superior to an 'average' PC or Macintosh.

Geomorphology From Space: A Global Overview of Regional Landforms - CD-ROM Information

CD-ROM Description

"Geomorphology From Space: A Global Overview of Regional Landforms," an out-of-print, limited edition will be available on new media, CD ROM and the Web, in Spring, 1997. Geomorphology From Space, published in 1986, discusses various Earth and planetary landforms and landscapes, including their description, classification, origin, and development, through a plethora of space images.

The book is composed of 237 plates, each treating some geographic region where a particular landform is exemplified. The bulk of the images were taken by the Multispectral Scanner and the Thematic Mapper on the Landsat satellites. The remaining space images consist of photographs taken by the astronauts and radar images. These invaluable images with commentary are being placed on CD ROM and on the Web to serve the high school and college earth science educational community.

The web version will be available in the Education section of the NASA Goddard DAAC Home Page. CDs are available from both the JPL Data Distribution Lab and the Goddard Distributed Active Archive Center (DAAC).

Hardware Requirements

The HTML version allows you to view the Geomorphology From Space pages without a network connection. You can simply place the CD-ROM in your drive, and use your WWW client's 'Open Local' or 'Open File' options to access the pages on the CD-ROM.

PC- Windows 3.1, 80386 processor, 8MB RAM, - 256 color 640x480 pixel monitor, 2X CD-ROM drive, audio capability, QuickTime for Windows (included on CD)

Ordering Information

Planetary scientists may order WTP CD-ROM through the Planetary Data System.

The general public may purchase Welcome the Planets CD-ROM through the National Space Science Data Center. (NSSDC)

All the above comments as related to limitations apply in this case, too. However, for anyone who can have access to all these 3 CD-ROMs I reviewed above, I can assure him or her that it will be a worthwhile, uncommon experience. The other space and related technology is indeed the ultimate frontier! And the CD-ROM technology is still an inexpensive way to experience it and make it popular and accessible to many people these days when the affordability of a PC or Macinstosh went beyond "casual".

CD technology continues to consolidate its place in the information storage market as CD towers and jukeboxes offer advantages in speed, configurability and connectivity. In early 1997, manufacturers were enthusiastically touting 8X drivers as standard in some models; just three months later, 12X drives are taking over. Those looking for replacement 4X drives will have to rely on existing inventory, because new ones are no longer being made. Manufacturers are also offering a wider range of configuration options in their systems. For example, a tower PC might include a CD recorder and a minichanger in the bays to complement the CD-ROM drives. Features such as mail slots, hot-swappable drives and intelligent magazines make the systems easier than ever to use. Finally, connectivity enhancements, via a combination of hardware and software, are allowing CD storage technology to perform effectively in a heterogeneous environment of PCs, Macintosh and UNIX operating systems, and to connect directly to networks and the Internet.

CD publishing is becoming automated by the incorporation of recorders and integrated disc printers into jukeboxes. For all the delay and confusion that has surrounded DVD (Digital Video Disk), manufacturers are optimistic about incorporating it into their systems as soon as mass production begins.

The other rapidly emerging CD-based technology is the COLD (Computer Output to Laser Disc). Poorly misunderstood, when COLD first came onto the scene, its first goal was simply to replace thousands to millions of feet of microfilm. That is to say, rather than sending millions of documents to microfilm, the goal was to store the bulky information on high-capacity optical drives while maintaining the document indexes in databases on faster magnetic storage.

And, the third path of evolving CD technology is the merging of CD-R (CD - Recordable) - widely known advantages with COLD. CD-R is quickly gaining popularity as an archival and distribution medium for COLD.

Bibliography

Buddine, L. and Young, E. (1995). The Brady Guide to CD-ROM. New York, NY: Brady Books.

Lambert, S. and Ropiequet, S. (1994). CD-ROM. The New Papyrus: Redmond, Washington: Microsoft Press.

Lambert, S. and Ropiequet, S. (1995). CD-ROM. Optical Publishing: Redmond, Washington: Microsoft Press.

Oberlin, S. and Cox, J. (1996). Microsoft CD-ROM Yearbook. Redmond, Washington: Microsoft Press.

The 24 Hours in Cyberspace book and CD-ROM by Que(R) of Macmillan Computer Publishing. ( http://www.cyber24.com).

2. Please read about and learn about CU-SeeMe (Error! Reference source not found.. Connect to at least 3 sites where there is activity. For each site, write up the following: 

- IP address (and location if known)

- How many others were connected and what was going on at their site

- How many FPS and KBPS were supported by the connection

- Any text shown in the windows

- An overview of any audio received.

Alternate Assignment: find 3 interesting multimedia web sites. Write up a summary of how they use multimedia effectively. Include the URL of the site. Some sample sites are: http://www.bev.net, http://www.mediadome.com, http://www.lotus.com/screencam.

Answer: I will attempt to cover both suggested questions for this assignment. First, I will cover partially the initially suggested assignment. I am still not sure if I understand the question. However, here are some finding son CU-SeeMe.

Part 1. I first encountered Cu_SeeMe about 3 years ago. One of the universities (West Coast University, Los Angeles, CA.) I was teaching for as an adjunct faculty was experimenting a program called RIISE (Remote Interactive Instructional Student Environment). It was just a was to try into an alternative approach to delivering education. An exploration of existing technology, but with limited resources (West Coast is not a large school). Furthermore, they have locations (regional centers) in entire Southern California. For easy to understand economic reasons, they wanted to offer a same class, each term (every 9 weeks) simultaneously to they 4 or 5 centers, rather than hiring one instructor for each center. I went through this experiment several times. I was lecturing (computer science topics) in front of an empty classroom, just in front of camera and microphone installed on a Macintosh running an early version of CU_SeeMe. The students were either in the same building, in specially designed cubicles area (lab) or remote at other academic centers. The experiment usually was not a good one. Mostly because two reasons: technology employed (required expensive equipment) and the topic (computer science/math, where plenty of "derivations" is going on the board, therefore little "discussion, talk" occurs). I had a monitor for each students (locally or remote) so I can see them. Each student had a monitor and a speaker so they could see or talk to me. The imaging as well as the sound were broken off quite often, especially for remote locations. There were usually 10 students in the class. When more than one was trying to talk, the technical difficulties were showing up, again. Also the students could not see one each other so, little interaction, besides lestening to one each other. We used typical telephone lines, so not much throughput was going through. As said, the images quality was poor (even though color), and not quite on-line, real-time, but rather stand still that moved once every 5 to 10 seconds. Audio was probably the best working among all the "multimedia" components. We know after couple of experiments that an ISDN backbone, supported by FDDI (fiber optics) would make a difference. But the economics overcame us.

Now, couple of words and examples about newer versions and more successful implementations of CU_SeeMe.

Enhanced CU-SeeMe is White Pine's (http://www.wpine.com) desktop videoconferencing software for real time person-to-person or group conferencing. CU-SeeMe was originally developed at Cornell University (Error! Reference source not found. and is widely available on the Internet. It is the only true multipoint program among many other stand-alone packages. You can use CU-SeeMe over the Internet or any TCP-IP network giving you the power to communicate globally without expensive hardware. This software-only solution runs on both Windows and Macintosh computers offering full-color video, audio, chat window, and white board communications. You can participate in 'Live over the Internet' conferences, broadcasts or chats. CU-SeeMe can be launched directly from Web pages with your favorite Web browser. All of this and more over your 28.8k modem, ISDN link or better.

For audio-only telephony use, CU-SeeMe works effectively over a 14.4 modem. White Pine's Enhanced CU-SeeMe is the leading desktop video conferencing software solution providing group conferencing over the Internet or other TCP-IP networks. Whether you are an instructor conducting a training a class, a business manager communicating to a customer or someone wanting to speak with a friend, Enhanced CU-SeeMe is a simple, software-only solution. Enhanced CU-SeeMe requires a notebook with at least 8MB of RAM, and 10MB of hard-disk space. The CD-ROM disk includes both a 16-bit version for Windows 3.x and a 32-bit version for Windows 95/NT. For remote connections, the software requires a 14.4 KBPS modem for audio only, a 28.8-Kbps modem for audio and video, and PPP software, which is included on the CD-ROM. For direct connections, the software also supports Ethernet and ISDN. It will work with any digital camera and video digitizer. For audio capabilities, a microphone, sound card, drivers, and speakers are needed.

One of the applications of White Pine’s CU-SeeMe Videoconferencing Software, I found, was The Bailey Group (http://baileygrp.com ) who revolutionizes Nursing Homes with the use of this software. Visual Care links residents of nursing homes with their families through the World Wide Web or Internet using CU-SeeMe videoconferencing software. "We are pleased CU-SeeMe videoconferencing software is being used for applications such as Visual Care," said Brian Lichorowic, Vice President of Marketing at White Pine. "It is people that make the Internet so great, a global medium where families and friends can stay in touch at a minimal cost. We applaud the Bailey Group's effort to pioneer this intriguing and important use of CU-SeeMe."

As another example, the National Hockey League (NHL), announced the Cybercast of the 1997 NHL ICE All-Star pre-game activities. In a move to provide live Internet coverage and real time communication amongst fans and players, the NHL is using White Pine's Enhanced CU-SeeMe videoconferencing software to heat up the event. "We wanted to provide Internet coverage to fans worldwide. The logical choice was Enhanced CU-SeeMe. It will cybercast the event, and it will allow fans to interact with each other and the celebrities," stated Charlie Schmitt, Chief Executive Officer of NHL Ice Hockey. "Fans will be excited at the opportunity to participate in the pre-game party, watch the concert and interact with the players."

"Enhanced CU-SeeMe continues to be used for various cybercasts and videoconferences. We are pleased to be working with the NHL for this event," stated Brian Lichorowic, Vice President of Marketing at White Pine. "We are encouraging Internet users to download Enhanced CU-SeeMe for free and participate in the NHL cyber-party!

And finally, NTT International Corporation (NTTI), which is the subsidiary of Nippon Telegraph and Telephone Corporation (NTT), the largest telecommunications company in Japan, announced their joint plans for technical collaboration and focused distribution of Enhanced CU-SeeMe videoconferencing software. "We are excited NTTI selected White Pine as a technology partner for videoconferencing over the Internet and corporate Intranets," according to Carl Koppel, Vice President of Sales at White Pine. He continues, "NTT, NTTI's parent company, is one of the world's prominent communications leaders. These companies will benefit from joint collaboration and technology. As a matter of fact, White Pine is already involved in field trial projects with NTTI and NTT in Japan to deliver the Enhanced CU-SeeMe technology to large corporations and government agencies throughout Japan."

Part 2. Here are my assessment of the use of multimedia capabilities of the three suggested Web sites.

a) http://www.bev.net. This is a simple but rich in information Web site (an Electronic Village as it calls itself). It does not have any special capabilities. Simple graphics, search capabilities, frame-based structure; pretty much all common components of modern Web page. The most useful part I found is the ACE Educational Seminar area that contains numerous and interesting links. Some interaction is provided through search for a specific topic and through on-line membership sign up. Another nice feature I found was maps, local weather and news from the Blacksburg, Virginia geographic area. The site covers various areas of the Community: Arts, Organizations, Religion, Sports, Education, People, Government, etc. that can be useful to any visitor. I found the navigation though various option quite easy. The performance (response time) was quite good, for a Friday, around 6:00 P.M. PST.

b) http://www.mediadome.com. This Web site was a quite impressive one. A truly multimedia site that combines plenty of graphics, animation and useful information. In particular I found interesting the pages on Titanic Legend, Sites, Ship Disasters, etc.. Also the links to Intel's site with information on their latest super microprocessor Pentium II was very informational, accompanied by animation and great graphics capabilities. Other impressive capabilities were the way you submit e-mail, to subscribe to various services, and the survey to be completed. The comics and games area provided quite a bit animation and use of multimedia capabilities. In my opinion a complete multimedia site, with offerings of Movies, Music, Animation and informative on Technology (Intel's chips) or various personalities. The name they choose for their site, Mediadome, could not be more appropriate!

c) http://www.lotus.com/screencam. Yet, another truly multimedia site! And expected coming from such a respectable corporation Lotus. (now part of IBM). Again, plenty of animation, graphics, and especially useful, educational information. However, the emphasis here was on the commercial aspect: offering products for sale or trial (demo, free downloads, etc.) Especially, Lotus flagship products: SmatSuite '97 and Domino. I found quite interesting the ScreenCam product I've read about in various magazine, and now I had an opportunity to take a closer look at.

Technical

1. What are some problems associated with the implementation of distributed networked multimedia applications? Illustrate your explanation with some examples from your workplace.

Answer: Any distributed (network) system-based applications suffer from a general issue: complexity! Neither very expensive hardware nor very sophisticated (and expensive) software will necessarily solved. Perhaps, just within given conditions, and for a limited time!

Multimedia is a very resource demanding technology even for a stand alone, desktop machine. When it needs to run in a distributed environment, the complexity and hardware/software needs are not just skyrocketing, but following a very aggressive curve.

A first, and relatively cheaper solution to the networked and distributed multimedia applications is through implementation of the latest, state-of-the-art technology offering, such as ATM (Asynchronous Transfer Mode). ATM is the only technology at this point, although still not widely affordable (read expensive) that can offer the level of throughput, speed, security and performance required by such a complex requirement: distributed multimedia applications.

In my workplace, some multimedia applications already have been adapted to an ATM network infrastructure. These include desktop-video collaboration, distance learning, news and entertainment video distribution (video-on-demand), multimedia kiosks, and medical imaging.

The following are some technical details, both of a general nature and of a practical one, on how it was implemented.

Desktop-video collaboration (DVC) is the fastest growing multimedia application. It includes application sharing and videoconferencing. DVC significantly improves worker productivity by streamlining the decision-making process, thereby reducing time to market for new products and services. Face-to-face meetings by key decision makers, that today often require transoceanic air travel and considerable downtime, are being replaced by high-quality videoconferences from desktop to desktop.

Delivering multimedia applications to employees' desktops, however, is a big networking challenge. Today's computer networks have been optimized for bursty data traffic. Multimedia applications, however, rely upon continuously flowing streams of compressed digital audio and video information. The streaming nature of multimedia applications is at odds with the contention schemes employed by today's computer networks. When the network becomes busy, everything slows down, creating bottlenecks for real-time audio and video.

Distance learning and remote classroom applications extend desktop videoconferencing over a metropolitan or wide-area network. An instructor in one location can teach classrooms of people in remote locations. Typically, the classrooms are equipped with two-way communications so the lessons can be interactive. High-quality video images, low-delay, and point-to-multipoint capabilities are required. In addition, a video archive is often required to replay recorded classes for people who were unable to attend the live class.

Broadcast-quality video delivery bridges the gap between content providers and consumers of professional video programming (such as television stations and cable operators). Video-content providers can upload their video programs directly into the video-information provider's (VIP) video archiver, opening up a vast market of prospective customers for the content provider. A VIP network, can provide a single, convenient source of digital broadcast video to multi-service operators (MSOs), replacing a multitude of other sources of video information such as satellite and microwave feeds, couriers, and express mail delivery.

he VIP network can deliver broadcast quality video. It gives television programmers and news organizations unprecedented control, speed, and economy in obtaining video programming such as late-breaking news stories or live news coverage from around the world. Producers and programmers can search the VIP's video archiver using key words and preview the video program before deciding to download it. Downloading video programming even can occur in real time, going directly into an MSO's head-end in the time slot allocated for the program.

What makes addressing the ATM question in monolithic terms so difficult is the fact that ATM has potential in so many different markets. These include carriers and CATV multiservice operators (MSOs), Internet service providers (ISPs) and enterprise networks. For the carrier/MSO market, the new rules of competition (i.e., global trends toward privatization/liberalization and telecon reform) are a potent business driver. For the ISP market, today's business driver is solving the bandwidth bottlenecks occurring in the WAN-backbone infrastructure. For enterprise network users, supporting the corporate, strategic competitive advantage is the business driver. Many companies, including the company I work for right now, JPL, have well-advertised success stories using ATM to support both multimedia (e.g., videoconferencing and CAD/CAM) and high throughput data applications. These visionary companies are redefining their respective industries with business applications enabled by ATM. Other enterprise network users are being driven to ATM solely to solve the bandwidth bottleneck in their campus backbones.

ATM protagonists have been barraged by criticism of the technology since its beginning. The challenge is to address these criticisms, and provide effective solutions. Although ATM provides the best price/performance when looking at the cost per bit per second, product costs must be lowered. ATM adopters are making significant gains in cost reduction and, as a result, ATM desktop solutions are now approaching the costs of switched Ethernet.

There is still room for improvement, however. ATM services must have reasonable tariffing. Carriers are reluctant to commit to public pricing until ATM is functioning in their networks and they have gained experience with customers in the ATM environment. Mainstream users must be convinced that ATM produces cost and performance benefits. Users must be assured of guaranteed interoperability of ATM products and services and they must understand the migration path.

Underlying these business challenges are the technical challenges—interoperability, interworking with legacy systems, and backward compatibility. ATM forum initiatives are making the public aware of ATM successes, ATM specifications are helping to guaranteed interoperability and simplifying migration paths, and ATM solutions are becoming more widespread and less expensive.

Implementation of video applications requires great networking speed. The network bandwidth requirements are an order of magnitude greater than for data alone. For example, a real-time, full-screen, full-motion color MPEG2 video stream requires from 4 to 60 MBPS of network bandwidth depending on the required quality level. This clearly exceeds the available bandwidth on most existing shared-media networks. For non-real-time image applications, digital video files can be in the 10- to 20 Mbytes range or as high as 100 Mbytes per minute of playback time depending on the compression scheme, window and frame size, and bit depth.

One of the most important factors affecting the quality of video delivery is end-to-end or absolute delay. While delays greater than 200 milliseconds are annoying, a delay greater than 400 msec is intolerable. Since most video coders/decoders (codecs) require between 50 and 250 msec to perform compression and decompression, the wide-area connection must introduce less than a 50 msec end-to-end delay to be considered usable for real-time video applications.

In addition to absolute delay, delay variation or "jitter" is also important in determining reliability in delivering real-time video. Jitter not only interferes with visual quality, but can also contribute to an irritating lack of synchronization between audio and video streams. While delay variations in excess of 500 microsec are considered annoying, variations in excess of 650 microsec are intolerable. A "video-enabled" network must deliver a continuous stream of data that arrives at the destination at a fixed rate, even as the network becomes heavily loaded with multiple users and video streams. Currently, only ATM offers the bandwidth guarantee and quality of service (QoS) required for real-time video applications.

ATM is emerging as the networking technology of choice for multimedia applications because it can support high bandwidth communications with low latency. In addition, ATM offers the capacity to statistically multiplex available bandwidth from one stream to another and one address to another. Although great progress has been made in recent years in the area of video compression standards, there are still many proprietary video-encoding techniques on the market. For example, many of the LAN-based videoconferencing applications use proprietary techniques that have been optimized to run well-suited for wide-area connections. MPEG2 is emerging as the leading standard for high quality video applications such as video broadcasting and video entertainment distribution. MPEG video/audio compression standards are defined by the ISO. While MPEG1 delivers VHS quality in 1.5 to 2.0 MBPS, MPEG2 can deliver to theater quality using from 4 to 60 MBPS. The MPEG2 encoding process requires sophisticated and expensive technology—an encoder can cost $50,000—and introduces a delay of hundreds of milliseconds. As a result, it is unlikely that MPEG2 will be used for desktop videoconferencing in the near future.

To address the requirements of real-time video transmission over ATM, the ATM forum has endorsed using the real-time AAL5 variable bit rate (VBR) class of service, even through an MPEG2 stream actually consists of fixed, 188-byte packets running at a constant bit rate (CBR). Real-time VBR was chosen over CBR since MPEG2 already has its own time base, the Program Clock Reference, in its transport stream and does not need the time stamp provided in the AAL1 CBR class of service. Furthermore, AAL5 is more efficient than CBR since it has lower overhead, and the full 48 bytes per cell of payload are usable.

Using real-time VBR network connections, the proper amount of ATM network bandwidth can be allocated using QoS parameters to ensure reliable transmission of each video connection. Using real-time VBR, jitter and latency can be explicitly specified per user contract for each video connection, along with the allowable cell-loss and cell-error rates. Although video is very sensitive to jitter and latency, it is more tolerant of cell loss and errors since the decoding process can easily compensate for lost picture frames.

JPEG, a predecessor of MPEG, is a n International Telecommunication Union (ITU) encoding standard for still pictures that was adapted to handle video, resulting in Motion-JPEG. A compressed video stream actually consists of a sequence of JPEG pictures at a rate of 24 or 30 frames per second. At these rates, however, any picture is very much like the picture immediately preceding or following it, which produces less than optimal compression. A Motion-JPEG video stream occupies from 10 to 20 MBPS. MPEG, on the other hand, performs interframe compression by computing some frames from other frames to reduce the information needing to be sent.

Today, the ,majority of desktops in large enterprises are equipped with two types of communications terminals—the PC and the telephone. They are supported by two entirely separate network connections—the :LAN and the PBX—each providing a single service. The telephone network delivers bi-directional streams of data at 64 KBPS (typically digitally encoded voice), while the LAN delivers packets of data on a best-effort basis. The technologies behind the telephone network and the LAN are very different. Each has evolved over the years—but both technologies continue to provide essentially the same, single service as they have always done.

From the beginning, ATM was designed to integrate voice, video, and data communications over a single network. Since early 1996, ATM has been able to provide practical, standardized solutions for data networking, including transparent interoperability with Ethernet and token-ring LANs, and support for existing LAN-based applications. Standards that allow voice and video applications to run over ATM are expected by the end of 1996. While it is clear that ATM is the best solution for integrating voice, video, and data throughout the enterprise, today's enterprise networks are based on older technologies such as Ethernet and token ring, typically linked by routers and running internetworking protocols such as Internet protocol (IP). IP originated as a pure data communications protocol, and was never designed to handle real-time voice and video. Upgrading router-based internetworks to carry voice and video requires radical surgery. Even then the quality of communication would never match that of the public telephone network. By contrast, ATM was designed from the outset to handle both real-time video and voice, and bursty data traffic. They are carried as payload directly in ATM cells. There is no need for IP to define network-layer addressing, since ATM has its own addressing scheme with a far larger address space than that of IP. ATM's support for point-to-multipoint switched virtual circuits (SVCs) also handles the requirements for multicasting voice and video far more elegantly and completely than IP's multicast protocol. An ATM network user can elect to receive any number of multicast streams, and receive only selected streams. ATM solves the multicast problem at Layer 2, and requires no Layer 3 protocol for multicasting voice and video.

The current computer/desktop communications focus is on data, but emerging applications include a much wider range of requirements that current solutions cannot meet. Multimedia, multi-application requirements with guaranteed quality of service and bandwidth will drive product and service offering toward ATM. Although equipment technical specs have been developed to implement ATM on the desktop, the "convergence" software or application program interfaces (APIs) that make applications aware of ATM are lagging behind. The ATM forum APIU group is working with other industry organizations to adapt their APIs to ATM based on the ATM forum's API specification—the Native ATM Service Semantic Description.

Although detractors have said that there really aren't any applications driving ATM demand, applications that demand bandwidth and/or quality of service for a mix of real-time (delay-sensitive) and non-real-time applications are emerging. These include medical imaging; CAD/CAM file transfer; interconnecting high-volume, geographically dispersed LANs; server farm support; ATM backbones to improve performance of IP intra- and internetworks, and enterprise networks. The pace will really start to pick up with products and services based on the ATM forum's Anchorage Accord.

As far as Private Enterprise Networks, historically this area exhibited the most urgent need and has seen the most dramatic changes. Many LAN switch/hub vendors and players in LAN and WAN markets have reported tremendous sales. More information about growing sales and applications is available from the ATM forum.

In the area of Public Networks, although it seems that the public carriers have been quiet on ATM, they've been planning and installing the needed network equipment before ATM services could be offered and waiting while regulatory changes were happening around the world. ATM-based services are becoming available via 14 carriers today in selected markets in the United States. Even more are becoming available globally. Services are primarily PVCs with SVC services starting later this year. Carriers have not disclosed many details of their plans, but carriers were waiting to get closure on telecom reform. Second, a large number of key ATM forum specifications have rather recently been completed or will be completed shortly. Why is this important? There were many gaps in the standards and implementation agreements that had to be addressed before ATM network solutions could be made available. The ATM forum is working diligently to fill these gaps.

Early video-on-demand trials to bring ATM to the home have been completed. Trail costs were high and user interest was weaker than service providers would have liked. Since then, use of the Internet has grown significantly. World Wide Web sites have become increasingly rich in still image and video animation. Internet users are increasingly frustrated downloading Web pages and need a bigger bandwidth pipe to the home. Enabling technologies like asymmetric digital subscriber line (ADSL) allow ATM to be brought to the home over the existing copper plant—thus breaking the bandwidth bottleneck for Internet access. These facilities can also support video-on-demand in the future. Meanwhile, technical work continues on defining the in-home ATM LAN (HAN)

ISPs are seeing significant growth in the numbers of residential and corporate customers, and many customers are demanding T1 and higher rate access to the Internet. ISPs are making significant investments to upgrade access capabilities and the aggregate backbone capabilities of their networks using ATM. For example, one leading provider just announced it will be upgrading its Internet backbone from 155 MBPS to 622 MBPS ATM over the next six months.

2. Explain the term Quality of Service (QoS). Why is the QoS architecture an important step in the creation of distributed multimedia systems?

Answer: Network Performance (NP) and Quality of Service (QoS) are part of a larger "umbrella" called Network Management (NM). The Network Performance is tightly related to Network Traffic Control, particularly within a distributed multimedia system applications environment. As the load on a network increases, a region of mild congestion is reached, where the queuing delays at the nodes result in increased end-to-end delay and reduced capability to provide desired throughput. When a point of severe congestion is reached, the classic queuing response results in dramatic growth in delays and a collapse in throughput. It is clear that there catastrophic events must be avoided, which is the task of congestion control. The object of all congestion control techniques is to limit queue lengths at the frame handlers so as to avoid throughput collapse.

There are three general traffic control mechanisms: Flow Control, Congestion Control and Deadlock Avoidance. Flow Control is concerned with the regulation of the rate of data transmission between two points. The basic purpose of flow control is to enable the receiver to control the rate at which it receives data, so that it is not overwhelmed. Typically, flow control is exercised with some sort of sliding-windows technique.

A quite different type of traffic control is referred to as Congestion Control. The objective here is to maintain the number of packets within the network below the level at which performance falls off dramatically. A problem equally serious to that of congestion is Deadlock, a condition in which a set of nodes are unable to forward packets because no buffers are available. This condition can occur even without a heavy load. Deadlock avoidance techniques are used to design the network in such a way that deadlock cannot occur.

The ATM standard promises to offer quality of service (QoS) connections from end-to-end. Although many ATM products vendors promise this remarkable feature, few have delivered on their promises. Recently, there has been much talk about similar protocols over Ethernet, which may have some people wondering why they should bother to invest in ATM. The Resource Reservation Protocol (RSVP) and Real-time Transport Protocol (RTP) let applications request an amount of bandwidth to be set aside for a given application. As traffic starts to flow on a frame-based network, data is queued in a first-in, first-out (FIFO) buffer. Since the frame size of Ethernet can range from 64 bytes to about 1.5 Kb, there is no guarantee that the rate of delivery will be consistent. If two bandwidth-reserved frames come into an Ethernet switch at the same time, the one entering the buffer first will be transmitted first. Because the whole frame must be transmitted, the second stream has to be buffered behind it.

Contrast this technology with a cell-based switch. With 53-byte ATM cells, multiple priority queues can be set up to allow different qualities of traffic to be balanced equally while maintaining a constant rate of flow. Because the queues can be serviced independently, individual cells can be serviced alternatively, allowing the ATM switch to maintain a true QoS to both destinations.

With Ethernet, the entire frame must be sent before the second queue can be serviced. Thus, though bandwidth has been reserved, true quality of service is lost because there is no fixed frame size for a given packet.

Quality of Service (QoS) over ATM is broken up into several different classes. Rather than attempting to grab a certain amount of bandwidth, ATM devices can negotiate for the best-available rate. As network dynamics change, the device can renegotiate its connection for more or less bandwidth, as necessary. These two factors give ATM a clear edge over Ethernet QoS protocols.

As above mentioned, in a distributed multimedia applications environment, one of the critical issues is the overall performance of the integrated environment, somehow defined as Quality of Service (QoS). Performance, reliability, and productivity gains actually achieved by a company will determine the usefulness of the "tool" to network users. High quality is required before the tool becomes useful and produces the desired behavior change which improves the employees' productivity. Meanwhile, system reliability is critical. If video connection experience frequent dropouts during high network usage, then employees will be unlikely to depend on the tool or anything other than unimportant meetings. The underlying capabilities of the video transport switching system must support stringent requirements. This is key to the system's reliability. ATM can deliver the high bandwidth and low delay required by video collaboration, but it can only do so with the following capabilities:

Quality of Service Guarantees per Video Stream. The ATM wide-area switch should support traffic policing for individual video streams to ensure that each stream receives the required quality of service across the network. Jitter and latency should be explicitly specified per user contract for each video stream, along with the allowable cell-loss and cell-error rates. Per-stream traffic policing that relies upon a sophisticated queuing and buffer management architecture should be used.

Per-Video-Stream Buffering. The ATM wide-area switch should support per-video-stream buffing to ensure fairness to all video streams and to firewall one stream from another. Per-stream buffering allows the network to sustain extended bursts from one stream without affecting the level of performance delivered to other streams.

Multiple ATM Service Classes for Video. The ATM wide-area switch should support various service classes for video—from CBR with minimal cell delay variation to VBR with less stringent cell delay. While the ATM forum has endorsed using the real-time AAL5 variable bit rate class of service for real-time video streams, non-real-time transfer of video streams should utilize the available bit rate (ABR) service class to use network bandwidth efficiently. The ATM switch should allocate unused, spare bandwidth for ABR video files—bandwidth that is not needed by the VBR video streams. The ATM switch should maintain firewalls between each service class to ensure fairness and prevent misbehavior in one class from affecting the performance of another class.

Reliability and Serviceability. The ATM wide-area switch should be designed for maximum reliability through hardware redundancy. It should automatically reroute video streams around network failures to provide high network resiliency. All switch modules should be able to be remodeled and re-inserted without powering down the switch and without affecting the operation of other modules. The switch should have a software-based architecture which allows feature enhancements and bug fixes to be downloaded from a central site as a background task and then activated when desired. The switch should collect statistics for each video stream, provide open interfaces to this information, and continuously monitor resources performance.

The video content of communications will continue to increase as collaborative computing, videoconferencing, application-sharing, and remote training become commonplace. Rather than increasing the speed of existing networks to accommodate video, companies are turning to ATM because it can support high bandwidth, multimedia traffic with low latency. Thanks to the efforts of the ATM forum, standards are starting to emerge that address these requirements. Early use of multimedia ATM networks have demonstrated the benefits of ATM's efficiency in delivering a mixture of video, audio, and data over one homogeneous network.

References:

1. Decina, M., and Trecordi, V., editors. Special Issue on Traffic Management and Congestion Control for ATM Networks. IEEE Network, September 1992.

2. Minet, P. "Broadband ISDN and Asynchronous Transfer Mode (ATM)." IEEE Communications Magazine, September 1989.

3. Prycker, M. (1991). Asynchronous Transfer Mode: Solution for Broadband ISDN. New York: Ellie Horwood.

Longer research paper

Prepare a report (12-15 pages) delineating the strategies, techniques and guidelines for using a commercially available presentation program or authoring tool to develop a short multimedia presentation for your work place. Include a list of references as well as a drawing of several sample screens. Be sure to address the following topics:

- rational for selection of the authoring tool or presentation program

- hardware and software requirements

- budgetary constraints

- project definition, goals and objectives

- target audience

- content and design guidelines

- project benefits and limitations

- testing and debugging

- criteria for evaluating the success of the project.

Answer: For this assignment I chose an area related to presentation or authoring systems that involves use of multimedia, Web and collaborative computing. This allows us the engineers and scientists here at Jet Propulsion Laboratory in Pasadena, California to work better, faster, and more efficient and be more productive in our quest for the secrets of the Universe.

Overview

 

The multimedia and networking group at Jet Propulsion Laboratory offers a variety of Tools and Services for the Collaborative Environment to the NASA community and it's industry partners. All of these group solutions are grounded in industry standards and maintain the highest level of interoperability.

What is "CE?"

CE is an acronym for a multimedia, network (Web)-based Collaborative Environment. A revolutionary new way to communicate and collaborate in a virtual environment where you can work with others in real-time and exchange information whether you might be separated by a single wall or several continents.

It's not enough to just hear words spoken over a telephone¾ you need a context, a connection to someone and something. You need to know that 'yes' means 'yes, I agree' not 'yes, I understand you.' A Multimedia, Web-based Collaborative Environment can help you differentiate between them.

Today, with every aspect of our business lives becoming increasingly global, the CE provides the means for groups and individuals to share information spontaneously, compressing both time and distance.

Tools

Video Conferencing

Multimedia and networking group's solutions range from inexpensive personal Desktop Systems running over ISDN, ATM or LAN/INTERNET networks to medium or large room systems running over ISDN or ATM networks. They maintain a number of Conferencing Servers which allow for the establishment and management of multiple conferences at a variety of data-rates, with multiple users in each. Finally, this group offers Gateway Servers to allow transparent bi-directional interoperability between the heterogeneous Network infrastructure involved.

Collaborative Tools

Multimedia and networking group enable users to work together virtually by assembling a wide variety of tools. Their solutions provide for the sharing and control of remote applications and/or the entire remote desktop.

Other tools provide the facility for the exchange of files, the ability to send "flash" text messages to a remote desktop in real-time, simple design collaboration via shared whiteboard, or simple communication with text-based chat or point-to-point audio conferencing.

Collaborative Environment: Toolset

Video Conferencing

· PictureTel PCS100 Video Conferencing Client (PC/Software/Hardware)

· Video & Audio served via: 2x56Kb RS-449 NISN circuits or 2x64Kb carrier provided ISDN (GPT T/A and CDSCC PBX) (NISN bandwidth currently shared with SlowScan Video)

· Primary (NISN) connection to JPL’s MultiWay Video Conferencing Bridge

· Bridge Connects to outside world using ISDN

· Bridge supports up to 16 simultaneous users in video conference

· Bridge scheduled via World Wide Web (aside from the dedicated ports assigned to each Complex)

· MDSCC, CDSCC, & GDSCC are active full-time on the Bridge

Collaborative Tools

 · Remote Desktop control or observation, file exchange, chat and "flash" messages served via: 48Kb NISN LAN to JPL.

 · Allows for application sharing as both participants can modify/save document/drawings

 · User definition with the ability to specific privilege level

 · Includes password protection

 · Smart Board & Projector

 · Provides 60" "touchable" screen

 · Canon VCC1 camera

 · Provides controllable view with presets

 · Canon MK650 Document Camera

 · Provides means for document conferencing with hardcopy or visual hardware troubleshooting and inspection.

 DSN Collaborative Environment: Toolset

One of the first beneficiaries of this new technology is the DSN (Deep Space Network Project) at NASA's Jet Propulsion Laboratory. The following outline the specific toolset employed by this group.

Video Conferencing Tools

· PictureTel PCS100 Video Conferencing Client (PC/Software/Hardware)

· Video & Audio served via:

· RS-449 NISN circuits (2x56Kb)
Carrier provided ISDN (2x64Kb)

· ATM (2x56Kb or 2x64Kb or 384Kb)

· Phonebook to be maintained by NTD

 · DSCC's (Deep Space Communicate Control) use NISN connection to JPL’s MultiWay Video Conferencing Bridge

· Bridge Connects to outside world using ISDN

· Bridge supports up to 16 simultaneous users in video conference

· Bridge scheduled via World Wide Web (aside from the dedicated port assigned to each Complex)

· MDSCC, CDSCC, & GDSCC (will be) active full-time on the Bridge w/ one additional port per site. (Please see attached implementation map)

 Collaborative Tools

· Remote Desktop control or observation, file exchange, chat and "flash" messages served via LAN and/or Internet

· Allows for application sharing as both participants can modify/save document/drawings

· User definition with the ability to specific privilege level

· Includes password protection

· Phonebook to be maintained by NTD

· Smart Board & Projector

· Provides 60" "touchable" screen (soon to be 72")

· Canon VC-C1 camera

· Provides high-quality controllable view with presets

· Canon MK650 Document Camera

· Provides means for document conferencing with hardcopy or visual hardware troubleshooting and inspection

DSN Collaborative Environment: How to Use It

As an example the following show how the above list of tools are used to create a collaborative environment for the Deep Space Network (DSN) project.

Video Conferencing

· PictureTel's PCS100 Video Conferencing Client.

· View the H.320 ISDN Numbers associated with the CDSCC site at the Video Client Lookup Directory and communicate them to the other participants.

Or,

· Use the Conferencing Server Scheduling form to schedule time and the parties that will be involved. Follow the directions on the form.

· At the specified time, start PictureTel Live100 (on the Desktop.) The video will follow voice. The other participants will see whoever speaks.

Collaborative Tools

· Timbuktu Pro Collaborative Software

· View the IP address associated with the client you wish to collaborate with at the CE Tools Client Lookup Directory

· Communicate with the party to establish Username/Password access

The following steps describe a simple use of this environment: (please see attached pictures)

1. Start Timbuktu.

2. Add an entry to the personal list. Click TCP/IP and enter the IP address associated with the client you wish to collaborate with. Then, click Add.

3. You should see a dialog similar to the following shortly thereafter. If not, either the IP address needs to be added to the Router Access List (see Ross Murray) or the other end isn't functioning.

4. Click OK. Click the "Personal" tab. Select the newly created entry and click the function you wish to perform: Send, Exchange, Control or Observe. Although the interface is intuitive, you may want to see the online help provided with the package if you need to define user access or anything more advanced. Simply click the Help menu item.

The above multimedia-based collaborative environment is greatly facilitate by existence of a network environment. This environment, the backbone of the above presented multimedia system is the a subset of the Internet, the Web, often called an Intranet. Following are some "business" consideration that let to the effective design and deployment of the Web in my organization.

Effective deploying the Web, distributed-based multimedia system within a large enterprise requires planning, collaboration and technical work to make the Web easily used by non-technical staff, and to manage large amounts of information that will become available. The following outlines the rational and framework for using the Web in my work environment, as well as general guidelines for a successful deployment. Both an organizational as well as technical and management perspective are presented, with specific applications.

A Web design needs to consider the needs of users, providers, developers and impact of the Web on the enterprise’s infrastructure - while dealing with the concerns of management. This requires different skills. In many larger organizations (such as my workplace) there several people, usually from different organizations, who collectively are the Web Team.

Internal Web Use

The Web addresses many of the issues that we face as Internet use explodes. Now we can move beyond "how to connect" to how to use information. Companies are realizing the benefits of using the Web and starting to deal with the challenges:

1. Managing large numbers of clients

2. Connecting people to the information they need

3. Security

4. Preventing information overload

5. Insuring information integrity/management

6. Helping providers publish information

7. Reducing redundant work

8. Getting non-technical users started

9. Continual client software changes/improvements.

The Web provides the basic mechanism to change how information is shared within enterprises, but it is just the framework. Work needs to be done to make information available, to provide users with tools and training, and to help information providers publish their information. This is the challenge for the Web.

We need to understand that this is a difficult task. Chaos is very possible. (Cultural) inertia is possible. The Web is responsible for ensuring the successful deployment and use of the Web within the enterprise. If we can "manage" growth, add new sources and users and avoid getting out of control the benefits can be tremendous.

Keys To Be Successful

The Web is critical to the successful, and the managed growth of the Web. As we've seen the Web's job is a challenging one which requires juggling the needs of several different communities. This position requires a combination of technical, management, planning and organization skills. In most cases it will be done by a team of people with complementary skills who collectively are the corporate "Web".

So, how does the Web address the different demands put upon him/her? Here’s a list that outlines an approach to the Web that can take to help increase his/her/their chance of succeeding.

1. Focus

2. Enlist senior management support.

3. Manage organizational and technical challenges

4. Identify and implement the mechanisms needed to support wide-scale use.

5. Sell, sell, sell

6. Watch, listen, improve.

This is a very simple view. In reality, the specific steps needed will factor in where you are in use of the Web, your corporate culture, and other environment-specific factors. These steps are provided as a guide. It’s strongly believed that someone, or some team, needs to think through which of these steps, or other work not identified here, needs to be done.

Focus

In larger enterprises there will be many opportunities to use the Web - too many in most cases. It would be difficult, if not impossible, to meet the needs of all of these different groups. It’s suggested that the team narrowly focus on one or two groups to start, understand their needs in detail, and provide solutions that meet the needs of the target group.

The wise Web will keep the needs of the larger community in mind while developing specific solutions, and generalize solutions where possible.

The recommendation is:

1. Identify and focus on one target group

2. Understand how they could use the Web

3. Write a vision

4. Test and revise as needed

5. Write down all that has to be done

6. Get the right groups involved

7. Develop a strategy and agree to tasks

8. Identify how you can measure progress

9. Measure progress weekly

10. Update vision, strategy, and tasks as needed.

Some of these steps may not apply to any situation, and some organizations will need to address additional issues. But, in general, these are viable first steps towards setting up a successful web environment.

One of the dynamics in a large environment is the number of different groups, who all have different information needs. If you try to meet all of these needs, you will probably fail. So pick one, or some small number, of groups to focus on to start. Some of the information these people need will be useful to other groups. Once you have a base of users, growing the information base becomes easier, and you will have developed a support base.

Steps 5 to 7 above, address the need to think through all the work that will be needed. You will need help from several different organizations - networks may need to be upgraded, information will need to be sourced, clients will need to be identified and helped, training or documentation may be needed. Identify what is needed and get the right people involved. This may require senior management support, so figure out when and how to get them involved.

Finally, measuring success and identifying problem areas is a challenge in a widely distributed service like this. Identify simple ways you can keep track of how things are working, looking for areas that could be improved. Check usage of key pages and track them over time, or monitor mail sent from users asking for help - whatever gives you a sense of whether the Web is being used widely.

Enlist Management Support

This is essential for success. Eventually you will need management support to deal with:

1. Funding (especially infrastructure and Web work)

2. Information availability

3. The (inevitable) technology turf wars

4. Access to users.

Most companies start with "underground" efforts and create great excitement at the grass roots. To move beyond staff frenzy to a focused effort will require management support, funding and help. You may need to sell key managers by showing them what is possible, what the future may be. Continue to focus on key user group - get them to help, focus on benefits to these folks. Technical team will need business sponsor. This is about helping the business, not neat technology. Prepare for the ROI question.

Sure, you have heard this before. While it may seem obvious there is sometimes reluctance to do so in some companies, especially where Web use starts as an underground effort, and over time a group of Web-fanatics grows and prospers, doing all sorts of interesting things. At some point, however, they will need to go public. Perhaps funding is needed for dedicated staff, servers or networks. There may be a reluctance to "go public" and expose the informal work and group to the more formal, and more risk-adverse, organization. But it will need to happen to move the work into the mainstream.

You may want to be preparing for this by thinking through how to "sell" the right people in the company on the value of the Web to their staff. For example, how will having Web access help the sales team? What information do they need? Put together a demo of an environment that might address this opportunity, and show it to sales management.

Again, we recognize that you need to factor in your organization's culture, philosophy and state. But as we have said before, think it through and don't leave it to chance.

To start, the following is a model which focuses on the mechanisms and services that are needed to support large-scale Web use, i.e. it addresses the question "what do we need to do to support thousands of internal users?". As you go through it keep in mind that this is a general model - you will want to modify it to meet the specific needs of your organization, and to reflect your organization's culture and state of Web use.

There are three types of people we considered - the users, obviously (i.e., people in the organization who will use the Web to get information), the information providers (internal groups with information of interest to others) and developers (people who develop tools, applications or gateways to applications across the organization). Each of these classes of people have different needs, as shown above.

To meet the needs of these three groups seven specific types of mechanisms should be considered:

1. Kits/configurations of tools for users

2. Web applications (or gateways to existing applications)

3. Navigation systems

4. Shared or reference pages

5. Web toolbox for developers and publishers

6. Publishing systems

7. Information archive (or repository).

The final set of ideas in the model are that you should develop services which leverage the mechanisms you develop. For example, if you develop a common set of pages, incorporate them into a training strategy, document them in a user's guide, make them showcases for your consulting team.

While this may not be the exact right model for you, we hope it gives you some ideas and helps you to develop a model that will work in your situation.

Information Archives

How do you keep the core business information current? How can users be confident the information being used is the right information (most timely, accurate, valid)?

To be widely used by non-technical people your Web will need to eventually provide access to "corporate data". The Web Information Archives (or Information Repository) are a collection of shared information used by the enterprise (e.g. on the external Web server) as well as internally.

It will need to include security, levels of access, distribution mechanisms. How you source this will depend on where your information resides. Connections to existing information is the best strategy.

Archive tools may need to support extensions, different security levels, multiple feeds, filtering and be able to support daily updates. This may be a significant effort!

Cost savings: Reduced cost of information sourcing. Other benefits: improved information quality, improved ability to manage information. Costs: development/testing/production costs.

A Web-based information network model

Many questions have been raised. To deal with them you will need a plan of action. To help in this we've developed a reference model to address the key issues facing successful, wide-spread Web deployment, including:

a. generating interest and enthusiasm in using the web

b. helping people find information

c. making most relevant information easy to find

d. getting new users started

e. helping new groups publish on the web

f. minimizing ongoing support costs

g. keeping up with new information sources

h. sharing information about tools and technologies

i. insuring internal information is valid, up to date, secure

j. leveraging scarce resources

k. making sure the infrastructure can support needs.

You'll need to develop your own model, and identify the mechanisms and services your organization needs, based on your specific requirements, organizational culture and state of Web deployment.

Kits

Problem: There are many different Web tools. How do people get started? How do they keep up with constant changes to software? How can they be sure viewers work with browsers? How can you support what's being used?

Kits Benefits

Kit configuration management (i.e., providing tested, supported collections of user tools) is critical for large-scale deployment. Tested and supported kits make support, documentation, upgrading and licensing much easier to manage, and gives users one place they can go to for a "trusted" set of tools.

For large-scale deployment this should include a simple upgrade process - for example every user could have an upgrade disk that runs a script that goes to the network and gets the latest kit. The component software should be managed from a "corporate" software library.

As hard (or unpopular) as this may be, try to have as few different supported tools within the corporation as possible, and package pieces together. Without this support becomes a nightmare, especially as tools continue to version. (Note that several vendors (e.g., Netscape) are now providing complete suites of tools). This is not to say that there can't be other tools used - this is about supported tools.

The potential cost savings are significant. Reduces wasted time, simplifies update process, insures licensing is handled, reduces documentation and supports costs, and helps with volume discounting.

Lessons Learned

Top lessons we've learned about large-scale internal Web deployment can be summarized here:

1. Web use has a very different dynamic than other services - publishing and browsing are much easier, base technology is simpler, so many will use it.

2. Your focus will need to be on how to sustain this work over the long term, and for many people.

3. A combination of risk-taking, planning and ongoing attention are needed.

4. Focusing on one group (at least to start) helps to simplify/focus.

5. This is more than just a technical challenge, and will require a cross-functional team.

6. End users want it to be as simple and straight-forward as possible, and won't share your enthusiasm for details, neat tools.

7. Anticipate new roles and skills - information finder, information seekers, information designers, graphics people, and integration experts.

8. Keep up with the constant changes in technology.

9. Continually push thinking - see how others are using new tools (e.g., personalized newspapers, subscriptions, audio, movie, new categories)

10. Sell, sell, sell - be an evangelist! Listen to what people need! "Just do it!"

Possible Web Applications

Identify applications needed, where there will be done, how they will be staged. The following is a potential list: Product Information, Knowledge Preservation, Project Information, Official Travel Guide, Access to Data Warehouse, Existing Catalogs, Product Support Databases, Employee Infobases, Training and Registration, Employee Property Management, Newswire Clippings, Policies and Procedures, Software Libraries, Jobs, Phone Directory, Benefits, Conference Room Reservations, Literature Ordering, Libraries, Stock Quote, Subscription Services, Performance Tracking, Engineering Groups and Information, Surveillance, Sharing Design Drawings, Application Front-end, Employee and Group Information, Whiteboard, Policies and Procedures, Conferencing, Historical Information , Events diary, Technology Centers, Art Libraries, Sales Support Centers, Directions, Competitive Analysis, Maps, Strategies, Indexing Engines, Financial-Management Query, Information Catalogs, Corporate Newsletters.

Every day we see new uses for the Web, or new tools being introduced that will let us do something new. These are some of the existing applications. Some of these are new applications that take advantage of the Web, while others are gateways to existing applications.

There could be hundreds of internal servers, and thousands of interesting external servers. How do people find information? How do they know what is available and avoid Web thrashing?

Helping users find information will be one of your biggest challenges. There are different types of navigation tools to consider:

1. Page navigation aids.

2. Consistent tools to help people travel through Web-space navigation metaphor, desktops, newspapers - whatever audience feels comfortable with.

3. Index of internal servers. Similar to WebCrawler, using tools to periodically gather index information from known servers.

4. Announcement directories. One or more directories to store and retrieve information about shared Web resources (name, URL, keywords, description). It could be a simple file or something more elaborate. Similar in concept to YAHOO, where it receives and stores announcements for later retrieval by others.

5. Search engines. Tools to support different search strategies. These tools complement external search tools (YAHOO, etc.). Make it easy to use (search, browse), easy to publish (rule-enforcing forms-based). Decide who can post. Consider allowing everyone, with Web as moderator.

Therefore, a new expert is needed - the "information seeker. Potential cost savings: information is visible; reduces time searching for information; reusable software components.

Web Pages

There is an overwhelming amount of places to go on the Web. How do people get started? How do they keep up with constant changes to information? Create and maintain a core set of Web pages with classes of information of most interest across the enterprise, with a consistent set of navigation tools. These may not be used by everyone, but should provide a good starting and ongoing reference point for users. This simplifies using the Web for non-technical people. Because it is already there and neatly categorized they can quickly see what is available.

Monitor usage to see what is of most interest, or what is not being accessed. Redesign to improve navigation. The Web needs to keep the pages current and effective. Potential cost savings: reduced page creation and maintenance costs across the enterprise. Pages can be reused and referenced in other enterprise servers.

How do technical people learn what is available? How can they share experiences and tools? How do we help people get started? The Web Toolbox is a collection place for descriptions, reviews and links to anything that makes creating or using the Web easier.

Some of the categories that may be included: Adobe Acrobat, Indexing tools, Announcements, Information Retrieval Tools, Authoring Tools Messengers, Browsers, Real Audio, Converters, Robots, Data Base Access, Searching tools, Diagnostic Tools, Security, Editors, Scripts, Filters, Sound Players, Firewall, Spiders, Forms (how to do), Statistics/tracking tools, Gateways, Verifiers, Graphics Tips, Viewers, Icon libraries, VRML, Image maps, Other toolboxes, News sources.

Some Organization Issues

Here’s a list of potential organizational issues:

- Is there a need for central philosophy?

- Scope of their responsibility?

- Some obvious cost savings

- Size, purpose, organization

- Needs senior management support

- Publishing standards. Some guidelines are needed

- Can anyone publish?

- External information should be closely managed

- What are publishers' responsibilities?

- Some information needs to come from content

- Who is responsible for authorized sources?

- May need to push 'right' people to publish

- Are there standards or guidelines?

- Clarify publisher responsibilities

- Educate publishers on these responsibilities - accuracy, timeliness, ongoing maintenance

- Consistent look and feel?

- Internal standards/guidelines control?

- External pages must have a level of consistency!

- Each may have very different look and feel

- Need to support interface evolution

- Should internal look different process than external so people know?

- Too much control will cause avoidance where they are

- Try minimal standard: logo, author, date, navigation bars, and art work

- Does this need to be controlled?

- Provide what you want to see

- Data quality and management Need data management strategy

- Is this the information "right"?

- Move key legacy information to Web

- Should there be a "UL" approval stamp?

- Is it up to date? Accurate?

- How about informal information?

- Who manages Web data quality?

- The Web is a new source of culture power

- Dynamics will change over time.

Issues

How can we make it easy to create new Web information sites? How can we best share tools, icons and experiences? How do we publish "standards"?

Create a Publishing System(s) or Center(s) with a collection of tools and pages to help groups create and maintain. Helps new users get on-line quickly, gives you a place to put new tools:

1. HTML editors and converters HTML templates

2. Links to the Web archives search mechanisms

3. Feedback forms

4. Navigation Aids (icon, HTML code).

Include examples of existing internal Web sites, process descriptions, and information taxonomies. New tools are becoming available in this area, e.g., PageMill and FrontPage. Potential cost savings: startup costs, shared tools, shared information, reduced development and support costs. Other benefits: common look and feel, sharing of new tools and information.

Summary

The Web can save costs and increase productivity, and is essential to be competitive in the future.

Effective long-term, wide-scale use of the Web will not happen without care and support. Initial use and growth could, but the issues will become too complex and inter-related. To use it most effectively in large enterprises we need:

An understanding of the challenges involved a cross-organizational plan for its use a technical strategy in support of this plan support tools and services a client and information deployment strategy.

References

Krol, Ed. (1994). The Whole Internet User’s Guide & Catalog. Cambridge, MA: O’Reilly & Associates

Peck, Susan and Mui, Linda. (1995). WebSite. Palo Alto, CA: O’Reilly & Associates

Dowd, Kevin. (1995). Getting Connected: Establishing a Presence on the Internet. Cambridge, MA: O’Reilly & Associates

Niederst, Jennifer, Freeman, Eddie and Peebles, Kathy. (1995). Web Design for Designers. Palo Alto, CA: O’Reilly & Associates.