The Fifth International World Wide Web Conference was the largest, Web-dedicated international event of the year. It was held May 6-10, 1996, in Paris, France, at the CNIT shown in the picture below, which is one of the largest Conference and Exhibition Centers in Europe. Multiple sessions provided over 2,000 attendees with an exhaustive and comprehensive view of where and how the Web is progressing today. The conference proceedings and other on-line information can be found at the URLs http://www.w3.org/hypertext/Conferences/WWW5/ and http://www5conf.inria.fr/.
Outside the conference center in Paris.
One of the latest innovations on the Web is the Internet Phone, which in the short term will provide a quick and low-cost alternative to the high-cost of long distance telecommunications. "Internet Phone(TM), the first voice communications product for PC users on the Internet. With Internet Phone, users can conduct unlimited long distance and international conversations for the cost of an Internet connection." The current bandwidth of the Internet is low (many time-outs and high packet loss and/or retransmissions) and makes real-time audio (and video) adequate for only moderate usage. This, however, will change shortly in the future when the Internet backbone provides much higher bandwidth. See URL http://www.vocaltec.com/ for more details.
There is also talk of an Internet box, which is a World Wide Web (WWW)-browser/computer without the operating system: a machine or dumb-terminal that speaks Java and can access the Internet. Television manufacturers are talking about putting Internet access into the television set. Cellular phones are being integrated with PCs. Nobody knows how much and in what direction the Internet will be a part of our lives.
The Web has been used to support mass retrieval of documents and resources with limited interactivity, but Tim Berners-Lee stressed that the original intent was collaboration and sharing information two ways. "Interactivity is not clicking on the submit button," he said, but is the dynamic sharing of information among multiple people. An example might be a virtual conference room with inline videos, an overhead projector, a whiteboard, and the ability for people to interact with both the "room" and each other. One working project that is going in this direction includes the Grassroots project at Stanford. A paper about this project called "Grassroots: A System Providing a Uniform Framework for Communicating, Structuring, Sharing Information, and Organizing People" was presented at the conference. (See URL http://www-pcd.stanford.edu/Grassroots/WWW96/) Another project mentioned is called CoWeb, discussed below. Tim Berners-Lee ended his opening talk with "Think it's cool? Y'aint seen nothin' yet."
John Patrick, vice president of Internet Technology of IBM, talked about the growth of the Web saying that "the Web in visicalc 1.0," meaning the Web is an infant and commerce will fuel continued growth. The Internet naming protocol is being extended to Version 6 for many more address spaces. The Internet is a full-fledged parallel economy, and an organization without the Web is an organization without the FAX machine. At the last Web conference, WWW4, in Boston, Massachusetts, it was predicted that the Web would collapse by May 1996, but it has not happened yet. The Internet will grow with integrated switched routers and increased backbone capabilities to handle the increasing bandwidth needs. See URL http://www.ibm.com/patrick/ for a philosophy of how to use the Internet to transform an organization and to make it more accessible and approachable, along with other related information.
Steve McGeady of Intel gave an inspiring talk about looking beyond the current client-server model. The talk starts with examples of the staggering cost of running a popular Web site:
The Web is centralizing the Internet because popularity kills small sites that do not have the resources to keep up with the demand. Only large organizations with large amounts of resources (people and computers) will be able to keep up with the exponential growth. Servers are increasingly becoming like mainframes, and servers do not scale very well. There are tens of millions of clients but only hundreds of thousands of servers. Peer-to-peer computing should replace the current client-server model where every computer is a peer (both a server and a client). Most computers are idle, and using a distributed computing model can utilize all the available computing power. For example, all of the computers in a given office can be integrated such that information is cached by frequent users and caches should be two-ways. Everything would be multicasted and distributed. Basically, it amounts to smarter ways to implement proxy servers and integrate client resources.
Rob Glaser, president and CEO of Progressive Networks, gave a presentation about the history of mass media with radio and television as a precursor to the growth of the Internet. Both radio and television viewers had a slow initial start with 40,000 radio listeners in 1920 and 10,000 television viewers in 1945 but had an exponential growth with two million radio listeners in 1925 and three out of four families with a television set by 1952. Mr. Glaser demonstrated real-audio with real-time radio broadcasts from radio stations all over the world via the Internet. See URL http://www.timecast.com/ to listen to real audio with the freely available software that requires Netscape Navigator Version 2.0 on PC, Macintosh, Solaris, SunOS, and IRIX platforms.
There is a battle between the major technology companies for who is going to make Java work. At present, Sun admits that Java does not live up to its propaganda, but "it will." James Gosling of Sun says that "Java is like C++ without guns and knives" and may end up controlling your cellular phone, microwave, and toaster ovens. Although the needed safety mechanisms are not yet in place (mostly because of the browser implementations of the language). The Java implementors are rushing to implement the capabilities without worrying about the little details such as security. It may be possible to run Java applications in a "sandbox" or "padded cell" where Java can create and access local files but not have access to anything outside this protected environment. Real Java applications need access outside this "sandbox," and digital signatures will be needed to run applications from "trusted" companies. Security will have to be defined with fine tuning for network and file access. The Future of Java panel suggested that Sun and Java creators not forget the lessons learned by modern programming languages (C, C++, and others) and that no matter how impressive the Graphical User Interface (GUI) looks, PERFORMANCE MATTERS. Interpreted languages are inherently slower than compiled languages, and users will want their applications to run as fast as possible. Tuning of Java could achieve up to three times the performance where the portable byte codes are directly translated into machine code. There will be a move back to the server with Java applets or Servlets running on the server and communicating with Java clients. The client cannot and should not do everything, and server-side computation is preferred in some situations. This Java panel ended with the question, "Will the Next Internet Worm Be Written in Java?" to which the Sun representative replied, "not by me," meaning that it might be possible no matter how safe Java or its various implementations claim it to be.
Microsoft introduced the ActiveX/COM Java interface to distributed objects and applications on the desktop, which is a networked-based extension of OLE. It claims to be the glue to tie Java to C++ and other languages. ActiveX and Java enable direct access to system services such as data bases, multimedia, and graphics. Sun has developed a similar project called Java Objects Everywhere (JOE) with access to distributed Common Objects Request Broker Architecture (CORBA) objects.
The Twinpeaks Project of ILOG uses Java as a dynamic add-on to C++ applications where Java can be used in changing classes and C++ for stable high-performance class libraries. This may be a more intranet than internet usage since security was not discussed.
Mark Pesce gave a non-technical talk about Virtual Reality Modeling Language (VRML) with the emergent social forms on the Web. "VRML is a social infrastructure," he said, and the development of VRML is due to a "collective intelligence" of technical and artistic people who first met at a Birds-of-a-Feather (BOF) meeting at the first WWW conference and now collaborate via the VRML mailing list. The VRML architecture group (VAG) has defined the VRML specifications (VRML 1.x and VRML 2.0), and a VRML consortium is currently being formed. Microsoft has been bashed for its aggressive push into VRML's future, and SGI's proposal had been selected for the VRML 2.0 specification. The VRML community stresses a cooperative environment where all major players share and contribute to the benefit of the group. The talk is available on line at URL http://www.hyperreal.com/~mpesce/, and more information about VRML can be found at URL http://vag.vrml.org/. Also, a tutorial called "VRML: Three Dimensional Visualization and Interactivity" was conducted and is discussed in more detail below.
Software agents can improve communication, although nobody will give a definition of what agents really are. Today, "agent" is yet another over-used term. The AI Lab at AT&T presented a referral agent that keeps track and indexes mail messages from internal AT&T users from which to find the "expert" of a given field or topic. Each user has an expertise/interest profile of its owner with access to the owner's private E-mail and files. A user may issue a query as to who knows the printer name across the hall or the nearest bicycle shop. Agents can handle large numbers of messages, and users are shielded by many irrelevant messages.
The following tutorials were given to bring the audience up to speed in various Web areas:
Authentication, Privacy, and Access Control on WWW.
VRML: Three Dimensional Visualization and Interactivity.
Putting Media into Hypermedia.
Access to Legacy Data.
Designing and Maintaining a Usable Site.
Effective Internet Searching Strategies.
Collecting and Serving Information.
The Web and Lotus Notes in the Enterprise.
Audio and Video.
Web Document Engineering.
The tutorial about VRML was given by Mark D. Pesce, Jan Hardenbergh, and Anthony Parisi. This was an intensive course in the fundamentals and syntax of VRML as a language in addition to its application to the Web, a survey of available VRML browsers, authoring and publishing tools, optimization strategies, and design considerations. In addition, fundamentals of 3D computer graphics were covered as well as the three VRML nodes (LOD, WWWAnchor, WWWInline), which are most important for the creation of effective VRML worlds.
The tutorial "Putting Media into Hypermedia" by Kevin Hughes discussed ways to integrate graphics, audio, and video into a World Wide Web site. Topics included techniques for editing, manipulating, and converting digital media, with numerous examples and case studies. For example, Mr. Hughes mentioned that many sites today attempt to be on the cutting edge with all manner of plugs-ins and other programs but forget low-bandwidth constraints, text-only users, and people using the Web with display devices other than large glass color monitors. Some basic tips to help make things easier for text-only users include the following:
Interlace GIF images.
Use inline JPEGs when appropriate.
Include file sizes in graphics and movie thumbnails with links to the media.
Create alternative pages for users.
Consider 14.4k as the low end. Ideally, users should not have to wait over 20 seconds for any operation to complete.
Keep the total size of all media on a page to around 50k.
Shortly, these tutorials will be available from the conference home page.
Jason Mathews of the NSSDC presented a paper "Electronic Management of the Peer Review Process," the subject of which was acknowledged in the opening ceremony. NASA/Goddard Space Flight Center (GSFC) has supported both the WWW4 and WWW5 conferences with an electronic system that manages the paper submission and peer review processes. The support was briefly discussed in the NSSDC News March 1996 issue. The presentation described what makes up the peer review process and how an electronic management system (EMS) automates the entire process through a Web-based interface. The technical paper in the conference proceedings is available at URL http://www5conf.inria.fr/fich_html/papers/P55/Overview.html, and the slides for the talk are available from the conference home page.
The best paper of the conference, which was selected as the most interesting, was called "Measuring the Web" by Tim Bray, senior vice president of the Open Text Corporation. This paper answered such questions as the following:
How big is the Web?
What is the "average page" like?
How richly connected is it?
What are the biggest and most visible sites?
What data formats are being used?
What does the WWW look like?
VRML was used in this paper to visualize the complex spatial relations among Web sites. The paper is available at URL http://www5conf.inria.fr/fich_html/papers/P9/Overview.html.
The paper "Filling HTML Forms Simultaneously: CoWeb - Architecture and Functionality" presented a good example of collaboration via the Web where multiple users can simultaneously annotate a given document or image for a real-time discussion. See the paper at URL http://www5conf.inria.fr/fich_html/papers/P43/Overview.html.
Web publishing requires a data base that understands, manages, and conceptualizes data in a whole new way. Technology is ever-changing. Data are no longer just text and numbers but also graphics, video, images, audio samples, animation, and formatted text. Illustra's extensible architecture accommodates all these data types. Snap-in software modules, called DataBlade modules, teach the Illustra data base server to understand new types of data, new ways of accessing them, and useful operations to perform on them. A Web site powered by Illustra enables a user to create and manage multimedia content in an easy, active, and intelligent way. See URL http://www.informix.com/ for details.
One of the most difficult aspects of running a Web site is keeping track of all the links, nooks, and crannies that develop. InContext offers a commercial product WebAnalyzer that allows users to see a graphical representation of a Web site and generate detailed reports of HTML documents, external links, graphics, mailto's, and audio and video files that make up their sites. WebAnalyzer competes against products like NetCarta's CyberPilot, which creates site maps; SurfBot, which can check all a user's links; and Web Whacker, which downloads entire sites to a user's hard disk. WebAnalyzer only runs from Windows 95 computers, but it can be used to analyze any site on the Web. See URL http://www.incontext.com/ for details.
It was announced at the closing ceremony that the next three international WWW conferences will be held on April 7-12, 1997, at the Santa Clara Conference Center; April 13-17, 1998, in Brisbane, Australia; and in 1999 in Toronto, Canada.
Inside the CNIT Conference Center.
Outside the conference center.
Author:Miranda Beall (firstname.lastname@example.org)