1. Introduction

OBJECTIVES

In this chapter you’ll learn:

• The evolution of the Internet and the World Wide Web.

• What Web 2.0 is and why it’s having such an impact among Internet-based and traditional businesses.

• What Rich Internet Applications (RIAs) are and the key software technologies used to build RIAs.

• How object technology is improving the software development process.

• The importance of JavaScript as the universal client scripting language.

The renaissance of interest in the web that we call Web 2.0 has reached the mainstream.
—Tim O’Reilly

Billions of queries stream across the servers of these Internet services—the aggregate thoughtstream of humankind, online.
—John Battelle, The Search

People are using the web to build things they have not built or written or drawn or communicated anywhere else.
—Tim Berners-Lee

Some people take what we contribute and extend it and contribute it back. That’s really the basic open source success story.
—David Heinemeier Hansson, interviewed by Chris Karr at www.Chicagoist.com

1.1 Introduction

Welcome to Internet and World Wide Web programming and Web 2.0! We’ve worked hard to create what we hope you’ll find to be an informative, entertaining and challenging learning experience. As you read this book, you may want to refer to

www.deitel.com/books/jsfp/

for updates and additional information.

The technologies you’ll learn in this book are appropriate for experienced professionals who build substantial information systems. You’ll find “industrial-strength” code examples. We have attempted to write in a clear and straightforward manner using best programming and documentation practices.

Perhaps most important, the book includes over 100 working code examples and shows the outputs produced when these examples are rendered in browsers or run on computers. We present all concepts in the context of complete working programs. We call this the “live-code approach.” All of the source code is available for download from www.deitel.com/books/jsfp/.

We present a carefully paced introduction to “client-side” web programming, using the popular JavaScript language and the closely related technologies of XHTML (Extensible HyperText Markup Language), CSS (Cascading Style Sheets) and the DOM (Document Object Model). We often refer to “programming” as scripting—for reasons that will soon become clear.

JavaScript is among today’s most popular software development languages for web-based applications. In this book, we present a number of powerful software technologies that will enable you to build such applications. We concentrate on using technologies such as the Extensible HyperText Markup Language (XHTML), JavaScript, CSS, and Extensible Markup Language (XML) to build the portions of web-based applications that reside on the client side (i.e., the portions of applications that typically run in your web browsers such as Mozilla’s Firefox, Microsoft’s Internet Explorer, Opera, Google’s Chrome or Apple’s Safari). The server side of web-based applications typically runs on “heavy-duty” computer systems on which organizations’ business-critical websites reside. By mastering the technologies in this book, you’ll be able to build the client side of substantial web-based, client/server, database-intensive, “multitier” applications. Our sister book, Internet & World Wide Web How to Program, 4/e, contains both the client-side programming material from JavaScript for Programmers, and also presents a variety of server-side programming technologies.

To keep up to date with Internet and web programming developments, and the latest information on JavaScript for Programmers at Deitel & Associates, please register for our free e-mail newsletter, the Deitel® Buzz Online, at

www.deitel.com/newsletter/subscribe.html

Please check out our growing list of Internet and web programming, and Internet business Resource Centers at

www.deitel.com/resourcecenters.html

Each week, we announce our latest Resource Centers in the newsletter. A list of Deitel Resource Centers at the time of this writing is located in the first few pages of the book. The Resource Centers include links to, and descriptions of, key tutorials, demos, free software tools, articles, e-books, white papers, videos, podcasts, blogs, RSS feeds and more that will help you deepen your knowledge of most of the subjects we discuss in this book.

Errata and updates for the book are posted at

www.deitel.com/books/jsfp/

You’re embarking on a challenging and rewarding path. We hope that you’ll enjoy JavaScript for Programmers. As you proceed, if you have any questions, send e-mail to

[email protected]

and we’ll respond promptly.

1.2 History of the Internet and World Wide Web

In the late 1960s, one of the authors (HMD) was a graduate student at MIT. His research at MIT’s Project MAC (now the Laboratory for Computer Science—the home of the World Wide Web Consortium) was funded by ARPA—the Advanced Research Projects Agency of the Department of Defense. ARPA sponsored a conference at which several dozen ARPA-funded graduate students were brought together at the University of Illinois at Urbana-Champaign to meet and share ideas. During this conference, ARPA rolled out the blueprints for networking the main computer systems of about a dozen ARPA-funded universities and research institutions. They were to be connected with communications lines operating at a then-stunning 56 Kbps (i.e., 56,000 bits per second)—this at a time when most people (of the few who could) were connecting over telephone lines to computers at a rate of 110 bits per second. There was great excitement at the conference. Researchers at Harvard talked about communicating with the Univac 1108 “supercomputer” at the University of Utah to handle calculations related to their computer graphics research. Many other intriguing possibilities were raised. Academic research was about to take a giant leap forward. Shortly after this conference, ARPA proceeded to implement the ARPANET, which eventually evolved into today’s Internet.

Communicating Quickly and Easily

Things worked out differently from what was originally planned. Rather than enabling re-searchers to share each other’s computers, it rapidly became clear that enabling researchers to communicate quickly and easily via what became known as electronic mail (e-mail, for short) was the key early benefit of the ARPANET. This is true even today on the Internet, as e-mail and instant messaging facilitates communications of all kinds among more than a billion people worldwide.

Mutiple Users Sending and Receiving Information Simultaneously

One of the primary goals for ARPANET was to allow multiple users to send and receive information simultaneously over the same communications paths (e.g., phone lines). The network operated with a technique called packet switching, in which digital data was sent in small bundles called packets. The packets contained address, error-control and sequencing information. The address information allowed packets to be routed to their destinations. The sequencing information helped in reassembling the packets—which, because of complex routing mechanisms, could actually arrive out of order—into their original order for presentation to the recipient. Packets from different senders were intermixed on the same lines. This packet-switching technique greatly reduced transmission costs, as compared with the cost of dedicated communications lines.

The network was designed to operate without centralized control. If a portion of the network failed, the remaining working portions would still route packets from senders to receivers over alternative paths for reliability.

Protocols for Communication

The protocol for communicating over the ARPANET became known as TCP—the Transmission Control Protocol. TCP ensured that messages were properly routed from sender to receiver and that they arrived intact.

As the Internet evolved, organizations worldwide implemented their own networks for both intraorganization (i.e., within the organization) and interorganization (i.e., between organizations) communications. A wide variety of networking hardware and software appeared. One challenge was to get these different networks to communicate. ARPA accomplished this with the development of IP—the Internet Protocol—truly creating a “network of networks,” the current architecture of the Internet. The combined set of protocols is now commonly called TCP/IP.

Commercial Internet Use

Initially, Internet use was limited to universities and research institutions; then the military began using the Internet. Eventually, the government decided to allow access to the Internet for commercial purposes. Initially, there was resentment in the research and military communities—these groups were concerned that response times would become poor as “the Net” became saturated with users.

In fact, the exact opposite has occurred. Businesses rapidly realized that they could tune their operations and offer new and better services to their clients, so they started spending vast amounts of money to develop and enhance the Internet. This generated fierce competition among communications carriers and hardware and software suppliers to meet this demand. The result is that bandwidth (i.e., the information-carrying capacity) of the Internet has increased tremendously and costs have plummeted.

World Wide Web

The World Wide Web allows computer users to locate and view multimedia-based documents on almost any subject over the Internet. Though the Internet was developed decades ago, the web is a relatively recent creation. In 1989, Tim Berners-Lee of CERN (the European Organization for Nuclear Research) began to develop a technology for sharing information via hyperlinked text documents. Berners-Lee called his invention the HyperText Markup Language (HTML). He also wrote communication protocols to form the backbone of his new information system, which he called the World Wide Web. In particular, he wrote the Hypertext Transfer Protocol (HTTP)—a communications protocol for sending information over the web. Web use exploded with the availability in 1993 of the Mosaic browser, which featured a user-friendly graphical interface. Marc Andreessen, whose team at NCSA (the University of Illinois’ National Center for Supercomputing Applications) developed Mosaic, went on to found Netscape®, the company that many people credit with initiating the explosive Internet economy of the late 1990s. Netscape’s version of the Mosaic browser has been evolved by the Mozilla Corporation into the enormously popular open source Mozilla Firefox browser.

Making Our Work and Lives Easier

In the past, most computer applications ran on computers that were not connected to one another, whereas today’s applications can be written to communicate among the world’s computers. The Internet mixes computing and communications technologies. It makes our work easier. It makes information instantly and conveniently accessible worldwide. It enables individuals and small businesses to get worldwide exposure. It is changing the way business is done. People can search for the best prices on virtually any product or service. Special-interest communities can stay in touch with one another. Researchers can be made instantly aware of the latest breakthroughs. The Internet and the web are surely among humankind’s most profound creations.

1.3 World Wide Web Consortium (W3C)

In October 1994, Tim Berners-Lee founded an organization—called the World Wide Web Consortium (W3C)—devoted to developing nonproprietary, interoperable technologies for the World Wide Web. One of the W3C’s primary goals is to make the web universally accessible—regardless of ability, language or culture. The W3C (www.w3.org) provides extensive resources on Internet and web technologies.

The W3C is also a standardization organization. Web technologies standardized by the W3C are called Recommendations. W3C Recommendations include the Extensible HyperText Markup Language (XHTML), Cascading Style Sheets (CSS), HyperText Markup Language (HTML—now considered a “legacy” technology) and the Extensible Markup Language (XML). A recommendation is not an actual software product, but a document that specifies a technology’s role, syntax rules and so forth.

1.4 Web 2.0

In 2003 there was a noticeable shift in how people and businesses were using the web and developing web-based applications. The term Web 2.0 was coined by Dale Dougherty of O’Reilly® Media1 in 2003 to describe this trend. Although it became a major media buzzword, few people really know what Web 2.0 means. Generally, Web 2.0 companies use the web as a platform to create collaborative, community-based sites (e.g., social networking sites, blogs, wikis, etc.).

1. O’Reilly, T. “What is Web 2.0: Design Patterns and Business Models for the Next Generation of Software.” September 2005 <http://www.oreillynet.com/pub/a/oreilly/tim/news/2005/09/30/
what-is-web-20.html?page=1>
.

Web 1.0

Web 1.0 (the state of the web through the 1990s and early 2000s) was focused on a relatively small number of companies and advertisers producing content for users to access (some people called it the “brochure web”). Web 2.0 involves the user—not only is the content often created by the users, but users help organize it, share it, remix it, critique it, update it, etc. One way to look at Web 1.0 is as a lecture, a small number of professors informing a large audience of students. In comparison, Web 2.0 is a conversation, with everyone having the opportunity to speak and share views.

Architecture of Participation

Web 2.0 is providing new opportunities and connecting people and content in unique ways. Web 2.0 embraces an architecture of participation—a design that encourages user interaction and community contributions. You, the user, are the most important aspect of Web 2.0—so important, in fact, that in 2006, TIME Magazine’s “Person of the Year” was “you.”2 The article recognized the social phenomenon of Web 2.0—the shift away from a powerful few to an empowered many. Several popular blogs now compete with traditional media powerhouses, and many Web 2.0 companies are built almost entirely on user-generated content. For websites like MySpace®, Facebook®, Flickr™, YouTube, eBay® and Wikipedia®, users create the content, while the companies provide the platforms. These companies trust their users—without such trust, users cannot make significant contributions to the sites.

2. Grossman, L. “TIME’s Person of the Year: You.” TIME, December 2006 <http://www.time.com/time/magazine/article/
0,9171,1569514,00.html>
.

Collective Intelligence

The architecture of participation has influenced software development as well. Open source software is available for anyone to use and modify with few or no restrictions. Using collective intelligence—the concept that a large diverse group of people will create smart ideas—communities collaborate to develop software that many people believe is better and more robust than proprietary software. Rich Internet Applications (RIAs) are being developed using technologies (such as Ajax) that have the look and feel of desktop software, enhancing a user’s overall experience. Software as a Service (SaaS)—software that runs on a server instead of a local computer—has also gained prominence because of sophisticated new technologies and increased broadband Internet access.

Search engines, including Google™, Yahoo!®, MSN®, Ask™, and many more, have become essential to sorting through the massive amount of content on the web. Social bookmarking sites such as del.icio.us and Ma.gnolia allow users to share their favorite sites with others. Social media sites such as Digg™, Spotplex™ and Netscape enable the community to decide which news articles are the most significant. The way we find the information on these sites is also changing—people are tagging (i.e., labeling) web content by subject or keyword in a way that helps anyone locate information more effectively.

Web Services

Web services have emerged and, in the process, have inspired the creation of many Web 2.0 businesses. Web services allow you to incorporate functionality from existing applications and websites into your own web applications quickly and easily. For example, using Amazon Web Services™, you can create a specialty bookstore to run your website and earn revenues through the Amazon Associates Program; or, using Google™ Maps web services with eBay web services, you can build location-based “mashup” applications to find auction items in certain geographical areas. Web services, inexpensive computers, abundant high-speed Internet access, open source software and many other elements have inspired new, exciting, lightweight business models that people can launch with only a small investment. Some types of websites with rich and robust functionality that might have required hundreds of thousands or even millions of dollars to build in the 1990s can now be built for nominal amounts of money.

Semantic Web

In the future, we’ll see computers learn to understand the meaning of the data on the web—the beginnings of the Semantic Web are already appearing. Continual improvements in hardware, software and communications technologies will enable exciting new types of applications.

See our Web 2.0 Resource Center at www.deitel.com/web2.0/ for more information on the major characteristics and technologies of Web 2.0, key Web 2.0 companies and Web 2.0 Internet business and monetization models. The Resource Center also includes information on user-generated content, blogging, content networks, social networking, location-based services and more. We have separate Resource Centers on many Web 2.0 concepts and technologies. You can view a list of our Resource Centers in the first few pages of this book and at www.deitel.com/ResourceCenters.html.

1.5 Key Software Trend: Object Technology

One of the authors, HMD, remembers the great frustration felt in the 1960s by software development organizations, especially those working on large-scale projects. During his undergraduate years, he had the privilege of working summers at a leading computer vendor on the teams developing timesharing, virtual-memory operating systems. This was a great experience for a college student. But, in the summer of 1967, reality set in when the company “decommitted” from producing as a commercial product the particular system on which hundreds of people had been working for many years. It was difficult to get this thing called software right—software is “complex stuff.”

Improvements to software technology did emerge, with the benefits of structured programming and the related disciplines of structured systems analysis and design being realized in the 1970s. Not until the technology of object-oriented programming became widely used in the 1990s, though, did software developers feel they had the necessary tools for making major strides in the software development process.

What are objects and why are they special? Actually, object technology is a packaging scheme that helps us create meaningful software units. These can be large and are highly focused on particular applications areas. There are date objects, time objects, paycheck objects, invoice objects, audio objects, video objects, file objects, record objects and so on. In fact, almost any noun can be reasonably represented as an object.

We live in a world of objects. Just look around you. There are cars, planes, people, animals, buildings, traffic lights, elevators and the like. Before object-oriented languages appeared, procedural programming languages (such as Fortran, COBOL, Pascal, BASIC and C) were focused on actions (verbs) rather than on things or objects (nouns). Programmers living in a world of objects programmed primarily using verbs. This made it awkward to write programs. Now, with the availability of popular object-oriented languages, such as C++, Java, Visual Basic and C#, programmers continue to live in an object-oriented world and can program in an object-oriented manner. This is a more natural process than procedural programming and has resulted in significant productivity gains.

A key problem with procedural programming is that the program units do not effectively mirror real-world entities, so these units are not particularly reusable. It’s not unusual for programmers to “start fresh” on each new project and have to write similar software “from scratch.” This wastes time and money, as people repeatedly “reinvent the wheel.” With object technology, the software entities created (called classes), if properly designed, tend to be reusable on future projects. Using libraries of reusable componentry can greatly reduce effort required to implement certain kinds of systems (compared to the effort that would be required to reinvent these capabilities on new projects).

Image

Software Engineering Observation 1.1

Extensive class libraries of reusable software components are available on the Internet. Many of these libraries are free.

Image

Software Engineering Observation 1.2

Some organizations report that the key benefit object-oriented programming gives them is not software that is reusable but, rather, software that is more understandable, better organized and easier to maintain, modify and debug. This can be significant, because perhaps as much as 80 percent of software cost is associated not with the original efforts to develop the software, but with the continued evolution and maintenance of that software throughout its lifetime.

1.6 JavaScript: Object-Based Scripting for the Web

JavaScript is a powerful object-based scripting language with strong support for proper software engineering techniques. You’ll create and manipulate objects from the start in JavaScript. JavaScript is available free in today’s popular web browsers.

You’ll see that JavaScript is a portable scripting language and that programs written in JavaScript can run in many web browsers. Actually, portability is an elusive goal.

Image

Portability Tip 1.1

Although it is easier to write portable programs in JavaScript than in many other programming languages, differences among interpreters and browsers make portability difficult to achieve. Simply writing programs in JavaScript does not guarantee portability. You’ll occasionally need to research platform variations and write your code accordingly.

Image

Portability Tip 1.2

When writing JavaScript programs, you need to deal directly with cross-browser portability issues. Such issues are hidden by JavaScript libraries, such as Dojo (discussed in Chapter 13), Prototype, Script.aculo.us and ASP.NET Ajax, which provide powerful, ready-to-use capabilities that simplify JavaScript coding by making it cross-browser compatible.

Image

Error-Prevention Tip 1.1

Always test your JavaScript programs on all systems and in all web browsers for which they are intended.

JavaScript was created by Netscape, which created the first widely successful web browser. Both Netscape and Microsoft have been instrumental in standardizing JavaScript through ECMA International (formerly the European Computer Manufacturers Association) as ECMAScript. Adobe Flash uses another scripting language named ActionScript. ActionScript and JavaScript are converging in the JavaScript standard’s next version (JavaScript 2/ECMA Script version 4) currently under development. This will result in a universal client scripting language, greatly simplifying web application development.

1.7 Browser Portability

Ensuring a consistent look and feel on client-side browsers is one of the great challenges of developing web-based applications. Currently, a standard does not exist to which software developers must adhere when creating web browsers. Although browsers share a common set of features, each browser might render pages differently. Browsers are available in many versions and on many different platforms (Microsoft Windows, Apple Macintosh, Linux, UNIX, etc.). Vendors add features to each new version that sometimes cause crossplatform incompatibility issues. Clearly it is difficult to develop web pages that render correctly on all versions of all browsers. In this book we develop web applications that execute on the Internet Explorer 7 and Firefox 2 (and higher) browsers. Most examples will operate correctly in other recent browsers such as Opera, Apple’s Safari and Google’s Chrome, but we have not explicitly tested the applications on these other browsers.

Image

Portability Tip 1.3

The web is populated with many different browsers, which makes it difficult for authors and web application developers to create universal solutions. The W3C is working toward the goal of a universal client-side platform.

1.8 Web Resources

www.deitel.com/

Check this site frequently for updates, corrections and additional resources for all Deitel & Associates, Inc., publications.

www.deitel.com/ResourceCenters.html

Check out the complete list of Deitel Resource Centers, including numerous programming, open source, Web 2.0 and Internet business topics.

www.w3.org

The World Wide Web Consortium (W3C) website offers a comprehensive description of web technologies. For each Internet technology with which the W3C is involved, the site provides a description of the technology, its benefits to web designers, the history of the technology and the future goals of the W3C in developing the technology.

www.deitel.com/Ajax/
www.deitel.com/XML/
www.deitel.com/XHTML/
www.deitel.com/CSS21/
www.deitel.com/Dojo/

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset