SUPPLEMENT D

Information Technology

Computer processing power will double every 18 months

Moore’s Law

The Information Revolution since World War II, dominated by the computer, is integrated with the latest accounting technologies. Moore’s Law is operating effectively, all part of a broader evolution including mass communications, global business, and access by virtually anyone.

A basic characteristic of accounting is repetitive entries: thousands of sales and receivables, inventory, and payroll transactions. Large paper accounting worksheets and No. 2 pencils, special journals, and subsidiary ledgers were early answers. The 20th century saw the mechanization of repetitive transactions using tabulating machines and, mid-century, the computer. Combining advancing technology with networked accounting systems and global access through the Internet, accountants entered the 21st century as part of the Information Revolution.

Mass Communication

The age of mass communication may have started with the Gutenberg’s printing press, but of more interest is the near instantaneous messaging of the telegraph. Samuel Morse invented the telegraph transmitter and receiver in 1837. The telegraph is an electromagnet connected to a battery by a switch—the telegraph key, working as an on-off switch allowing the timing and spacing of the electric pulses to vary. The Morse Code standardized these pulses (dots and dashes) to form letters and numbers. Funded by the federal government, telegraph lines were strung next to the Baltimore and Ohio Railroad track between Washington, DC and Baltimore; the first message in 1844: “What hath god wrought.” For the first time information was transformed into electrical impulses and transmitted long distances. By 1850, 50 telegraph companies competed in the United States. Western Union was created in 1851 as a combination of a dozen of these companies. Soon after the Civil War, telegraph lines stretch across the continent and, with the Atlantic cable, to Europe and beyond.

The telephone would follow, with Alexander Graham Bell’s first phone demonstrated in 1876. The telephone has a transmitter and receiver, both invented by Bell (and others). Bell’s transmitter used magnetism to change sound into electricity. Bell and others went commercial and the telephone eventually caught on. The company Bell cofounded became American Telephone and Telegraph. Thomas Edison’s carbon transmitter was in improvement and eventually acquired by Bell. Calls required direct wire connections, soon through switchboards, then dial telephones. Telephones connected to a central office, transferring calls to other local customers or routes to other telephone offices and long distance facilities.

Edison invented the electric light and started electrifying major cities. Movies were another 19th-century Edison invention. Marconi sent the first radio signal in 1895. The potential for the next century looked extraordinary. Early 20th-century communications technology included the radio and television prototypes. Instantaneous communications are taken for granted today, but are the result of inventions over the century and a half.

Early Computing

Prior to the late-19th century there were a few attempts at mechanical computing. Blaise Pascal and Gottfried Leibnitz invented mechanical adding machines in the 17th century, really intellectual curiosities of famous mathematicians. Charles Babbage developed concepts of large calculating machines that could be considered 19th-century computers, but failed to build them despite attempts using his own and government funds.1 Working with Babbage, Ada Lovelace developed concepts of programming the machine to do limited task using Jacquard punch cards; she could be considered the first programmer. Thomas de Colmer invented the “Arithmomether” in 1820 and the Felt and Tarrant “Comptometer” began in 1885. Perhaps the most successful commercial calculating machines were developed by William Burroughs at American Arithmometer (1886) and later Burroughs Adding Machine. By 1926 over one million calculating machines were shipped by Burroughs. The company became part of Unisys Corporation.

The most important machine for accounting potential was the punchcard tabulating machine, invented by Herman Hollerith, based on the early 19th-century Jacquard loom using punched cards to create fabric designs. Hollerith taught mechanical engineering at Massachusetts Institute of Technology in the 1880s where he started work on a tabulating system. His system included punch cards, tabulator, and sorter, gaining considerable fame when used for the 1890 U.S. Census—conveniently finished in 6 months. The punch cards used 12 rows and 24 columns to describe each person in the census, a forerunner of the electronic spreadsheet. He founded the Tabulating Machine Company in 1896, acquired by Computing-Tabulating-Recording Company (CTR) in 1911. Hollerith served as a consulting engineer until retiring in 1921. Under the grandiose plans of Thomas Watson, Sr., CTR changed its name to International Business Machines (IBM) in 1924.

Hollerith leased the early tabulating machines, primarily because they were so unreliable that customers were reluctant to buy them. The leasing tradition was carried on by IBM. It proved to be an effective strategy during the Great Depression, because most customers continued to make lease payments. The new regulations of the 1930s New Deal made federal agencies big purchasers of IBM machines as well as companies needing the tabulating power to meet the new SEC disclosure requirements. Thomas Watson’s compensation included bonuses based on IBM profit and he became the highest paid executive in America in 1937, at $365,000 (dubbed the “thousand dollar a day man” by the press).

During the 1930s the cheapest IBM installation was the International 50 (for $50 a month). This included a card-sorter, keypunch, and a nonprinting tabulator. The focus was on accounting functions: payroll, address labels, accounts payable and receivable, and inventory. The machines were “programmed” using wires on a plugboard, not much different from an old telephone switchboard.2

The Stone Age

According to computer lore, the Stone Age was the period of the huge electro-mechanical “dinosaurs” of the 1940s and early 50s. By the early 1930s MIT professor Vannever Bush built his Differential Analyzer, the first analog computer, used to solve equations. Also in the 1930s, British engineers used vacuum tubes as on-off switches in electronic circuits. During World War II, both the United States and United Kingdom built giant computers. The British built Colossus for code breaking the Nazi enigma machine. The Pentagon commissioned several universities to build massive computers for the war effort. The first real computer was the ENIAC (Electronic Numerator, Integrator, Analyzer, and Computer), built at the University of Pennsylvania by John Mauchly and Presper Eckert. It was completely electronic, using 17,000 vacuum tubes and taking up some 1,000 square feet of floor space. It was built for ballistic tables (calculations for artillery fire), but, because the computer was behind schedule and over budget, the war was over by the time ENIAC was publicly demonstrated in 1946. The key point: ENIAC was a general-purpose computer that could be programmed for any purpose.

Walter Isaacson gave Mauchley and Eckert credit for inventing the computer:

Not because the ideas were all their own but because they had the ability to draw ideas from multiple sources, add their own innovations, execute their vision by building a competent team, and have the most influence on the course of subsequent developments, The machine they built was the first general-purpose electronic computer. … The main lesson to draw from the birth of computers is that innovation is usually a group effort, involving collaboration between visionaries and engineers, and that creativity comes from drawing on many sources.3

Mauchly and Eckert filed a patent application and left the university to make their fortune in the computer business. Eckert-Mauchly’s Electronic Control Company produced the Binary Automatic Computer (BINAC) in 1948 for the Air Force, again behind schedule and over budget. The major effort was on the UNIVAC (Universal Automatic Computer). Mauchly and Eckert ran out of money in 1949 and sold their company to Remington Rand. The first UNIVAC was completed in 1951. The Census Bureau was the first customer. Forty-six UNIVACs would eventually be sold at more than $1 million each.

UNIVAC and all first generation electronic computers were big, used unreliable vacuum tubes, and required operating instructions made for each specific task. Each had a different binary-coded machine language program to operate, which made them difficult to program. Easy to criticize now, these were remarkable machines for their time.

One of the early purchasers of a UNIVAC I was General Electric (GE) in 1953. GE contracted with Arthur Andersen to do a feasibility study on using a computer for business functions. GE then hired Andersen to install a payroll system for 15,000 employees at their Lexington plant. It was time consuming, expensive, and several trouble spots developed. Initially, it took 5 days to compute the weekly payroll. There were too many control systems, logs, and printouts and too little storage capacity. The problems eventually were solved and Andersen had the first practical business software for the UNIVAC. The computer was ready to take over mundane accounting procedures and the Big 8 would become computer specialists.

IBM Dominates the Mainframe Market

The IBM contribution to the World War II effort was the Mark I (in cooperation with Harvard), an electromechanical devise largely made up of many tabulators hooked together. IBM was slow to see the potential in the all-electric computer. It seemed too big, expensive, and unreliable to compete with their tabulators. Top executives saw the computer as a scientific instrument, not for accounting applications. The company had 70 percent of the tabulator and related punch card markets and saw no immediate threat. IBM did build computers, but not for business. Beginning in 1947 they started the Selective Sequence Electronic Calculator (SSEC), the first electromechanical computer to run on software—using punch cards. IBM’s first “real computer” was the Defense Calculator, to be used for defense applications after the Korean War started in 1950. This was designed to be a general-purpose scientific computer and 20 actual and prospective orders were soon received, the innovation of an “assembly-line” computer rather than a one-of-a-kind machine. This later became the IBM 701, which was coming off the production line by the end of 1952.

It was after the 701 that IBM decided to compete head-on with UNIVAC for accounting applications. They were in second place to UNIVAC, having installed about 15 computers to UNIVAC’s 20. By 1954 IBM had 50 orders for the newest machine, the 702. The big innovation was the use of magnetic core memory. By 1956, 87 machines were in operation and 191 ordered, against 41 in operation and 40 on order for all other computer manufacturers. IBM also began producing a small computer, called the 650, which rented for about $4,000 a month.

A new problem became apparent. A profession was developing for programmers and system analysts. Installing computers and adapting them to the needs of a specific business was much more expensive than the machines themselves. The process also was time consuming (with the standard that installation and programming always took twice as long and cost twice as much as estimated). At the time there were no college degrees for programmers. By the mid-1950s, Thomas Watson, Jr. was encouraging universities such as MIT to start training computer scientists. IBM also gave large discounts to colleges that used their computers in data processing and scientific computing.

During much of the 1950s UNIVAC was the innovation leader in computers and many firms jumped into computer manufacturing. But IBM also was the dominant player in the related tabulator market and dominated the punch-card market, representing an important computer peripheral. Magnetic tape, used by UNIVAC, was faster and cheaper, but the punch-card was a permanent record and had been long used by major corporations. The card was bulky when stored by the millions, but would stay around for decades. IBM launched several computers, including the 700 series—leased as were tabulators.

Even with inferior products and high prices, IBM could stay competitive. How was this possible? IBM had the best sales and distribution methods in the business, which they had developed over the decades with their tabulating empire. They called their approach “systems knowledge”: “We consistently outsold people who had better technology because we knew how to put the story before the customer, how to install the machines successfully, and how to hang on to customers once we had them.”4 IBM had an army of experts to handle tough programming problems and a large library of computer programs available to customers for free. Sales staff filled out “call reports” when competing for new customers and when customers were dissatisfied. The company kept close tabs on the success of their competitors. By 1957 IBM revenues hit $1 billion, becoming one of the 40 largest companies in America.

UNIVAC was the first to use the transistor for computers,5 the Model 80 in 1958—the first Second Generation Computer. This was followed by the RCA BIZMAC and the Control Data Corp. (CDC) 1604. CDC was founded by disgruntled UNIVAC engineers and the 1604 was the fastest computer in the market. IBM launched the 7000 series in 1959, but became really competitive with the lower priced 1401 in 1960. The IBM 1401 included all the components associated with computers: disk and tape storage, memory, operating systems, shared programs, and printers. Honeywell introduced the 200 Series, which was cheaper and faster than the IBM 1401 and could run 1401-compatible programs. Thus, IBM set the standard, but had to share the market with products both faster and cheaper.

Second generation computers used assembly language rather than machine language, which simplified programming and allowed the potential of software compatibility across different computers. Incompatibility was a major problem with second generation computers. Growing companies and others changing focus (reorganizations, acquisitions, etc.) had expanding computer needs. Changing computers, even from the same manufacturer, required new software. The solution was a few years off.

Other second-generation competitors included Burroughs and Sperry Rand (UNIVAC’s parent companies Remington Rand, Sperry, and Burroughs would merge in 1986 to form Unisys). Big companies including RCA and GE would soon compete. Lots of innovation occurred by 1960: FORTRAN was created at IBM, the integrated circuit was invented at Texas Instruments, and the first minicomputer, the PDP-1, by Digital Equipment. Despite the competition, IBM had two-thirds of the market in 1961, the year the 1400 series was introduced. Of the 6,000 computers in the United States, 4,000 were IBM.

The integrated circuit was developed by Jack Kilby in 1958 at Texas Instruments, which included many transistors, resistors, and capacitors on a single chip. This was the most significant invention associated with third generation computers, started in the mid-1960s. Also, disk storage systems were developed and programming languages widely adopted. The obvious results: Computers were more powerful, cheaper, and flexible.

By the early 1960s IBM was successful with eight competing computers, but each had a different internal architecture, which required different software and peripherals. The other big problem: Competitors’ new machines were technologically superior. In 1961 IBM made their boldest stroke even, a $5-billion gamble that a single computer line with complete comparability using integrated circuits and other recent innovations that would span the entire computer market was the answer. They called this the System/360 (S/360), for the 360 degrees in a circle to emphasize they intended to meet the needs of all computer users. This was publicly announced in 1964 and the first System 360/40 delivered in April 1965. They spent incredible sums of money, were well behind schedule, and delivered the first computers without full testing (and they weren’t very reliable).6 But they were successful and had a period of 30 percent a year growth—IBM’s profits topped $1 billion in 1968. A new industry of plug-compatible peripherals developed, manufacturers making cheaper disk drives, terminals and other products that would plug into the 360/System computers. Other manufacturers found holes in the product line. CDC, for example, developed the 6600 supercomputer and kept the distinction of the world’s fastest computer.

Despite the competition, IBM dominated the mainframe market.7 They supplied applications programs and assisted writing programs, which encouraged expanded computer use (and buying more hardware) and locking customers into “proprietary systems” belonging to IBM. The Antitrust Division of the Justice Department took a dim view of this and sued IBM in 1969 (following a 1968 CDC case over marketing the 360/90). IBM changed some practices, including “unbundling” of peripherals and other products. There also was the use of standardized programming languages, which limited IBM dominance through software.

A major competitive area was the use of service bureaus. These firms could buy IBM computers, add cheaper components made by plug compatible manufacturers, and lease time to smaller firms that did not need a fulltime computer. The service firms developed their own software and system analysis expertise, operating the computers virtually 24 hours a day. The most famous was Electronic Data Systems (EDS), founded by H. Ross Perot (later acquired by General Motors).

Digital Equipment Corporation (DEC) introduced the PDP-8 in 1965, the first commercially successful minicomputer. The PDP-8 sold for $18,000, about 20 percent below the price of IBM’s smallest 360. The minicomputer became a growth area. IBM ignored minis until it introduced the Systems/3 in 1969. However, DEC was the leader here and IBM never became the dominant player.

Fourth generation computers started in the 1970s. The IBM version was the 370 series, an update of the System/360 but compatible with the older software and peripherals. The key technological improvement was the large-scale integrated circuit. The concept was placing all the processing circuits on a single chip, which became the microprocessor. The earliest was the Intel 4004, developed in 1971. However, the “Golden Age” of the mainframe was over. Growth peaked in 1968 and overcapacity plagued the industry by the early 1970s. Technology would head in new directions. It was the personal computer that next revolutionized the industry—IBM being a major but not dominant player. IBM still dominates the mainframe market with about a 90 percent market share (such as its zSeries). These computers store large amounts of information in one location (increasing reliability) and typically used by big companies, governments, and other large organizations for bulk transaction processing, platforms for e-commerce, and so on.

The Personal Computer

The idea of the personal computer (PC) is computer power used by the individual at home or office. Several models claimed to be the first PC, including the Kenbak-1 in 1971, the Micral in 1973, and the Altair 8800 in 1975. However, it was the Apple I, designed by Steve Wozniak in 1976 in partnership with Steve Jobs that had commercial success.

Xerox set up the Palo Alto Research Center (PARC) in 1970 in response to the prospect of the “paperless office.” It was here that many of the best ideas associated with the PC (as well as other technologies) originated—and they also had a claim as the first PC. Doug Engelbert introduced bit mapping and the mouse. Bob Metcalfe invented the Ethernet, originally to link computers to laser printers (a PARC invention). The graphical user interface also was produced at PARC.

The key to PCs is the microprocessor, an integrated circuit on a silicon chip with thousands, then millions, of transistors functioning as the central processing unit (CPU). The first commercial microprocessor was the Intel 4004, pioneered in 1971 with 2,300 transistors (a top-end Intel Xeon has 7.2 billion). This allowed the simultaneous miniaturization and mass production of basic computer components, with the added advantages of being relatively cheap (with more power and lower price). It costs a lot to develop a new semiconductor, but not much to manufacture. The strategy of most chip manufacturers is high volume to recover the development cost (which are fixed costs) over millions of chips.

Beyond the microprocessor is the necessary computer hardware and useful software, including the killer applications that make the PC especially desirable. Different hardware strategies were attempted. Apple tried to maintain proprietary technology. IBM used open standards, setting de facto industry standards—and allowing a gigantic clone market. Early “killer aps” software included the electronic spreadsheet (VisiCalc), database (dBase), and word processing (Wordstar). Accountants were major beneficiaries.

The first big commercial success was the Apple II, invented by Steve Wozniak and introduced in 1978. It looked like a PC, with floppy disk drive, color, and video graphics. It was a simple machine, cheap to produce, although sold at the somewhat inflated price of $3,000. Amazingly, it succeeded as a business machine.

Dan Brinklin, a computer programmer attending Harvard Business School, wrote business programs performing financial calculations for class. Attempting to format business calculations in a more general way, he invented calculating procedures based on data and formulas through a matrix of rows and columns. A simple idea—a brilliant idea—the electronic spreadsheet: the killer app accountants loved! Working with Bob Frankston, VisiCalc was developed specifically for the Apple II. It hit the market in 1979. Accountants at Big Business saw the potential and were the major buyers—the people with the room-sized mainframes.

The Macintosh was the personal computer ideal for many, with Steve Jobs’ vision—after he toured Xerox PARC—and had all the PARC innovations except networking.8 (Microsoft would not come close to the Mac’s graphical user interface for a decade.) It was introduced in 1984 and became a money machine. But Steve Jobs left Apple in 1985 and innovation stopped.

IBM entered the PC race in 1980, about the time that the PC market hit a billion dollars; a market now worthy of IBM domination. The strategy was to gather existing hardware and software and market the results in a big metal box with the blue IBM logo. The IBM chose the Intel 8088 processor, a somewhat out-of-date product, but with the related chips needed for a computer. BASIC was acquired from Microsoft and, in a strange twist of fate, the operating system that becomes MS-DOS. CP/M was the likely operating system, written by Gary Kindall. When IBM has trouble meeting Kindall, Bill Gates acquires QDOS from Tim Paterson. Gates signs a codevelopment deal with IBM, but maintained the rights to DOS—which becomes the core for Microsoft’s success. The only real IBM proprietary item was the ROM-BIOS chip, linking the hardware to Microsoft’s operating software.

Far from being a great machine, the IBM PC introduced in 1981 became the top selling personal computer by 1983. The PC were retailed in stores, but also sold directly to big business. In 1984 IBM sold over one million PCs to take a 40-percent market share. That same year IBM earned more income than any company ever had, $6.6 billion. The killer ap was Mitch Kapor’s Lotus 1–2–3. Kapor borrowed the electronic spreadsheet from VisiCalc, but optimized it for the IBM PC. Tough luck for Dan Brinklin, Kapor became rich and (somewhat) famous.

The success of the IBM PC led to IBM clones, machines that were virtually identical to the IBM but often with more power and a lower price. IBM established the de facto PC standards. Clone makers relied on reverse engineering: tear down the machine to its component parts. Most parts were off-the-shelf with eager suppliers willing to sell the same “IBM compatible” components and software. The most difficult part was the ROM-BIOS, first cracked by Compaq Computer. To appear different, the first Compaq was a 28-pound portable with a small monitor. The software was basically the same, including MS-DOS.

Early on, IBM was innovative—the PC-AT in 1984 with the Intel 80286 and a hard drive. Compaq and other clones followed, racing to Moore’s Law, with IBM struggling to stay even. Intel’s 386, 486, Pentium, Pentium Pro, and Pentium II followed—the processors that drove Moore’s Law price-performance curve. Approaching the end of the millennium, Compaq was the leading PC hardware producer using mainly retail stores, Dell the biggest direct mail producer, IBM a less than dominant player, Apple an also-ran.

Bill Gates saw the potential for the graphical user interface and Microsoft began working on its version in 1982. Windows was first shipped in 1985, but was a clunker. It was built on top of DOS, very slow, with few applications. Windows improved and Windows 3.0 is introduced in 1990, a product that sold in the millions. With Windows 95 Microsoft became the software leader, with an 85 percent share of PC operating systems. Could the Justice Department be far behind? Nope. Microsoft must be a nasty monopoly and the Antitrust Division sued. Microsoft’s behavior was less than stellar and debate raged on how unfair Microsoft was. When the Justice Department investigates, it’s time to reduce profit. Microsoft created reserve accounts (a standard income smoothing technique) to park what otherwise would be current income. Microsoft survived as a blue chip but stogy competitor.

IBM had set the PC standard—off-the-shelf hardware and software, which became “wintel,” Microsoft Windows and Intel chips. Dell Computer tried to dominate as the low-cost producer with some success. IBM faced too many competent (and cheaper) competitors, selling out the PC business to Chinese manufacturer Lenovo in 2005. The PC became an indispensable commoditized product, still more-or-less following Moore’s Law of better, faster, cheaper.

Networks

Networks with the ability to exchange data electronically seem to be one of the recent technological breakthroughs. However, Western Union evolved into a 19th-century national network providing instantaneous information via telegraph, followed by the AT&T telephone system decades later. AT&T’s research arm, Bell Labs, introduced many electronic inventions useful not only to phone service and long distance but future computer networks. For example, Bell Labs discovered that signals could travel long distances using microwaves, building microwave towers and later communication satellites across the country and beyond. The telecommunication network proved handy for computer networking, in terms of innovation, architecture, and infrastructure.

The concept of a computer network is simple. Hook up several terminals or PCs to a host (perhaps a mainframe computer or server) and connect the host to other networks. The point is sharing technology, computing power, and information. Accomplishing this goal was complicated and required a number of crucial inventions. By the mid-1960s video display and printing terminals were common, hooking the mainframe across office buildings and beyond. In 1964 IBM developed the SABRE reservation system for American Airlines, the first online transaction processing.

In 1966 the American Standard Code for Information Interchange (ASCII) was developed to standardize communications and computer code (in some sense, an update of the Morse Code). The Recommended Standard (RS-232C) defined how data moved over a communications link. The modem, an AT&T invention converting digital to analog signals and vice versa, connected computers to telephone lines, generally using the RS-232C interface. Large numbers of terminals and PCs could connect to mainframes and minicomputers over long distances. In 1973 Bob Metcalfe introduced Ethernet technology at Xerox PARC, the first of three major networking protocols (the others being ARCnet and IBM’s Token-Ring).

A basic business network is the local area network (LAN), computers initially connected by cable using within a single building. During the 1980s and beyond, integrated word processing, spreadsheet, and database programs were developed, followed by work group programs to search and share databases and analysis. In a server-based LAN a large computer with a multitasking operating system was the host or server and provided application programs, storage files, printing applications, and external communication links.

Internet

The Internet is a collection of computer networks tied together by telecommunication lines and standard protocols. It started in 1969 as the Pentagon’s Advanced Research Projects Agency network (ARPANET), a communications system lacking a central authority so it could survive a nuclear war. Ray Tomlinson wrote the e-mail message send and read software in 1972 and this massive supercomputer system became an expensive electronic post office.

When the Transmission Control Protocol or Internet Protocol (TCP or IP) was developed in the mid-1970s, other networks had the ability to link with ARPANET. The result was a network of networks using public domain software. Along with the Evil Empire, ARPANET fell by the wayside in 1989.

With the widespread use of PCs, workstations, and networks, the Internet became virtually free and accessible anywhere. It was an education system, a commercial system, a source of seemingly unlimited information. It also represents a new industry: new high-tech players like Netscape and Yahoo competing with giants such as Microsoft and IBM to dominate this new market. Different strategies developed. Microsoft built Internet access into the Windows operating system. Netscape planned to make Windows irrelevant by using its Navigator as a complete operating system. Becoming a cyber-presence proved to be easier than making a cyber-buck. While most crashed, the big success stories, such as Google and Facebook, joined the ranks of the largest corporations (by market cap) in the world.

Corporate intranets, networks using Internet standards and protocols, limit access by using “firewalls,” but using the Internet to span multiple locations. When the organization allows proprietary access to its intranet to selected business “partners” such as suppliers or distributors, this intranet becomes an extranet. For many organizations an Internet and intranet game plan proved mandatory, with a web presence essential for advertising, communicating, and selling goods and services. Most Internet and intranet transactions are business-to-business. Consumers can do research on Google, gossip on Facebook, buy books from Amazon, conduct banking transactions, order a custom computer, buy and finance a new car.

What about accounting information and strategy? All Big 4 firms have stellar websites and extol their systems expertise: enterprise software, web security, e-commerce. Doing business over the net can result in significant cost savings. For example, minimizing inventories by sharing information with suppliers and distributors. The Internet became a major source of financial information. Most large corporations have extensive management and financial data, the SEC its EDGAR system of 10-k and proxy statements of public corporations, and many sites have stock market and financial information available for thousands of firms.

Accounting Applications

Until recently, most companies starting to computerize accounting applications would use essentially the same process that General Electric used in 1953. This can be considered a three-stage process: Stage 1—computerize repetitive high-volume transaction areas: payroll, inventory, receivables, and payables. These would be handled independently, each with separate programming, processing, and data files. Early adoptions (especially 1950s) required substantial programming effort. Managers were given daily or monthly summary sheets of activity (based on batch processing). Later applications typically used software packages or started with service bureaus.9

Second-stage applications installed most transactions processing on computers, which were relatively integrated. A financial application included all cash receipts and disbursements. Credit sales processing included receivables, inventory, and sales. Financial information at this stage was more useful for many management decisions and often available at the manager’s desk via a work station.

Third-stage applications were integrated and included communication links across the firm and to both customers and suppliers. Much of this activity could be run on the Internet. Information was stored in sophisticated database files using data management software and shared across the organization. The big business version was sophisticated enterprise software with transactions captured and processed at their source in real time. Decision support software allowed substantial information for virtually all management decision areas.

Small businesses today can computerize using powerful PCs with accounting software. These are integrated systems that have much of the potential associated with third-stage applications and enterprise software, including communications links through the Internet.

The Role of Accounting

The current financial accounting paradigm is not much different than the double-entry system invented by Italian merchants seven centuries ago. Problems are readily apparent: there is information not captured by accounting (such as customer satisfaction, product quality) and summary data based on GAAP may have limited usefulness. Financial report information is not timely, since the focus is on periodic reporting. This paradigm becomes less relevant with advancing technology. However, business processing is making the leap as economic transactions become recognized as part of integrated systems based around resources, events, agents and locations. The accounting model is lagging the technology, a problem to be recognized.

Computer, Internet, cloud computing, and related technologies impact accounting and virtually all other professional fields. Progress is so fast that just keeping up is difficult. Big data is an important area, the ability to store vast amounts of data that potentially could be used for improving transparency, financial reporting, auditing, managerial accounting, and so on. Combined with artificial intelligence, new areas of accounting expertise could be on the horizon. Thomas Friedman emphasizes the importance of Moore’s Law:

where the doubling has gotten so big and fast we’re starting to see stuff that is fundamentally different in power and capability from anything we have seen before. … Global flows of commerce, finance, credit, social networks, and connectivity generally are weaving market, media, central banks, companies, schools, communities, and individuals more tightly together than ever.10

Accounting is right in the thick of these changes and the roles of accountants will expand, get more complex, and require increased education and experience.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset