images 12

USING INFORMATION ETHICALLY

IT has created a unique set of ethical issues related to the use and control of information. This chapter addresses those issues from various perspectives using three normative theories (stockholder, stakeholder, and social contract) to understand the responsible use and control of information by business organizations. Social contract theory is extended to evolving issues such as green computing and responsiveness to foreign governments when ethical tensions emerge. At the individual and corporate level, Mason's PAPA (Privacy, Accuracy, Property, Accessibility) framework is applied to information control. The chapter concludes with discussions of the ethical role of managers in today's dynamic world of social business and security controls to keep information safe and accurate.

When TJX Co. found the largest data security breach in the history of retailing, it faced a serious ethical dilemma not faced by many companies. It originally estimated that the credit card accounts of 45.6 million customers worldwide were affected (though that number has been updated to 94 million). Given the extent of the breach, multiple state, federal, and foreign jurisdictions dictated how and when it must inform affected customers and what corrective steps it must take. Most jurisdictions allowed 45 days for it to act following the determination of the breach. Any extension beyond 45 days would incur heavy fines. However, on the ethical side it became an even more pressing issue. Should TJX inform the affected customers immediately or wait till the breach was secured and all remedial steps were undertaken, which may take weeks?

As a socially responsible company, TJX makes its obligations to customers a priority. If it informed the customers immediately, the customers could start taking preventive steps to protect themselves from the identity theft and avoid any resulting financial and psychological losses. However, this means the breach would become public knowledge before the remedial steps were taken. More hackers would learn about it and possibly exploit the weakness in its IT infrastructure. Additionally, the financial markets would lose confidence in the company and severely punish shareholders. Such loss of image would also affect its ability to attract and retain high-quality employees in the long run. On the other hand, if it waited for 45 days, financial stability of many customers would be compromised through misuse of their credit card and other private records. This could result in a major class-action litigation, which might permanently affect the company.

As in the case of TJX, information collected in the course of business is important for the conduct of business and can even create valuable competitive advantage. But managers must ask ethical questions concerning just how that information will be used and by whom, whether they arise inside or outside the organization. Failing to do so can carry serious consequences. Failing to protect consumer information ultimately can hurt shareholder relationships if costs associated with a breach have a negative impact on the bottom line. Acting responsibly is likely to gain legitimacy in the eyes of key stakeholders. Further, failure to adequately control information can cause spillover effect with repercussions for an entire industry. For example, following the TJX breach, Massachusetts passed legislation with stringent requirements for any organization maintaining information about its citizens.1 As computer networks and their products come to touch every aspect of people's lives, and as the power, speed, and capabilities of computers expand, managers are increasingly challenged to govern their use and protect information residing on them in an ethical manner.

In such an environment, managers are called on to manage the information generated and contained within those systems for the benefit not only of the corporation, but also of society as a whole. The predominant issue, which arises due to the omnipresence of corporate IS, concerns the just and ethical use of the information companies collect in the course of everyday operations. Without official guidelines and codes of conduct, who decides how to use this information? More and more, this challenge falls on corporate managers. Managers need to understand societal needs and expectations to determine what they ethically can and cannot do in their quest to learn about their customers, suppliers, and employees, and to provide greater service.

Before managers can deal effectively with issues related to the ethical and moral governance of IS, they need to know what these issues are. Unfortunately, as with many emerging fields, well-accepted guidelines do not exist. Thus, managers bear even greater responsibility as they try to run their businesses and simultaneously develop control methods that meet both corporate imperatives and the needs of society at large. If this challenge appears to be a matter of drafting operating manuals, nothing could be further from the truth.

In a society whose legal standards are continually challenged, managers must serve as guardians of the public and private interest, although many may have no formal legal training and, thus, no firm basis for judgment. This chapter addresses many such concerns. It begins by expanding on the definition of ethical behavior and introduces several heuristics that managers can employ to help them make better decisions. Next this chapter elaborates on the most important issues behind the ethical treatment of information and some newly emerging controversies that will surely test society's resolve concerning the increasing presence of IS in every aspect of life.

This chapter takes a high-level view of ethical issues facing managers in today's environment. It focuses primarily on providing a set of frameworks the manager can apply to a wide variety of ethical issues. Outside the scope of this chapter are several important issues such as the digital divide (the impact of computer technology on the poor or “have-nots,” racial minorities, and third world nations), cyberwar (politically motivated hacking to conduct sabotage and espionage), or social concerns that arise out of artificial intelligence, neural networks, and expert systems. Although these are interesting and important areas for concern, the objective in this chapter is to provide managers with a way to think about the issues of information ethics and corporate responsibility.

images RESPONSIBLE COMPUTING

The technological landscape is changing daily. Increasingly, however, technological advances come about in a business domain lacking ethical clarity. Because of its newness, this area of IT often lacks accepted norms of behavior or universally accepted decision-making criteria. Daily companies encounter ethical dilemmas as they try to use their IS to create and exploit competitive advantages. These ethical dilemmas arise whenever a decision or an action reflects competing moral values that may impair or enhance the well-being of an individual or a group of people. They arise when there is no one clear way to deal with the ethical issue.

Managers must assess current information initiatives with particular attention to possible ethical issues. Because so many managers have been educated in the current corporate world, they are used to the overriding ethical norms present in their traditional businesses. Managers in the information age need to translate their current ethical norms into terms meaningful for the new electronic corporation. Clearly they need to consider information ethics, or the “ethical issues associated with the development and application of information technologies.”2

Consider three theories of ethical behavior in the corporate environment that managers can develop and apply to the particular challenges they face. These normative theories of business ethics—stockholder theory, stakeholder theory, and social contract theory—are widely applied in traditional business situations. They are “normative” in that they attempt to derive what might be called “intermediate-level” ethical principles: principles expressed in language accessible to the ordinary businessperson, which can be applied to the concrete moral quandaries of the business domain.3 Following is a description of each theory accompanied by an illustration of its application using the TJX example outlined at the beginning of this chapter.

Stockholder Theory

According to stockholder theory, stockholders advance capital to corporate managers, who act as agents in furthering their ends. The nature of this contract binds managers to act in the interest of the shareholders (i.e., to maximize shareholder value). As Milton Friedman wrote, “There is one and only one social responsibility of business: to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game, which is to say, engages in open and free competition, without deception or fraud.”4

Stockholder theory qualifies the manager's duty in two salient ways. First, managers are bound to employ legal, non-fraudulent means. Second, managers must take the long-term view of shareholder interest (i.e., they are obliged to forgo short-term gains if doing so will maximize value over the long-term).

Managers should bear in mind that stockholder theory itself provides a limited framework for moral argument because it assumes the ability of the free market to fully promote the interests of society at large. Yet the singular pursuit of profit on the part of individuals or corporations cannot be said to maximize social welfare. Free markets can foster the creation of monopolies and other circumstances that limit the ability of members of a society to secure the common good. A proponent of stockholder theory might insist that, as agents of stockholders, managers must not use stockholders' money to accomplish goals that do not directly serve the interests of those same stockholders. A critic of stockholder theory would argue that such spending would be just if the money went to further the public interest.

The stipulation under stockholder theory that the pursuit of profits must be legal and non-fraudulent would not limit TJX from waiting to announce the security breach until it had taken corrective action. The delay allowed by law might also have a positive impact on TJX's stock price. The delay would satisfy the test of maximizing shareholder value because it would help keep the price of its stock from dropping. Further, a recent survey has shown that customers are reluctant to shop in stores once data breaches have been announced, so delaying may be important for maintaining a steady stream of revenues for as long as possible. On the other hand, disgruntled customers would definitely stop shopping at its stores if TJX waited too long.5 Any lost revenues would weigh against managers' success in meeting the ethical obligation to work toward maximizing value. In the end it appears that TJX only took the actions necessary to bring its practices in line with those expected in industry.6

Stakeholder Theory

Stakeholder theory holds that managers, although bound by their relation to stockholders, are entrusted also with a responsibility, fiduciary or otherwise, to all those who hold a stake in or a claim on the firm.7 The term “stakeholder” is currently taken to mean any group that vitally affects the survival and success of the corporation or whose interests the corporation vitally affects. Such groups normally include stockholders, customers, employees, suppliers, and the local community, though other groups may also be considered stakeholders, depending on the circumstances. At its most basic level, stakeholder theory states that management must enact and follow policies that balance the rights of all stakeholders without impinging on the rights of any one particular stakeholder.

Stakeholder theory diverges most consequentially from stockholder theory in affirming that the interests of parties other than the stockholders also play a legitimate role in the governance and management of the firm. As a practical matter, it is often difficult, if not impossible to figure out what is in the best interest of each stakeholder group and then balance their conflicting interests.

When stakeholders feel that their interests haven't been considered adequately by the managers making the decisions, their only recourse may be to stop participating in the corporation: Customers can stop buying the company's products, stockholders can sell their stock, and so forth. But some stakeholders are not in a position to stop participating in the corporation. In particular, employees may need to continue working for the corporation, even though they dislike practices of their employers, or experience considerable stress due to their jobs.

Viewed in light of stakeholder theory, the ethical issue facing TJX presents a more complex dilemma. John Philip Coghlan, CEO of Visa USA noted, “A data breach can put an executive in an exceedingly complex situation, where he must negotiate the often divergent interests of multiple stakeholders.”8 TJX's shareholders stand to gain in the short-term, but what would be the effects on other stakeholders? One stakeholder group, the customers, definitely could benefit from knowing about the breach as soon as possible because they could take steps to protect themselves. Customers could be informed of the severity of the breach and protective actions that they could take through a special Web page, toll-free information hotlines, or Webcasts. TJX could also offer them free credit-monitoring service and compensate those who are injured. Research has shown that customers who receive adequate compensation after making a complaint are actually more loyal than those without complaints.9 On the other hand, if the breach were not announced, fewer hackers might attempt to break into the systems. Nonetheless, it probably could be shown that the costs to customers outweighed the benefits within the larger stakeholder group.

Social Contract Theory

Social contract theory places social responsibilities on corporate managers to consider the needs of a society. Social contract theorists ask what conditions would have to be met for the members of society to agree to allow the corporation to be formed. Thus, society bestows legal recognition on a corporation to allow it to employ social resources toward given ends that create more value to the society than the corporation consumes. Thus, society charges the corporation to enhance its welfare by satisfying particular interests of consumers and workers in exploiting the advantages of the corporate form.10

The social contract comprises two distinct components: social welfare and justice. The former arises from the belief that corporations must provide greater benefits than their associated costs, or society would not allow their creation. Thus, the social contract obliges managers to pursue profits in ways that are compatible with the wellbeing of society as a whole. Similarly, the justice component holds that corporations must pursue profits legally, without fraud or deception, and avoid activities that injure society.

Social contract theory meets criticism because no mechanism exists to actuate it. In the absence of a real contract whose terms subordinate profit maximization to social welfare, most critics find it hard to imagine corporations losing profitability in the name of altruism. Yet, the strength of the theory lies in its broad assessment of the moral foundations of business activity.

Applied to the TJX case, social contract theory would demand that the manager ask whether the delay in notifying customers about the security breach could compromise fundamental tenets of fairness or social justice. If customers were not apprised of the delay as soon as possible, TJX's actions could be seen as unethical because it would not seem fair to delay notifying them. If, on the other hand, the time prior to notification were used to take corrective action with the consequence of limiting not only hackers from stealing confidential customer information but also of forestalling future attacks that would impact society as a whole, the delay conceivably could be considered ethical.

Although these three normative theories of business ethics possess distinct characteristics, they are not completely incompatible. All offer useful metrics for defining ethical behavior in profit-seeking enterprises under free market conditions. They provide managers with an independent standard by which to judge the ethical nature of superiors' orders as well as their firms' policies and codes of conduct. Upon inspection, the three theories appear to represent concentric circles, with stockholder theory at the center and social contract theory at the outer ring. Stockholder theory is narrowest in scope, stakeholder theory encompasses and expands on it, and social contract theory covers the broadest area. Figure 12.1 summarizes these three theories.

What, ultimately, did TJX do? TJX disclosed the breach in January 2007, but did not release a comprehensive executive summary of the attack until March 2007, when it made a regulatory filing. The preceding December TJX had actually noticed suspicious software, at which point it hired IBM and General Dynamics to investigate. Three days later, these investigators determined that TJX's systems had been compromised due to its failure to implement adequate information security procedures and detect and limit unauthorized access.11 Further, the attacker still had access. Unfortunately, it took TJX 17 months to find out that their computer systems had been breached on numerous occasions on a colossal scale.12 It was over a year later, on February 29, 2008, when the President and CEO, Carol Meyrowitz, wrote a letter to “valued customers” about the breach that had been announced on January 2007. The TJX retail chain agreed to pay $24 and $41 million in restitution to MasterCard and Visa issuing lenders, respectively, who were affected by the breach. TJX also offered free credit monitoring for cardholders and a $30 store voucher.13 It wasn't until June 2009 that TJX finally reached a settlement of US$9.75 million with 41 states to compensate them for their investigations of the breach.14 Based on the newspaper accounts, one could surmise that TJX's overriding approach was more consistent with the stockholder theory than social contract theory. At least one stakeholder group, the customers, were not well-served.

images

FIGURE 12.1 Three normative theories of business ethics.

images CORPORATE SOCIAL RESPONSIBILITY

Application of social contract theory helps companies adopt a broader perspective. In this section we adopt a “big picture” by exploring two types of corporate social responsibility. We look at a new way of doing business, green computing. We also consider an ethical dilemma that more and more corporations are facing in our flattening world.

Green Computing

Gartner Inc. continues to put green computing at the top of its list of upcoming strategic technologies, signaling that more and more companies are becoming socially responsible.15 Green computing is concerned with using computing resources efficiently. The need for green computing is becoming more obvious when considering the amount of power needed to drive the world's PCs, servers, routers, switches, and data centers. Consider, for example, the computing power consumed by the five largest search companies at the peak of energy consumption in 2007. The five companies used about 2 million servers that needed approximately 2.4 gigawatts to run. By comparison, the massive Hoover Dam at a maximum only generates about 2 gigawatts. The situation was exacerbated by the cooling systems that companies added to combat the heat generated by the highest-performing systems. The usage patterns dropped since 2007, most likely because the financial crisis of 2008, a greater focus on sustainability and the use of more energy-saving technologies.16 However, the use is still substantial.

Companies are working in a number of ways to adopt more socially responsible approaches to energy consumption. In particular they are replacing older systems with more energy-efficient ones, moving workloads based on energy efficiency, using the most power-inefficient servers only at times of peak usage, improving air flows in data centers, and turning to cloud computing, as well as virtualization. As introduced in Chapter 6, virtualization lets a computer run multiple operating systems or several versions of the same operating system at the same time. SAP improved its data center efficiency on the computing through its continued investments in virtualization. Energy consumption dropped and they were able to eliminate 1,400 servers. SAP increased the number of virtual servers from 37% in 2009 to 49% in 2010. The virtualization rate of new servers grew from around 80% to 83%.17 On their Web site, SAP notes the value of green IT which in terms of energy usage “presents some of the greatest opportunities to increase our efficiency, improve our operations and reach our sustainability goals. It is one of the best examples of how creating positive impact also benefits our business. By reducing our total energy consumption, we can be both sustainable and profitable.”18

An especially creative green approach is the one contemplated by Google to cool the computers that power its search engine. Google's management is considering placing the computers in a fleet of barges anchored approximately seven miles (11 km) offshore. This would allow Google to turn tidal power, a continuous uninterruptible power source, into electricity. The sea could also be used to power a cooling pump to carry away the considerable heat generated by its computers.19 Or the sea could be used to power servers in an abandoned paper mill built more than half a century ago. Google transformed the mill in Hamina, Finland, into a data center with massive computing facilities. Part of the appeal of the mill was its underground tunnel system that was designed to pull water from the Gulf of Finland. Originally, that frigid Baltic water cooled a steam generation plant at the mill, but Google saw it as a way to cool its servers.20

Green programs can have a triple bottom line (TBL): economic, environmental, and social. That is, green programs create economic value while being socially responsible and sustaining the environment. Thus, they create a triple bottom line that is also known as “3BL,” or “People, Planet, Profit.”

Green computing can be considered from the social contract theory perspective: managers benefit society by conserving global resources when they make green, energy-related decisions about their computer operations. These are the “people” and “planet” motivations. However, their actions may also be evaluated from the stockholder theory perspective. Energy-efficient computers reduce not only the direct costs of running the computing-related infrastructure, but also the costs of complementary utilities, such as cooling systems for the infrastructure components. This creates a huge “profit” motivation for companies to turn “green.” The companies can become more environmentally friendly and reduce their energy costs at the same time.

Ethical Tensions with Governments

Organizations are also facing a dilemma reconciling their corporate policies with regulations in countries where they want to operate. “Managers may need to adopt much different approaches across nationalities to counter the effects of what they perceive as unethical behaviors.”21 For example, the United Arab Emirates threatened to shut off BlackBerry messaging, e-mail and Web browsing services if the device's maker, Research in Motion (RIM) did not provide certain information necessary for national security. RIM managers did not want to disclose confidential information. But they also didn't want to endanger UAE's national security. Even though a compromise was reached shortly before the shutdown was to go into effect, the case reflects the challenges of dealing with foreign governments.22

Censorship posed an ethical dilemma for Google. Enticed by the lure of a gigantic market, Google tried to set up business in China. The Chinese government that is quite use to developing and enforcing regulations, wanted to limit the overseas Web sites that Google's search engine could retrieve when operating in China. The Chinese government also interfered with Google's e-mail services, making it difficult for users to gain access to Gmail. Google faces the dilemma of how to deliver the level of services it deems appropriate in the face of stiff government regulation. It is a dilemma that is likely to become very common with increased globalization. In this case, the balancing act is at a national level.

images PAPA: PRIVACY, ACCURACY, PROPERTY, AND ACCESSIBILITY

In an economy that is rapidly becoming dominated by knowledge workers, the value of information is tantamount. Those who possess the “best” information and know how to use it, win. The recent trend in cloud computing permits high levels of computational power and storage to be purchased for relatively small amounts of money. Although this trend means that computer-generated or stored information now falls within the reach of a larger percentage of the populace, it also means that collecting and storing information is becoming easier and more cost effective. This circumstance certainly affects businesses and individuals for the better, but it also can affect them substantially for the worse.

Consider several areas of information ethics in which the control of information is crucial. Richard O. Mason23 identified four such areas, which can be summarized by the acronym PAPA: privacy, accuracy, property, and accessibility (see Figure 12.2). Mason's framework has limitations in terms of accommodating the range and complexity of ethical issues encountered in today's information intensive world. However, this framework helps to understand information ethics because it is both popular and simple.

Privacy

Many consider privacy to be the most important area in which their interests need to be safeguarded. Privacy has long been considered “the right to be left alone.”24 While it has been argued that so many different definitions exist that it is hard to satisfactorily define the term,25 it is “fundamentally about protections from intrusion and information gathering by others.”26 Typically, it has been defined in terms of individuals' ability to personally control information about themselves. But requiring individuals to control their own information would severely limit what is private. In today's information-oriented world, individuals really have so little control.

Though total control is difficult in today's digital world, individuals do have control to manage their privacy through choice, consent and correction. In particular, individuals can choose situations that offer the desired level of access to their information ranging from “total privacy to unabashed publicity.”27 Many are finding out that talking about their latest bashes in detail on Facebook does not go over very well with potential employers who access their pages. A recent study reported that 70% of U.S. recruiters and human resource professionals have rejected candidates based on data found online.28 Less than 20% of Facebook's members had adjusted the default privacy settings prior to Facebook's change in policy (when it came under fire) to enhance customer privacy.29 The concern about privacy on Facebook (and other Internet sites) varies across the globe; for example, it is greater in Europe than in the United States.

images

FIGURE 12.2 Mason's areas of managerial control.

Source: Adapted from Richard O. Mason, “Four Ethical Issues of the Information Age,” MIS Quarterly (March 1986), 10(1), 5.

Individuals may also exert control when they manage their privacy through consent. When they give their consent, they are granting access to otherwise restricted information and they are specifying the purposes for which it may be used. In granting access, they should recognize that extensive amounts of data that can personally identify them are being collected and stored in databases and this data can be used in ways that they had not intended. When giving their consent, individuals should try to anticipate how their information might be reused as a result of data mining or aggregation. They should also try to anticipate unauthorized access through security breaches or internal browsing in companies whose security is lax. Finally, individuals should have control in managing their privacy by being able to access their personal information and correct it if it is wrong. To protect the integrity of information collected about them, federal regulators have recommended allowing consumers limited access to corporate information databases. Consumers thus could update their information and correct errors.

For organizations, the tension between the proper use of personal information and information privacy is considered to be one of the most serious ethical debates of the information age.30 One of the main organizational challenges to privacy is surveillance of employees.31 For example, to ensure that employees are productive, employers can monitor their employees' e-mail and computer utilization while they are at work, even though they have not historically monitored telephone calls.

Individuals are also facing privacy challenges from organizations providing them with services. Their actions are being traced not only with cookies, but maybe also with “beacons,” “Flash cookies,” and even “supercookies” that can follow individuals' surfing behaviors without them knowing it. Every time someone logs onto one of the main search engines, a “cookie” is placed in their hard drive so that these companies can track their surfing habits. A simple “cookie,” which is a text message given to a Web browser by a Web server, has been ruled to be legal by the U.S. courts. The browser stores the cookie's message with user identification codes in a tracking file that is sent back to the server each time the browser requests a page from the server.32 A recent examination of the 50 most popular American Web sites determined that over two-thirds of the over 3,000 tracking files installed by a total of 131 companies after people visited these Web sites were used to create rich databases of consumer profiles that can be sold.33

Apple and Google recently came under fire for collecting and storing unencrypted location information from both personal computers and mobile devices. The information was obtained after the computer or mobile searched for available wireless networks that were nearby. Typically the users gave permission to the companies to determine the computer's approximate location, but many did not know that the information was being stored. Going against previous policy about keeping information about Internet searches sacrosanct, Google now combines user information from its sister sites, Gmail, Google+, and YouTube, to direct user searches.34

Do customers have a right to privacy while searching the Internet? Courts have decided that the answer is no, but as society moves ahead, the right to monitor customer habits in terms of their phone usage, location, emailing behaviors, and a myriad of other behaviors will be affected by how managers decide to use the information that they have collected.

Why would people be willing to give up this privacy? First, by supplying the information to vendors, they can receive personalized services in return. For example, the location device on their mobile might alert them that the restaurant that they are just walking by has a special off on one of their favorite foods—sushi. Second, they might actually be paid for the information at a price that exceeds what they are giving up. Third, they might see providing information, such as that contained on many Facebook pages, as something that everybody is doing. Some individuals, especially younger individuals, share information that would otherwise be considered private simply because they view it as a way to have their friends know them and as a way to get to know their friends. Social interaction among “digital native” individuals who have grown up in the Internet age, do not know about society without the Web. They are comfortable building relationships, and consequently sharing information, on the Web that others might consider private. Unfortunately, what's posted on the Web is there forever, and while it may be fun to share it now, there may be unintended consequences in the future.

Governments around the world are grappling with privacy legislation. Not surprisingly, they are using different approaches for ensuring the privacy of their citizens. The United States' sectoral approach relies on a mix of legislation, regulation, and self-regulation. It is based upon a legal tradition with a strong emphasis on free trade. In the United States, privacy laws are enacted in response to specific problems for specific groups of people or in specific industries. Examples of the United States' relatively limited privacy legislation include the 1974 Privacy Act that regulates the U.S. government's collection and use of personal information and the 1998 Children's Online Privacy Protection Act that regulates the online collection and use of children's personal information.

The Gramm–Leach–Bliley Act of 1999 applies to financial institutions. It followed in the wake of banks selling sensitive information, including account information, Social Security numbers, credit card purchase histories, and so forth to telemarketing companies. This U.S. law somewhat mitigates the sharing of sensitive financial and personal information by allowing customers of financial institutions the limited right to “opt-out” of the information sharing by these institutions with non-affiliated third parties. This means that the financial institution may use the information unless the customer specifically tells the institution that his or her personal information cannot be used or distributed.

The Health Insurance Portability and Accountability Act (HIPAA) of 1996 is designed to safeguard the electronic exchange privacy and security of information in the health care industry. Its Privacy Rule ensures that patients' health information is properly protected while allowing its necessary flow for providing and promoting health care. HIPAA's Security Rule specifies national standards for protecting electronic health information from unauthorized access, alteration, deletion, and transmission.

The Fair Credit Reporting act limits the use of consumer reports provided by consumer reporting agencies to “permissible purposes” and grants individuals the right to access their reports and correct errors in them.

Social Business Lens: Personal Data

Social IT, especially Facebook, is redefining how people think about themselves and define themselves to others. Sherry Turkle, the author of Home Alone and a professor at Massachusetts Institute of Technology, says about Facebook and the new marketplace for personal data: “I can't think of another piece of passive software that has gotten so embedded in the cultural conversation. . . It crystallized a set of issues that we will be defining for the next decade—self, privacy, how we connect and the price we are willing to pay for it.”

What many people who supply this data about themselves may not realize is that that data may exist indefinitely in the ether. Furthermore, that data about personal lives and wants may be mined indefinitely by technology companies. Lori Andrews, in her book I Know Who You Are and I Saw What you Did: Social Networks and the Death of Privacy, is concerned that the Internet companies are in business for the money and hence they really would prefer to keep their customers in the dark about how their personal data is being used to generate profits.

And what is Andrews' solution? She proposes a social network constitution that can be used to judge the activities of social networks. Her constitution has ten articles and begins with: “We the people of Facebook nation.” Articles like “No person shall be discriminated against based on his or her social network activities or profile” or “Each individual shall have control over his or her image from a social network, including over the image created by data aggregation” point to the need for people who supply data to social networks to demand respect for the data. Her focus is on rights, but not individuals' responsibilities in keeping private information private.

Some suggest that reputation management is destined to be big business in the future, given the amount of personal data on the Web. BusinessWeek noted that online reputation management is booming. Companies such as Reputation.com and Elixir offer services to help individuals, and companies, clean up their online presence so that searches for their name produce mostly positive references.

It could be argued that individuals need to recognize that surrendering their privacy in change of coupons, free music, and videos or customized products and services may lead to the loss of something of value. . . And that the data may remain accessible far longer than they want it to be.

Sources: J. Wortham, “It's Not About You, Facebook. It's About Us,” The New York Times (February 12, 2012), BR3; E. Morozov, “Sharing it All,” New York Times Book Review (January 29, 2012), 18; and T. McNichol, “Fixing the Reputation of Reputation Managers,” BusinessWeek (February 2, 2012), http://www.businessweek.com/magazine/fixing-the-reputations-of-reputation-managers-02022012.html (accessed on April 5, 2012).

In contrast to the United States' sectoral approach, and with strong encouragement of self-regulation by industry, the European Union relies on omnibus legislation that requires creation of government data protection agencies, registration of databases with those agencies, and in some cases prior approval before processing personal data. It is linked with the continental European legal tradition where privacy is a well-established right.35 Because of pronounced differences in governmental approaches, many U.S. companies were concerned that they would be unable to meet the European “adequacy” standard for privacy protection specified in the European Commission's Directive 95/46/EC on Data Protection that went into effect in 1998. This directive sets standards for the collection, storage, and processing of personal information. It prohibits the transfer of personal data to non-European Union nations that do not meet the European privacy standards. Many U.S. companies believed that this directive would significantly hamper their ability to engage in many trans-Atlantic transactions. However, the U.S. Department of Commerce (DOC), in consultation with the European Commission, developed a “safe harbor” framework in 2000 that allows U.S. companies to be placed on a list maintained by the DOC. The U.S. companies must demonstrate through a self-certification process that they are enforcing privacy at a level practiced in the European Union.36

Accuracy

The accuracy, or the correctness, of information assumes real importance for society as computers come to dominate in corporate record-keeping activities. When records are inputted incorrectly, who is to blame? Recently, a couple was told by Bank of America, their mortgage holder, that they would have to vacate their house by Christmas Eve unless they put their house up for forced sale. The couple was flabbergasted because they had never missed making a house payment. They had, however, refinanced their home less than a year earlier. Although they used a conventional mortgage, they had checked out loan rates on the Make Home Affordable Program. Unbeknownst to them, the mere initiation of this type of loan application triggers to the credit world that the applicant is in bad financial straits. A comedy of errors ensued in which the limit on a credit card was reduced, their good accounts were cancelled, and their credit score was ruined. Another unit of Bank of America admitted to erroneously reporting to credit agencies that the couple was seeking a loan modification, ruining their credit rating and as the result putting their mortgage into default. This unit sent a letter of apology in September and turned the case over to a special unit at Bank of America that is charged with dealing with severe customer issue. The special unit was supposed to notify the credit reporting agencies that the couple was a good credit risk. Unfortunately, it didn't do so, costing the couple much anxiety and financial loss.37 Although this incident may highlight the need for better controls over the bank's internal processes, it also demonstrates the risks that can be attributed to inaccurate information retained in corporate systems. In this case, the bank was responsible for the error, but it paid little—compared to the family—for its mistake. Although they cannot expect to eliminate all mistakes from the online environment, managers must establish controls to ensure that situations such as this one do not happen with any frequency.

Over time it becomes increasingly difficult to maintain the accuracy of some types of information. Although a person's birth date does not typically change (my grandmother's change of her birth year notwithstanding), addresses and phone numbers often change as people relocate, and even their names may change with marriage, divorce, and adoption. The European Union Directive on Data Protection requires accurate and up-to-date data and tries to make sure that data is kept no longer than necessary to fulfill its stated purpose. Keeping data only as long as it is necessary to fulfill its stated purpose is a challenge many companies don't even attempt to meet.

Property

The increase in monitoring leads to the question of property, or who owns the data. Now that organizations have the ability to collect vast amounts of data on their clients, do they have a right to share data with others to create a more accurate profile of an individual? Consider what happens when a consumer provides information for one use, say a car loan. This information is collected and stored in a data warehouse and then “mined” to create a profile for something completely different. And if some other company creates such consolidated profiles, who owns that information, which in many cases was not divulged willingly for that purpose? Who owns images that are posted in cyberspace? With ever more sophisticated methods of computer animation, can companies use newly “created” images or characters building on models in other media without paying royalties? Mason suggests that information, which is costly to produce in the first place, can be easily reproduced and sold without the individual who produced it even knowing what is happening—and certainly not being reimbursed for its use. In talking about this information that is produced Mason notes:

. . . information has the illusive quality of being easy to reproduce and to share with others. Moreover, this replication can take place without destroying the original. This makes information hard to safeguard since, unlike tangible property, it becomes communicable and hard to keep it to one's self.38

Accessibility

In the age of the information worker, accessibility, or the ability to obtain the data, becomes increasingly important. Would-be users of information must first gain the physical ability to access online information resources, which broadly means they must access computational systems. Second and more important, they then must gain access to information itself. In this sense, the issue of access is closely linked to that of property. Looking forward, the major issue facing managers is how to create and maintain access to information for society at large without harming individuals who have provided much, if not all, of the information.

Today's managers must ensure that information about their employees and customers is accessible only to those who have a right to see and use it. They should take active measures to see that adequate security and control measures are in place in their companies. It is becoming increasingly clear that they also must ensure that adequate safeguards are working in the companies of their key trading partners. The managers at TRICARE, a military health provider, were no doubt embarrassed when they reported to 4.9 million active and retired military personnel and their families that their personal and medical records were compromised. Back-up tapes containing records back to 1992 had been left in care of an employee of TRICARE's data contractor, Science Applications International Corp. The tapes were stolen from the car in San Antonio, Texas, while they were being transferred from one federal facility to another.39 Accessibility clearly is an issue that extended beyond TRICARE's internal systems.

Accessibility is becoming increasingly important with the surge in identity theft, or “the taking of the victim's identity to obtain credit, credit cards from banks and retailers, steal money from the victim's existing accounts, apply for loans, establish accounts with utility companies, rent an apartment, file bankruptcy or obtain a job using the victim's name.”40 In short, identity theft is a crime in which the thief uses the victim's personal information (such as driver's license number or Social Security number) to impersonate the victim. In TJX's case, the security breach made its customers vulnerable to identity theft.

According to subject matter experts, identity theft is categorized in two ways: true name and account takeover. True name identity theft means that the thief uses personal information to open new accounts. The thief might open a new credit card account, establish cellular phone service, or open a new checking account to obtain blank checks. Account takeover identity theft means the imposter uses personal information to gain access to the person's existing accounts. Typically, the thief will change the mailing address on an account and run up a huge bill before the person whose identity has been stolen realizes there is a problem.

Identity theft is a problem for both individuals and businesses. The U.S. government keeps statistics on reported cases of identity theft.41 The incidence of identity theft had been growing at an amazing rate during the early part of this century. A total of 8.6 million households experienced identity theft in 201042 and American businesses and individuals experienced losses to the tune of $54 billion a year earlier because of identity theft.43 The most victimized tend to be college students and young adults who have not learned to use security software or shred documents.

Although some cases of individual identity theft can be traced to carelessness on the part of victims, some may also be credited to the failure of businesses to limit accessibility to their databases. Businesses are also subject to significant losses due to identity theft. Illegitimate e-mail messages that solicit personal information for the thief can ruin a business's hard-won reputation. Purchases made by the thief must be paid for, and often that loss is covered by the business. The U.S. Federal Trade Commission (FTC) maintains a Web site to help both individuals and businesses manage identity theft.44

Managers' Role in Ethical Information Control

Managers must work to implement controls over information highlighted by the PAPA principles. Not only should they deter identity theft by limiting inappropriate access to customer information, but they should also respect their customers' privacy. Three best practices can be adopted to help improve an organization's information control by incorporating moral responsibility:45

  • Create a Culture of Responsibility. CEOs and top-level executives should lead in promoting responsibility for protecting both personal information and the organization's information systems. Internet companies should post their policies about how they will use private information and make a good case as to why they need the personal data that they gather from customers and clients. Mary Culnan noted in CIO magazine about customers providing information: “If there are no benefits or if they aren't told why the information is being collected or how it's being used, a lot of people say ‘Forget it.’”46 The costs of meaningfully securing the information may outweigh the obvious benefits. . . unless there is a breach. Thus, it is unlikely that an organization can create a culture of integrity and responsibility unless there is a moral commitment form the CEO.
  • Implement Governance Processes for Information Control. In Chapter 8 we discuss the importance of mechanisms to identify the important decisions that need to be made and who would make them. Further, control governance structures such as COBIT and ITIL can help identify risks to the information and behaviors to promote information control. These concepts of governance also apply to information control. Organizations need governance to make sure that their information control behaviors comply with the law and reflect their risk environment.

    Geographic Lens: Should Subcultures be Taken into Account When Trying to Understand National Attitudes Toward Information Ethics?

    Ethics can naturally be expected to vary across countries. An interesting study of 1,100 Chinese managers showed that it can also vary depending upon subcultures resulting from major events within a country. Maris Martinsons and David Ma studied the responses to PAPA-based ethical situations made by three different Chinese generations: Republican—people born before the People's Republic of China was established in 1949; Revolution—people born between 1950 and 1970 under Communist rule. This generation lived during Mao Zedong's Cultural Revolution in 1966 and the Great Leap Forward (1958–1961); Reform—people born after 1970 when Deng Xiaoping's government introduced Open Door Policy and a One Child Policy as part of economic and social reforms.

    Survey results indicate that there are significant differences in information ethics across generations. The Revolution Generation experienced a profound event that appears to have increased its ethical acceptance of both inaccurate information and intellectual property violations. The Reform generation is much less accepting of privacy violations than older generations of Chinese managers. They are more conscious of the right to privacy and less inclined to compromise the privacy of others.

    Source: M. G. Martinsons and D. Ma, “Subcultural Differences in Information Ethics across China: Focus on Chinese Management Generation Gaps,” Journal of AIS (2009), 10(Special Issue), 816–833.

  • Avoid Decoupling. Often organizations use complex processes to treat rather personal privacy issues. Should an apparent conflict appear, managers can decouple the impact to individuals from institutional processes and mechanisms. In that way, they can shift the responsibility away from themselves and onto the institution. It would be much better if the managers were to act as if the customer's information were actually their own. This would mean that in delicate situations involving privacy or other issues of information control, managers would ask themselves “How would I feel if my information was handled in this way?”47

images SECURITY AND CONTROLS

It should be clear from the earlier discussion that the PAPA principles work hand-in-hand with security. Unfortunately, organizations more often than not may rely on luck rather than on proven information systems controls, at least according to an Ernst & Young survey.48 More than half of the high-level executives responding to the survey reported that hardware, telecommunications, and software failures, as well as major viruses, Trojan horses, or Internet worms, had resulted in unexpected or unscheduled outages of their critical business systems. The survey confirmed that companies turn to technical responses to deal with these and other threats. In particular considerable emphasis is placed on using technology (i.e., antivirus countermeasures, spam-filtering software, intrusion detection systems) to protect organizational data from unauthorized hackers and undesirable viruses. Managers go to great lengths to make sure their computers are secure from outsider access, such as a hacker who seeks to enter a computer for sport or for malicious intent. They also try to safeguard against other external threats such as telecommunications failure, service provider failure, spamming, or distributed denial of service (DDoS) attacks.

Technologies have been devised to manage the security and control problems. Figure 12.3 summarizes three types of tools (e.g., firewalls, passwords, and filtering tools) that restrict access to information on a computer by preventing access to the server on the network. They provide warning for early discovery of security breaches, limit losses suffered in case of security breaches, analyze and react to security breaches (and try to prevent them from reoccurring), and recover whatever has been lost from security breaches.49

As the physical corporate walls are torn down and more workers work from remote locations, enterprises use technological advances to keep up with their business and network security needs. Some of these technologies include antivirus and antispyware, desktop firewalls, devices that can trace stolen laptops, devices that prevent USB mass-storage devices or iPods from accessing data on home-based computers, or data-leak prevention technology that keeps sensitive corporate data from being printed out, e-mailed, or saved to removable media without the proper authorization, even on remote endpoints.50

Additional technological approaches to security and privacy may include a combination of software and hardware. For example, some of today's laptop computers have built-in fingerprint identification pads to prevent unauthorized use. Biometrics are also being considered for security purposes at national levels. For example, the United Kingdom passed the Identity Cards Act in 2006 that required nationals to obtain a compulsory national identity card that contained 50 different types of information, including name, birth date and place, current and past addresses, a head and shoulders photograph, fingerprints, an iris scan and other biometric information, personal reference information, and registration and record histories. The British government argued that the card would give people a convenient way to prove their identity and prevent identity theft by providing a unique individual identifier. It also would offer a secure way of identifying people for national security, detect crime, aid in enforcing immigration controls, prevent illegal workers, and assist in providing public services. Opponents feared the card would create a “Big Brother” world and the unique identifier, ironically, would increase identity theft because all necessary information was contained in one central location. After much public debate, the Identity Cards Act was repealed in 2010 and the card was scrapped for nationals.

images

FIGURE 12.3 Security and control tools.

Sources: Adapted from J. Berleur, P. Duquenoy, and D. Whitehouse, “Ethics and the Governance of the Internet,” IFIP-SIG9.2.2, White paper (September 1999); and Tavani and Moor, “Privacy Protection, Control of Information and Privacy-Enhancing Technologies,” Computers and Society (March 2001), 6–11.

Technological security controls extends beyond dealing with external threats; managers must also guard against potentially more lethal threats—threats that originate from within the company. Internal threats include operational errors (i.e., loading the wrong software) and former or current employee misconduct involving information systems, as well as hardware or software failure. Managers from the highest echelons down must champion the human aspect of protecting information. This means that they must be supportive of efforts to develop employees into the company's strongest layer of defense. These efforts include training and awareness programs to alert employees to risks, make them aware of countermeasures that exist to mitigate these risks, and drill into them the importance of security, as well as awareness programs. Buttressing the technological controls, training, and awareness programs with a good governance structure reflecting security procedures and policies and an overall information security strategy can help round out a company's security efforts.

images SUMMARY

  • Due to the asymmetry of power relationships, managers tend to frame ethical concerns in terms of refraining from doing harm, mitigating injury, and paying attention to dependent and vulnerable parties. As a practical matter, ethics is about maintaining one's own, independent perspective about the propriety of business practices. Managers must make systematic, reasoned judgments about right and wrong and take responsibility for them. Ethics is about decisive action rooted in principles that express what is right and important, and about action that is publicly defensible and personally supportable.
  • Three important normative theories describing business ethics are (1) stockholder theory (maximizing stockholder wealth), (2) stakeholder theory (maximizing the benefits to all stakeholders while weighing costs to competing interests), and (3) social contract theory (creating value for society that is just and non-discriminatory).
  • Social contract theory offers the broad perspective to display corporate responsibility in such areas as green computing and dealing with ethical issues in tensions with foreign governments about IT and its use.
  • PAPA is an acronym for the four areas in which control of information is crucial: privacy, accuracy, property, and accessibility.
  • To enhance ethical control of information systems companies should create a culture of responsibility, implement governance processes, and avoid decoupling.
  • Security looms as a major threat to Internet growth. Businesses are bolstering security with hardware, software, and communication devices.

images KEY TERMS

accessibility (p. 365)

accuracy (p. 364)

cookie (p. 361)

green computing (p. 357)

identity theft (p. 366)

information ethics (p. 352)

privacy (p. 359)

property (p. 365)

social contract theory (p. 354)

stakeholder theory (p. 353)

stockholder theory (p. 352)

images DISCUSSION QUESTIONS

  1. Private corporate data is often encrypted using a key, which is needed to decrypt the information. Who within the corporation should be responsible for maintaining the “keys” to private information collected about consumers? Is that the same person who should have the “keys” to employee data?
  2. Check out how Google has profiled you. Using your own computer, go to Ad Preferences: www.google.com/ads/preferences. How accurate is the picture Google paints about you in your profile?
  3. Consider arrest records, which are mostly computerized and stored locally by law enforcement agencies. They have an accuracy rate of about 50%—about half of them are inaccurate, incomplete, or ambiguous. These records often are used by others than just law enforcement. Approximately 90% of all criminal histories in the United States are available to public and private employers. Use the three normative theories of business ethics to analyze the ethical issues surrounding this situation. How might hiring decisions be influenced inappropriately by this information?
  4. The European Community's Directive on Data Protection strictly limits how database information is used and who has access to it. Some of the restrictions include registering all databases containing personal information with the countries in which they are operating, collecting data only with the consent of the subjects, and telling subjects of the database the intended and actual use of the databases. What effect might these restrictions have on global companies? In your opinion, should these types of restrictions be made into law? Why or why not? Should the United States bring its laws into agreement with the EU directive?
  5. Should there be a global Internet privacy policy?
  6. Is sending targeted advertising information to a computer using cookies objectionable? Why or why not?
  7. What is your opinion of the British Identity Card discussed in this chapter?

CASE STUDY 12-1
ETHICAL DECISION MAKING

Situation 1

The help desk is part of the group assigned to Doug Smith, the manager of office automation. The help desk has produced very low quality work for the past several months. Smith has access to the passwords for each of the help desk members' computer accounts. He instructs the help desk supervisor to go into each hard drive after hours and obtain a sample document to check for quality control for each pool member.

Discussion Questions
  1. If you were the supervisor, what would you do?
  2. What, if any, ethical propositions have been violated by this situation?
  3. If poor quality was found, could the information be used for disciplinary purposes? For training purposes?
  4. Apply PAPA to this situation.

Situation 2

Kate Essex is the supervisor of the customer service representative group for Enovelty.com, a manufacturer of novelty items. This group spends its workday answering calls, and sometimes placing calls, to customers to assist in solving a variety of issues about orders previously placed with the company. The company has a rule that personal phone calls are only allowed during breaks. Essex is assigned to monitor each representative on the phone for 15 minutes a day, as part of her regular job tasks. The representatives are aware that Essex will be monitoring them, and customers are immediately informed when they begin their calls. Essex begins to monitor James Olsen, and finds that he is on a personal call regarding his sick child. Olsen is not on break.

Discussion Questions
  1. What should Essex do?
  2. What, if any, ethical principles help guide decision making in this situation?
  3. What management practices should be in place to ensure proper behavior without violating individual “rights”?
  4. Apply the normative theories of business ethics to this situation.

Situation 3

Jane Mark was the newest hire in the IS group at We_Sell_More.com, a business on the Internet. The company takes in $30 million in revenue quarterly from Web business. Jane reports to Sam Brady, the VP of IS. Jane is assigned to a project to build a new capability into the company Web page that facilitates linking products ordered with future offerings of the company. After weeks of analysis, Jane concluded that the best way to incorporate that capability is to buy a software package from a small start-up company in Silicon Valley, California. She convinces Brady of her decision and is authorized to lease the software. The vendor e-mails Jane the software in a ZIP file and instructs her on how to install it. At the initial installation, Jane is asked to acknowledge and electronically sign the license agreement. The installed system does not ask Jane if she wants to make a backup copy of the software, so as a precaution, Jane takes it on herself and copies the ZIP files sent to her onto a thumb drive. She stores the thumb drive in her desk drawer.

A year later, the vendor is bought by another company, and the software is removed from the marketplace. The new owner believes this software will provide them with a competitive advantage they want to reserve for themselves. The new vendor terminates all lease agreements and revokes all licenses on their expiration. But Jane still has the thumb drive she made as backup.

Discussion Questions
  1. Is Jane obligated to stop using her backup copy? Why or why not?
  2. If We_Sell_More.com wants to continue to use the system, can they? Why or why not?
  3. Does it change your opinion if the software is a critical system for We_Sell_More.com? If it is a non-critical system? Explain.

Situation 4

Some of the Internet's biggest companies (i.e., Google, Microsoft, Yahoo, IBM, and Verisign) implemented a “single sign-on” system that is now available at more than 50,000 Web sites. As corporate members of the OpenID Foundation, they developed a system that is supposed to make it easier for users to sign on to a number of sites without having to remember multiple user IDs, passwords, and registration information. Theoretically, users also have a consistent identity across the Web. Under OpenID, the companies share the sign-on information for any Web user who agrees to participate. They also share personal information such as credit card data, billing addresses, and personal preferences.

Discussion Questions
  1. Discuss any threats to privacy in this situation.
  2. Who would own the data? Explain.
  3. Who do you think should have access to the data? How should that access be controlled?

Situation 5

SpectorSoft markets eBlaster as a way to keep track of what your spouse or children are doing online. Operating in stealth mode, eBlaster tracks every single keystroke entered into a computer, from instant messages to passwords. It also records every e-mail sent and received and every Web site visited by the unsuspecting computer user. The data is sent anonymously to an IP address of the person who installed eBlaster. eBlaster could also be installed onto a business's computers.

Discussion Questions
  1. Do you think it would be ethical for a business to install eBlaster to ensure that its employees are engaged only in work-related activities? If so, under what conditions would it be appropriate? If not, why not?
  2. Apply the normative theories of business ethics to this situation.

Situation 6

Google, Inc. had a unique advantage as of March 2012. By combining information about user activity from its many popular applications (such as Gmail, Google+ and YouTube), Google algorithms were able to alert users when things might be of interest. This vast amount of information, analyzed properly, gave Google a way to compete. By combining data with information from Internet searches, Google could better compete against applications such as Facebook.

But this was a departure from its earlier privacy policy. In June 2011, the Executive Chairman of Google had declared, “Google will remain a place where you can do anonymous searches [without logging in]. We're very committed to having you have control over the information we have about you.”

This may be possible for users who don't login to a Google account, but for those with Gmail or other personal accounts or an Android mobile phone, it's more difficult to remain anonymous. Offering a counter viewpoint, Chirstopher Soghoian, an independent privacy and security researcher said, “Google now watches consumers practically everywhere they go on the Web [and anytime they use an Android phone]. No single entity should be trusted with this much sensitive data.”

Discussion Questions
  1. Do you see any ethical issues involved in Google's new approach to combining information from a particular user? Why or why not?
  2. How might users change their behaviors if they were aware of this new approach?
  3. How is Google's combining data about individuals in one central location any different ethically from the United Kingdom placing all individual's necessary information on an identity card?
  4. Apply the normative theories of business ethics to Google's new policy about combining user information?

Situation 7

Spokeo is a company that gathers online data for employers, the public or anybody who is willing to pay for their services. Clients include recruiters and women who want to find out if their boyfriends are cheating on them. Spokeo recruits via ads that urge “HR-Recruiters—Click Here Now.”

Discussion Questions
  1. Do you think it would be ethical for a business to hire Spokeo to find out about potential employees? If so, under what conditions would it be appropriate? If not, why not?
  2. Do you think it is ethical for women to hire Spokeo to see if their boyfriends are cheating on them? Why or why not?

Sources: Situations 1 to 4 adapted from short cases suggested by Professor Kay Nelson, Southern Illinois University—Carbondale. The names of people, places, and companies have been made up for these stories. Any similarity to real people, places, or companies is purely coincidental. Situation 6 is from Julia Angwin, “Google Widens Its Tracks,” Wall Street Journal (July 30, 2010), http://online.wsj.com/article/SB10001424052970203806504577181371465957162.html?mod=djem_jiewr_IT_domainid (accessed on January 28, 2010). Situation 7 is from Lori Andrews, “Facebook is Using You,” The New York Times (February 5, 2012), SR7.

CASE STUDY 12-2
MIDWEST FAMILY MUTUAL GOES GREEN

Midwest Family Mutual Insurance Co., an insurance company with nearly 100 million in written premiums in 2011, considers itself to be “operationally green.” Through a variety of initiatives it has reduced its annual energy, natural gas, and paper consumption by 63%, 76%, and 65%, respectively. Ron Boyd, the carrier's CEO, attributes most of the improvements in energy usage to creating a virtual work- from- home office environment. As a result of implementing a series of electronic processes and applications. These include imaging and workflow technology, networking technology, and a VoIP network. In 2006, the year these savings were reported, all but two of Midwest Family Mutual's 65 employees worked from home. In addition to the energy savings that Midwest Family Mutual has directly experienced, Boyd estimates that the company's telecommuting policy has resulted in fuel savings of at least 25,000 gallons.

Though green computing was a commendable goal in itself, Midwest Family Mutual's bottom line also has benefitted from the company's socially responsible approach. Over a five year period Midwest Family Mutual's was able to shave its expense ratio to 29.9% from 33%. Boyd states: “Being environmentally green can equate to financial green.”

Green computing grew out of Midwest Family Mutual's IT successes, according to Boyd. AS the company started realizing saving from the electronic processes it implemented, it started thinking about telecommuting arrangements that allowed its employees to work from home. He adds, “It became obvious that many of our jobs could be done wherever a high-speed connection existed...VOIP completed the technology requirements for all [employees] to work from home.”

Boyd summarizes: “We became green as a side benefit of saving resources and cost.” The company continued its green policy with its decision to sell its 24,000-square-foot office building in Minnetonka, Minnesota. However, in order to provide more centralized regional service to agents in the new states in which it was recently licensed (i.e., Arizona, Nevada, Utah, Colorado, Idaho, Washington, and Oregon), the company built a new home domicile in Chariton, Iowa, in 2012.

Discussion Questions
  1. Do you think that the economic benefits that Midwest Family Mutual realized as a result of green computing are unusual? Do you think most companies can see similar types of economic gains? Explain.
  2. What are some possible disadvantages the employees of Midwest Family Mutual may be experiencing as a result of their new virtual “work from home” office environment?
  3. Apply the normative theories of business ethics to this situation.

Sources: Adapted from Anthony O'Donnell, “Plymouth, Minnesota-based Midwest Family Mutual's Move to a Paperless, Work-at-Home Operational Paradigm Has Yielded Both Environmental and Bottom-Line Benefits,” Insurance & Technology (February 24, 2008), http://www.insurancetech.com/resources/fss/showArticle.jhtml;jsessionid=AYMVWDKZBGIFIQSNDLOSKHSCJUNN2JVN?articleID=206801556 (accessed on April 23, 2008); and Midwest Family Mutual News Archive, MFM Announces 2011 Results and Plans for 2012, https://midwestfamily.com/news.php?detail=589 (accessed on April 14, 2012).

1 M. Culnan and C. Williams, “How Ethics Can Enhance Organizational Privacy: Lessons from the ChoicePoint and TJX Data Breaches,” MIS Quarterly (2009), 33(4), 673–687.

2 M. G. Martinsons and D. Ma, “Sub-cultural Differences in Information Ethics across China: Focus on Chinese Management Generation Gaps,” Journal of AIS (2009), 10(Special Issue).

3 Hasnas and Smith, “Ethics and Information Systems,” 5.

4 M. Friedman, Capitalism and Freedom (Chicago, IL: University of Chicago Press, 1962), 133.

5 There is an interesting presentation of a similar breach with commentaries from the CIOs of ChoicePoint, Motorola, Visa International, and Theft Resource Center in Eric McNulty's “Boss I Think Someone Stole Our Customer Data,” Harvard Business Review (September 2007), 37–50.

6 M. Culnan and C. Williams, “How Ethics Can Enhance Organizational Privacy: Lessons from the ChoicePoint and TJX Data Breaches,” MIS Quarterly (2009), 33(4), 673–687.

7 Hasnas and Smith, “Ethics and Information Systems,” 8.

8 McNulty, “Boss I Think Someone Stole Our Customer Data.”

9 Ibid.

10 Hasnas and Smith, “Ethics and Information Systems,” 10.

11 M. Culnan and C. Williams, “How Ethics Can Enhance Organizational Privacy: Lessons from the ChoicePoint and TJX Data Breaches,” MIS Quarterly (2009), 33(4), 673–687.

12 Kevin Murphy, “TJX Hack Is Biggest Ever,” Computer Business Review (March 30, 2007), http://www.cbronline.com/article_news.asp?guid=0EFDDC37-4EA7-4A78-9726-E6F63C86234D.

13 Martin Bosworth, “TJX to Pay Mastercard $24 Million for Data Breach,” ConsumerAffaris.com (April 6, 2008), http://www.consumeraffairs.com/news04/2008/04/tjx_mc.html (accessed on July 29, 2008).

14 J. Vijayan, “TJX Reaches $9.75 Million Breach Settlement with 41 States,” Computerworld, (June 24, 2009), http://www.computerworld.com/s/article/9134765/TJX_reaches_9.75_million_breach_settlement_with_41_states (accessed on January 28, 2012).

15 Hype Cycle for Sustainability and Green IT (2011), Gartner, Inc., http://www.gartner.com/DisplayDocument?doc_cd=214739&ref=g_noreg (accessed on February 28, 2012).

16 These two articles contrast energy use in 2007 and 2011: G. Lawton, “Powering Down the Computing Infrastructure,” Computer (February 2007), 16–19; and J. Markoff, “Data Centers' Power Use Less Than Was Expected, The New York Times (July 31, 2011), http://www.nytimes.com/2011/08/01/technology/data-centers-using-less-power-than-forecast-report-says.html?_r=2 (accessed on February 28, 2012).

17 Data Center Energy, SAP Sustainability Report, http://www.sapsustainabilityreport.com/data-center-energy (accessed on January 30, 2012).

18 “Total Energy Consumed”, SAP Sustainability Report, http://www.sapsustainabilityreport.com/total-energy-consumed (accessed on January 30, 2012).

19 J. Mick, “Google Looks at Floating Data Centers for Energy,” Daily Tech (September 16, 2008), http://www.dailytech.com/Google+Looks+to+Floating+Data+Centers+for+Energy/article12966.htm (accessed on October 1, 2008).

20 Cade Metz, “Google Reincarnates Dead Paper Mill as Data Center of Future,” Wired Enterprise (January 26, 2012), http://www.wired.com/wiredenterprise/2012/01/google-finland/ (accessed on January 28, 2012).

21 Leidner and Kayworth (2006), 368.

22 “For Data, Tug Grows Over Privacy vs. Security,” The New York Times (August 3, 2010), http://query.nytimes.com/gst/fullpage.html?res=9504E4D6113CF930A3575BC0A9669D8B63 (accessed on January 28, 2012).

23 Richard O. Mason, “Four Ethical Issues of the Information Age,” MIS Quarterly (March 1986), 10(1).

24 Samuel D. Warren and Louis D. Brandeis, “The Right to Privacy,” Harvard Law Review (December 1890), 4(5), 193–200.

25 Paul Pavlou, “State of the Inform Privacy Literature: Where Are We Now and Where Should We Go?” MIS Quarterly (2011), 35(4), 977–985.

26 E. F. Stone, D. G. Gardner, H. G. Gueutal, and S. McClure, “A Field Experiment Comparing Information-Privacy Values, Beliefs, and Attitudes Across Several Types of Organizations,” Journal of Applied Psychology (August 1983), 68(3), 459–468.

27 H. T. Tavani and James Moore, “Privacy Protection, Control of Information, and Privacy-Enhancing Technologies,” Computers and Society (March 2001), 6–11.

28 Andrew LaVallee, “Facebook Outlines Privacy Changes,” Wall Street Journal (December 9, 2009), http://blogs.wsj.com/digits/2009/12/09/facebook-outlines-privacy-changes/ (accessed on May 11, 2011).

29 Lori Andrews, “Facebook is Using You,” The New York Times (February 5, 2012), SR7.

30 Paul Pavlou, “State of the Inform Privacy Literature: Where Are We Now and Where Should We Go?” MIS Quarterly (2011), 35(4), 977–985.

31 B. C. Stahl, “The Impact of UK Human Rights Act 1998 on Privacy Protection in the Workplace,” Computer Security, Privacy, and Politics: Current Issues, Challenges, and Solutions (Hershey, PA: Idea Group Publishing), 55–68.

32 Webopedia, http://www.webopedia.com/TERM/c/cookie.html (accessed on June 28, 2002).

33 Julia, Angwin, “The Web's New Gold Mine: Your Secrets,” Wall Street Journal (July 30, 2010), http://online.wsj.com/article/SB10001424052748703940904575395073512989404.html (accessed on January 28, 2010).

34 Julia Angwin, “Google Widens Its Tracks,” Wall Street Journal (July 30, 2010), http://online.wsj.com/article/SB10001424052970203806504577181371465957162.html?mod=djem_jiewr_IT_domainid (accessed on January 28, 2012).

35 B.C. Stahl, “The Impact of UK Human rights Act 1998 on Privacy Protection in the Workplace,” Computer Security, Privacy, and Politics: Current Issues, Challenges, and Solutions (Hershey, PA: Idea Group Publishing), 55–68.

36 U.S. Department of Commerce, “Safe Harbor Overview,” http://export.gov/safeharbor/eu/eg_main_018476.asp (accessed on January 28, 2012).

37 G. Gombossy, “Bank Of America's Christmas present: Foreclose Even Though Not A Payment Missed” (December 24, 2010), http://ctwatchdog.com/finance/bank-of-americas-christmas-present-foreclose-even-though-not-a-payment-missed (accessed on February 27, 2012).

38 Richard O. Mason, “Four Ethical Issues of the Information Age,” MIS Quarterly (March 1986), 10(1), 5.

39 Jim Forsyth, “Records of 4.9 mln stolen from car in Texas data breach,” Reuters (September 29, 2011), http://www.reuters.com/article/2011/09/29/us-data-breach-texas-idUSTRE78S5JG20110929 (accessed on February 28, 2012).

40 Identity Theft Organization, Frequently Asked Questions, http://www.identitytheft.org (accessed on April 5, 2012).

41 http://www.consumer.gov/sentinel/pubs/Top10Fraud2004.pdf (accessed on August 4, 2005).

42 L. Langton, “Identity Theft Reported by Households, 2005–2010,” Bureau of Justice Statistics (November 30, 2012), http://www.bjs.gov/index.cfm?ty=pbdetail&iid=2207 (accessed on February 28, 2012).

43 PR Web, “79% of U.S. Citizens Concerned About Identity Theft Yet Just 12% Enrolled in An Identity Theft Protection Program” (February, 28, 2012), http://www.prweb.com/releases/identity-theft/statistics2011/prweb4907404.htm (accessed on February 28, 2012).

44 Welcome to the FTCs Identity Theft Site, http://www.ftc.gov/bcp/edu/microsites/idtheft/ (accessed on April 5, 2012).

45 M. Culnan and C. Williams, “How Ethics Can Enhance Organizational Privacy: Lessons from the ChoicePoint and TJX Data Breaches,” MIS Quarterly (2009), 33(4), 673-687.

46 “Saving Private Data,” CIO Magazine, (October 1, 1998).

47 M. Culnan and C. Williams, “How Ethics Can Enhance Organizational Privacy: Lessons from the ChoicePoint and TJX Data Breaches,” MIS Quarterly (2009), 33(4), 685.

48 Ernst & Young, Global Information Survey, 2004.

49 J. Berleur, P. Duquenoy, and D. Whitehouse, “Ethics and the Governance of the Internet,” IFIP SIG 9.2.2, White paper (September 1999).

50 Cara Garretson, “Heightened Awareness, Reinforced Products Advance Teleworker's Security,” Network World (February 20, 2007), http://www.networkworld.com/news/2007/022007-heightened-awareness.html?ap1=rcb (accessed on April 12, 2012).

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset