Students from Santa Clara University are engaged in a global dialog on this and other business ethics issues with students at Ateneo de Manila in the Philippines and Loyola Institute of Business Administration in India. You are invited to share your thought on this or the previous case on who was responsible for a terrible factory fire in Bangladesh.
Ben Adida, director of identity for Mozilla, Brian Kennish, co-founder of Disconnect, and Arvind Narayanan, assistant professor of computer science at Princeton University, visited the Santa Clara campus Jan. 23 to discuss an engineering ethics perspective on privacy by design. The approach means that concerns about customer privacy are part of the planning process for new products, rather than an afterthought, only addressed after concerns arise.
The panel, co-sponsored by the Ethics Center and the High Tech Law Institute, was moderated by Center Internet Ethics Program Manager Irina Raicu.
When an app or piece of software comes on the market, its introduction may be followed by concern over whether users' privacy can be adequately protected. But a new approach to privacy protection is "privacy by design," where protections are "baked into" the product from its inception.
At a panel discussion this evening, 7 p.m., in Lucas Hall on the Santa Clara University campus, three outstanding computer scientists will offer their perspectives on privacy by design and the role that engineering ethics plays in those efforts. Join us for a presentation by Ben Adida Director of Identity at Mozilla and technical advisor to Creative Commons Brian Kennish ex-Googler and co-founder of Disconnect Arvind Narayanan Assistant Professor at Princeton and affiliate scholar at Stanford’s Center for Internet and Society.
A $500 prize will be awarded to the SCU student or team of students that creates the most creative, well-designed, accessible app allowing users to work through an ethical decision using the tools provided by the Ethics Center. The deadline for submissions is April 5, 2013.
Characterizing the Federal Trade Commission as "the nation's premier consumer protection privacy agency," FTC Commissioner Julie Brill focused on new developments in Internet privacy at a presentation Sept. 20 at Santa Clara University. Her talk was part of the “State of the Net West” townhall series, in which federal policymakers meet with members of the Silicon Valley community to discuss technology policy.
Brill began by detailing consumers’ increasing reliance on mobile tools—noting, for example, that 52 percent of college students report that they check their phones before getting out of bed. She also highlighted disparities among various segments of the population, citing consumer surveys which show that 40 percent of people in households earning less than $30,000 per year go online mostly through their phones, as compared to 17 percent of households earning more than $50,000 per year. In addition, half of African-Americans and 40 percent of Latinos who access the Internet report doing most of their browsing through their phones.
As Internet access through mobile devices becomes widespread, Commissioner Brill said that app developers and app service providers are increasingly realizing that they have to think about consumer privacy, too. In the “diffuse ecosystem” of the Web, she said, which involves so many players, it is easy for each of them to believe that “privacy is somebody else’s responsibility.” However, if nothing else, a variety of FTC enforcement actions have demonstrated that both large and smaller players involved in data collection and analysis would be best served by “baking” privacy into their products.
Commissioner Brill praised the agreements reached by California’s Attorney General Kamala Harris and various platform providers (including Amazon, Google, and Facebook), which have agreed to provide app developers with tools that will allow those developers to give consumers clear information about their privacy policies. In general, Commissioner Brill urged app developers to limit the information that they collect from users to data that is necessary for the app to work; she also spoke of the need to then provide adequate security for that data.
A significant portion of the Commissioner’s talk, as well as of the question-and-answer period that followed, focused on the ongoing efforts to develop a “Do Not Track” mechanism through which consumers would express their wishes if they didn’t want to be tracked by companies online. Commissioner Brill sounded optimistic about the efforts of the W3C consortium, which has been holding meetings with key stakeholders in an effort to develop standards for the implementation of a “Do Not Track” signal.
Commissioner Brill noted that some stakeholders have called for exceptions that would allow continued data collection for “market research” and “product improvement” (even from consumers who request that they not be tracked), and she called on advertising networks to provide more input in defining those exceptions. She also called on the W3C to address the issue of data retention limits. She described limits on the amount of data collected, and on the amount of time that data may be held, as ways to enhance data security.
Finally, she warned that consumers who are concerned about online privacy and loss of control over their personal information might turn to self-help tools that would be “more blunt” than the ones currently being discussed in the W3C.
In response to a question suggesting that the vast majority of consumers would choose to opt out of tracking if given that option, Commissioner Brill argued that companies need to convey more clearly to consumers the benefits of behavioral (or targeted) advertising, and need to give people more “granular” choices about the types of ads they want to see. She noted that companies also need to explain to consumers that, in the absence of advertising revenue, providers would find it difficult to continue to deliver the sites or services that consumers are now enjoying.
Brill has worked actively on issues most affecting today's consumers, including protecting consumers' privacy, encouraging appropriate advertising substantiation, guarding consumers from financial fraud, and maintaining competition in industries involving high tech and health care.
"It's inevitable that we will be publishing much more government data online," Hanson said. "In general, it's a good thing."
Online publication of sex offenders' names and whereabouts help individuals protect themselves from convicted criminals, and such data have a social benefit, he says.
But posting online certain government data, such as arrest records, can open a can of worms, Hanson points out. If an arrest were posted, then at a minimum government would have the responsibility to publish the disposition of a case, he says.
"There are ethical issues because so much damage can be done by electronic publication of an arrest that doesn't lead to a conviction," Hanson said. "An electronic arrest record lives forever."
Government agencies have an ethical obligation to consider whether more harm than good results from putting certain data online, Hanson says.
Search services like Google, AOL and Yahoo! compile vast amounts of data on the searches of all their visitors. These seemingly innocent little bits of data, when taken together, can be very revealing. From a person's search queries, one could infer, rightly or wrongly, medical and psychological issues, legal problems, employment status, personal interests, sexual activities and preferences, relationships, fantasies, economic circumstances, geographical location and a host of other characteristics. Taken together they can suggest a fairly comprehensive portrait of a person, including that person's most intimate problems and vulnerabilities.
These are some of the ethical concerns raised by data aggregation, according to Michael McFarland, S.J., a visiting scholar at the Ethics Center this past year. McFarland, a computer scientist and the former president of College of the Holy Cross, addresses these and other issues in an extended case study on Internet privacy.
Focusing on the "why" of privacy protection, the module offers a case study, an explanation of what companies have to gain from violating a user's privacy, and links to sites that show visitors how to protect their privacy. Throughout, the materials place privacy in the context of ethics, exploring why privacy is crucial to rights that we value.
WeKnowWhatYoureDoing.com is a new Web site that copies potentially embarassing status updates by people who have not protected their privacy on Facebook and posts them for all the world to see. The site is a partial answer to the question, Why do we care about privacy?
So is a new group of materials on privacy recently posted on the Ethics Center Web site. Created by computer scientist Michael McFarland, the materials look at privacy as an ethical matter. McFarland was a visiting scholar at the Ethics Center this spring, following a 12-year stint as president of Collge of the Holy Cross. McFarland began his career at Bell Labs, and has taught computer science at Boston College and Gonzaga University. See McFarland's take on Why We Care About Privacy.