CSC1016S – SIPP block 2016

Annotated list of background readings for Social Issues and Professional Practice in Computing and IT

 

Dr. Maria Keet

Department of Computer Science

University of Cape Town

 

 

This document lists several background sources to consult and analyse. You are not expected to read all of them, but, at least, as a minimum:

Most references are annotated, and for many of them, questions are added to assist you in reflecting on the content, which are indicated as such with a different font type.

This is not meant to be a comprehensive, final-and-only, list, and you are encouraged to look up more sources. The ones listed in this reader are a good starting point to get to grips with social issues and professional practice for computing and IT. Many of the sources listed contain further links to related material that you may wish to explore.

            One note of caution: only English-language resources are listed, which are rather biased, meaning that they are heavily USA/UK-framed. Most of these articles presuppose certain premises[1] that are not always formulated and they ignore others, and some SIPP-relevant topics that are not an issue in the Anglo-Saxon/English language sphere are in other countries where other languages are spoken (and those documents were thus not included[2]) or where certain societal issues are [more/less] prevalent[3].

 

 

Table of contents

1. (Computer) ethics, moral responsibility........................................................................................................................................... 2

2. Modelling, design................................................................................................................................................................................................ 2

3. Big data..................................................................................................................................................................................................................... 3

4. Some IT applications with issues............................................................................................................................................................ 5

5. Privacy...................................................................................................................................................................................................................... 8

6. Open Source Software, Free Software etc.......................................................................................................................................... 9

7. ICT for Development and ICT for Peace.............................................................................................................................................. 9

8. Other....................................................................................................................................................................................................................... 10

 

1. (Computer) ethics, moral responsibility

 

Gotterbarn, D. Computer Ethics: Responsibility Regained. National Forum: The Phi Beta Kappa Journal, 1991, 71: 26–31.

 

Moor, J.H. What is Computer Ethics? Metaphilosophy, 1985, 16(4):266-275.

 

Metz, T. Ubuntu as a moral theory and human rights in South Africa. African Human Rights Law Journal, 2011, 11:532-559.

 

Noorman, M. Computing and Moral Responsibility. The Stanford Encyclopedia of Philosophy (Summer 2014 Edition), Edward N. Zalta (ed.).

 

Bynum, T. Computer and Information Ethics. The Stanford Encyclopedia of Philosophy (Winter 2015 Edition), Edward N. Zalta (ed.).

 

Informal:

Slashdot: https://slashdot.org/story/99/09/02/2038236/review-code-of-ethics-for-programmers

Wikipedia: https://en.wikipedia.org/wiki/Computer_ethics

 

2. Modelling, design

 

Derman, E. Apologia Pro Vita Sua. The Journal of Derivatives, 2012, 20(1):35-37.

 

Jerven, M. Studying Africa by the numbers can be misleading. What can be done about it? The Conversation, 20 July 2016.

 

Keet, C.M. Dirty wars, databases, and indices. Peace & Conflict Review, 2009, 4(1):75-78.

 

Tufekci, Z. The real bias built in at Facebook. New York Times, 19 May, 2016.

 

3. Big data

 

Note: while this section is also split into scientific references and other sources, some of the other sources are actually gentle introductions to a scientific paper that is referenced at the end of that article.

 

Crișan, C., Zbuchea, A., Moraru, S. Big Data: The Beauty or the Beast. Strategica: Management, Finance, and Ethics, 2014, p. 829-849.

 

Sax, M. Finders keepers, losers weepers. Ethics Inf Technol, 2016, 18: 25-31.

 

Zwitter, A. Big data ethics. Big Data & Society, 2014, 1-6.

 

Staff. Big data for development. African Seer, 25 April 2014.

 

Richards, N.M., King, J. Gigabytes gone wild. Al Jazeera, 2 March, 2014.

o   This is the ÔliteÕ version of their (longish) journal article on Big Data Ethics in the Wake Forest Law Review journal.  Some quotes: ÒBig data allows us to know more, to predict and to influence others. This is its power, but itÕs also its danger.Ó ÒThe values we build or fail to build into our new digital structures will define us.Ó ÒBig data has allowed the impossible to become possible, and it has outpaced our legal systemÕs ability to control it.Ó ÒItÕs outrageous that while big data has allegedly eliminated privacy, many of the ways itÕs used are themselves shrouded in secrecy. This has things entirely the wrong way around.Ó (cf. companies being transparent and the users keeping their privacy). And the need for Òin-house panels that ensure that scientific tools are deployed ethically and for the benefit of human beings.Ó

 

OÕNeil, C. The ethical Data Scientist. Slate.com, 4 February 2016.

o   Some quotes: ÒPeople have too much trust in numbers to be intrinsically objective, even though it is in fact only as good as the human processes that collected it.Ó ÒAnd since an algorithm cannot see the difference between patterns that are based on injustice and patterns that are based on [network/usage] traffic, choosing race as a characteristic in our model would have been unethical.Ó ÒBut what about using neighborhoods or ZIP codes? Given the level of segregation we still see in New York City neighborhoods, thatÕs almost tantamount to using race after all. In fact most data we collect has some proxy power, and we are often unaware of it.Ó ÒWhat typically happens, especially in a Òbig dataÓ situation, is that thereÕs no careful curating of inputs. Instead, the whole kit and caboodle is thrown into an algorithm and itÕs trusted to come up with an accurate, albeit inexplicable, prediction.Ó. The paper has some useful points for discussion, but then it ends with: ÒA data scientist doesnÕt have to be an expert on the social impact of algorithms; instead, she should see herself as a facilitator of ethical conversations and a translator of the resulting ethical decisions into formal code. In other words, she wouldnÕt make all the ethical choices herself, but rather raise the questions with a larger and hopefully receptive group.Ó Is that really the right approach to the matter, relegating responsibilities to some amorphous vague Ôlarger and hopefully receptive groupÕ? Is it right to absolve the data scientist from any responsibility of her (in-)actions, never virtuous and never to blame for unethical behaviour, no matter how bad the consequences of some data crunching may be? DoesnÕt a data scientist have moral agency so s/he can be culpable or be exculpated?

 

4. Some IT applications with issues

 

Note: this list is ordered alphabetically, and I tried to limit the short articles to those published in the past year, and still include a range of different current issues.

 

Bejoy, R. Apartheid victims lose 14-year legal battle against Ford and IBM. GroundUp, 23 June 2016.

 

Boninger, F., Molnar, A. How companies use school educational software to sell to children. TimesLive, 18 August 2016.

 

Epstein, R. How Google could rig the 2016 election. Politico, 20 August 2015.

 

Gershgorn, D. Police used bomb disposal robot to kill a Dallas shooting suspect. Popular Science, 8 July, 2016.

 

Kwet, M. The dangers of paperless classrooms. Mail & Guardian, 9 October 2015.

 

Murthy, M. Facebook is misleading Indians with its full-page ads about free basics. The Wire, 26 December 2015.

 

Pileggi, T. US terror victim seeks $1 billion from Facebook for Hamas posts. The Times of Israel, 11 July 2016.

o   Their argument: ÒÒFacebook has knowingly provided material support and resources to Hamas in the form of FacebookÕs online social media network platform and communication services,Ó a press release issued by the plaintiffs said. ÒHamas has used and relied on FacebookÕs online social network platform and communications services as among its most important tools to facilitate and carry out its terrorist activity.ÓÓ Was this unacceptable practice from Facebook? Is it the [legal or moral] responsibility of the owner of the social network software to police what is, and is not, allowed to be communicated through its software? If not, who is responsible, if anyone? And if you would deem that Facebook would be complicit and culpable, could then not also, say, Egypt sue Facebook, for it was used to organise demonstrations during the Arab Spring? IsnÕt their claim analogous to using a telephone company for providing the services, had they communicated over that network, and thus that telephone companies could be sued for such matters as well? Telephone companies are not responsible for what their customers say during telephone conversations, so can one draw an analogy and conclude that Facebook is not to blame, or is it different because software is involved?

 

Anonymous. Do we need AsimovÕs laws? MIT Technology Review, 16 May 2014.

 

 

5. Privacy

 

This section lists only a few links that specifically focus on privacy on its own. Some of the previous topics intersect with specific privacy issues in a particular context, notably Big Data.

 

DeCew, J. Privacy. The Stanford Encyclopedia of Philosophy (Spring 2015 Edition), Edward N. Zalta (ed.).

 

Isaacs, R., Deosaran, N. Crawford, K.  Data protection in South Africa: overview. Practical law.

 

Government Gazette of the RSA: Act no. 4 of 2013: Protection of Personal Information Act 2013.

 

The right to be forgotten. https://en.wikipedia.org/wiki/Right_to_be_forgotten

 

Spyware, or its euphemism Ônon-invasive computingÕ. URL: https://en.wikipedia.org/wiki/Spyware,

 

More To Be Added

 

6. Open Source Software, Free Software etc.

 

RT Spotlight interview with Richard Stallman, the free software ÔevangelistÕ: https://www.youtube.com/watch?v=uFMMXRoSxnA

 

Free Software Foundation: http://www.fsf.org/

 

GNU General Public License (GPL) and information, including the ideas behind GNU: https://www.gnu.org/philosophy/philosophy.html

Free Software, definition: https://www.gnu.org/philosophy/free-sw.html

 

See also the list of references in the SIPP05-Property slides.

 

 

7. ICT for Development and ICT for Peace

 

Anon. Kentaro Toyama: ten myths about technology and development. FSI News, 25 February, 2010.

 

Stauffacher, D. Weekes, B., Gasser, U., Maclay, C., Best, M. (Eds.). Peacebuilding in the Information Age – sifting hype from reality. ICT 4 Peace Foundation. January 2011.

 

More To Be Added

 

 

8. Other

 

Harvey, D. Technology, work and human disposability. In: Seventeen contradictions and the end of capitalism. London: Profile Books. pp111-121. (file david harvey 17 contradictions)

 

Richardson, K. Sex robot matters – Slavery, the prostituted, and the rights of machines. IEEE Technology & Society magazine, June 2016.

 

Electronic Frontier Foundation—Ôdefending your rights in the digital worldÕ.

 

Toyama, K. Bursting the 9 Myths of Computing Technology in Education. ICTworks, 28 January 2011.

 

IITPSA. Codes of behaviour.

 

The Moral Machine crowdsourcing app from MIT.

 

 

 

 



[1] E.g., favouring individual freedom over equality, rather than equality of the people over their individual freedoms, and assuming that a capitalist economic system is necessary.

[2] E.g., on electronic health records and privacy concerns, some paper-based vs. electronic voting arguments, intentional de-coupling of databases to prevent finding information that in that country would be considered to have ethically undesirable consequences if discovered.

[3] E.g., whether it is ethical for companies to offer limited ÔfreeÕ internet access for those who otherwise cannot afford it, like a Ôfree Facebook+Õ or ÔWikipedia content onlyÕ. Such promotions run in various countries in Africa, and it was tried to roll out in India but India (its telecom regulatory authority) decided to ban it.

[4] A quick search will lead you to, e.g., ÒItÕs an easy question to emotionalize; one side can be accused of being uncaring about Ganesh and his poor family, the other can be accused of self-interestedly trying to retard economic and social progress in the developing world.Ó http://www.extremetech.com/extreme/220106-free-basics-net-neutrality-and-the-problem-with-charity