User:Oceanflynn/sandbox/Data for Humanity Initiative

From Wikipedia, the free encyclopedia

Data for Humanity Initiative is a set of five basic ethical principles adopted by over a thousand big data organizations, such as Goethe University Frankfurt's Big Data Laboratory at the Database and Information Systems (DBIS).

Five Principles[edit]

Goethe University
Goethe University

In a 2015 article published in Scientific American, a team of researchers stressed the importance of the role of the Data for Humanity Initiative in "disseminating an ethical code of conduct for big data use". They list and describe the Initiative's "five fundamental ethical principles for big data users."[1] They call for the First, do no harm or Primum non nocere principle. Second, they require that data be used "in such a way that the results will foster the peaceful coexistence of humanity." Third, date must be used "to help people in need". Fourth, data must be used "to protect nature and reduce pollution of the environment." Fifth, data must be used to "eliminate discrimination and intolerance and to create a fair system of social coexistence."[1][2]

Signatories[edit]

By April, 2016 1028 people were signatories of Data for Humanity Initiative, including ETH Zurich's Dirk Helbing.[3]

Context[edit]

In a 2014 article entitled "Big Data Ethics", University of Groningen professor of international relations and chair of the Department of Political Science, Andrej Zwitter, called for a "rethinking of ethical choices" in big data ethics as the speed of Big Data development increases. Big data governance has a moral responsibility for its unintended consequences and its societal impact.[4] Zwitter cautioned that the concept of moral agency defined in traditional ethics, is inadequate faced with 'the problem of many hands'[5][Notes 1] or 'many actors' inextricably linked in Big Data's interaction creating a distributed morality.[4] He listed the four "ethically relevant qualities" of Big Data. The sheer quantity of data had increased from 5 billion gigabytes of data generated from the "beginning of recorded history until 2003" to 5 billion gigabytes of data generated every 10 seconds in 2015.[4]

See also[edit]

{{portal|Information technology

Notes[edit]

  1. ^ In large organizations it is often not possible to tell who is actually responsible for the outcomes, which is the philosophical problem of many hands.

References[edit]

  1. ^ a b Helbing, Dirk; Frey, Bruno S.; Gigerenzer, Gerd; Hafen, Ernst; Hagner, Michael; Hofstetter, Yvonne; van den Hoven, Jeroen; Zicari, Roberto V.; Zwitter, Andrej (25 February 2017). "Will Democracy Survive Big Data and Artificial Intelligence?". Scientific American. Retrieved 15 March 2018. We are in the middle of a technological upheaval that will transform the way society is organized. We must make the right decisions now.
  2. ^ "Data for Humanity: An Open Letter". Frankfurt: Big Data Laboratory. 15 March 2018. Retrieved 15 March 2018.
  3. ^ Zicari, Roberto; Zwitter, Andrej (28 October 2015). "Data for Humanity: A Request for Support". Operational Database Management Systems (ODBMS). Retrieved 15 March 2018.
  4. ^ a b c Zwitter, Andrej (2014). "Big Data Ethics". Big Data & Society. 1 (2): 1–6. doi:10.1177/2053951714559253. Retrieved 15 March 2018.
  5. ^ Thompson, Dennis F. (2005). "The Problem of Many Hands". Restoring Responsibility: Ethics in Government, Business and Healthcare. Cambridge University Press. pp. 11–32. ISBN 9780521547222

Further reading[edit]

External links[edit]

  • {{Commonsinline
  • {{Wiktionary-inline|big data



Categories


[[Category:Big data [[Category:Data management [[Category:Distributed computing problems [[Category:Transaction processing [[Category:Technology forecasting