Ethical considerations for Facebook's news feed algorithm
Algorithms are slowly invading our lives through the constant development of new technology. Not only are algorithms capable of producing outputs, the increasing number of algorithms with learning capacities are able to create rules and make decisions and judgements on behalf of humans. Facebook’s n...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2018
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/76172 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-76172 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-761722019-12-10T13:02:49Z Ethical considerations for Facebook's news feed algorithm Tan, Xenia Yin Rue Melvin Chen School of Humanities DRNTU::Humanities::Ethics Algorithms are slowly invading our lives through the constant development of new technology. Not only are algorithms capable of producing outputs, the increasing number of algorithms with learning capacities are able to create rules and make decisions and judgements on behalf of humans. Facebook’s news feed algorithm is one that has been silently structuring our lives with the potential to cause ethical impacts. The goal of this paper is to highlight the necessity for ethical considerations with regards to Facebook’s news feed algorithms because of unintended moral consequences caused directly or indirectly by the algorithm. In order to do so, I will be elaborating on three key issues of Facebook’s news feed and the values that are impacted by these issues. This paper will also identify the degree of accountability required of Facebook, Facebook’s algorithm designers and data scientists, as well as Facebook’s users with respect to the moral consequences. Bachelor of Arts in Philosophy 2018-11-22T13:02:00Z 2018-11-22T13:02:00Z 2018 Final Year Project (FYP) http://hdl.handle.net/10356/76172 en Nanyang Technological University 38 p. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
country |
Singapore |
collection |
DR-NTU |
language |
English |
topic |
DRNTU::Humanities::Ethics |
spellingShingle |
DRNTU::Humanities::Ethics Tan, Xenia Yin Rue Ethical considerations for Facebook's news feed algorithm |
description |
Algorithms are slowly invading our lives through the constant development of new technology. Not only are algorithms capable of producing outputs, the increasing number of algorithms with learning capacities are able to create rules and make decisions and judgements on behalf of humans. Facebook’s news feed algorithm is one that has been silently structuring our lives with the potential to cause ethical impacts. The goal of this paper is to highlight the necessity for ethical considerations with regards to Facebook’s news feed algorithms because of unintended moral consequences caused directly or indirectly by the algorithm. In order to do so, I will be elaborating on three key issues of Facebook’s news feed and the values that are impacted by these issues. This paper will also identify the degree of accountability required of Facebook, Facebook’s algorithm designers and data scientists, as well as Facebook’s users with respect to the moral consequences. |
author2 |
Melvin Chen |
author_facet |
Melvin Chen Tan, Xenia Yin Rue |
format |
Final Year Project |
author |
Tan, Xenia Yin Rue |
author_sort |
Tan, Xenia Yin Rue |
title |
Ethical considerations for Facebook's news feed algorithm |
title_short |
Ethical considerations for Facebook's news feed algorithm |
title_full |
Ethical considerations for Facebook's news feed algorithm |
title_fullStr |
Ethical considerations for Facebook's news feed algorithm |
title_full_unstemmed |
Ethical considerations for Facebook's news feed algorithm |
title_sort |
ethical considerations for facebook's news feed algorithm |
publishDate |
2018 |
url |
http://hdl.handle.net/10356/76172 |
_version_ |
1681038843078246400 |