Gabriel's Personal Data Structures project
Home | Test Prep | Project Documentation | Create Task Documentation | AP Study |
Example 1 (Netlfix):
Biases can be programmed to reflect trends. This can assist in targetting by age, interests, gender, race, etc.
Certain in game elements, types of advertisements, etc. can be targetted towards certain groups (COD is more appealing to teenage boys)
Certain groups want to be diverse (ex: instagram vs facebook)
Enhancing or excluding?
Intentionally harming or hurting?
The following questions are based off of this video
Does the owner of the computer think this was intentional?:
No, the owner doesn’t think it was intentional
If yes or no, justify you conclusion.:
The owner seems to recoginze that it was an oversight in the programming and seemed annoyed but understanding about it happening.
How do you think this happened?:
There was a lack of software testing into the camera function and all testing was likely done on exclusivly white people.
Is this harmful? Was it intended to be harmful or exclude?:
It is harmful due to accessibility issues and an oversight involving racial issues, however it was not intended to cause harm and wasn’t intentional
Should it be corrected?:
Yes, the developer should update the program through the computer or offer a return on the computer to those having this issue.
What would you or should you do to produce a better outcome?:
Allow for more extensive testing and crowdsourcing
Public Data Sets (Kaggle)
Google Public Data Sets
Data.gov
Crowdsourcing (Spotify) ex: