This article discusses Technology Influences and Changes Culture.
Class: This week we looked at Gerhard Lensky\\\’s theory that TECHNOLOGY INFLUENCES AND CHANGES CULTURE.
The things we make, create and use says a lot about us…particularly HOW we use it.
We saw Two stories about the impact of Tech on Culture
CAMBRIDGE ANALYTICA WHISTLEBLOWER
For the CAMBRIDGE ANALYTICA story: PLEASE ANSWER ALL PARTS IN BLOCK PARAGRAPH FORM. EACH ONE LINE BETWEEN EACH PARAGRAPH. 10 PTS
- How did Cambridge Analytica understand each person to be? What did this enable them to do? 3 pts
- Explain how the use of Facebook allowed Cambridge Analytica to gather information. 3 pts
- Who do you think was most responsible for this happening? PICK ONLY ONE PERSON/Company. Defend your choice by telling how that ONE person\\\’s actions allowed everything else to occur. IF YOU PICK MORE THAN ONE, YOU WILL LOSE ALL POINTS FOR THIS QUESTION…PICK ONE AND DEFEND THAT CHOICE 4 pts
FOR THE STORY ON \\\”MISINFORMATION\\\”10 PTS
- What are the TECHNICAL/Cultural aspects of this story that is impacting the communities? 3 pts
- What is the problem policing Social Media in foreign languages? 3 pts
- What are the CONSEQUENCES of Misinformation in these communities. There are plenty in the story. 4 pts
BONUS ROUND UP TO 10 EXTRA POINTS****
Review BOTH STORIES. How might the story about Cambridge Analytica be connected to the story on Misinformation IN TERMS OF THE TECHNOLOGY and how it is used?
Give it your best shot. Make a good argument with the info from the stories.
Technology Influences and Changes Culture
Cambridge Analytica understood each person to be a unit of culture, and they had to change the people to change the culture. There was an initial opinion to change the whole culture, but they focused on the culture units, including the people. They sought to create a psychological weapon that would help manipulate people’s psychology (The Guardian, 2018). The company understood that it had to fragment and remodel them to fit its vision. By collecting Facebook profiles of millions of people, they would design an algorithm that would help manipulate the people to think how the company wanted them to.
Facebook had permitted Cambridge Analytica to include special apps in its platform that help collect information about people on Facebook. Using this app, the company would use the people on the app to reach their friends that are not subscribed to the app (The Guardian, 2018). By gathering people subscribed to the app, the company would extend their access to friends on Facebook and their profile information, including personal data like private messages. By using Facebook, the company was able to access the data of millions of people.
Cambridge Analytica is responsible for what is happening. I would not point out any individual but the whole company because every team member was aware of what was happening and unethical. Still, nobody came forward to stop unauthorized data access. The company is supposed to bring people together through connection and shared experience, but it sought to divide and fragment and mold them into something they desired for money and selfish political agenda.
The company sought funding for building a psychological tool to manipulate voters (The Guardian, 2018). It took advantage of its special apps on Facebook to gather data about people on Facebook without authorization. It went ahead to build a system that profiles individual voters to target them with tailored political advertisements.
Are you looking for answers to a similar assignment?Nursingstudy.org has the top and most qualified writers to help with any of your assignments. All you need to do is place an order with us.(Technology Influences and Changes Culture)
The language barrier is the primary cultural aspect affecting the communities in this story. Most immigrant diaspora communities cannot access information availed in English, and they have to rely on channels and sources speaking their native language.
The report includes the Vietnams community that relies on a YouTube source with no credibility (LastWeekTonight, 2021). Being in a situation where the communities can only consume content in their native language makes it easier for misinformation to spread because it is hard to debunk the channels or sources sharing the information.
Technical aspects impacting communities include Facebook and private messaging apps like WhatsApp, TALK, and WeChat tailored to family and friends and for particular communities only. Most information on these sites has no credible sources, and no hyperlinks are included, making it challenging to confirm the data.
Facebook and YouTube have interventions to control social media misuses, especially the spread of misinformation. However, most policing interventions apply to information shared in the English language (LastWeekTonight, 2021).
It is critical to take down misinformation in other languages, but not practical so far because of different policies regarding information sharing and social networking sites in different countries. From this story, most Facebook misinformation regulations apply primarily in America and Canada.
Misinformation leads to the spread of harmful conspiracy theories and hates speech. Several examples in this story reveal how misinformation has affected how these communities respond to the coronavirus. A fake doctor spread the news about how the community should not take the vaccine and treat the virus independently. Others share information about invalidated treatment for the coronavirus.
Some urged the communities to stop wearing masks because they kill (LastWeekTonight, 2021). This fake information can cause communities to respond to the virus inappropriately, increasing infection and mortality rates. Misinformation can lead to violence, especially those spread in family and friends groups on WhatsApp. Misinformation also leads to mistrust of a particular individual or government, for instance, Joe Biden and the Chinese government.
Digital technology, especially social networking sites and private messaging apps, can be used for the wrong reason. Facebook, in particular, was guilty on both occasions, first for accepting the access of users’ information without consent and inadequate policies to control misinformation. Cambridge Analytica managed to retrieve profile information of about 50 to 60 million Facebook users within three months without them knowing to develop individual voter profiles and target those using tailored political advertisements (The Guardian, 2018).
Facebook and private messaging apps have impacted communities by spreading misinformation on various matters, including critical ones like coronavirus prevention, treatment, and vaccination (LastWeekTonight, 2021). Generally, technology has been used unethically in both stories.
The Guardian. (2018, March 17). Cambridge Analytica whistleblower: ‘We spent $1m harvesting millions of Facebook profiles. [Video]. YouTube. https://www.youtube.com/watch?v=FXdYSQ6nu-M&ab_channel=TheGuardian
LastWeekTonight. (2021, October 11). Misinformation: Last Week Tonight with John Oliver (HBO). [Video]. YouTube. https://www.youtube.com/watch?v=l5jtFqWq5iU&ab_channel=LastWeekTonight
|PICOT Question for the Evidence-Based Practice Project||PICOT Question for the Evidence-Based Practice Project|
|100 Good Examples Of PICOT Questions & Papers||The post below includes 100 Good Examples of PICOT Questions for NPs and papers in different subject areas such as diabetes, mental health, falls, emergency nursing, pregnancy, hypertension and nursing burnout.|
|PICOT Question & Literature Search||PICOT Question & Literature Search|
|Grand Canyon Literature Review PICOT Statement Paper||Grand Canyon Literature Review PICOT Statement Paper|
|Evidence-Based Practice Project: PICOT Paper SAMPLE||Evidence-Based Practice Project: PICOT Paper SAMPLE|