Task 5: How Algorithms + Confirmation Bias = Misinformation
Using YouTube as an example, students learn how social media platforms’ algorithms, combined with our desire to accept information that aligns with what we already believe, can lead people to consume progressively more extreme and false content.
Materials: Student Handout | Example YouTube algorithm assignment
1. Students watch the next part of Social Media from the Crash Course “Navigating Digital Information” (from 10:06–11:42) to learn about “Extreme Recommendation Algorithms,” then respond to this prompt on the Student Handout:
- Some social media apps show us more and more outrageous content the longer we’re on the app because….
2. Students learn more about how YouTube’s algorithm has pushed people to outrageous and false information by doing one of the following:
- Watch The Algorithm: How YouTube Search & Discovery Works from the YouTube Channel “YouTube Creators” (2:01) and Read As Algorithms Take Over, YouTube’s Recommendations Highlight a Human Problem from NBC News
- Watch How YouTube’s Algorithm Pushes Content onto Users from NBC News (2:17)
- Read the opinion piece YouTube, the Great Radicalizer from the New York Times
3. Students write individually or discuss in pairs / small groups about the following questions: Why are algorithms and confirmation bias problematic for information? Cite evidence from the above materials.
4. Students review the YouTube algorithm assignment directions (included on Student Handout) and example assignment to learn how to experiment with the YouTube algorithm to see how many clicks it takes to get to extreme or false information.
5. Students present their findings to small groups or the whole class, or teacher posts all student presentations to a learning management system where students can view their peers’ work and reflect on what they found most and least surprising.