Blog Post #5

This week, I shifted my focus from why misinformation spread to the bigger question of how this actually affects and shapes public discourse, meaning the shared conversations and debates regarding an event or topic. I realized this week that when platforms amplify certain types of content, they indirectly shape what the āpublic conversationā is.
One of the sources that I looked at this week was an article from Georgetown University where the author explains how algorithms can actually set the agenda for public conversations. They explain how social media create feedback loops, these work where when someone interacts with a video, whether that is a like, or share, or repost, that algorithm sees this engagement and then pushes other content like it or pushes the content to more people. This makes people continuously see similar videos and therefore create a loop and distortion of how popular a topic is. According to the author this is part of the reason why content that is perceived as viral can be inaccurate due to the algorithms preference for what performs emotionally (Diresta, 2025)
This connects directly to what I mentioned in my last blog post. A lot of the misinformation that we see goes viral for a specific reason, whether or not it triggers a strong reaction, not whether or not it is accurate. Now bringing this back to public discourse, the problem is that once something goes viral, then it becomes part of public conversation even if it is false.
Another algorithmic impact is what is referred to as āecho chambersā. Echo chambers are when you are being exposed to a specific topic, or opinion and the more you see it the more you believe it. The other key part of echo chambers is that you tend to only see that opinion and not see others. This is because when you interact with content of that type, the algorithm is trained to bring more information like it back to you, creating a dangerous cycle. Echo chambers also tend to group people with those similar views together creating public discourse that is centered around one agreeing topic. Once in those bubbles what others think can impact what others believe.
Overall, this week made me more aware that algorithms donāt just show us content but they guide us to see specific things and then to communicate with others who are seeing the same consent. This has consequences because public discourse is what impacts our beliefs and if we are only content with others with similar views we will never see the other side.
Resources
- Ask-a-professor-renee-diresta-how-social-media-can-shape-public-opinion
- Echo-chamber-effect
- https://insights.som.yale.edu/insights/how-social-media-rewards-misinformation
Photo: Unsplash – a-close-up-of-a-cell-phone-with-social-media-icons-on-it-a_lBsEZZjWc
