-Danny Begg
Tik Toks, YouTube Shorts, and Instagram reels are all examples of short-form video content that is ubiquitous across social media. The term “doomscrolling” was added to the Merriam Webster dictionary in 2023– it refers to an excessive amount of scrolling through online content and it seems to be uniquely related to these short-form videos. Meta and other social media sites use algorithms to tailor the suggested videos to the individual user which compounds time spent on these videos to a point that resembles addiction.
One day during my own endless scrolling I came across a video that stood out. It was an edit comparing the kindness of Queen Camilla to Princess Diana, and it was just the kind of innocuous, pointless media that I came to expect from Instagram Reels. Suddenly, after 22 seconds of images of the two royals there was a hard switch to a woman who identified herself as Palestinian. She said, “Don’t scroll past just because you saw me. Five seconds of your time is all I need. Like, comment, share—any little amount helps.” I felt like I was being spoken to, as if a direct plea to be seen was being made to me. It was jarring to have my doomscrolling interrupted by a real person. I was moved, I liked the post hoping that if I boosted engagement on the post it might find its way to someone who had the spare cash to donate.
Later, I encountered another video with the same structure, what I have started to call a “bait and switch” template. It was posted by a different account, but followed the same pattern: a “popular” or eye-catching image, followed by a sharp cut to a heartfelt appeal. It started with a video of an artist painting a landscape and switched to a young girl walking around through the rubble of destroyed buildings saying “everyone can donate. So can you donate to me? ‘Cause I’m tired in this place. Thank you.” She was no more than 11 years old and seeing her on my reels raised so many questions.
I built a small archive of all the videos I came across and as my engagement grew, my social media started feeding me more of this content. I noticed that it was usually women and children reaching out—speaking directly to global audiences, asking for relief or simply a boost in engagement. I kept wondering: Why do they rely on innocuous content to draw viewers in? Would their pleas get the same attention if they weren’t wrapped in something easy, light, or algorithmically “safe”? The reliance on this bait-and-switch technique isn’t accidental, I have come to understand. It’s a necessary response to algorithmic suppression, both by platforms like Meta and, in these cases, by the state of Israel (Cristiano 2022). Israel’s algorithmic suppression stems from their total control of Plaestinian digital networks that Tawi-Souri and Aouragh (2014) called “cyber-colonialism. As a result of the Oslo accords, “all international traffic, initially for landlines and later for cellular and internet lines, had to be routed through Israeli providers (pg.112).” This control allows for all content posted by Palestinians to be routed through Israeli content moderation.
Figure 1. Instagram screen shot of the bait-and-switch template
Many advocate for free and open access to the Internet as a human right. While not officially recognized by the United Nations, this right is increasingly seen as essential in contemporary society, where a lack of digital connectivity can result in real disparities in information, education, opportunity, and safety. Today, the over-surveillance and moderation of Palestinian content may impact how the humanitarian crisis in Gaza is perceived and addressed. In recognition of the importance of things like social media visibility, the organization 7amleh, fights against online suppression by using international law to lobby for the digital rights of Palestinians. After the May 2021 protests in Palestine, 7amleh pushed Meta to hire an external company, named BSR, to review possible discrimination in the application of their content moderation algorithm. The BSR’s memo investigated Meta’s activities and identified “various instances of unintentional bias where Meta policy and practice, combined with broader external dynamics, does lead to different human rights impacts on Palestinian and Arabic speaking users” (