Google launches censorship measures to rig EU Parliamentary elections
By bellecarter // 2024-02-28
 
Reports recently indicated that tech giant Google is about to embark on an anti-misinformation campaign and tougher new rules tackling online content in five European Union (EU) countries in time for the European Parliament election in June 2024 to hopefully "sway voters' stance" on some issues. According to Reuters, France, Poland and Germany accused Russia last week of putting together an elaborate network of websites to spread pro-Russian propaganda. Because of this, Europe's Digital Services Act, which had been fully in effect the previous week, required large online platforms and search engines to do more to "address illegal content and risks to public security." Meanwhile, Jigsaw, a unit within the Big Tech firm that explores "threats to open societies," plans to run animated ads on TikTok, YouTube and other social media platforms in Belgium, France, Germany, Italy and Poland. It said that this project will reach citizens in countries with some of the largest number of voters in the bloc. The said ads will feature "pre bunking" techniques, which is their "anticipatory form of rumor control," in collaboration with the University of Cambridge and University of Bristol researchers. Experts believe that this will help viewers identify manipulative content before encountering it. "We've spent so much time having these really polarized debates. Our democracy is at stake, and the temperature just keeps getting higher and higher," Jigsaw head of research Beth Goldberg said, adding that prebunking is the only technique that works equally effectively across the political spectrum. For Bangkok-based American journalist Ben Bartee, the technocrats have been so far succeeding in advancing their measures in weaponizing technology. In his recent Substack newsletter, he said: "With each step, with each successive advancement in weaponized technology – which comes faster and faster as the rate of development skyrockets – the corporate state inches closer to achieving its ultimate objective of totalitarian control." He warned that the world is actually on the second stage of a three-part descent into techno-hell. "In the end, of course, the social control will descend to the level of the individual mind itself, with neurological implants or other tools rendering the very act of wrongthink impossible," he said. Meanwhile, Bartee cited an article on Harvard Kennedy School's Misinformation Review referring to pertinent studies contradicting Jigsaw's plan of action on its projected "prebunking" campaigns. It said that debunking and fact-checking can lack effectiveness because of the continued influence of misinformation. "Once people are exposed to falsehood, it is difficult to correct (De Keersmaecker & Roets, 2017; Lewandowsky et al., 2012). Overall, there is a lack of evidence-based educational materials to support citizens’ attitudes and abilities to resist misinformation (European Union, 2018; Wardle & Derakshan, 2017). Importantly, most research-based educational interventions do not reach beyond the classroom (Lee, 2018)," the writeup included. It also mentioned the paper that said the inoculation theory posits the possibility of pre-emptively conferring psychological resistance against persuasion attempts. "This is a fitting analogy, because 'fake news' can spread much like a virus" as per two analyses in 2016 by Kucharski and 2018 by Vosoughi and others.

Social psychologist warns about downfalls of prebunking

Sander van der Linden, a social psychologist who authored "Foolproof: Why We Fall for Misinformation and How to Build Immunity," said that prebunking can have downfalls. Last year, Google did a similar campaign in Poland, the Czech Republic and Slovakia regarding false claims about Ukrainian refugees causing a housing crisis in these countries. Admittedly, it was viewed a whopping 38 million times, which means about half of the population of all three mentioned nations. "Sometimes I worry social media companies see this as an excuse to not take down content or not to pursue regulation," he said.  "It shouldn't be the case that it becomes a replacement for taking down harmful content or implementing measures that improve the algorithm or change the incentives on social media." (Related: Google project is running "prebunking experiment" on social media.) Also, the videos were only effective for those who had watched the whole segment. But people who are busy and whose attention spans are shorter could miss their "message." "And just like vaccines, the effects of prebunking wear off with time and the approach loses its effectiveness if social media users aren't 'boosted' regularly," van der Linden's research also pointed out. However, a postdoctoral fellow at Cambridge University's Social Decision-Making Lab Jon Roozenbeek insisted that prebunking could educate social media users about manipulation tactics, "even if they won't solve the problem of misinformation entirely." Check out BigTech.news to read more about how tech giants are being utilized by globalists to gain full world control. Sources for this article include: Reuters.com ArmageddonProse.Substack.com MisinfoReview.HKS.Harvard.edu EuroNews.com