Launched in 2018, NewsGuard employs dozens of journalists who have rated over 30,000 news sources according to a series of criteria correlated with journalistic standards, including whetherr the outlets correct their errors or separate news from opinion. Over the last several years, the organization has shifted its focus toward halting the spread of misinformation by AI. It has announced partnerships with tech companies including Microsoft, which licensed NewsGuard’s products to help train the new Bing, a search engine powered by the same software as ChatGPT.
The organization said it was increasing its election misinformation “fingerprinting,” collecting a continuously updating feed of misinformation designed to help AI models detect and avoid inadvertently sharing false information, and helping those models “detect prompts and responses that might convey the misinformation.”
As part of the push, Newsguard said it is ramping up its risk-testing and “red-teaming” efforts, using the knowledge of tactics and motivations of malicious actors to ensure that AI text, image, video, and audio generators can’t be prompted to “circumvent guardrails and exploit AI systems” aimed at preventing misinformation. There are even more basic bits of information that NewsGuard wants to make sure that machines understand: for example, the company is trying to incorporate dates and times of elections from official government websites into its models, in order to ensure that voters are given correct information on specifics like polling locations.
Earlier this year, the organization announced the creation of its 2024 Elections Misinformation Tracking Center, which tracks myths and falsehoods spreading online.
Eugen Boglaru is an AI aficionado covering the fascinating and rapidly advancing field of Artificial Intelligence. From machine learning breakthroughs to ethical considerations, Eugen provides readers with a deep dive into the world of AI, demystifying complex concepts and exploring the transformative impact of intelligent technologies.