Perplexity, an AI search engine, claims to provide short summaries on various topics with footnotes linked to real-time information from reliable sources. However, it has been criticized for citing AI-generated posts that contain inaccurate and contradictory information. A study by GPTZero found that Perplexity frequently draws information from AI-generated sources in areas like travel, sports, food, technology, and politics, leading to concerns about the reliability of its output.
Perplexity users often encounter AI-generated sources after entering just three prompts, according to the study. CEO Edward Tian of GPTZero emphasized the importance of using reliable sources to ensure accurate outputs. Queries on cultural festivals, technology impact, food recommendations, and promising young tennis players consistently returned answers citing AI-generated materials, raising doubts about the credibility of Perplexity’s information.
Perplexity’s Chief Business Officer Dmitri Shevelenko acknowledged that the system is not flawless and continuously works to improve its search engine by refining processes to identify relevant and high-quality sources. The company uses trust scores to assess the authority of domains and content, downranking spam-filled websites. Shevelenko mentioned that Perplexity has developed algorithms to detect AI-generated content, although these systems require ongoing refinement as AI-generated content becomes more sophisticated.
Perplexity has faced criticism for allegedly plagiarizing content from various news outlets and republishing it without proper attribution. Forbes sent a cease and desist letter to Perplexity, alleging copyright infringement for recreating and distributing an exclusive Forbes story without permission. The company denied the allegations, arguing that facts cannot be plagiarized. Perplexity’s reliance on AI-generated sources for information, including health advice, has also raised concerns about the accuracy and consistency of its output.
Despite its popularity and backing from tech giants like Amazon and Google, Perplexity’s reliance on AI-generated sources and questionable handling of authoritative information has led to scrutiny. The company’s revenue-sharing program with publishers aims to compensate them for their contributions to the platform. However, experts warn that relying on low-quality web sources can promote disinformation and bias in AI-generated content, potentially leading to model collapse and misleading outputs.
The use of AI-generated content in search engines and other products highlights the challenges faced by AI companies in ensuring the accuracy and reliability of their outputs. Perplexity’s reliance on AI-generated materials and alleged plagiarism of journalistic work underscore the need for improved vetting processes and source attribution in AI-powered technologies. As the prevalence of AI-generated content continues to grow, the importance of using credible sources and maintaining editorial integrity becomes increasingly crucial for maintaining trust and accuracy in AI-based platforms.