Crypto News

Chinese Researchers Introduces Tool to Correct AI Hallucination

Chinese researcher have introduced a tool named Woodpecker designed to correct AI Hallucinations in Large Language Models
Published by
Chinese Researchers Introduces Tool to Correct AI Hallucination

A groundbreaking solution to the problem of AI hallucination in Multimodal Large Language Models (MLLMs) has been designed by a group of scientists from the University of Science and Technology of China and Tencent’s YouTu Lab.

Advertisement

Solving AI Hallucination: Introducing Woodpecker

The solution was introduced through a published research paper titled “Woodpecker: Hallucination Correction for Multimodal Large Language Models.” This research was published on the pre-print server arXiv.

Woodpecker utilizes three different AI models. This differs from the MLLM that is being corrected for hallucinations. The models are GPT-3.5 turbo, Grounding DINO and BLIP-2-FlanT5. Their fusion facilitates a system where an evaluation is carried out to first identify the hallucinations and then command the model that is under correction for the hallucinations to regenerate its result according to its data.

This is not the first time an attempt is being made to correct the challenge of hallucination in AI models. Prior to this time, existing solutions involved an instruction-tuning approach which  required the model to be retrained with a particular data. However, these methods were data and computation intensive which equally means that they were expensive. 

In line with the inspiration behind its name, the Woodpecker framework works in five different stages including key concept extraction, question formulation, visual knowledge validation, visual claim generation, and hallucination correction.

Advertisement

Hallucination in AI Models

For context, AI hallucination is typically said to have happened when an AI model generates outputs with a high level of confidence but are not in alignment with the information embedded in its training data. 

These scenarios have largely been experienced with Large Language model (LLM) research. An example of AI applications that use LLM and are at risk of these hallucinations include OpenAI’s ChatGPT and Anthropic’s Claude.

According to a note in the research paper, “Hallucination is a big shadow hanging over the rapidly evolving Multimodal Large Language Models (MLLMs), referring to the phenomenon that the generated text is inconsistent with the image content.” 

With the release of new chatbot models like GPT-4, especially its visual variant GPT-4V as well as other visual systems that process picture and text into generative AI modality, such incidents of hallucination are imminent and Woodpecker is deemed a workable solution.

Advertisement
Share
Godfrey Benjamin

Benjamin Godfrey is a blockchain enthusiast and journalists who relish writing about the real life applications of blockchain technology and innovations to drive general acceptance and worldwide integration of the emerging technology. His desires to educate people about cryptocurrencies inspires his contributions to renowned blockchain based media and sites. Benjamin Godfrey is a lover of sports and agriculture. Follow him on X, Linkedin

Published by
Why trust CoinGape: CoinGape has covered the cryptocurrency industry since 2017, aiming to provide informative insights to our readers. Our journalists and analysts bring years of experience in market analysis and blockchain technology to ensure factual accuracy and balanced reporting. By following our Editorial Policy, our writers verify every source, fact-check each story, rely on reputable sources, and attribute quotes and media correctly. We also follow a rigorous Review Methodology when evaluating exchanges and tools. From emerging blockchain projects and coin launches to industry events and technical developments, we cover all facets of the digital asset space with unwavering commitment to timely, relevant information.
Investment disclaimer: The content reflects the author’s personal views and current market conditions. Please conduct your own research before investing in cryptocurrencies, as neither the author nor the publication is responsible for any financial losses.
Ad Disclosure: This site may feature sponsored content and affiliate links. All advertisements are clearly labeled, and ad partners have no influence over our editorial content.

Recent Posts

  • Crypto News

Ripple, Circle Could Gain Fed Access as Board Seeks Feedback on ‘Skinny Master Account’

The U.S. Federal Reserve has requested public feedback on the payment accounts, also known as…

December 19, 2025
  • Crypto News

Fed’s Williams Says No Urgency to Cut Rates Further as Crypto Traders Bet Against January Cut

New York Federal Reserve President John Williams has signaled his support for holding rates steady…

December 19, 2025
  • Crypto News

Trump to Interview BlackRock’s Rick Rieder as Fed Chair Shortlist Narrows to Four

The Fed chair race is heating up with U.S. President Donald Trump set to interview…

December 19, 2025
  • Crypto News

Breaking: VanEck Discloses Fees and Staking Details for its Avalanche ETF

The leading crypto asset manager VanEck amends its Avalanche ETF with the U.S. Securities and…

December 19, 2025
  • Crypto News

Crypto Market Braces for Volatility as BTC, ETH Options Expiry Collides $7.1 Trillion ‘Triple Witching’

Crypto market traders are bracing for heightened volatility and a potential crash as Bitcoin and…

December 19, 2025
  • Crypto News

Terraform Labs Lawsuit: Jump Trading Faces $4B Case over Market Manipulation

While the crypto market has yet to fully recover from the $40 billion collapse of…

December 19, 2025