Faked data? Met Office's data gaps raise questions over climate crisis claims
By willowt // 2025-05-19
 
  • The Met Office cannot identify the sources of temperature data for 103 non-existent or defunct weather stations, undermining the reliability of its climate records. Independent investigations revealed cases like Scole (operational 1971-1980) reporting data from 1959-2020.
  • Critics argue the Met Office’s reliance on unverified "regression analysis" to fill gaps breaches scientific standards. Without transparent sourcing, the data’s credibility is questioned, impacting climate models and policies.
  • The issue mirrors past climate data scandals, such as the 2009 "Climategate" emails and NOAA’s alleged fabrication of U.S. station data. Skeptics compare "ghost data" to using unsupported evidence in policymaking.
  • Leaked discussions reveal Met Office scientists prioritizing climate models over raw data, with experts calling models "convenient fictions." If foundational data is unreliable, trillion-dollar policies like Net Zero may lack scientific rigor.
  • Despite pressure, the Met Office avoids disclosing data sources, fueling skepticism. Investigators demand accountability, warning that climate action without verifiable evidence risks being "wishful thinking."
The U.K. Met Office, a cornerstone of global climate science, faces mounting scrutiny after failing to identify the sources of temperature data linked to 103 non-existent weather stations. Investigations by independent researchers and frustrated Freedom of Information requests have exposed gaps in the agency’s methodologies, casting doubt on its climate data reliability. The findings, emerging against a backdrop of urgent global climate policies, raise critical questions about the validity of temperature records used to justify the U.K.’s Net Zero agenda and international climate models.

103 Stations with no trace, no evidence

In May 2025, citizen investigator Ray Sanders revealed that the Met Office cannot name specific weather stations providing temperature data for 103 so-called “zombie sites”—locations that ceased operation decades ago or never existed. After submitting FOI requests, Sanders discovered the agency admitted it lacks records detailing the “well-correlated neighboring stations” supposedly used to estimate data for these non-functional sites. Sanders highlighted two cases: Scole, a Norfolk station operational only from 1971-1980, which now lists temperature averages spanning 1959-2020—including a full decade before its establishment. The Met Office defended this by citing “regression analysis” to maintain long-term climate averages but refused to name the stations that supplied data for Scole’s 1990–2020 records, stating such information is “not held.” A similar issue arose with Manby, a Lincolnshire site closed in 1974, yet still listed with 60 years of temperature data. “If the data isn’t real and relies on unverified estimates from unknown sources, it’s no better than fiction,” Sanders said. “Any scientist would tell you this breaches fundamental standards. Without verifiable sources, Credibility vanishes.”

A pattern of data controversies in climate science

The Met Office’s transparency struggles echo decades of disputes in climatology. In 2009, emails from the University of East Anglia’s Climatic Research Unit exposed efforts to manipulate temperature data and suppress skeptical research. A decade later, NOAA faced accusations of fabricating data for 30% of U.S. reporting stations, as documented by meteorologist Anthony Watts. “Ghost data has a track record,” Watts noted. “It’s the equivalent of admitting in court that you have no evidence—and yet we’re supposed to trust it for trillion-dollar policies?” The Met Office’s current practices draw particular criticism for overlapping with Net Zero advocacy. Chief Scientist Stephen Belcher has linked rising global temperatures to extreme weather events, citing a doubling of hot days since 2014. However, the Intergovernmental Panel on Climate Change (IPCC) has found no such trend, raising questions about the quality of U.K. data feeding global models.

Climate models under fire: "Convenient fictions" drive policy?

The Met Office’s alleged data gaps intersect with broader debates over climate models. Hosted in a leaked BBC discussion, skeptic Chris Folland of the U.K.’s Met Office himself admitted, “We’re not basing recommendations on the data; we’re basing them upon the climate models.” David Frame, a climate modeler at Oxford, clarified the models’ limitations: “They’re convenient fictions, not literal truth.” Critics argue this dependency creates a paradox. If foundational temperature data is fabricated or untraceable—as alleged in the Met Office case—then models built on such “fictions” lose scientific grounding. The implications extend beyond academics. In the U.K., tax reforms and energy policies are directly tied to climate assessments, leading skeptics like Sanders to demand transparency: “How can we follow a climate emergency narrative if the data isn’t real?”

Call for transparency amid policy pressure

Despite Sanders’ relentless advocacy—and mounting public interest—the Met Office, its governing minister and mainstream media have yet to fully engage. The agency continues attributing its records to “regression analysis,” though it cannot verify the stations involved. As debates over Net Zero intensify, the Met Office’s opacity risks amplifying skepticism. “I’m not anti-climate science,” Sanders said, “but we need honesty. Without proof, every policy claiming ‘climate emergency’ is just wishful thinking.” The unresolved questions underscore a growing rift between climate action imperatives and the rigor required to justify them—an unresolved balance that defines this critical moment in climate science. Sources for this article include: ClimateDepot.com BrightonJournal.co.uk WattsUpWithThat.com