You do realize it isn't the specific temperatures that are changing, but how those temperatures represent the climate at that time, right? And you do understand why it's scientifically inaccurate to consider those specific temperatures when considering the overall climate at that time, right? Or are you ignoring your own source?
However, due to changes in observing sites, instruments, observing schedule, observing habits and micro-environment around the observational grounds, discontinuous points in the observational records can be created, especially for surface air temperature (SAT) records. The inhomogeneous data may bring certain deviation for estimating climatic trends, leading to inaccu-rate analyses for regional climate change detection in some circumstances (Jones et al. 1986 ; Easterling and Peterson 1995a; Yan et al. 2001; Ren et al. 2005; Menne et al. 2010).
And that illustrates some of the mathematical sloppiness that Jones, Mann, and other members of the team introduced, leading climatology to embrace data homogenization without looking at the consequences.
For an example, suppose you have three monitoring stations (A, B, and C), with slightly different baseline temperatures, but none of which has
any underlying trend, but each of which has a consistent, creeping, positive bias that gets periodically corrected as trees get cut back or as they get moved a hundred feet to stay away from encroaching structures. Each site then has a sawtooth waveform, and the wave forms are asynchronous because the cutbacks or moves aren't coordinated between sites. So you have the equivalent of three saw blades laying edge up on a flat table, and you want to know the slope of the table.
So take it as given that the actual climate at all three sites is unchanging, and the delta T should be zero across the whole time series, but due to the encroachment (UHI, etc) each site shows a sawtooth warming bias of 0.5 degrees per decade, and then the error at a site gets corrected. So over fifty years, each site will go through five cycles of a gradual up and then an almost instantaneous down, with the maximum error at each site limited to 0.5 degrees. If you go back through the paper records, you would see that the past temperatures are going to match the current temperature to plus or minus 0.25 degrees.
But then a climatologist comes along nattering about the importance of homogenization, and what he sees is three sites that each show a trend of 0.5 C per decade, but with abrupt discontinuities where the sites were somehow altered or screwed up. By comparing the three sites (A, B, and C), he can see where one site abruptly quit tracking its two neighbors, both of which were still showing the positive trend, and he will correct the "error" by removing the break-point so that the differing sites record continues along with the positive linear trend of its neighbors. First he corrects A, from fifty years ago, and then B, from 48 years ago, and then C, from 43 years ago, and round and round and round, because the three sites are asynchronous sawtooths.
What he's built mathematically is like a ratchet mechanism. Instead of treating the waveforms as periodic and taking the site observers at their word, he's taken the three site's five sawtooth cycles and created giant triangles, thinking the gentle slope is the underlying climate signal and the discontinuities are the errors. So from a real climate change of zero, he's created a "signal" from the 5 decades of 0.5C per decade drift, and created an imaginary warming of 2.5 degrees C for each site.
Since the current temperatures are still being measured, he adjusts the prior temperatures, making each site's past colder and colder and colder. Since the system is a ratchet, this process will go on forever, leading to the current state of the GHCN dataset, where the past is claimed to be much colder than the present, even where the paper records of past temperatures are the same numbers we're seeing now.
Michael Crichton actually performed a check that could've detected this for his global warming novel, which contained dozens and dozens of raw-data temperature plots from North America that showed temperatures stable or declining throughout the 20th century. It was an inconvenient truth, so it got homogenized out of existence.
It's not opinion, it's math. You can grab the code for a homongenization algorithm and feed it some sample data and see how it can screw up. What Mann, Jones, and the Team have done is figure out how to torture raw data until it will confess to anything.