When a group of researchers tried to email the authors of 516 biological studies published between 1991 and 2011 and ask for the raw data, they were dismayed to find that more 90 percent of the oldest data (from papers written more than 20 years ago) were inaccessible. In total, even including papers published as recently as 2011, they were only able to track down the data for 23 percent.

Was there ever any raw data in most of those studies?
no raw data, only the predetermined outcome was available to the study participants!
To be fair, losing decades old data is understandable but unprofessional and some data was in a format that is obsolete (floppy discs). These are administration and technical problems that can be resolved rather easily.
The human nature factor and the unmistakable drift into bad science is another matter altogether.
The first, human nature, will never be completely overcome because, well, humans are flawed and being less than totally honest sometimes outweighs the risks of getting caught. For instance, if the POS auto sampler jams up, skewing your instruments calibration run which then gives you incorrect sample results and you only realize it after you asked other staff to make operational adjustments…you may just keep quiet for a few hours, disappear the bad results and then rerun the test and make new recommendations. Or so I’ve heard. This type of issue cannot be eradicated but the higher the stakes in getting accurate results, the more rigorous the checks, reproducibility and confirmation process should be. Wasting some steam water is not a big deal; science that is used to justify a radical new worldwide economic and energy system is.
The drift into the acceptance of bad science and politically motivated science, OTOH, must be stopped. Science needs to return to the practice of revealing all data, skepticism, vigorous debate and resist the temptation towards dogma, authoritarianism and claims of infallibility.
Having no raw data is a good indicator that there never was any raw data. Given that the conclusions (‘Global Warming’) are shown to be false, either the raw data was not analyzed correctly, or that the research grant whores were just spreading their legs for the approval of the biased green pimps. Having some raw data sure would put the whore/pimp theory to rest.
Until the raw data shows up, whore/pimp looks more reasonable.
Boy us redneck conservative rubes shure does need to larn allot. Why I hads no idear that including the word ‘science’ or ‘study’ was proof positive of whatever the science and study was all about. Dem libs is so smrt they don’st even need to provide evidence!
That’s right. All science is just smoke and mirrors. That computer you’re working on… a fantasy. The medicine you take… nothing more than snake oil. Evolution a conspiracy.
Shut off your computer, take off your clothes and go swing from the trees. It’s where you belong.
This issue is a problem but it doesn’t mean the science that has been done isn’t good science. It just means it’s been poorly archived. That could be fixed with… funding. Sorry guys, but that’s how something like this is fixed. A permanent, properly managed archive.
Your first point is plausible. It could very well be that data was collected but was in an old format and never archived properly. Until one can see that data, one cannot tell if it is good, bad, indifferent or even existing.
Merry Christmas.
(Santa does exist! 🙂
Simple way to fix that.
No data, no use in policy documents, no citation as reference paper.
Any laws,regulations imposed using data free “science” must be struck from books.
Something about policy that implies scientific support must document said science, fully.
This would produce a keen interest in archiving the data.
Course it would implode CAGW and the astounding level of worldwide Kleptocracy built upon invisible data.
It wouldn’t hurt to replicate those studies.
A bigger problem is lurking. Paper is an enduring medium. Computer disk surface is not. One leading
hard drive manufacturer designs for a lifetime of three years. The idea for a paper repository for
scientific journals came to nought. I have even had difficulty getting hold of my own papers from the
70s (Can J Phys has had its difficulties) but so far ultimately succeeded. But it is predictable
and even likely that at some point there will be massive losses of information –
say 50 years of Nature or whatever. Professional librarians mostly couldn’t care
less – they work for speed of access;
the archival function which many naive people think to be their primary function they disdain.
The modern professional librarian hates books.
The palaeolithic era is a lot closer than most people realize.
No it’s not. You just migrate the data to new mediums. That’s it. Paper worked great for centuries. But we now produce as much data in a year as pretty much every living scientist put together did from 1600 – 1950. We need computers and we need the data storage and archiving abilities they provide us.
John is correct. This is a minor technical problem in most cases. I have 30 year old data on floppy disks which could be read if needed. It appears that they were confounded when a floppy disk didn’t fit in a CD drive and looked no further.
Merry Christmas!
Do you know Santa’s billing address? There’s an itemized list I’d like to send him. 😉
Funny that … seems to indicate that there is a growing problem. The issue is that digging up old info is getting harder because of technology that was supposed to make it easier.
One of the things I’ve experienced myself is the loss of stored information.
I’ve lost a lot of old records that were stored on now obsolete media or using formats for which there is no longer supporting software tools.
In most cases the information could have been saved by updating everything as the technical transitions moved ahead. But, to tell the truth I found that was just too much trouble and so off to the scrap heap. Even if I had the material now ti would be more work than it’s worth to recover.
Other issues like obsolete contact information just exacerbate the problem for anyone who needs to track down old data.
This is why businesses spend $millions on stuff like SAP implementations.
At the end of the day, for researchers ad the institutions they work for are just responsible for the original data and the format approved for saving that information. If the format is cumbersome or becomes obsolete, that is not their responsibility.
Oh BTW Merry Christmas
John,
Every Professional Society has been busy for >30 years collecting and archiving data. (Obviously you don’t belong to any, and those scientist’s lacked qualifications) The Library of Congress has funded the recovery of NASA data including televised Broadcast They actually re-built compatible (secret at the time) data recorders.
When you pull data from butt holes, the butt hole is lacking in up to date Core memory. M.Mann comes to mind
Somewhere in the North Pole. One is hoping in the new year that this will have been sorted out by the scientists!
Santa’s address is:
Santa Claus
North Pole, Canada
H0H 0H0
Not even kidding about the postal code.
John, nice straw man. Twenty years ago was 1993. You are arguing that it is reasonable for ~75% of these researchers to not have raw data available from 1993.
Why would you even try to make that argument? It’s unreasonable for a proper scientist who’s published in major journals not to provide data on request by colleagues.
Why do you continually post unreasonable, ridiculous nonsense here John?
The way you solve obsolete data medium issues is by never throwing away a working computer. I’ve got ancient boxen from the Paleolithic Era kicking around, there’s even a couple of SGI Octanes in the barn someplace. Worse comes to worst you can always print it out.
Also, never throw away a working hard drive. Because you never know.
see that, john baby? Those so-called scientists could just come over to my house and I’d have them back in business in no time. You think I’m the only computer pack-rat in Canada?
You don’t know what you are talking about. I have enough connection with professional
librarians to be aware of the proposal for a paper archive (obviously a truly massive
paper archive) that never got off the ground (it was multinational, BTW). You also
don’t understand the massive nature of the problem. Elsevier for example publishes
1700 or more journals. Did you ever frequent a research library just before
electronification, and see the tremendous volume of the new journal literature come
in each month? Library space couldn’t be built fast enough to keep up.
While I don’t hold much truck with run of the mill librarians,
those who head research libraries are a little more conscientious, and they do
identify disk volatility as a major issue.
Oh yeah, duhhh – funding helps. It really helps.
Every try to get funding out of a civil servant
for an archive?????????
The way you solve obsolete data medium issues is by never throwing away a working computer. I’ve got ancient boxen from the Paleolithic Era kicking around, there’s even a couple of SGI Octanes in the barn someplace. Worse comes to worst you can always print it out. Also, never throw away a working hard drive. Because you never know. Posted by: The Phantom
Exactly what I’ve done too. No need to throw away a working tool just because I got faster brighter better one. I’ve got an original tan case Osborne tucked away somewhere.
“Because you never know.”