LISTSERV mailing list manager LISTSERV 15.5

Help for FISH-SCI Archives

FISH-SCI Archives

FISH-SCI Archives


Next Message | Previous Message
Next in Topic | Previous in Topic
Next by Same Author | Previous by Same Author
Chronologically | Most Recent First
Proportional Font | Monospaced Font


Join or Leave FISH-SCI
Reply | Post New Message
Search Archives

Subject: Re: #perchgate - Would you have spotted the fraud?
From: Trevor Kenchington <[log in to unmask]>
Reply-To:Scientific forum on fish and fisheries <[log in to unmask]>
Date:Sat, 29 Apr 2017 13:30:01 -0300

text/plain (214 lines)

><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>
<><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><


Thank you for bringing this affair to the list's attention! I had not  
heard anything of it before but I have now run through the various  
reports and documents available on-line.

I must correct you on one point: The raw data for Lönnstedt & Eklöv's  
study no longer exist. As a routine step in its publication process,  
"Science" required those data to be posted somewhere. However, when  
the authors were reminded to complete that step, it emerged that the  
data had only existed on a laptop that had just been stolen. Or such  
was the claim made.

Unless the two authors can rebut the recent review decision (and  
there is still room for some doubt there), this would be the worst  
case of outright data fabrication that I have encountered in the  
fisheries literature -- at least if one excludes cases where the  
fabrication was obvious from the published papers themselves. (I  
remember, for example, one case of a paper that took catch rate data  
from "surveys" which never happened but I don't think the data  
themselves were fabricated. Instead, I suspect, they were taken from  
tagging trips which did happen. That still invalidated the paper,  
since tagging work seeks to maximize catches and the resulting CPUE  
isn't comparable with data from later stratified-random surveys. But  
all of that could readily be found by any reader of the paper who  
bothered to check. Clearly, the reviewers had failed to do so.)

To respond to your original question, though the response can only be  
one of personal opinion:

I don't think that normal peer reviewers can be expected to detect  
outright fraud. That would require a level of investigation that  
would demand vastly more time and expense than any of us could  
volunteer. We should check that, if the methods were applied as they  
are described and if the resulting data were analyzed as the authors  
claim, then their conclusions would be supported. But if someone  
worked backwards from a "conclusion", fabricated data, added some  
uncertainty and then worked forwards again to generate "support" for  
their pre-conceived position, I'd not expect a reviewer to detect it,  
nor a journal editor either.

Nor do I think that the parent institute, such as a university, can  
be expected to confirm that work was done as described. How could  
that ever be arranged, other than in some very particular situations?  
What institutes could be called upon to do is to swiftly and properly  
investigate reported cases of fraud, after the event. It looks like  
Lönnstedt & Eklöv's university badly failed on that one, preferring  
to deny any problem and blame the whistleblowers. It was a  
subsequent, national-level inquiry which produced the recent decision  
that fraud had occurred.

Ultimately, it must be an author's responsibility to proceed  
ethically. The burdens on the wider scientific community lie in such  
things as maintaining the highest ethical standards ourselves (so  
that each of us stands as an example), inculcating ethics in  
students, weeding out those who reject that message and demanding  
draconian penalties for those scientists who do commit academic  
fraud. Sadly, I fear that (as a community, though obviously not for  
every individual) we have been too lax in all of those areas.

It would further help if rags like "Science" stopped putting out  
press releases announcing new papers and stopped accepting for  
publication bits of rinky-dink science that would be better suited to  
an undergrad thesis. (If Lönnstedt & Eklöv had done what they  
claimed, it would have involved 30 one-litre beakers and three weeks  
of lab work. OK for a pilot study but hardly publishable --  
especially since their "results" were so exceptional and should thus  
have been checked by a repeat experiment.) The notion of extra-rapid  
publication of exciting new results, which has crept in during the  
last decade or so, should also be scrapped. (In too many cases,  
results are only exciting because they are wrong. Extraordinary  
claims should require extraordinary evidence, not a swift rush  
through an abbreviated review process.) More broadly, as a community  
we need incentive systems that encourage quality work, not a pursuit  
of newspaper headlines. Hence, universities and grant-funding  
organizations should go back to valuing bodies of primary  
publications, rather than media attention.

We might even get so far as trying to educate the public to ignore  
single scientific publications. The overall body of scientific  
knowledge is a robust and valuable edifice but there is a lot of  
"noise" along its fringes as individual studies lead to unique ideas.  
Some will pan out with further work. Others won't. In recent decades,  
we have made progress towards creating a scientifically literate  
public but there's been far too much emphasis on the excitement of  
the latest thing, which has led to public-policy decision-makers  
responding to the "noise" rather than waiting to see what gets  
supported over time.

All a bit depressing, to me.

Trevor Kenchington

On 28-Apr-17, at 8:55 PM, Irene Zweimüller wrote:

>> <>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>
> <><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><
> Actually, the raw data are given in the supplement. I´ll run the  
> stuff through SPSS and see, what that tells me.
> @Chris: I could not agree more with respect to "science correcting  
> itself" and "co-authors beware"
> Irene
> Am 29.04.2017 01:24, schrieb Chris Harrod:
>> I think that the minimum requirement of providing the data would  
>> lower
>> the risk of this kind of thing, but whether I would have spotted it
>> being fake is another thing.
>> Good to see science correcting itself. Also, the big take home  
>> message
>> is that if you are a co-author, you need to ensure that the data are
>> what your student/colleagues say they are.
>> Chris
>> -------- Original message --------
>> From: Irene Zweimüller <[log in to unmask]>
>> Date: 28/04/2017 17:06 (GMT-04:00)
>> To: [log in to unmask]
>> Subject: #perchgate - Would you have spotted the fraud?
>>> <>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>
>> <><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><
>> Several scientists twittered, that a study published in Science
>> Lönnstedt & Eklöv (2016): Environmentally relevant concentrations of
>> microplastic particles influence larval fish ecology
>> Science 352: 1213-1216
>> was more or less "thin air", e.g. not all of the experiments  
>> described
>> were carried out. The University started an investigation and  
>> concluded
>> misconduct.
>> Now my question: as a reviewer, would you have detected the fraud?
>> I found some mistakes in the statistics of the materials and methods
>> sections (Supplement), but I´m not sure I would have raised hell  
>> about
>> the data.
>> Is it the responsibility of the reviewer to check, whether there  
>> was in
>> fact an experiment performed? Or the responsibility of the  
>> University /
>> field station etc to make sure, people do in fact work?
>> How suspicious do we have to be?
>> I always thought, that fish are too unpredictable to fake fish  
>> studies
>> kind regards
>> Irene
>> --
>> ----------------------------------------------------
>> Dr. Irene Zweimüller
>> Fakultät für Lebenswissenschaften
>> Dept. für Integrative Zoologie
>> Althanstr. 14
>> A-1090 Wien
>> Österreich
>> Faculty of Lifesciences
>> Dept. for Integrative Zoology
>> Althanstr. 14
>> A-1090 Vienna
>> Austria
>> --------------------------------------------------
>>> <>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>
>> For information, send INFO FISH-SCI to [log in to unmask]
>>                    The FISH-SCI List Archive
>>      To cancel your subscription, send a blank message to:
>>            [log in to unmask]
>> <><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><
> -- 
> ----------------------------------------------------
> Dr. Irene Zweimüller
> Fakultät für Lebenswissenschaften
> Dept. für Evolutionsbiologie
> Althanstr. 14
> A-1090 Wien
> Österreich
> Faculty of Lifesciences
> Dept. for Evolutionary Biology
> Althanstr. 14
> A-1090 Vienna
> Austria
> --------------------------------------------------
>> <>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>
> For information, send INFO FISH-SCI to [log in to unmask]
>                   The FISH-SCI List Archive
>     To cancel your subscription, send a blank message to:
>           [log in to unmask]
> <><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><

><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>  ><>
For information, send INFO FISH-SCI to [log in to unmask]

                   The FISH-SCI List Archive

     To cancel your subscription, send a blank message to:
           [log in to unmask]
<><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><  <><

Back to: Top of Message | Previous Page | Main FISH-SCI Page



CataList Email List Search Powered by the LISTSERV Email List Manager