[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 121 KB, 659x729, 1569575142106.jpg [View same] [iqdb] [saucenao] [google]
11011659 No.11011659 [Reply] [Original]

Isnt it a bit scary ?

https://www.bbc.com/news/science-environment-39054778

>> No.11011664
File: 173 KB, 680x960, IMG_20181120_201634_810.jpg [View same] [iqdb] [saucenao] [google]
11011664

Finally

>> No.11011668

>>11011659
This may happen because
1.The experiment is so complex and equipment specific its impossible to replicate it consistently.
2.Scientists lying through their teeth for job security because publish or perish tempo.

You can guess which one is more likely.

>> No.11011670

>>11011659
One issue is the literature has become very terse in its description of methodology. Experimental setups are passed around in almost an oral tradition. In biochemistry procedures like cell culture are a refined, personal art. So it is difficult to impose such a standard of rigor here; and the difficulty of reproduction does not invalidate (all) the results.

>> No.11011671

Now if only these threads would have a reproducibility crisis.

>> No.11011672

>>11011671
oh snap

>> No.11011675

>>11011668
>Scientists lying through their teeth for job security because publish or perish tempo.
I do not think this is as prevalent as people here believe. Fraud happens, but I think it is more often the case that people publish marginal or statistically insignificant results.

>> No.11011697

>>11011675
Its not about fraud,its about being involuntary complicit to a broken system.
I believe Veritasium on youtube had a video about paper that was a recently published that reasearched statistical significance origin and the mess that it became.

>> No.11012023

>>11011659
Nutritional science in a nutshell.

>> No.11012051

Unsurprising. The longer you have a complex establishment, like science, higher education, government, capitalism, journalism, the weaker it becomes as himans get sloppy over time. More people ignore the original intent and realize it's easy to game the system.

All of these establishments are useful and serve important purposes bit you gotta hit the reset button once in a while. The difficult part is figuring out how to do that without throwing everything people depend upon into turmoil.

>> No.11012056

>>11011675
Right, not fraud per se but rather people finding shortcuts to the results they want but which don't do much to further the science.

>> No.11012435

>>11011675
Deliberately publishing shaky work or work that's fundamentally insufficient isn't fraudulent? The publisher know's is barely workable or shit, or completely insufficient to prove it's claim but they publish it anyways. How can they be anything other than fraudulent?

>> No.11012652
File: 59 KB, 426x574, IMG-188616197387~01.jpg [View same] [iqdb] [saucenao] [google]
11012652

>>11011659
I never knew how incompetent most researchers are until I did my PhD, holyshit like a solid 60% of them are brainlet retards that all they do in their work is bureaucracy shit instead of researching and studying

>> No.11013661

>>11012435
weak data is different than fraudulent data, it's often not a conscious or malicious decision either. in many fields of science the underlying principles are "known" and so the experimenter believes that when he sees something that disagrees with this knowledge, he made a mistake. so he throws out all the data that disagrees wih his model, and restricts his report to the data where no "experimental error" occurred. this behavior is routine today, and what's more, it's often justified, as for example in microbiology a lot of stuff is simply out of your control. but it delays recognition of phenomena like minute amounts of copper contamination acting as a catalyst.

>> No.11013699

>>11011659
Why would anyone replicate anything tho? Even if you have the lab to do so, it's worthless.
Nobody cares if you do something that was already done. Best case you can't replicate and either the original guy was wrong or you are.
Or everything works fine and status quo as isual.
Either way replicating studies won't get you grants, citations, publications on respected journals or tenure.

>> No.11013733
File: 869 KB, 1280x720, 1540863247583.png [View same] [iqdb] [saucenao] [google]
11013733

>>11011668
3. the paper is very bad at explaining the methodology and/or setup
4. le 5% alpha level
5. ...

>> No.11013776

This is mostly due to publication bias, ie the results of the experiment influencing the decision on whether to publish or not.

Still we are talking about preclinical data. These are not even included in the large metanalysis and in the rcts which confere the highest level of evidence to scientific data.

Anyone who works in the field of preclinical research knows that infering conclusions from smal cohorts, as in the typical basic research experiment, bears big problems of reproducibility.

>> No.11015631

>>11011659
>>11011664
kek get the fuck out of here christfag brainlets

>> No.11015689

>>11011659
Finding surprising discoveries is economically incentivized in pretty much every university and there is no way to check if the scientists are truthfully reporting their findings. What the fuck do you expect?

>> No.11015693

>>11011659
I smell sampling bias. How many have actually attempted with significant effort to replicate someone elses shit? People utilize published methods all the fucking time, is that weighed into the results? Sure, this article focuses on 5 (!) cancer-related studies - so this might just all be medicine. Whatever.

>> No.11016610

>>11013699
It should though. Without replication, science is worth shit.

>> No.11017060

>>11013699
This is the truth. You don't even get any special award for deepening your own prior findings and thereby increasing their likely validity. The system is a game for points, which need to be grabbed where they can be gotten most cheaply. One million retrospective observational studies that produce questionable results through data dredging and call for randomized trials to confirm that of course never happen because they are too risky and too costly, have just a really bad ratio of work to cumulative impact factor. It's a race to the bottom.

>> No.11017348

>>11016610
If society (both civil society and the scientific society) actually wants replication (instead of holding it as an empty value/virtue signalling) them they need to start behaving as if replication is important. Stop treating replication as second-class (if not third or forth class) work.

>> No.11018199

>>11013699
Peer review is on a similar situation really.
It's a sacred cow, but it's incredible thankless job

>> No.11018209

It's making me glad I didn't go into it. I was on the cusp. But watching it from the outside, it looks like the walls are caving in.