Scientists can get really petty in peer review. They won’t be able to catch if the data was manipulated or faked, but they’ll be able to catch everything else. Things such as inconclusive or unconvincing data, wrongful assumptions, missing data that would complement and further prove the conclusion, or even trivial things such as a sentence being unclear.
It generally works as long as you can trust that the author isn’t dishonest
So it’s like a crowd strike code review
Scientists can get really petty in peer review. They won’t be able to catch if the data was manipulated or faked, but they’ll be able to catch everything else. Things such as inconclusive or unconvincing data, wrongful assumptions, missing data that would complement and further prove the conclusion, or even trivial things such as a sentence being unclear.
It generally works as long as you can trust that the author isn’t dishonest
A LOT of things work without safety nets if people engage honestly.
The problem, with FAR more than science, is many, many people are distinctly NOT honest.
I do trust scientists about peer review more than code reviews. This is how I imagine the crowd strike reviewer.