Nothing new, scientific peer review is a process under suspicion. The Guardian illustrated this fact three years ago, and WSJ has repeated the same recently. The potential solution is to put into practice the falsiability process as Popper emphasized long time ago. The main difficulty is data access, however some journals have started to supply such data for researchers in order to confirm the results. My position about it is clear, as a referee I'll refuse to review more papers unless this option is possible for any submitted article. The potential harm is huge in certain fields and circumstances, as the Vioxx case illustrates.
The WSJ op-ed says:
Fixing peer review won't be easy, although exposing its weaknesses is a good place to start. Michael Eisen, a biologist at UC Berkeley, is a co-founder of the Public Library of Science, one of the world's largest nonprofit science publishers. He told me in an email that, "We need to get away from the notion, proven wrong on a daily basis, that peer review of any kind at any journal means that a work of science is correct. What it means is that a few (1-4) people read it over and didn't see any major problems. That's a very low bar in even the best of circumstances."
But even the most rigorous peer review can be effective only if authors provide the data they used to reach their results, something that many still won't do and that few journals require for publication. Some publishers have begun to mandate open data. In March the Public Library of Science began requiring that study data be publicly available. That means anyone with the ability to check should be able to reproduce, validate and understand the findings in a published paper.