Tools of the Trade: Resources for Psychology Research

As part of our “Tools of the Trade” blog series this summer, we’re here to help you further your own research by providing the resources you need to focus on your scholarship, write—or rewrite—your work, and prepare your work for publication.

Regardless of the discipline, the quality of one’s research is only as sound as the manner in which it was conducted. That’s why our Open Access journal, Collabra: Psychology, has an entire section dedicated to the study of Methodology and Research Practice in Psychology. For those conducting research this summer—and especially those in psychological fields—we’ve rounded up the following articles to help inform your own methodological approaches, data transparency, and replicability practices.

Making Your Research Transparent (Unlike a Car Salesperson!)

Quality Uncertainty Erodes Trust in Science by Simine Vazire

When consumers of science (readers and reviewers) lack relevant details about the study design, data, and analyses, they cannot adequately evaluate the strength of a scientific study. A car whose carburetor is duct-taped to the rest of the car might work perfectly fine, but the buyer has a right to know about the duct-taping. Without high levels of transparency in scientific publications, consumers of scientific manuscripts are in a similar position as buyers of used cars – they cannot reliably tell the difference between lemons and high quality findings. The solution is to increase transparency and give consumers of scientific research the information they need to accurately evaluate research. Transparency also encourages researchers to be more careful in how they conduct their studies and write up their results.

A New Standard for Replicating Your Research

A New Replication Norm for Psychology by Etienne P LeBel

In recent years, there has been a growing concern regarding the replicability of findings in psychology, including a mounting number of prominent findings that have failed to replicate via high-powered independent replication attempts. In the face of this replicability “crisis of confidence”, several initiatives have been implemented to increase the reliability of empirical findings. In the current article, LeBel proposes a new replication norm that aims to further boost the dependability of findings in psychology. Paralleling the extant social norm that researchers should peer review about three times as many articles that they themselves publish per year, the new replication norm states that researchers should aim to independently replicate important findings in their own research areas in proportion to the number of original studies they themselves publish per year (e.g., a 4:1 original-to-replication studies ratio).

Giving Due Attention to the Pitfalls of False Negatives

Too Good to be False: Nonsignificant Results Revisited by Chris H. J. Hartgerink, et al

The concern for false positives has overshadowed the concern for false negatives in the recent debates in psychology. This might be unwarranted, since reported statistically nonsignificant findings may just be “too good to be false.” This article examines evidence for false negatives in nonsignificant results in three different ways, arguing that the failure to address false negatives can lead to a waste of research resources and stifle the scientific discovery process.


Collabra: Psychology Call for Papers: Methodology & Research Practice in Psychology

This post was originally published on the Collabra: Psychology blog. For Collabra: Psychology news and updates, please follow @CollabraOA, the Collabra blog, or sign up for the Collabra e-newsletter.


We invite you to submit your work in methodology & research practice in psychology to Collabra: Psychology, the mission-centric, value-sharing open access (OA) journal from University of California Press.

unnamed

If you’ve already heard of us, you will know that Collabra: Psychology is different. It is not just another OA journal, but a journal that actually gives back to the research community through a novel mechanism that recognizes and shares the value contributed by editors and peer reviewers. This mechanism shares earnings with editors and reviewers for any journal work (not just work leading to acceptance), and allows them to make decisions as to what happens with this value, with options to “pay forward” that value to institutional OA budgets, or to an author waiver fund subsidizing APCs for other researchers. This page explains it in full.

Additionally, Collabra: Psychology is focused on scientific, methodological, and ethical rigor. Editors and reviewers do not attempt to predict a submission’s impact to the field, nor employ any topic bias in accepting articles — they will check for rigorously and transparently conducted, statistically sound, adequately powered, and fairly analyzed research worthy of inclusion in the scholarly record. The bar is set high.

But, most importantly for this call for papers, Collabra: Psychology has a great team of editors who specialize in methodology & research practice, led by Simine Vazire, Senior Editor, University of California, Davis.

We encourage you to submit your work to us, and to know that you will be supporting one of the first journals that shares actual value with all of the people who do the work and help create a journal’s brand. With our first papers now published and receiving over 25,000 views collectively, we look forward to continued publishing success. Any questions, please contact Dan Morgan.

There are many more innovative features at Collabra: Psychology, including optional open peer review, article-level metrics, article annotation and commentary from hypothes.is, and an article-sharing partnership with Kudos, to name just a few. Please do check out the website for the full story: www.collabra.org.

We hope to hear from you soon!

(On behalf of the Editors)

— Dan Morgan, Publisher, Collabra: Psychology