Tools of the Trade: Resources for Social Scientists

As part of our “Tools of the Trade” blog series, we’re highlighting resources for social science scholars and educators to aid in your research, writing, and prep work this summer. Look no further for a refresher of methods that you can use in your own work or share with your students.

How to Think Critically

Critical Thinking: Tools for Evaluating Research by Peter Nardi

This book prepares readers to thoughtfully interpret information and develop a sophisticated understanding of our increasingly complex and multi-mediated world. Peter M. Nardi’s approach helps students sharpen critical thinking skills and improve analytical reasoning, enabling them to ward off gullibility, develop insightful skepticism, and ask the right questions about material online, in the mass media, or in scholarly publications. Students will learn to understand common errors in thinking; create reliable and valid research methodologies; understand social science concepts needed to make sense of popular and academic claims; and communicate, apply, and integrate the methods learned in both research and daily life.

Stat-Spotting: A Field Guide to Identifying Dubious Data, Updated and Expanded by Joel Best

Are four million women really battered to death by their husbands or boyfriends each year? Is methamphetamine our number one drug problem today? Alarming statistics bombard our daily lives. But all too often, even the most respected publications present numbers that are miscalculated, misinterpreted, hyped, or simply misleading. This new edition contains revised benchmark statistics, updated resources, and a new section on the rhetorical uses of statistics, complete with new problems to be spotted and new examples illustrating those problems. Joel Best’s bestseller exposes questionable uses of statistics and guides the reader toward becoming a more critical, savvy consumer of news, information, and data. See also Damned Lies and Statistics: Untangling Numbers from the Media, Politicians, and Activists, Updated Edition.

Methodology

Data Mining for the Social Sciences: An Introduction by Paul Attewell and David Monaghan

We live in a world of big data: the amount of information collected on human behavior is staggering, and exponentially greater than at any time in the past. Powerful algorithms can churn through seas of data to uncover patterns. This book discusses how data mining substantially differs from conventional statistical modeling. The authors empower social scientists to tap into these new resources and incorporate data mining methodologies in their analytical toolkits. This book demystifies the process by describing the diverse set of techniques available, discussing the strengths and weaknesses of various approaches, and giving practical demonstrations of how to carry out analyses using tools in various statistical software packages.

The Comparative Method: Moving Beyond Qualitative and Quantitative Strategies, With a New Introduction by Charles C. Ragin

The Comparative Method proposes a synthetic strategy, based on an application of Boolean algebra, that combines the strengths of both qualitative and quantitative sociology. Elegantly accessible and germane to the work of all the social sciences, and now updated with a new introduction, this book will continue to garner interest, debate, and praise.

“While not everyone will agree, all will learn from this book. The result will be to intensify the dialogue between theory and evidence in comparative research, furthering a fruitful symbiosis of ‘quantitative’ and ‘qualitative’ methods.”—Theda Skocpol, Harvard University

Time Series Analysis in the Social Sciences: The Fundamentals by Youseop Shin

 This book is a practical and highly readable, focusing on fundamental elements of time series analysis that social scientists need to understand so they can employ time series analysis for their research and practice. Through step-by-step explanations and using monthly violent crime rates as case studies, this book explains univariate time series from the preliminary visual analysis through the modeling of seasonality, trends, and residuals, to the evaluation and prediction of estimated models. It also explains smoothing, multiple time series analysis, and interrupted time series analysis. With a wealth of practical advice and supplemental data sets, this flexible and friendly text is suitable for all students and scholars in the social sciences.

Regression Models for Categorical, Count, and Related Variables: An Applied Approach by John P. Hoffmann

Sociologists examining the likelihood of interracial marriage, political scientists studying voting behavior, and criminologists counting the number of offenses people commit are all interested in outcomes that are not continuous but must measure and analyze these events and phenomena in a discrete manner.

The book addresses logistic and probit models, including those designed for ordinal and nominal variables, regular and zero-inflated Poisson and negative binomial models, event history models, models for longitudinal data, multilevel models, and data reduction techniques.

A companion website includes downloadable versions of all the data sets used in the book.

Presenting Your Data

Principles of Data Management and Presentation by John P. Hoffmann

The world is saturated with data in words, tables, and graphics. Assuming only that students have some familiarity with basic statistics and research methods, this book provides a comprehensive set of principles for understanding and using data as part of a research, including:
• how to narrow a research topic to a specific research question
• how to access and organize data that are useful for answering a research question
• how to use software such as Stata, SPSS, and SAS to manage data
• how to present data so that they convey a clear and effective message

A companion website includes material to enhance the learning experience—specifically statistical software code and the datasets used in the examples, in text format as well as Stata, SPSS, and SAS formats.

 


Tools of the Trade: Resources for Psychology Research

As part of our “Tools of the Trade” blog series this summer, we’re here to help you further your own research by providing the resources you need to focus on your scholarship, write—or rewrite—your work, and prepare your work for publication.

Regardless of the discipline, the quality of one’s research is only as sound as the manner in which it was conducted. That’s why our Open Access journal, Collabra: Psychology, has an entire section dedicated to the study of Methodology and Research Practice in Psychology. For those conducting research this summer—and especially those in psychological fields—we’ve rounded up the following articles to help inform your own methodological approaches, data transparency, and replicability practices.

Making Your Research Transparent (Unlike a Car Salesperson!)

Quality Uncertainty Erodes Trust in Science by Simine Vazire

When consumers of science (readers and reviewers) lack relevant details about the study design, data, and analyses, they cannot adequately evaluate the strength of a scientific study. A car whose carburetor is duct-taped to the rest of the car might work perfectly fine, but the buyer has a right to know about the duct-taping. Without high levels of transparency in scientific publications, consumers of scientific manuscripts are in a similar position as buyers of used cars – they cannot reliably tell the difference between lemons and high quality findings. The solution is to increase transparency and give consumers of scientific research the information they need to accurately evaluate research. Transparency also encourages researchers to be more careful in how they conduct their studies and write up their results.

A New Standard for Replicating Your Research

A New Replication Norm for Psychology by Etienne P LeBel

In recent years, there has been a growing concern regarding the replicability of findings in psychology, including a mounting number of prominent findings that have failed to replicate via high-powered independent replication attempts. In the face of this replicability “crisis of confidence”, several initiatives have been implemented to increase the reliability of empirical findings. In the current article, LeBel proposes a new replication norm that aims to further boost the dependability of findings in psychology. Paralleling the extant social norm that researchers should peer review about three times as many articles that they themselves publish per year, the new replication norm states that researchers should aim to independently replicate important findings in their own research areas in proportion to the number of original studies they themselves publish per year (e.g., a 4:1 original-to-replication studies ratio).

Giving Due Attention to the Pitfalls of False Negatives

Too Good to be False: Nonsignificant Results Revisited by Chris H. J. Hartgerink, et al

The concern for false positives has overshadowed the concern for false negatives in the recent debates in psychology. This might be unwarranted, since reported statistically nonsignificant findings may just be “too good to be false.” This article examines evidence for false negatives in nonsignificant results in three different ways, arguing that the failure to address false negatives can lead to a waste of research resources and stifle the scientific discovery process.


Collabra: Psychology Call for Papers: Methodology & Research Practice in Psychology

This post was originally published on the Collabra: Psychology blog. For Collabra: Psychology news and updates, please follow @CollabraOA, the Collabra blog, or sign up for the Collabra e-newsletter.


We invite you to submit your work in methodology & research practice in psychology to Collabra: Psychology, the mission-centric, value-sharing open access (OA) journal from University of California Press.

unnamed

If you’ve already heard of us, you will know that Collabra: Psychology is different. It is not just another OA journal, but a journal that actually gives back to the research community through a novel mechanism that recognizes and shares the value contributed by editors and peer reviewers. This mechanism shares earnings with editors and reviewers for any journal work (not just work leading to acceptance), and allows them to make decisions as to what happens with this value, with options to “pay forward” that value to institutional OA budgets, or to an author waiver fund subsidizing APCs for other researchers. This page explains it in full.

Additionally, Collabra: Psychology is focused on scientific, methodological, and ethical rigor. Editors and reviewers do not attempt to predict a submission’s impact to the field, nor employ any topic bias in accepting articles — they will check for rigorously and transparently conducted, statistically sound, adequately powered, and fairly analyzed research worthy of inclusion in the scholarly record. The bar is set high.

But, most importantly for this call for papers, Collabra: Psychology has a great team of editors who specialize in methodology & research practice, led by Simine Vazire, Senior Editor, University of California, Davis.

We encourage you to submit your work to us, and to know that you will be supporting one of the first journals that shares actual value with all of the people who do the work and help create a journal’s brand. With our first papers now published and receiving over 25,000 views collectively, we look forward to continued publishing success. Any questions, please contact Dan Morgan.

There are many more innovative features at Collabra: Psychology, including optional open peer review, article-level metrics, article annotation and commentary from hypothes.is, and an article-sharing partnership with Kudos, to name just a few. Please do check out the website for the full story: www.collabra.org.

We hope to hear from you soon!

(On behalf of the Editors)

— Dan Morgan, Publisher, Collabra: Psychology