If you are a social psychologist, it’s probably old news to you that the field is in the midst of a revolution. As a fifth-year grad student, this is all I have ever known of the field—news of Hauser’s questionable coding broke my first week of graduate school, and Bem’s parapsychology paper and Diederik Stapel followed shortly after. Since then, nearly every conference, Twitterfeed, and paper-writing meeting I’ve experienced has included discussion of QRPs (questionable research practices), replication, the newest retracted article, “bullying”, and the “crisis of confidence.” It’s easy for a neophyte to feel frustrated and disappointed.
However, in part because of all the recent soul searching and outcry, it is also an incredibly exciting time to be a junior social psychology researcher. There is something about discord that opens up the possibility for change and improving practices that otherwise might keep on being quietly mediocre. And it is encouraging that, unlike some more senior researchers, nearly every graduate student I have ever spoken with recognizes that as a science we can do better and, importantly, is open to change.
Of course the question remains: how can junior researchers, with minimal power and prestige (yet!), contribute to improving our science? Certainly we lack the ability to change things in any substantial top-down manner; hell if I’m going to tell senior research advisors (and potential letter writers) that they have to fundamentally change how they think about methods, and I’m pretty sure JPSP would completely ignore my input on its publication practices. But I propose that one good way for junior researchers to contribute to positively changing the norms of the field—and to reap some positive individual benefits in the meantime—is to practice more open science.
Why open science?
The need for open science has been explicitly acknowledged since at least the 1600s, when the first journals and scientific societies came about, encouraging researchers to publish their findings rather than hide them for personal gain. When data, procedures, materials, and findings are made open, other researchers are able to stand on the shoulders of the sharing giants to more efficiently and effectively pursue new knowledge. Indeed, openness has been credited with pushing the Scientific Renaissance, and before that with classical Greece’s development and dissemination of logic through public debates. (Paul David provides a very readable history here.) Today governments and leading psychological organizations continue to recognize that materials and data should be made publically accessible.
Yet the current norm in psychology is to follow somewhat closed science. This is not to say psychologists are against being open—most likely think it is “good” to share papers and materials (even data) and will do so in response to a kindly worded email. However, as a field we do not generally share materials and data in a truly open way. Instead only those who have the courage to inconvenience a stranger (who is often higher in prestige than the requester) and patience to wait get access to them. Moreover, even the best-intentioned researchers often fail to adequately share because data is indecipherable or materials have been lost. Researchers thus, intentionally or not, construct small (and sometimes large) barriers to the dissemination and progress of science.
But as junior scientists we are particularly well situated to change this norm. Most of us are still in the process of developing our personal practices and habits, and we have already use computers and the internet (which greatly help open practices) in our everyday work. Thus, integrating open science practices can be a relatively seamless transition. Additionally, since most of us are not well-known or well-cited, we may be particularly likely to gain from open science’s benefits.
Integrating open practices – what a junior researcher can do.
Online sharing. Even for someone with virtually no technological aptitude, free online sharing platforms like ResearchGate.net, Academia.edu, and the Social Science Research Network make sharing papers quick and foolproof. Posting the journal-published version of your paper may run up against your publisher’s copyright, but most allow authors to share pre-print author-created versions of their work on non-commercial websites and depositories. (As always, this is not legal advice. Some resources you may find helpful are here and here.)
Sharing data and materials can be nearly as easy. Some researchers have begun to simply link to them on personal websites, and others use free online repositories designed specifically for scientists. I, for instance, have used the Open Science Framework (OSF) to store hypotheses, research plans, stimuli, and data. The OSF basically serves as a time-stamped online file storing and sharing system, not unlike Figshare or Dropbox. But, importantly, with a click of the button it also allows you to share any of your stored materials with specific individuals, such as collaborators, or the general public.
Making it usable. Of course, making your work truly open is not the same as merely making it available; to be helpful to others it must also be organized and easily decipherable. Frankly, this takes upfront time and effort. But most of the work and drag that open science demands—for instance compiling materials, clearly labeling data and code, documenting hypotheses and analyses—is the kind of rigorous work that makes science better. Junior scientists are in the relatively well-off position of being able to integrate such organizational habits from the start, making sharing down the line much easier.
Expecting openness from others. Sometimes junior researchers can also create expectations of openness from others. For instance, Barbara Spellman recently proposed that researchers could demand access to data files before agreeing to serve as journal reviewers for empirical papers. We can also train the next generation of psychologists to integrate open practices into their everyday workflow. I, for example, may not be able to change the practices or beliefs of my senior collaborators, but I have been pleasantly surprised by how excited undergraduate research assistants are to learn about and begin using open practices. If anything, they are surprised to learn we don’t do so already.
How openness can benefit the junior researcher.
Perhaps the most obvious immediate personal benefit of practicing open science is that it gets your name and findings out in public so people can more quickly become familiar with your work. Not only does it make published papers more accessible to would-be readers and citers, but it provides a place for unpublished work to be disseminated. This is likely particularly helpful to a junior researchers, who may wait years to see our name in print.
Despite this obvious benefit, some researchers still fear the risks of being open outweigh the benefits. Common fears, however, are likely overblown and may actually instead be better characterized as benefits. I should acknowledge that others have made these arguments for years. I repeat them because I believe they are important and right.
First, researchers are afraid that others will literally steal their ideas or data, claiming priority or failing to give proper citation. If anything, however, having your work publicly published online should protect you from this fear. If someone copies your work and passes it off as their own without attributing it to you, then that is plagiarism and fraud. Having your data or ideas published online prior to hard copy provides you with time-stamped evidence that you had the idea or data first. It may even stop unintentional scooping that would otherwise occur, since new researchers can check whether others are working on similar projects, and possibly even collaborate with them, before starting on duplicative work.
Second, researchers are afraid that others will misappropriate their hard work. This is an understandable concern: researchers who put in lots of effort (and sometimes money) want to be sufficiently rewarded.
Of course this fear should not stop researchers from making relevant procedures, data, and other materials available after publishing their findings. The more challenging question is whether, or perhaps for how long, researchers should be able to sit on materials and data. Currently, my personal practice is to withhold any potentially publishable data until I publish it—regardless of how long that takes or how likely I am to actually do so—and to store any seemingly unpublishable data or “failed projects” in a private folder until I forget about them. Of course there is an ethical question here (something involving withholding huge amounts of information from the public, likely slowing down the progress of science). But even from a purely selfish perspective I think we should ask ourselves: am I really going to publish this any time soon? If the answer is “no,” then chances are researchers can only gain from making materials and data publically available (through, e.g., collaborations or citations or just becoming known for doing lots of work on whatever you do).
The downside of sharing – opening yourself up to scrutiny.
Increased openness has many benefits, but it also has at least one drawback: opening one's research to enhanced scrutiny. Current research culture often incentivizes presenting 'clean' designs and data, so transparency about design and data 'ugliness' can put one at a disadvantage in peer review, academic hiring, and tenure reviews. This fear is particularly scary as a junior researcher who do not have a solid reputation yet and whose career trajectory is founded on a tiny number of projects. But it is precisely because others might find mistakes or come to different conclusions that it is so important to share. Science requires that evidence-based conclusions be subject to scrutiny.
Because open practices likely will lead to more criticism, as a field social psychology should be careful about how and why we criticize. We must acknowledge that there are often multiple ways to analyze data, and mistakes are to be expected. Thus instead of condemning individuals or practicing “gocha” science, we should focus on developing processes to improve data collection and analysis to reduce errors. One way to constructively address perceived errors is simply to provide the original authors notice of any mistakes or alternative interpretations, allowing them to respond before their work is publicly criticized. Another option could be to engage in collaborative re-analyses or re-interpretation akin to collaborative replications. As Betsy Levy Paluck noted, if economists can be respectful, so can we.
Some final thoughts for all, including the less junior reader.
Finally, as a field we need to acknowledge that open science improves our research and reward those who practice it. Top journals like PSPB are starting to require data from published articles be made available, and Psych Science has begun to incentive open practices. I hope that hiring, tenure, and grant committees will similarly expect transparency and credit those who practice it.
This of course does not mean that all researchers need to spend the weekend searching their attacks so they can publicly archive all their research materials since 1978. I empathize with those who are overwhelmed by the idea of practicing openly or even skeptical about open science more generally. To these readers I suggest you start small. Perhaps begin by organizing your new research materials and data files in an electronic form that could be easily shared if (and when) you decide to—clearly label the variables in your data files, try out an online repository like the Open Science Framework, encourage your next undergraduate honors student to record their hypothesis and post their materials online. I think that you will find it is not so hard to begin integrating practices that will help make your science more open and our science better.