Jonathan Pruitt, a former UCSB professor currently working at McMaster University, finds himself engrossed in a whirlwind of controversy following the retraction of four academic research papers and an ongoing investigation by numerous peers into his extensive body of work.
Anomalies identified in data collected by Pruitt have put the status of his research publications in jeopardy. The proceedings of these investigations are currently being compiled in a Google Sheets document created by Daniel Bolnick, the editor in chief of the American Naturalist and a professor of ecology and evolutionary biology at the University of Connecticut — as well as a one-time co-author, collaborator and friend of Pruitt’s.
Pruitt worked with UCSB’s Ecology, Evolution, and Marine Biology (EEMB) department until October 2018, when he left the United States for McMaster University in Hamilton, Ontario after being offered an annual seven-year $350,000 research grant by the Canadian government.
At McMaster, Pruitt has been involved in animal personality research, managing his own lab and studying animal behavior in various species. As a holder of one of Canada’s 150 prestigious Research Chairs, Pruitt had been lauded as a standout in the field.
Issues with Pruitt’s data were first brought to attention by Niels Dingemanse, a professor of behavioral ecology at the Ludwig Maximilian University of Munich, regarding the paper “Individual and Group Performance Suffers from Social Niche Disruption,” published in The American Naturalist. The paper has since been retracted.
“[Dingemanse’s] critique (from an analysis by Erik Postma) identified biologically implausible patterns in the data file used in the paper (Laskowski et al 2016). Specifically, certain numbers appeared far more often than one would expect from any plausible probability distribution,” Bolnick wrote in a post on his blog, Eco-Evo Evo-Eco.
Kate Laskowski, a professor in the department of evolution and ecology at UC Davis who collaborated with Pruitt on four papers in the research of social spiders — two of which have since been retracted, with a third in the process of being retracted — was made aware of the issues with Pruitt’s data by Dingemanse and contacted Pruitt.
The data in question measured the “boldness” behavior of specific social spiders, or the “latency of the spiders to resume movement after being scared,” Laskowski wrote in a blog post.
According to Laskowski, the prediction was that the “repeatability of behavior” would increase in mixed colonies with “increasing tenure with their groups,” supporting the notion that interactions with the same group of individuals repeatedly will “canalize” individuals into behaving in predictable ways.
In other words, spiders over time will fall into particular social niches.
“Jonathan had [made measurements] to two decimal places, to the hundredth of a second. He said he did this with a stopwatch,” Laskowski said in an interview with the Nexus.
“However, there were many values that were duplicated, like exactly down to two decimal places. And so this seems really weird, right? How can two spiders get the exact same time down to a hundredth of a second?”
Pruitt clarified to Laskowski that the reason for this was that the spiders were actually measured in a block design, with multiple spiders simultaneously being observed and followed using the same stopwatch, and “then if two spiders did something at the same time, they recorded the same time,” according to Laskowski.
“To be very honest, I was a little bit frustrated because he hadn’t previously told me this is how he collected the data. This is a paper that I had my name on, and the fact that I didn’t understand this part about the data collection was frustrating, but these to me seemed like honest mistakes.”
According to Laskowski, Pruitt resolved to publish a correction to the paper. However, Laskowski found more and more anomalies in the data as her examination of the duplicate values proceeded further.
“I thought, if I could figure out what block the spiders were measured in, I could account for this statistically and that would recover some of the power of our analysis,” Laskowski explained.
“So I started looking at the data, trying to find out what blocks spiders were measured in it with the assumption that all of these duplicate values should be occurring in the same observation, like we had multiple observations per animal.”
However, to Laskowski’s dismay, these duplicate values were not occuring in the same observations, but instead in measurements with the same animal which were supposedly five weeks apart from one another.
“These duplicate values, that pattern of duplication — it cannot be explained by block design at all,” Laskowski stated.
“The other co-author on the paper, [Pierre-Olivier] Montiglio, and I talked extensively about what I had found, and he and I both agreed, since we couldn’t figure out what was causing the problems in the data, we couldn’t correct it, and this meant that any of the results from our paper were invalid,” Laskowski wrote in her post.
Laskowski contacted Bolnick, requesting a retraction.
As further investigations continue to unveil issues with Pruitt’s data, Pruitt himself is still engaged in ongoing research, working in the South Pacific and Australia on field research.
While the ultimate consequences are as of yet unknown, the investigations into Pruitt’s data will no doubt have an enormous impact on Pruitt, his collaborators and the broader field of animal personality research.
“Whether the problem is data handling error or intentional manipulation, the outcome will be both a series of retractions (the two public ones are just the beginning I fear), and mistrust of unretracted papers,” Bolnick wrote.
“This is harmful to the field, and harmful especially to the authors and co-authors on those papers. Many of them (myself included) were involved in Pruitt-authored papers on the basis of lively conversations generating ideas that he turned into exciting articles,” Bolnick continued.
While the ordeal is no doubt difficult, how the retractions have been handled is a matter of less controversy than the retractions themselves.
“I think the right course of action going forward is happening. And that we’re now doing it collectively as a community. We have lots of data forensics, people and other researchers who are completely uninvolved, and so hopefully have no conflicts of interest,” Laskowski said.
“All of us together are now trying to go through all of the data that Jonathan has provided in his papers to see which datasets appear like they might have some problems that need to be investigated.”
When asked for comment, Director of News and Media Relations at UCSB Andrea Estrada wrote in an email to the Nexus that the university is aware of the allegations, but “cannot discuss specific cases.” Furthermore, “maintaining the highest degree of integrity in all research endeavors is essential to our mission.”
“We have robust procedures on our campus to address instances of research misconduct, and we would cooperate with any other institution conducting an investigation.”
Meanwhile, those hoping to read about Pruitt’s UCSB research on The Current will find themselves redirected to a blank page.