Lawrence Sanna departed University of Michigan amid questions over his work from ‘data detective’ Uri Simonsohn.
Uri Simonsohn, the researcher who flagged up questionable data in studies by social psychologist Dirk Smeesters, has revealed the name of a second social psychologist whose data he believes to be suspiciously perfect.
That researcher is Lawrence Sanna, whose former employer, the University of Michigan in Ann Arbor, tells Simonsohn that he resigned his professorship there at the end of May. The reasons for Sanna’s resignation are not known, but it followed questions from Simonsohn and a review by Sanna’s previous institution, the University of North Carolina in Chapel Hill (UNC). According to the editor of the Journal of Experimental Social Psychology, Sanna has also asked that three of his papers be retracted from the journal.
In both Smeesters’ and Sanna’s work, odd statistical patterns in the data raised concerns with Simonsohn, at the University of Pennsylvania in Philadelphia. But the similarity between the cases ends there. Smeesters’ resignation was announced on 25 June by his institution, Erasmus University Rotterdam in the Netherlands, which undertook a review and concluded that two of his papers should be retracted. Sanna’s resignation, by contrast, remains mysterious: UNC did not release the results of its review, and the University of Michigan will not explain why Sanna resigned.
“Unlikely” data
Sanna’s research covers areas of psychology including judgement, decision-making and morality. Last year, his work attracted media coverage (and so far, four citations) for showing that people behave more altruistically if they are physically elevated, for example by riding an ascending escalator1. This link between physical height and moral virtue is an example of embodied cognition, a growing area of psychology that looks at how the body and environment influence the mind.
Simonsohn was looking through the literature on embodied cognition last July when he noticed Sanna’s elevation study. “The evidence was very strong compared to the other papers and it puzzled me,” he says. “Every result was super-significant, and there were very large effects.”
In one experiment, Sanna checked the willingness of volunteers to force other people to eat hot sauce in order to make them feel pain. The volunteers were tested after being in three positions: walking up or down a staircase or staying level. Simonsohn noticed that although the results for each of the three test positions had different means, they had uncannily similar standard deviations. “I ran simulations and the similarity was extremely unlikely for proper random samples,” he says.
Simonsohn found three other papers by Sanna2 – 4 and several from other researchers that used one of the methods found in the elevation paper – a cooperation game that involved fishing. “When other authors ran this paradigm, they got healthy-looking standard deviations. But when Sanna did, he got very similar ones,” says Simonsohn.
In September, Simonsohn sent an eight-page report detailing his concerns to Sanna and two of his senior co-authors. He received back raw data, which revealed, for example, almost identical ranges between the maximum and minimum data points, across different conditions. “That’s extremely rare,” Simonsohn says.
Simonsohn exchanged e-mails with Sanna and his co-authors throughout October, offering to discuss his concerns. Eventually, the replies stopped. When Simonsohn contacted three graduate students (who each appeared as co-authors on at least one of the four papers), all said that they were not involved in collecting or analysing data. Simonsohn adds that he has no evidence or suggestion of any data manipulation by the co-authors.
University investigation
The UNC, where Sanna was working when those particular papers were published, contacted Simonsohn on 5 December 2011. The university had already heard about his concerns and begun an enquiry, and Simonsohn provided them with the information he had collated.
The results of that investigation are not public. The UNC sent Nature a statement on 6 July saying it could not comment further “because the review involved confidential personnel information”. Robert Lowman, associate vice-chancellor for research at the UNC, told Nature that North Carolina law prevents the university from speaking about employee matters.
Judith Nowack, associate vice-president for research at the University of Michigan, sent Simonsohn an e-mail on 28 June saying that Sanna had resigned. But spokesperson Kelly Cunningham, who was copied on that e-mail, would not confirm or deny this statement to Nature. “The University does not typically discuss personnel or employment records beyond certain public information,” she wrote.
The editor of the Journal of Experimental Social Psychology, Joel Cooper of Princeton University in New Jersey, says that Sanna wrote to him and asked that three of his articles, published from 2009 – 11, be retracted from the journal. Cooper says that he will comply, although he has not heard from the University of Michigan or the UNC, and does not know anything about the circumstances leading to the request.
Sanna has not responded to Nature’s requests for interviews, and neither have the co-authors on his papers, apart from psychologist Craig Parks of Washington State University in Pullman. Parks says that he only helped to design the research protocol for the projects he was connected to, and was not involved with the data. He knows nothing about the investigation’s outcome. “I am very surprised,” he says. “I’m having a hard time wrapping my head around it.”
Next week, Simonsohn plans to submit a paper to Psychological Science documenting the full statistical techniques that he used to investigate Smeesters and Sanna, along with his code and data. In the meantime, he is aware that his work could be seen as a blow for a field already reeling from the case of Diederik Stapel, a prominent social psychologist who last year was found guilty of large-scale research fraud (see Nature 479, 15; 2011).
“Some people are concerned that this will damage psychology as a whole and the public will perceive an epidemic of fraud,” says Simonsohn. “I think that’s unfounded.” He notes that retractions are common in many fields, and cites the case of anaesthesiologist Yoshitaka Fujii, who was recently found to have fabricated data in at least 172 papers.
“We in psychology are actually trying to fix things,” he says. “It would be ironic if that led to the perception that we are less credible than other sciences are. My hope is that five years from now, other sciences will look to psychology as an example of proper reporting of scientific research.”
Naturedoi:10.1038/nature.2012.10968
References
1. Sanna, L. J., Chang, E. C., Miceli, P. M. Lundberg, K. B. J. Exp. Soc. Psychol. 47, 472 – 476 (2011).
2. Sanna, L. J., Chang, E. C., Parks, C. D. Kennedy, L. A. Psychol. Sci. 20, 1319 – 1321 (2009).
3. Sanna, L. J., Parks, C. D. Chang, E. C. Group Dyn. Theor. Res. 7, 26 – 40 (2003).
4. Sanna, L. J., Lundberg, K. B., Parks, C. D. Chang, E. C. J. Exp. Soc. Psychol. 46, 1126 – 1129 (2010).
Related posts:
Views: 0