Skip to main content

Big Data Science and Engineering

Big Data

Predicting cooperation: One good deed often leads to another

Two Vanderbilt engineers have built a computational model of human behavior based on game theory and techniques from machine learning that reliably predicts a player’s most likely behavior in game after game in a setting where there is a significant tension between cooperative and selfish behavior.

“In essence, we can use our approach to identify a combination of factors that makes selfless behavior cooperation significantly more frequent,” said Assistant Professor of Computer Science and Computer Engineering Yevgeniy Vorobeychik.

Vorobeychik and graduate student John Nay, MS’13, studied the Prisoner’s Dilemma, a game widely used to understand the tension between social and individual interests. Game theory is a branch of applied mathematics primarily used to model how people behave in strategic situations.

In the Prisoner’s Dilemma, players must decide whether to cheat or cooperate with a partner. In a single round of the Prisoner’s Dilemma, the best strategy is to cheat—squeal on your partner and you’ll get less jail time. But if the game repeats over and over, cooperation becomes achievable. The model created by Nay and Vorobeychik relies on data they integrated from thousands of experimental observations: 168,386 real decisions in 30 different game scenarios. That’s a big difference from similar studies that use only a few scenarios, according to the researchers.

“An important number to remember here is 86 percent. Our relatively simple model predicts the next action a player will take at the individual level with 86 percent accuracy. This is a remarkably good prediction,” said Nay, an interdisciplinary doctoral candidate in the School of Engineering and a research fellow in Vanderbilt Law School’s Program on Law and Innovation.

From experiments in published papers and publicly available data sets that used real people and real financial incentives, the pair built a comprehensive collection of game structures and individual decisions—what Nay, the first author, calls a “meta study.” Their work was published in May by PLOS One, Predicting Human Cooperation.

“Our model is successful, not merely at fitting the data, but in predicting behavior at multiple scales,” Nay said. “For example, in one of the game designs, the model predicted the initial (high) level of cooperation almost exactly and then perfectly matched the observed cooperation level throughout the next seven periods of play. If we can build models that accurately predict human behavior, then we can run simulation experiments to find institutional and policy designs that increase cooperation and other desirable social outcomes, and our work is a step in that exciting direction.”

Funding

The research is partially supported by
U.S. National Science Foundation grants
EAR-1416964 and EAR-1204685; the U.S.
Department of Energy Solar Energy Evolution
and Diffusion Studies Grant, Office of Naval
Research (N00014-15-1-2621), and U.S.
National Science Foundation (IIS-1526860).

Top Photo: Graduate student John Nay and Assistant Professor of Computer Science and Computer Engineering Yevgeniy Vorobeychik  used more than 168,000 real-life decisions made in 30 game scenarios to develop a highly reliable model for predicting how well people will cooperate with each other.