Social computing is a broad research area situated at the intersection of computer science, economics, and other social sciences. It concerns both harnessing human intelligence for computational tasks and the design of computational systems that support social behavior and interactions. Examples that showcase the power of social computing include Wikipedia, a crowd-generated online encyclopedia, Zooniverse, a crowdsourcing platform for scientific problems, Iowa Electronic Markets for eliciting and aggregating information for political events, the Amazon Mechanical Turk platform which serves as a market for “human intelligence tasks”, TopCoder for code development, and the DARPA "red balloon challenge" where teams competed to discover the location of ten large weather balloons hidden across the United States.
Despite the potential for broad applications, surprisingly little is known about the methodologies for designing effective and efficient social computing systems and their limitations.
At Harvard, we conduct both theoretical and experimental research aimed at understanding the design levers of social computing, the interaction of multiple levers, and behavioral models of participants of social computing systems, while pushing the frontier of novel social computing applications.
Some examples of recent research efforts include:
- Algorithms for information dissemination: We focus on studying fundamental properties of social networks and then develop algorithmic frameworks that leverage these properties for the benefit of information dissemination.
- Online labor market design: We study how to match tasks with workers and how to price tasks to improve the efficiency of online labor markets. In addition, we also investigate the design of reputation systems for online labor markets.
- Experimental studies of incentives in crowdsourcing: We run human-subject experiments to understand how financial, social, and psychological incentives affect participants' behavior in social computing systems and use the learned insights to improve the design of crowdsourcing workflows.
- Information elicitation: We study the fundamental problem of eliciting private information from individuals. When the information can be verified or objectively evaluated in the future, market mechanisms can be designed and used to accurately elicit such information. In the more challenging situation where we cannot verify the elicited information, rewarding an answer based on how it compares to answers of other people can help achieving accurate elicitation. We study these mechanisms both theoretically and experimentally.