top of page

Selection of past projects

My past work all falls under this central theme: The effect of group identity on different aspects of belief -- i.e., how partisan identity affects (whether in a biased or rational manner) belief formation, updating and information seeking. 

​

PARTISAN BELIEF UPDATING

​

OLDER PARTISANS' SHARING MISINFORMATION

​

PARTISN BIAS IN BELIEF IN MISINFORMATION

​

Partisan Belief Updating

Screenshot (130).png

Do people rationally update (believe less after seeing a correction), fail to update (believe to the same extent after a correction) or backfire (believe even more after seeing the correction)? In three experiments (N = 1207), we explored if and how Americans update their beliefs when they see a piece of misinformation on Twitter followed by a tweeted correction. We manipulated the party identity of the initial tweeter and correction tweeter to investigate whether participants believe less when the initial information comes from an outgroup versus ingroup member and/or update less when the correction comes from an outgroup member (i.e., partisan group bias in initial belief and belief updating). We found that individuals were much more likely to believe information that came from ingroup members, and found some evidence that suggests that they updated their beliefs more when the corrections came from ingroup members compared to outgroup members.

Partisan Belief Updating

Screenshot (130).png

Do people rationally update (believe less after seeing a correction), fail to update (believe to the same extent after a correction) or backfire (believe even more after seeing the correction)? In three experiments (N = 1207), we explored if and how Americans update their beliefs when they see a piece of misinformation on Twitter followed by a tweeted correction. We manipulated the party identity of the initial tweeter and correction tweeter to investigate whether participants believe less when the initial information comes from an outgroup versus ingroup member and/or update less when the correction comes from an outgroup member (i.e., partisan group bias in initial belief and belief updating). We found that individuals were much more likely to believe information that came from ingroup members, and found some evidence that suggests that they updated their beliefs more when the corrections came from ingroup members compared to outgroup members.

Partisan Belief Updating

Screenshot (130).png

Do people rationally update (believe less after seeing a correction), fail to update (believe to the same extent after a correction) or backfire (believe even more after seeing the correction)? In three experiments (N = 1207), we explored if and how Americans update their beliefs when they see a piece of misinformation on Twitter followed by a tweeted correction. We manipulated the party identity of the initial tweeter and correction tweeter to investigate whether participants believe less when the initial information comes from an outgroup versus ingroup member and/or update less when the correction comes from an outgroup member (i.e., partisan group bias in initial belief and belief updating). We found that individuals were much more likely to believe information that came from ingroup members, and found some evidence that suggests that they updated their beliefs more when the corrections came from ingroup members compared to outgroup members.

Anchor 1

Older Partisans' Sharing of Misinformation

Screenshot (131).png

In the first of two studies, we examined the behavior of sharing misinformation on Facebook in the lead up to a Presidential election (N = 1,191 Americans, 485 of which shared fake news). We found evidence of an effect of age on sharing fake news (i.e., older individuals share more fake news compared to younger individuals) but this effect was only robust (i.e., significant with and without outliers) for Republicans, although trending for Democrats and Independents. In the second study, we experimentally manipulated the news content evaluated by a sample of elderly Americans (aged 60-87; N= 244). We found a partisan bias for both elderly Democrats and Republicans in their reported willingness to share fake news. This second study suggests that elderly members of both parties are more willing to share negative news about their outgroup than ingroup.

Anchor 2

Partisan Bias in Belief in Misinformation

Screenshot (132).png

We test three competing theoretical accounts invoked to explain the rise and spread of political (mis) information. We compare the ideological values hypothesis (people prefer news that bolster their values and worldviews); the confirmation bias hypothesis (people prefer news that fit their preexisting stereotypical knowledge); and the political identity hypothesis (people prefer news that allow them to believe positive things about political ingroup members and negative things about political outgroup members). In three experiments (N = 1,420), participants from the United States read news describing actions perpetrated by their political ingroup or outgroup. Consistent with the political identity hypothesis, Democrats and Republicans were both more likely to believe news about the valueupholding behavior of their ingroup or the value undermining behavior of their outgroup. Belief was positively correlated with willingness to share on social media in all conditions, but Republicans were more likely to believe and want to share apolitical fake news.

Anchor 3
bottom of page