The spread of false and misleading news on social media is of great societal concern.Why do people share such content, and what can be done about it? To address these questions, we use the lens of cognitive science and examine sharing on Twitter. In a first study,we investigated the relationship between individual differences in cognitive reflection and behavior in a sample of almost 2000 Twitter users in a hybrid lab-field setup. The results show that people who rely on intuitive gut responses over analytical thinking follow more questionable accounts, and share more low-quality content -in particular, political misinformation and get-rich-quick scams. We also found evidence of “cognitive echo chambers”where users who rely on their intuition more often tend to follow similar accounts. In a second study, we developed a subtle intervention that nudges people to think about accuracy while on social media. We messaged over 5000 Twitter users who had previously shared links to misinformation sites, and asked them to rate the accuracy of a single non-political headline -therefore making the concept of accuracy more top of mind for them, such that they would be more likely to think about accuracy when they went back to their newsfeed. And indeed, we find that the message significantly improved the quality of the news content they shared subsequently.Our experimental design translates directly into an intervention that social media companies could deploy to fight misinformation online.