Recently, crowdsourcing has been proposed as a tool for fighting misinformation online. Will internet users listen to crowdsourced fact-checking, and how? In this experiment we test how participants follow others’ opinions to evaluate the validity of a science-themed Facebook post and examine which factors mediate the use of this information. Participants observed a post presenting either scientific information or misinformation, along with a graphical summary of previous participants’ judgements. Even though most participants reported not having used information from previous raters, their responses were influenced by previous assessments. This happened regardless of whether prior judgements were accurate or misleading. Presenting crowdsourced fact-checking however did not translate into the blind copying of the majority response. Rather, participants tended to use this social information as a cue to guide their response, while also relying on individual evaluation and research for extra information. These results highlight the role of individual reasoning when evaluating online information, while pointing to the potential benefit of crowd-sourcing-based solutions in making online users more resilient to misinformation.