We present a study of user voting on three websites: Imdb, Amazon and BookCrossings. Here we report on an expert evaluation of the voting mechanisms of each website and a quantitative data analysis of users' aggregate voting behavior. Our results suggest that the websites with higher barrier to vote introduce a relatively high number of one-off voters, and they appear to attract mostly experts. We also find that one-off voters tend to vote on popular items, while experts mostly vote for obscure, low-rated items. We conclude with design suggestions to address the "wisdom of the crowd" bias. Keywords-Voting, rating, quantitative analysis, expert evaluation.