Scholastica
Active Member
- Aug 25, 2019
- 173
- 191
- 33
- Country
- United States
- Faith
- Christian
- Marital Status
- Single
I don't think there is any "official" litmus test for polling accuracy, but generally the most accurate polls follow certain practices to ensure they get it right.
Firstly, numbers. The more people you poll, the greater your sample size of the population. Which, short of polling everyone in the country, gives you the best indication of the national mood. Usually good polls will have at least 1000 people.
Then you have randomization. To ensure there isn't any bias from the pollsters, they aren't supposed to focus on any particular group to the exclusion of others. To the best of their ability the pollster needs to asks anyone and everyone to be polled. Obvious caveat applies when you're polling a specific group, obviously. You can exclude Ohioans when you're polling voters in Michigan.
The questions on the poll, and the answers allowed are also important. Leading or loaded questions, not accepting certain answers, and other subtle means of pushing the result you want can skew the poll since the people are being railroaded onto a certain path.
Then there's weighing the results. This part is a bit tricky because it involves giving some poll takers a greater percentage in the end result based on certain categories and demographics. This is because despite being randomized, the fact is that not everyone actually responds to polls, with some demographics being more likely to respond which gives a skewed perception even if you did your best to ask everyone fairly. So pollsters often give more weight to certain groups to better reflect the population that was being polled.
Firstly, numbers. The more people you poll, the greater your sample size of the population. Which, short of polling everyone in the country, gives you the best indication of the national mood. Usually good polls will have at least 1000 people.
Then you have randomization. To ensure there isn't any bias from the pollsters, they aren't supposed to focus on any particular group to the exclusion of others. To the best of their ability the pollster needs to asks anyone and everyone to be polled. Obvious caveat applies when you're polling a specific group, obviously. You can exclude Ohioans when you're polling voters in Michigan.
The questions on the poll, and the answers allowed are also important. Leading or loaded questions, not accepting certain answers, and other subtle means of pushing the result you want can skew the poll since the people are being railroaded onto a certain path.
Then there's weighing the results. This part is a bit tricky because it involves giving some poll takers a greater percentage in the end result based on certain categories and demographics. This is because despite being randomized, the fact is that not everyone actually responds to polls, with some demographics being more likely to respond which gives a skewed perception even if you did your best to ask everyone fairly. So pollsters often give more weight to certain groups to better reflect the population that was being polled.
Last edited:
Upvote
0