Where differences occurred, they were especially large on three broad types of questions: Items that asked the respondent to assess the quality of their family and social life produced differences of 18 and 14 percentage points, respectively, with those interviewed on the phone reporting higher levels of satisfaction than those who completed the survey on the Web.
Questions about societal discrimination against several different groups also produced large differences, with telephone respondents more apt than Web respondents to say that gays and lesbians, Hispanics and blacks face a lot of discrimination. However, there was no significant mode difference in responses to the question of whether women face a lot of discrimination.
Web respondents were far more likely than those interviewed on the phone to give various political figures a “very unfavorable” rating, a tendency that was concentrated among members of the opposite party of each figure rated.
Statistically significant mode effects also were observed on several other questions. Telephone respondents were more likely than those interviewed on the Web to say they often talked with their neighbors, to rate their communities as an “excellent” place to live and to rate their own health as “excellent.” Web respondents were more likely than phone respondents to report being unable to afford food or needed medical care at some point in the past twelve months.
ultimately, the legitimacy of the poll would depend on where they solicited their subjects in the poll. You’re likely to get a far different answer with advertisements on Truth Social than you would with advertisements on, lets say, a palistinian-american subreddit. but that wasn’t addressed in the report, so. we’ll never really know.
Opt-in polling is so bad. It means you only get answers from people with strong opinions. They are polls where the results are shaped like a U instead of a bell curve so it rarely represents the actual 95 percentile.
I don’t think it’s opt-in in that way though. They have a pre-existing list of millions of pre-screened people and they’re selecting a representative sample from that list. Fivethirtyeight ranks them fairly highly among other polls – certainly high enough for this opinion poll to be considered accurate.
Fivethirtyeight only grades them on their ability to predict american election results. I don’t think that’s the same as advocating for their efficacy in producing leading public opinion polls.
It’s a serious issue when there’s a clear political bias in the founders. They put more effort into steering the narrative than objectively reporting it.
If you haven’t seen any clear bias then you probably have a close enough outlook to them to not notice. And that’s fine, I don’t require everyone to have the same opinion as me for their comments to have value.
My impression of bias is probably born out of the leading polls that rightwing media and thinktanks in the UK commission them to do. You can fairly argue that these polls are externally commissioned so their tenor is a product of their issuer not yougov. But the overall impression I got was that they could be readily depended on to produce misleading propaganda against labour when it wasn’t being run by corporate technocrats.
It’s still selecting from a list of people who have something to say, though.
As far as how accurately they represent broad swaths of america… well, that’s a different matter. I would expect your average american to be far more luke warm to any given subject than respondents to a poll.
That’s probably just a problem with polls though – people who won’t answer aren’t included. But they’re saying that 26-32% of Americans are “unsure,” and that sounds pretty lukewarm. Their methodology does sound odd to me too but if it was flawed it would show in the election data, right? Elections are a brutal testing ground. Hundreds of surveys have been predictive and high quality on average.
I would have guessed 1/3 are “wtf! stop it”, a 1/3 are “bomb them harder!” And then there’s everyone else just doing their thing, going to work. Going to school.
way down at the bottom here::
opt in polling from web based “interview”. So, probably. nobody even remotely tech savy (aka has an ad blocker) ever participated. but anyhow… here’s what pew research has to say about the effects of interview mode:
ultimately, the legitimacy of the poll would depend on where they solicited their subjects in the poll. You’re likely to get a far different answer with advertisements on Truth Social than you would with advertisements on, lets say, a palistinian-american subreddit. but that wasn’t addressed in the report, so. we’ll never really know.
Opt-in polling is so bad. It means you only get answers from people with strong opinions. They are polls where the results are shaped like a U instead of a bell curve so it rarely represents the actual 95 percentile.
I don’t think it’s opt-in in that way though. They have a pre-existing list of millions of pre-screened people and they’re selecting a representative sample from that list. Fivethirtyeight ranks them fairly highly among other polls – certainly high enough for this opinion poll to be considered accurate.
Fivethirtyeight only grades them on their ability to predict american election results. I don’t think that’s the same as advocating for their efficacy in producing leading public opinion polls.
You’re right, kinda. Issue polling is generally better than horse race polling and YouGov is no exception.
It’s a serious issue when there’s a clear political bias in the founders. They put more effort into steering the narrative than objectively reporting it.
Sure, and were there a clear bias your comment would have value.
If you haven’t seen any clear bias then you probably have a close enough outlook to them to not notice. And that’s fine, I don’t require everyone to have the same opinion as me for their comments to have value.
My impression of bias is probably born out of the leading polls that rightwing media and thinktanks in the UK commission them to do. You can fairly argue that these polls are externally commissioned so their tenor is a product of their issuer not yougov. But the overall impression I got was that they could be readily depended on to produce misleading propaganda against labour when it wasn’t being run by corporate technocrats.
It’s still selecting from a list of people who have something to say, though.
As far as how accurately they represent broad swaths of america… well, that’s a different matter. I would expect your average american to be far more luke warm to any given subject than respondents to a poll.
That’s probably just a problem with polls though – people who won’t answer aren’t included. But they’re saying that 26-32% of Americans are “unsure,” and that sounds pretty lukewarm. Their methodology does sound odd to me too but if it was flawed it would show in the election data, right? Elections are a brutal testing ground. Hundreds of surveys have been predictive and high quality on average.
Agreed on all of that.
I would have guessed 1/3 are “wtf! stop it”, a 1/3 are “bomb them harder!” And then there’s everyone else just doing their thing, going to work. Going to school.