Nov 27, 1997
Yes, my post was written before the survey data was posted. Some of my conclusions need to be changed in light of the new data. Some quick thoughts for now, possibly more later...
- On bias in the questions: the survey questions seem to be less biased than I had originally thought. It seems that whomever phrased the graph captions alongside the article felt a need to play with the verbs a bit.
- On public schools - I found it interesting that while few of the superconnected had confidence in public schools, even fewer had confidence in American schools generally. Yes there appears to be a general disenchantment with the school system, but not toward public schools specifically, as the article seems to suggest.
- On the sample size: I remain unconvinced that 1,400 people is a sufficiently large sample size, but I am ready to be corrected on this point. I know that degree of error can be calculated mathematically. That said,
- As noted above, 28 people is still too small a sample size from which to infer any conclusions, and
- And some new information with the survey data, that 400 people were selected, is highly suspicious. I would like to know the methodology here.
- The division of the data into the four groups is still problematic. I see no good reason for the pollsters to have done that, and it obscures the data.
Some thoughts on pagers: perhaps there's some aspect to American culture that I'm just not getting here - in Canada, at least, pager service is more expensive than having a phone, and anyways, almost everyone has a phone. So a pager becomes a device only self-employed people and professionals use. In fact, I have never seen a pager used for personal reasons. Until I see come evidence to the contrary, I will stay with my belief that pagers are used mostly by businesspeople. And please don't cite demographics unless you have a source you can cite - citing 'confidential' data isn't useful. Surely there is census information on this, isn't there?
As a real aside: it would be interesting to see some of the demographics presented in the survey - eg., salary, location, employment type - compared with U.S. census statistics to see how representative the sample was.
Finally (for now): Jon - yes, you expressed doubts about some of the information in the article. Fair enough. But before setting pen to paper, you ought to have investigated the survey more thoroughly. You say you didn't see the raw data. To publish an article in an international magazine - a title article yet - without having examined the data for yourself was irresponsible.
In some of your other articles, Jon, including one this week, you complain about media sensationalizing and failing to verify facts and information. That's exactly what you did. And while the doubts about the survey can be (and are) raised in forums such as this, many readers approach such an article with the attitude that the data must be representative, that the author at least checked it.
I know you want to focus on how this picture compares with your earlier article. I want to look at this too. But before we start comparing findings, we have to establish the validity of the findings. It is a very common fallacy in newspapers and magazines to offer explanations for putative 'facts' which turn out, on analysis, to be pure fiction.