this week i

heard a tune while in line at starbucks.  thought i could do better.

spent the evening with three of my favorite people.  new rule: only eat ben's chili bowl when at least 66% of the friends i'm dining with are doctors.  unbroken streak so far.

read about the inherent bias in published research, and the public library of science's admirable role in the crusade against it.

figured out parallel processing in the windows 64 bit version of r.


listened to the magnificent eric lewis at the final night of hr-57 on 14th st..

..turn it up.



took an excellent short course on survey sampling, designed not to confer mastery, but to 'bring you to the point where you can interrogate people who do survey sampling.'  critical skills, the origination of most of the data sets i work with.  the decennial census the only survey conducted by the u.s. government that lacks some formal sampling strategy.

the chances of being hit by lightning are very low, but once you've been sampled, it's too late.

a statistic derived by sampling requires a precision statement to make it useful.
 
sample size does not depend on population size, but on the variability within the population and the desired level of precision.
 
sampling each state to predict the winner of the presidency with america's electoral college system costs about fifty times as much as simply predicting the winner of the popular vote.

if you know something about the population, build that into the design.

if the population is homogeneous, you can get the correct answer from sampling one person.
 
you're looking for a miniature of the population.

if we have a design effect and a simple random sample variance, the product of the two will give us the variance taking the clustering into account.
 
stratification is always at least as good, and often better, than simple random sampling.


12/16/10