Process Analysis Sampling Plans - Computational Approach
In principle, the computational approach to the question of how large a sample to take is straightforward. Elementary Concepts discusses the concept of the sampling distribution. Briefly, if we were to take repeated samples of a particular size from a population of, for example, piston rings and compute their average diameters, then the distribution of those averages (means) would approach the normal distribution with a particular mean and standard deviation (or standard error; in sampling distributions the term standard error is preferred, in order to distinguish the variability of the means from the variability of the items in the population). Fortunately, we do not need to take repeated samples from the population in order to estimate the location (mean) and variability (standard error) of the sampling distribution. If we have a good idea (estimate) of what the variability (standard deviation or sigma) is in the population, then we can infer the sampling distribution of the mean. In principle, this information is sufficient to estimate the sample size that is needed in order to detect a certain change in quality (from target specifications).
Sample size estimation for binomial fractions and Poisson frequencies. Note that the sampling plans options in the Process Analysis module will use the normal approximation to the binomial and Poisson distributions, in order to estimate required fixed sample sizes. These approximations are described, for example, in Duncan (1986), and are consistent with the approach used in quality control charting (see also Quality Control for additional details). Note that, usually, in power analysis applications in the biomedical sciences, the explicit formulas (instead of the normal approximations) are used, and those analyses may yield slightly different results. If your version of Statistica does not include the Statistica Power Analysis module, contact Statistica for information about the availability of this module.