### Reducing Gage RR can be problematic. Especially when the Gage employed and the process used to a collect the data is the best under the current circumstances.

During the Measure phase of Six Sigma a Black Belt will perform a GRR study. This type of Measurement Systems Analysis (MSA) quantifies the error in repeatability and reproducibility.

Repeatability refers to the inability of a Gage to measure the same feature of a part, in the same place repeatedly.

Reproducibility refers to the error between operators using the same Gage, measuring the same parts, and the same feature.

Knowing Gage repeatability and operator reproducibility we can combine these two standard deviations and compute a GRR value.

Depending on the intended use of the Gage, we compare the Gage RR to the process spread (process variation) or the customer’s tolerance spread. We can then assess the suitability of the measurement system to detect process changes or meet a critical customer characteristic. If the measurement system is suitable, then there isn’t a need to improve Gage repeatability or operator reproducibility.

So if reducing Gage RR becomes a priority, what can a Six Sigma Black Belt or Quality Engineer do? The first thing a Six Sigma Black Belt or Quality Engineer can do is identify which component of the GRR is contributing the most. Suppose the largest contributor is Operator Reproducibility. In this case, the solution resides in assuring both operators followed the same procedure.

But what if Gage repeatability is the problem? Often a Six Sigma Black Belt or Quality Engineer will search for an alternative Gage. In some cases a better Gage may not exist. Now what is a Six Sigma Black Belt or Quality Engineering to do? Depending on the circumstances, there is a statistical approach to reducing Gage RR. This approach uses the distribution of averages in reducing the error in measurement. Recall the distribution of averages has the following standard deviation.

In this expression, the distribution of averages will have a smaller standard deviation (s_{x-bar}) compared to the standard deviation of individual values (s_{x}). This is because it’s scaled by 1 over the square root of n. So how can we use this to our advantage in reducing Gage RR? Let’s consider the following example from page 175 of AIAG’s Measurement System Analysis Manual.

The XYZ Company has a GRR as a percentage of tolerance equal to 25.5%. The 6 sigma spread of the GRR is 0.24. The customer wants to reduce this figure to at least 15%. This would result in a 6 sigma spread of 0.14. To determine the number of readings we can manipulate the expression above as such:

We now enter the appropriate values and solve for n:

rounded to the nearest integer, n = 3.

So we get 3 readings for the part feature, and compute the average. We then report this average as a single reading for that part feature. We continue this procedure until the GRR study is complete. This will reduce the GRR to about 0.14 from 0.24 and the %GRR from 25.5% to about 15%.

Now I’d like to hear from you. Have you ever had a GRR problem? How did you solve it or are you still living with the problem? Did you ever use this approach? Please share your comments below.

Bob McNeely says

Thank you for explaining how effective gauge R&R can be reduced by taking the average of several measurements. How effective is this technique when gauge precision is low relative to the range of values being measured? For instance, if the true values of the population range over, say, 200 microns, but the gauge precision is only 50 microns, the individual gauge readings will fall into just five buckets, and it starts to look like attribute, not variable, data. Is there a way to judge whether the precision of the gauge relative to the spread of the data will make this technique valid? Thanks again.

Support User says

Hello Bob. I think this problem could be investigated via bootstrapping. I haven’t had the time to run a simulation to see the outcome. Hopefully one of theses days I’ll get to it and post a reply.