2 मिनट पढ़ें

By: Adebayo B. Olanrewaju 

Selecting the right sampling plan is a key to ensuring data accuracy in quality management. It should be noted that there are no standards as to which plan is to be used for data collection and analysis, therefore the analyst makes a decision based upon experience and the specific needs. There are many other sampling techniques that have been developed for specific needs, including this: https://www.olanabconsults.com/shop/snap-sampling-plans-aql-inspection-software



Click Here to Download Readymade Editable Toolkits & Templates on Quality Assurance/Quality Control, Lean Six Sigma, Lean Manufacturing, Six Sigma, ISO 9001, ISO 14001, ISO 22000, ISO 45001, FSSC 22000, HSSE, Project Management etc.

Bad data is not only costly to capture, they also corrupt decision-making process. The following are some points to consider in ensuring data accuracy and integrity during measurement:

  1. Emotional Bias: This involves being holistic and avoiding bias relative to targets or tolerances when counting, measuring, or recording digital or analog displays.
  2. Avoiding Unnecessary Rounding. From experience, rounding has shown to often reduce measurement sensitivity. Averages should be calculated to at least one more decimal position than individual readings.
  3. Record Order of Sequence: If data occurs in time sequence, a best practice is to record the order of its capture.
  4. Record Measurement Classification: If an item characteristic changes over time, the measurement or classification should be recorded as soon as possible after its manufacture, as well as after a stabilization period. Important classifications should be recorded along with the data. This information can include: time, machine, auditor, operator, gage, lab, material, target, process change and conditions, etc.



Click Here to Download Readymade Editable Toolkits & Templates on Quality Assurance/Quality Control, Lean Six Sigma, Lean Manufacturing, Six Sigma, ISO 9001, ISO 14001, ISO 22000, ISO 45001, FSSC 22000, HSSE, Project Management etc.

  1. Put an Eye on Specification Criteria: In order to apply statistics which assume a normal population, one must be able to determine whether the expected dispersion of data can be represented by at least 8 to 10 resolution increments. If not, the default statistic may be the count of observations which do or do not meet specification criteria.
  2. Filter Data Entry: Data should be filtered such as that they are able to detect and remove data entry errors such as digital transposition and magnitude shifts due to a misplaced decimal point.
  3. Ensure Objective Statistical Tests: Removal by guessing should be avoided. Use objective statistical tests to identify outliers.



Click Here to Download Readymade Editable Toolkits & Templates on Quality Assurance/Quality Control, Lean Six Sigma, Lean Manufacturing, Six Sigma, ISO 9001, ISO 14001, ISO 22000, ISO 45001, FSSC 22000, HSSE, Project Management etc.

About the Author

Adebayo is a thought leader in continuous process improvement and manufacturing excellence. He is a Certified Six Sigma Master Black Belt (CSSMBB) Professional and Management Systems Lead Auditor (ISO 9001, 45001, ISO 22000/FSSC 22000 etc.) with strong experience leading various continuous improvement initiative in top manufacturing organizations. 

You can reach him here.

कमैंट्स
* ईमेल वेबसाइट पर प्रकाशित नहीं किया जाएगा।