Dictionary of common research, survey, quality, and Six Sigma terminology.
Products or processes that deviate from what is regarded as “standard” for that product or process. When measured by the tools used to define the standard, a unit with an abnormality is of unacceptable quality.
The process of measuring product cost strictly by subtracting all direct and indirect product costs from revenues. Absorption costing is used to provide a per-unit average product cost.
Also known as “assured quality level“ (see below), an acceptable quality level represents the ratio of conforming units in a given sample lot of a product or process to the maximum number of defective or “nonconforming” units that the sample lot may contain and still be considered of acceptable quality. The ideal acceptable quality level is one with zero defects or nonconforming units.
The maximum number of units in a sample lot that may be defective or nonconforming, while still permitting the sample lot to be considered of acceptable quality.
A quality-control plan that uses a Control Chart to measure the quality of individual “sample” lots of units of a product or service, rather than inspecting each individual lot or unit. Two types of acceptance sampling plans exist: attribute sampling plan and variables sampling plan.
A business strategy that plans for the use of “remnant materials” (the residual or scrap materials left over after a product or process is complete) to provide added value to existing products or processes.
The condition of being responsible for the products or processes (or the segments of a product or process) over which one has personal power or authority. Accountability for a product or process also assumes that one will accept the consequences if the product or process does not meet required standards of quality.
The condition of being responsible for the products or processes (or the segments of a product or process) over which one has personal power or authority. Accountability for a product or process also assumes that one will accept the consequences if the product or process does not meet required standards of quality.
A strategy that uses a specific process or procedure to reach a particular objective or achieve particular results; often documented in writing.
A particular task or procedure designed to obtain specific results. In the production context, an “activity” usually occurs over a period of time rather than in a single instance, and is usually a core part of business operations.
A chart, diagram, graph, or other visual representation of a business and its processes. The visual depiction includes the business and its products; activities involved in creating the product(s) or process(es) and any relationships between them; costs; controls or other features; and resources allocated to activities. Two forms of activity models exist: “as-is” models and “to-be” models.
A method of budget planning that:
- Defines the activities necessary to each function of a business.
- Determines what, if any, relationships exist between each activity and its function in fulfilling business objectives.
- Allocates resources, including financial resources, based on the needs of each activity, rather than on the projected costs of each product or process.
A method of determining costs that:
- Defines the actual costs of each product or process.
- Allocates resources, including financial resources, based on those actual costs rather than on the traditional business structures used to allocate resources in other methods of budgeting.
A strategic planning process that originated with the QS-9000 business management philosophy. APQP comprises five phases of product planning:
- Plan and define the program
- Product design and development
- Process design and development
- Product and process validation, launch, feedback, assessment, and corrective action
The purpose of APQP is to provide a business with needs assessments in advance of each stage of the production process to help neutralize potential problems before they arise and fulfill production objectives efficiently and successfully.
A visual depiction used as a tool to organize large amounts of data logically, especially by relationships between types of data. An affinity diagram is often used in a “brainstorming” session: participants contribute data on post-it notes, which are collected and organized by the relationships in the information they contain, and the data are then entered into the diagram in related categories, often in column format.
A procedure or set of rules defining the steps needed to solve a particular problem. Most often used as a label for a set of instructions given to a computer to perform a particular function, it may represent instructions used to fulfill any task or solve any problem.
A method of evaluating an algorithm, usually by comparing it to another algorithm, to determine whether it is the best choice to solve a particular problem or perform a particular function.
A concept by which organizational data based on employee feedback are collected, sorted and categorized, and then analyzed statistically to identify the business’s strengths, organizational weaknesses, and gaps in product or process quality control.
The probability that accepting an “alternate hypothesis” that a real difference exists between two samples being compared, and rejecting the “null hypothesis” that no difference exists, is an error because the null hypothesis is true. The “alpha risk”, or probability of making such an error, is generally expressed either in terms of probabilities (normally 5% or less) or in terms of “confidence level” (normally 95% or greater).
Sometimes expressed as “Ha” or “H1”. A hypothesis that an observable difference between two samples is a real difference, and not due merely to chance or to a sampling error.
Step three in either of two Six Sigma quality-control processes, “DMAIC” and “DMADV” (see below). The process of scrutinizing and measuring a particular aspect of a business’s operations to identify gaps in quality and minimize errors or defects. Six Sigma uses a variety of analytical processes, including “cause and effect analysis”, “reliability analysis”, “risk analysis”, and “systems analysis”, to name only a few.
A statistical method of analyzing data, particularly experimental data, among two or more groups by 1) comparing the variances within each group and 2) between the groups to 3) identify any effects such variances may have on the outcome of the process being analyzed.
A variation of the Kolmogorov-Smirnov (K-S) Test that uses a sample’s “P-value” to measure whether it is “normal”. P-value is the probability that the sample being tested was drawn from a population with a specific distribution; if the P-value is less than the generally-accepted standard of 0.05, the null hypothesis is likely to be false and differences between the samples are likely to exist.
From the Japanese word for “lamp”, representing the classic Japanese paper lantern used as a light, a sign, or a signal. Here, an andon is a management tool that indicates operational status on production lines or equipment: a green light means that the equipment is operating normally; yellow indicates a transitional stage, such as a scheduled change or maintenance; and red indicates abnormal function or lack of function.
A result that deviates from the result that previous testing or documentation would lead one to expect. Sometimes known as a “bug”, “error”, “exception”, or “fault”.
One of a collective “family” of four types of processes: 1) artisan process; 2) automated process; 3) operational process; and 4) project process. The artisan process is usually temporary, and generally refers to a “pioneering” process – i.e. to create something wholly new.
The statistic used in the Anderson-Darling normality test to determine whether a given data set follows the normal distribution. The resulting A-square implies a P-value of inverse size, which indicates whether the null hypothesis should be accepted or rejected.
An evaluative process used to collect and interpret data to measure performance. An assessment system may include a variety of components, such as internal and external audits, document reviews, analyses, and final reports.
Also known as “special cause”, an assignable cause is an identifiable, specific cause of variation in a given process or measurement. A cause of variation that is not random and does not occur by chance is “assignable”.
The act of assuring customers that a business will maintain a particular level of quality in its products, processes, organization, function, or operations. “Assurance” also refers to the commitment to maintain a particular quality level.
Also known as “acceptable quality level“, AQL represents the ratio of conforming units in a given sample lot of a product or process to the maximum number of defective or “nonconforming” units that the sample lot may contain and still be considered of acceptable quality. The ideal AQL is one with zero defects or nonconforming units.
Binary data, which is the simplest form of data and is unanalyzable. Attribute data is used for “counting” purposes – (e.g. recording numbers or providing totals for analysis). To analyze attribute data, it must first be converted into variable, or “discrete data“. It is frequently used to divide units into “conforming” and “nonconforming” lots.
A method of examining costs by integrating “activity-based costs” (ABC) with “quality function deployment” (QFD). It:
- Identifies and defines the actual costs of each product or process
- Allocates resources according to those costs
- Documents customer data via QFD methods
- Integrates each to ensure that activities, costs, and resources fulfill customer wants, needs, and expectations
A periodic, systematic process of inspection and evaluation, often performed by an independent entity. An audit is designed to ensure that a business, its employees, and its processes, systems, operations, services, and/or products conform to pre-established standards of quality; to identify weaknesses, errors, or defects; and to identify methods of correction and/or improvement.
The condition of having power over, and responsibility for, an enterprise, system, process, product, and/or personnel.
Closeness of values in observations or dependencies between observations.
The state or capacity of being ready and able to fulfill or perform an intended role or function. Availability may describe this quality in products or components, services, processes, systems, individuals, teams, or entire organizations or enterprises.
The average level of quality of the units in a lot at the time the lot arrives at a point of inspection (i.e. before it has been inspected).
The average level of quality of the units in a lot when it leaves a point of inspection. AOQ is based on the average quality level of the lot upon first reaching the inspection point (i.e. the “average incoming quality“, or AIQ). If the inspection does not result in the exchange of defective units for units of acceptable quality, the AOQ will be identical to the AIQ.
The length of time between production and the point at which 10% of a given product population will fail.
In manufacturing or production processes, a method of calculating the “start date” and “due date” of each stage of the operation by working backward from the “ship date”.
A form of experiment in which each factor level or treatment group contains the same number of units (i.e. quantitatively “balancing” each factor or group). A balanced experiment is simpler than an “unbalanced” one, in which varying quantities add complexity, and is especially useful where each factor or group is to be accorded identical weight, significance, or importance.
A business strategy of performance evaluation via use of a “scorecard” based on four to six strategic “indicators.” The scorecard comprises each of the four to six components, represented individually, and measures them in terms of how well they are meeting the strategic objectives of the business.
A visual depiction that compares or contrasts groups of data by means of images of bars that vary in length or other relevant qualities. “Simple” bar charts compare sets of data that are uniform; “complex” bar charts may group or stack various types of data internally within each data set being compared.
A fundamental point in a system, process, or quality, against which all other stages or phases are measured. Frequently, a baseline is:
- A beginning point in a process, or
- A minimum acceptable level of quality.
The comparison of conditions against the baseline point. Frequently, this process of measurement will occur between:
- Current conditions and a previous condition that serves as the baseline
- Projected future conditions measured against a baseline of current conditions
Used as a noun, a given quantity or lot of a particular product or component, with all units in the lot produced under “uniform conditions”, using identical processes. Used as a verb, “to batch” is to use a uniform method to collect identical data concurrently and forward it in a group to the next processing stage (i.e. instead of collecting and forwarding each individual data unit separately as it becomes available).
Also known as “batch-and-push”; a production process by which all the units in a given lot complete a particular stage of production before moving to the next stage. The process creates a “batch” of units, which then must wait in a “queue” at each production stage as each unit is processed in turn.
A system of measurement by which a business compares its products, services, processes, or systems to those that are generally recognized as industry standards or leaders, to evaluate quality, identify gaps or defects and suggest improvements.
A practice, such as a method, system, or process, that is recognized in the industry as the best for a particular purpose (e.g. most effective, most efficient, etc.). The label “best practice” is usually conferred, either formally or informally, by generally agreeing among a business’s peers in a given industry.
An error that permeates a system or process and causes inaccurate results, such as a difference between the value of specific test results and the accepted or expected value against which they are measured.
The distribution of a data set, in which two values occur more frequently than the remaining values in that data set’s distribution.
A “Black Belt” is team leader in a business or division, trained in and responsible for implementing Six Sigma principles to increase quality, productivity, and customer satisfaction according to DMAIC and DFSS methods and standards. A Black Belt has usually completed a four-week training program, one or more Six Sigma projects, and an examination on Six Sigma practices.
A label sometimes used in place of “assignable cause” or “special cause.” Black noise is a cause or source (or set of causes or sources) of variation that do not occur randomly or by chance.
A visual depiction of a system, including the system’s components and the relationships between them. Each component is illustrated by an image in the shape of a “block” (e.g. a square or rectangle) and related blocks are connected via lines.
A method of identifying and segregating known “background” variables or sources of variation in a process or system so that they are isolated from the primary variables being measured. This segregation permits comparison of primary variables, while preventing background variables from influencing that comparison and/or distorting the results.
Named after the statisticians Box and Cox, who identified the equation and the process it represents. This equation is applied when two groups of data, each measured in a different format, must be synthesized (i.e. “transformed”) – usually through conversion to a normal distribution of the values in the data set(s) – to create a single uniform method of measurement to provide new information about the data as a whole.
Also called a “box and whisker diagram”; a visual diagram of a continuous data set’s center, distribution, and spread in five-point summary form:
- The middle 50% (i.e. the box itself)
- The median (i.e. 50% of the data appears above it; 50% appears below it)
- The 25th quartile (i.e. no more than 25% of the data appears below it)
- The 75th quartile (i.e. no more than 25% of the data appears above it)
- Identification of the limits of the data set and any “outliers” (i.e. “whiskers” of the plot that exceed 1.5 times the length of the inner quartiles)
A group process used to generate creative and effective ideas and strategies. In the brainstorming process, each member of the group writes down each relevant idea that occurs to him or her, without editing its content physically or even mentally. The group leader or facilitator then collects the ideas and leads the group in discussing and dissecting them to distill the most effective concepts and strategies from them.
Business goals that require new and/or radical strategies and approaches to ensure that the goals are attained successfully. Breakthrough objectives are designed to encourage dynamic effort and better performance throughout an organization, enterprise, team, system, or process.
A change or addition that a business makes to a product or process before it reaches the customer or point of purchase. A “business value added” change increases the quality and value of the product or process, but the fact of the change itself is not apparent to the customer.
The practice of conducting business according to the status quo, with no attempt to assess additional or new needs, identify possibilities for improvement, or implement change.
Those activities that collectively combine to create products or services for a business. A true business process functions to fulfill the mission and objectives of the business.
A nine-step business model designed to manage and ensure reliability and replicability of “mission-critical” processes across departments or enterprises. The nine steps are:
- Identify the process’s mission
- Document the process
- Document both process and customer requirements
- Identify all process and output measures
- Create an effective process management system
- Create a comprehensive data collection plan
- Monitor the process’s performance
- Create “dashboards“, including targets and limits
- Identify opportunities for improvement
A management philosophy that emphasizes questioning the status quo in all aspects of the business, in order to identify methods of meeting and exceeding the business’s mission and objectives.
A management method designed to identify and evaluate a business’s actual or potential risk by engaging in a particular action or making a particular change from existing processes.
The process of gauging a system’s or instrument’s accuracy by comparing it to an identical system or instrument whose accuracy level is generally accepted as the standard for that business, industry, or enterprise.
The period of time between an instrument’s initial calibration and a subsequent calibration. The difference in performance during the period between these two calibrations indicates the instrument’s “drift”, or deviation from its standard of performance.
The entire range of a product, process, or system’s inherent variation in ability to function. Capability is usually regarded in “evolutionary” terms – i.e. from the unit’s minimum to maximum abilities and any capacities for growth – and is usually evaluated according to specified standards by results that can be measured.
A tool of statistical measurement used to determine capability by comparing a process’s actual performance with customer expectations.
The measurement of a product, process, or system’s maximum capability with regard to a particular factor or characteristic (e.g. volume, speed, quality, effectiveness, etc.).
The relationship between an action or condition (i.e. a “factor”) and an effect that results from it (i.e. a “response variable”).
A factor (i.e. an action or condition) that creates an effect (i.e. a “response variable”) on another factor or process.
A component of “continuous flow” processing; a group of systems or operations (including necessary staff, equipment, etc.) that is arranged in a particular sequence, adjacent to the next step in the processing sequence. The arrangement and use of cells in continuous flow processing ensures that each step of the process occurs in order, permitting speed and efficiency.
A measure of central tendency; sometimes called the “mean“. When measuring data, particularly data related to a process, the center is the data’s average value.
The halfway point between highest and lowest levels of a process. Center points are used as a performance measurement tool, by launching a process with each of its factors set at its individual halfway, or center point.
A proposition used in probability theory to explain why data distributions frequently tend toward the “normal distribution”: When calculating the average distribution of a number of random variables that have identical distributions, if their variance (i.e. statistical dispersion) is finite, their average distribution will tend toward the normal distribution.
The numerical average, or mean, of data related to a particular product, process, or system, and the tendency of that data to gather around a single central point between the highest- and lowest-valued data points.
A Japanese phrase that translates as “load-load”. It refers to a production method in which all equipment needed to produce one part or component is arranged in proximity, so that the equipment operator may simply “load” the part and continue with the next process or operation.
A type of production or assembly line containing the equipment necessary for all stages of production of a particular part or component. The chaku-chaku line requires human participation only to load the initial part, freeing staff to move on to other processes.
A person, process, or condition that causes change within an organization, team, system, process, or product, either deliberately or inadvertently.
The period of “down-time” during which a system or piece of equipment is taken out of production in order to refit it for producing a different unit (e.g. switching ink or toner in a printer).
A trait or feature of a product, process, or system, especially one that defines the unit and/or differentiates it from other units.
Written documentation, created at the outset of a project, process, or enterprise, that defines its scope, authority, responsibilities, mission, and objectives, as well as those of the team engaged in that enterprise.
A statistical tool used to test three separate analytical functions:
- Goodness of fit, which determines whether the lot from which the sample was taken conforms to a particular distribution.
- Homogeneity, which determines whether various samples or lots are homogeneous (i.e. whether a specified characteristic is the same in each sample).
- Independence, which determines whether, when applying two or more criteria of classification to a sample or group, the criteria are independent (i.e. a test of the null hypothesis; if the null hypothesis is false, the criteria are not independent, but “associated”).
The set of facts and/or conditions that surround or may be expected to surround a particular event, often affecting, modifying, or controlling the event or the likelihood of its occurrence.
C-Level, usually abbreviating the word “chief”, refers to a senior executive’s status (e.g. CEO or chief executive officer; CIO or chief information officer; CFO or chief financial officer; etc.).
The result obtained when a distribution’s standard deviation is divided by its mean. The calculation is used as a “relative measure” – i.e. comparing the variation in a data set’s dispersion to its mean by expressing the result as a percentage of the same units of measurement as those of the original data.
A factor inherent in a system or process that results in variation. Such a factor is a “common” cause when it is a factor that reasonably can be expected to occur. It generally affects every aspect and/or outcome of the process.
The condition of having achieved “market dominance” by means of:
- Offering a particular product or service, or aspect thereof, or
- Operating at levels of efficiency and/or quality, that enable a business to obtain greater market share than its competitors.
Also known as a “secondary”, “subordinate”, or “incidental” variable. A concomitant variable is observed, but is not measured or otherwise used in analyzing the data set in which it appears.
A manufacturing approach in which all elements of the design and production processes occur simultaneously, rather than sequentially. Under a CE approach, staff from all aspects of the business may collaborate from beginning to end to ensure full integration, and thus efficiency, of all processes.
A factor that limits or controls some aspect (i.e. cause) of an operation or function, and upon which the successful fulfillment of that operation of function (i.e. effect) depends.
A measurement of the likelihood that the estimated value of a specific parameter will occur in an expected range between a defined upper and lower limit (e.g. polling results that indicate that a result is “plus or minus X percent” are demonstrating a confidence interval of “X percent”).
A measurement of the likelihood that the estimated value of a specific parameter will fall within the relevant confidence interval (e.g. a confidence level of 95% for the accuracy of a specific factor indicates that the factor will be accurate in 95 out of 100 units in a sample).
The condition of adhering to or complying with particular requirements or standards.
The condition of being uniform, regularized, or standardized, according to specific requirements or guidelines.
The probability that a lot of a product or process, when delivered to the consumer or point of purchase, will be nonconforming, defective, or of insufficient quality by virtue of the fact that too great a number of nonconforming units were allowed to remain in the lot when it was accepted after inspection.
A production method using batch sizes of single units. Each stage of the production process completes all of its tasks in contributing to the production of one unit before that single unit is sent to the next stage.
A production process in which a unit undergoes each stage of production sequentially. The unit remains at the first stage until that particular segment is complete; it is then sent to the next stage, where the process is repeated. This method continues until the unit has completed all sequences of the production process.
Sometimes called “continued improvement”; an approach to a process, system, or enterprise by which the team constantly strives to improve that enterprise’s individual elements, the interrelationships between them, and the whole.
A condition in which a process or operation is free of assignable causes of (i.e. nonrandom) variation; the process is stable, with variation that is normal and predictable. “Control” is also the fifth and final phase of the Six Sigma DMAIC process, in which a product, process, or system’s stability and predictability are tracked and verified.
A visual depiction of the upper and lower statistical limits of a product, process, or system, with an average or central line or indicator, used to measure data relative to that unit or lot. When all data sets fall between the limits, the process is “in control”.
The upper and lower points of demarcation on a control chart, measuring data relevant to a product, process, or system; when the data all fall between those limits, the process is “in control”.
Written documentation that defines the average point of a product, process, or system, and provides guidelines for ensuring a quality level that will keep the data within those limits (and thus, “in control”).
A method of measurement, using a specific unit as an example of acceptable quality levels, that includes a defined target for results.
An acronym for “Customer-Output-Process-Input-Supplier”; the inverse of the “SIPOC” process. A five-step method of designing a process or system to ensure quality:
- Obtain the customer’s perspective (e.g. wants, needs, expectations)
- Determine the necessary output qualities and characteristics to satisfy those customer requirements
- Identify the processes necessary to achieve such output
- Identify the materials and/or other “input” necessary to such processes
- Choose suppliers based on the previous four requirements
Steps taken to ameliorate, repair, and/or prevent noncomformities or defects and their causes.
A positive relationship, especially one with “cause and effect” qualities, between two or more variables or sets of data.
One element of calculating “cost of quality” (COQ), encompassing costs of both “quality assurance” and “quality control“. COC comprises the costs of ensuring that a product or service conforms to or exceeds specified standards of quality.
Note: A minority of professionals use the acronym COC to refer to “certification of conformity,” which is an assurance that a particular product or service conforms to specified standards.
One element of calculating cost of quality (COQ), comprising all of the costs resulting from the failure of a product or service to conform to specified minimum quality levels. Such costs may include, e.g. “rework“, loss of customers, etc.
The costs of maintaining processes or systems, or of producing products or services, of inferior quality. Such costs fall into four categories:
- Appraisal costs, or costs of assessing requirements for ensuring sufficient quality levels
- External failure costs, or costs of delivering units of inferior quality to the customer (i.e. customer dissatisfaction)
- Internal failure costs, or costs related to finding, addressing, and/or correcting defects before delivery to the customer or point of purchase, and
- Prevention costs, or costs related to avoiding costs in the other three categories, as well as costs of general quality failures
The costs of ensuring delivery of a product or service of acceptable levels of quality, and of preventing delivery of a product or service of inferior quality. COQ encompasses the four categories identified in the definition of COPQ.
The goal for costs associated with a particular product, process, project, or enterprise. A cost target represents the maximum collective costs that are to be allowed for all aspects of a project.
Sudden and comprehensive failure of a product, system, or process, or a component thereof; often used to describe an information technology failure.
A customer expectation regarding an aspect of a product or service (e.g. quality, speed, etc.). Such an expectation is a CCR when the customer may be expected to refuse to purchase, or to purchase from a competitor, if the expectation is not met.
The recipient or purchaser of a product or service. A “customer” may be an individual, a group or team, an enterprise, or an organizational entity.
A business strategy that evaluates the wants, needs, and preferences of the business’s customers, and implements controls and other methods to ensure quality levels that comply with those wants, needs, and preferences.
The length of time that elapses, or can be expected to elapse, between the beginning and the end of a particular process.
A management tool used to provide summary, at-a-glance evaluations of performance, usually with regard to:
- Customer wants, needs, preferences, and/or expectations, or
- Quality and/or performance of a product, process, system, or enterprise.
A set of facts or information. There are two types of data:
Evaluation of the collection of information in a data set for a particular purpose – e.g. to determine quality and/or performance and identify flaws, or to project new hypotheses, conclusions, steps in a process, or results.
Also known as “data contamination”; the disruption or violation of the information in a data set, or the introduction of error or extraneous information into it.
The level of accuracy and completeness of the information in a particular data set – i.e. the degree to which it is free from corruption or the introduction of error.
A characteristic, factor, or aspect of a product, process, service, or system that renders it nonconforming – e.g. inaccurate, inoperable, or of less than acceptable quality. Defects can be categorized into four classes:
- “Very serious”, i.e. causing “severe” physical and/or economic harm
- “Serious”, i.e. causing “significant” physical or economic harm
- “Major”, i.e. causing substantial problems or difficulties with the unit’s intended use
- “Minor”, i.e. causing relatively inconsequential problems or difficulties with the unit’s intended use
A measurement used to determine the probable number of defects in an average production run. DPMO is calculated by dividing the average number of defects observed by the number of opportunities for a defect to occur during that run; the result is then normalized to one million.
A measurement representing the average number of defects observed in a population sample. DPU is calculated by dividing the total number of observed defects by the total number of units in the sample.
A business management approach and a fundamental component of the Six Sigma business philosophy. DMAIC is a five-step method for ensuring that products or processes adhere to Six Sigma quality levels (i.e. no more than 3.4 defects per million opportunities), generally by improving processes:
- Define (objectives and deliverables, both internal and external)
- Measure (current performance)
- Analyze (defects and their root causes)
- Improve (elimination of existing defects)
- Control (future performance of the process)
A business management approach and a fundamental component of the Six Sigma business philosophy. DMADV is a five-step method for ensuring that products or processes adhere to Six Sigma quality levels (i.e. no more than 3.4 defects per million opportunities), generally by improving methodologies:
- Define (objectives and deliverables, both internal and external)
- Measure (customer needs and expectations)
- Analyze (options available to fulfill customer expectations)
- Design (the process to fulfill customer expectations)
- Verify (design performance and ability to fulfill customer expectations)
Usually expressed in lower-case letters as the acronym “df”; a statistical measure of the number of data points, and how many data points are used, in a particular calculation.
Also known as the PDCA (Plan, Do, Check, Act) cycle or the PDSA (Plan, Do, Study, Act) cycle; a methodological model designed to ensure “continuous improvement” in quality levels. The Deming Cycle consists of the four steps represented by the acronyms PDCA or PDSA, performed repeatedly at all levels of a process, to ensure quality and lead to standardization.
Also known as a “response variable”; expressed as Y=f(Xl. . . XN) variable, where Y is the dependent variable. In an equation or system, a dependent variable is a variable that serves as a function of an independent variable in the equation, or one upon which a particular operation depends (i.e. the presence of the variable can affect or change the result).
A formal, comprehensive, documented strategy designed to guide the “deployment”, or implementation, of a product, service, process, strategy, or system through the transition from a testing phase to actual use.
A quality assurance process that:
- Identifies and defines a design’s objectives, variables, limitations
- Processes the design by accounting for those objectives, variables, and limitations, thus ensuring optimal performance
A business philosophy and methodology used to ensure that the design of products, processes, services, or systems that will be produced and will perform at Six Sigma quality levels (i.e. no more than 3.4 defects per million opportunities).
The probability that a product or process will be nonconforming or defective because the unit’s design does not provide the required support necessary for the unit to fulfill its intended use upon deployment.
The process of identifying existing nonconformities or defects in a product, service, system, or process.
The difference between the actual value of a variable in a data set and either the desired value or a standard or normative value (i.e. the mean).
A cost that can be linked directly to specific product or activity, or component thereof. Direct costs include materials, labor, certain “overhead” costs, etc.
A statement of strategy or policy, usually in written form, that serves as a guide to staff in fulfilling organizational objectives. Examples include mission or vision statements, statements of purpose, and policy guides for the execution and implementation of discrete projects, systems, or enterprises.
Information that has been converted from “counted” or “attribute data” into a form that can be subjected to limited analysis. “Discrete data” fall into certain defined categories whose potential values are limited in number and generally are not reducible further.
A subsequent point in a process – i.e. a process stage that occurs after completion of the present stage.
Groups comprising two units or individuals each. The two units in each dyad are usually qualitatively
- Similar in some way, or
- Directly opposite in some way.
A device used to convert a voltage signal into a linear current signal.
The result produced by a particular cause – e.g. the result achieved by a specific action. Sometimes expressed as the “response variable,” represented by “Y,” with the cause, or independent variable, represented by “X.”
A measurable quality representing the ability (i.e. of a particular product, process, service, system, operation, enterprise, etc.) to achieve a particular objective or obtain a particular desired result.
A condition measured by comparing the quality and/or value of the output with the collection of costs, resources, and effort that constitute the input. The higher the output and/or quality levels and the lower the costs, resources, and effort, the greater the level of efficiency.
A method of “process control” consisting of a collection of predefined conditions or standards, against which a process or output is compared to ensure that it conforms to acceptable standards of quality.
Customers who are not a part of or affiliated with a particular business, and who are also the purchasers or recipients of that business’s “output” (i.e. product or service deliverables).
A nonconformity, defect, or deficiency in quality that affects, is identified by, or is apparent to the customer.
Also known as “F ratio”; a test that measures whether two samples withdrawn from different data populations have the same standard deviation within a particular confidence level.
The act of making a process, system, or event easier and/or more efficient.
A variable, whether controlled or uncontrolled, that may affect or influence another factor or a result.
An analytical methodology used to:
- Identify all possible forms of failure in a product, process, system, or enterprise
- Determine the potential frequency of such failures
- Identify the potential effects of such failures on all aspects, components, and functions of the product or process
- Identify and prioritize such failures and their prevention, and
- Design and implement strategies to prevent such failures
An expansion of FMEA to include identification and assessment of the potential “criticality” or severity of the effects or consequences of possible failures.
The condition or state of being able, by means that are practical given the circumstances, to function as intended or to fulfill a given role. A process or function is feasible if it can be accomplished with reasonable efficiency and efficacy – e.g. without costs that are disproportionate to value of the likely or expected result.
A measurement of production performance, often used to determine COQ (cost of quality). First-time yield is calculated by dividing the number of conforming or good units produced in a process’s initial run by the total number of units input at the beginning of the process.
Also known as “Ishikawa diagram“, “cause and effect diagram“, or “cause and effect technique”, the Fishbone diagram is used to analyze cause and effect. Its visual representation resembles the shape of a fish with a central “spine” representing the issue or effect being analyzed, and with smaller branches connected to the spine that represent discrete causes.
A management methodology deriving from “lean manufacturing” philosophy; so named because, in its original Japanese, each term representing a stage in the five-step process begins with the letter “s.” The “Five S” steps include:
- Seiri (segregation of necessary and unnecessary resources and removal of those that are unnecessary
- Seiton (organization and identification of resources for easy access and use)
- Seiso (organized “cleanup”)
- Seiketsu (maintaining an efficient workspace by following the previous three steps daily)
- Shitsuke (the habitual performance of all four of the previous steps)
A business management technique used to identify and explore causal relationships involved in a particular problem. Often visually represented by a “tree” diagram, the technique involves addressing the problem first by asking, “Why?”; then answering the question; then addressing that answer by again asking, “Why?”; and repeating the entire process a minimum of four additional times in an effort to identify and distill root causes.
A business cost that does not vary, regardless of changes in process or production.
In the business production or manufacturing context, the movement of material or units in a given direction.
Sometimes designated one of the “Seven Tools of Quality,” a flowchart is a visual representation of the steps, stages, or a series of steps or stages in a process or in the function of a system, in the order in which each step or stage does or should occur.
A business analysis technique, usually accompanied by a visual representation in the form of a central arrow pointing toward an object, with information listed on either side. “Driving forces”, or those that propel the business, system, process, or product toward improvement are listed on one side (usually the left); the “restraining forces”, or those that keep the process rooted in the status quo and “restrain” it from moving toward improvement, are listed on the other side (usually the right).
A structured template containing specific, predefined information; often used to collect, categorize, or analyze data, and/or to provide instruction or guidance.
A form of experiment design that tests only a subset or sample of all possible combinations of factors in a data set. For data sets that are too large or contain too many factors to make it feasible to test all possible combinations accurately, use of a fractional factorial may be appropriate.
In the distribution of values in the variables of a data set, a visual representation of the frequency with which those values appear in the distribution. Often demonstrated via use of a “histogram” or a “frequency polygon”.
A form of experiment design that tests all possible combinations of factors in a data set. Often used to test relationships between factors and/or the operation of factors at multiple levels.
The specific task or role of a product, process, system, operation, enterprise, individual, or team, requiring specific skill sets and capabilities, and intended to produce specific effects or results.
A project and production management tool named after its inventor, engineer Henry L. Gantt. A Gantt chart is a visual representation, usually in the form of a bar chart, depicting the work planned for a project or enterprise. The bars demonstrate work completed to date as well as that planned for each project component or process, including existing and projected timelines.
A measurement of the difference between customer expectations and a business’s existing products or services.
With regard to a product, service, process, system, project, enterprise, or operation, an intended result that conforms to specific and measurable targets (e.g. quality level, speed, efficiency, etc.).
A four-step project planning tool used in Six Sigma business methodology to help Green Belt and Black Belt team leaders ensure productivity, efficiency, and quality. The four steps function as follows:
- Goals – clearly defines the team’s mission and establishes objectives that conform to the “SMART” approach (i.e. goals that are specific, measurable, attainable, relevant, and timely);
- Roles – uses a “roles statement” to define clearly each team member’s function and the interrelationships between individual and team roles, objectives, and processes;
- Processes – identifies and defines processes inherent in and essential to the project (e.g. problem-solving, decision-making, etc.); and
- Interpersonal – ensures open communication between team members, encourages creative and diverse contributions from all members, and discourages “groupthink.”
An employee, usually a team or project leader, who has been formally trained (and may be certified) in Six Sigma methodology. A GB implements Six Sigma principles as a part of his or her duties, but does not concentrate on them exclusively; s/he is responsible for Six Sigma implementation at the level of project organization and management.
A method of cost accounting popular in Germany for the last half-century; some sources translate the phrase literally as “flexible analytic cost planning and accounting”. Sometimes called “marginal costing”, GPK resembles “attribute-based costing“, but takes a “cost-pull” approach (i.e. customer demand drives output, and therefore, costs). It organizes a business’s operations into “cost centers” and evaluates their efficiency based on the difference between each center’s standard costs and actual costs; the resulting data are used to improve productivity and efficiency.
A phenomenon that may appear in businesses, enterprises, or teams, in which the entire membership of the body adheres to the same principles, beliefs, strategies, and approaches. This uniformity usually results from the members’ self-censorship of their own ideas, largely either because their contributions are disregarded or because they fear reprisals for contributing potentially accurate but unwelcome information.
A management model whereby employees are motivated primarily by so-called “primal urges”, rather than by loyalty to the employer or other considerations. The Gwilliam Motivational Model evokes components of other motivational theories, including the “carrot and stick” approach of Freud’s “Theory X”; the most fundamental needs, physiological needs, in Maslow’s “Hierarchy of Needs”; and the “hygiene needs” or “animal needs” of Herzberg’s “Hygiene/Motivation Theory”.
A device installed on production equipment that automatically ejects the product unit once its operation on that piece of equipment is complete, readying the unit for transfer to the next production stage. Used in chaku-chaku line production, it ensures that the only action required of employees is the loading of each product unit into the equipment at the beginning of its process.
Savings achieved through use of Six Sigma methodology in one of two ways:
- Cost savings, by enabling the enterprise to do the same amount of business with fewer employees, or
- Cost avoidance, by enabling the enterprise to take on additional business without adding additional staff
Named after studies done in the 1920s and 1930s at the Western Electric Hawthorne Works facility in Chicago, the Hawthorne Effect refers to the fact that the employees’ knowledge that they are being evaluated leads them to produce at higher levels.
A method of risk assessment that attempts to identify in advance hazards or failure in design and production processes, or sources thereof, in order to prevent their occurrence; generally refers to hazards that constitute risk of serious economic or physical harm.
The theory that a business that does not focus on ensuring quality will engage in wasteful processes and production, leading to inflated costs and decreased profits, largely through production of defective or nonconforming units and resultant customer dissatisfaction.
A visual representation, in the form of a bar graph, of the frequency distribution of a data set; used to identify patterns in variation, including amount, frequency, and types of variation. Each bar represents a particular “class” or category of the data, the width of each bar is identical, and the height of each bar is proportional to the frequency with which the class it represents appears.
The image of a frequency distribution displayed in a histogram (i.e. the “line” of the distribution).
The phenomenon that occurs in statistical analysis when the variances of two separate populations of data, upon comparison or contrast, are found to be equal.
A business management approach that is “process-oriented”. A business becomes “horizontalized” when all of its processes are aligned, particularly with regard to production functions and services.
Translations of this Japanese phrase vary widely; its most accurate expression appears to be “HoShin Kanri”, which may be translated approximately to “control of direction” (“ho” – method; “shin” – compass; and “kanri” – control). “Hoshin Kanri” is a method of strategic planning that focuses on “vertical alignment” – i.e. ensuring that the business’s vision, objectives, performance standards, and review processes are communicated thoroughly to and understood by all members of an organization, from top to bottom, via the constant repetition of a four-stage process:
- Defining and establishing a specific, limited number of policy and strategic objectives that will further the business’s mission
- Deployment of (i.e. communication and setting into motion of) these objectives throughout all levels of the corporate hierarchy
- Implementation of the objectives and the changes they require, by making them an integral part of daily activities, and
- Review of performance and results; each of these steps occurs both up and down the corporate hierarchy, in a continuous process of input/refinement/feedback traded among all employees at all levels, known as “catchball”. In the West, often called “hoshin planning” or “QPD” (quality policy deployment”), although some experts regard “hoshin kanri” as referring to the combination of both long-range hoshin planning and its daily counterpart, known as “nijiro kanri”.
Often used interchangeably with “hoshin kanri” or “QPD”; a process-oriented strategic planning methodology aimed at improving quality and performance. “Hoshin planning” refers to a long-term approach to use of the “hoshin” method – i.e. the identification of a limited number of specific policy objectives and their communication to and implementation by employees at all levels of the organization, each of whom refines and improves the processes used to achieve the objectives through the “hoshin” process of open and continuous feedback among staff at all levels of the organization.
The House of Quality is the first of four matrices in the Quality Function Deployment (QFD) process; it converts customer expectations into “critical to quality” features, compares those features with existing performance, and identifies changes needed to bring performance into alignment with customer expectations. The House of Quality comprises six components+B186:
- Customer requirements, obtained from actual customer feedback
- Technical requirements of the product or service, defined in specific and measurable ways
- A “planning matrix”, usually derived from market research, that includes measuring the relationship between customer preferences and both the company’s performance and competitors’ performance
- An “interrelationship matrix”, which measures perceived relationships between customer requirements and technical requirements
- A “technical correlation matrix”, which identifies correlations or conflicts in technical requirements and highlights opportunities for improvement (also known as the “roof” of the House of Quality)
- A matrix illustrating and measuring technical priorities, benchmarks, and targets or objectives; the results in this final matrix should fulfill the customer requirements outlined in the first component
Considerations relating to the role of humans in the work environment, including capabilities, preferences, limitations, motivations, etc. Human factors are analyzed both for purposes of identifying and preventing errors and for designing a work environemnt that encourages productivity and performance.
The minimum rate of expected return on investment that a business must achieve on a new product, process, or enterprise in order to be able to afford to undertake that activity; usually equal to the incremental cost of capital.
A controlled variable, sometimes called an “input” or “process” variable, usually expressed as “X”; its value does not depend on any other variable in the data set.
A measure of performance; generally quantitative rather than qualitative, an indicator is usually expressed as a ratio of the frequency with which a given phenomenon actually occurred to the frequency of its opportunities to occur.
A cost that is not directly allocable in full to a specific product or service (e.g. so-called “overhead costs”, a portion of which are allocable to every product or service).
A form of statistical analysis that uses statistics to draw inferences (i.e. logical conclusions based on available data) about a population by evaluating a sample from that data set. Two primary analytical methods are used in inferential statistics: 1) hypothesis testing, in which the sample data are measured to determine whether they support rejecting the null hypothesis; and 2) estimation, used to measure a particular parameter in the sample data and project its confidence interval.
Also known as “soft benefits”; gains that are non-monetary or that cannot be sufficiently quantified for purposes of accounting or other financial reporting, but that contribute to increases in quality, performance, and profit. Examples of intangible benefits may include improved employee morale, heightened customer satisfaction, better vendor relationships, etc.
The act of two or more variables functioning in ways that affect each other, producing a combined result that neither variable could achieve independently of the other.
Recipients of a product, service, etc. from within an organization; may include individual employees and/or entire teams or departments.
A visual depiction of two or more variables that possess cause-and-effect relationships, used to identify the “drivers” of those relationships and their outcomes. Also known as a “relations diagram”.
Also known as “fishbone diagram”, “cause and effect diagram“, or “cause and effect technique”, the Ishikawa diagram is used to analyze cause and effect; named after its inventor Kaoru Ishikawa. Its visual representation resembles the shape of a fish with a central “spine” representing the issue or effect being analyzed, and with smaller branches connected to the spine that represent discrete causes.
A series of “quality management” standards promulgated by the International Organization for Standardization; upon meeting these process standards and passing an audit by an accrediting organization, a business may obtain ISO 9000 “certification.” ISO 9000 standards provide guidelines that help businesses identify target objectives and measure performance in four primary, integrated ways:
- Defining and meeting customer expectations with regard to quality
- Identifying and complying with regulatory requirements
- Enhancing customer satisfaction
- Simultaneously striving for continuous improvement in attempts to meet the previous three standards
A method of choosing a data sample drawn from a larger population based on one’s own judgment, grounded in relevant experience. The sample and the variables included in it are (or should be) selected based on judgments in three primary areas:
- Their value
- Their relative risk
- The extent to which they are representative of the larger population
A “lean manufacturing” process that uses a “continuous improvement” (CI) methodology; it attempts to improve quality and performance by eliminating waste in seven primary areas of the manufacturing process:
- Product defects
In practice, JIT manfacturing produces only those units in only those quantities needed by the next stage of the production process; at the stage of delivery to the customer, it controls inventory by delivering the necessary quantities of specific units only at the point at which they are needed.
Japanese for “incremental and continuous improvement”. In the manufacturing context, “Kaizen” refers to an overall management philosophy by which the business as a whole and each of its individual employees commit to seeking continuous improvement in all aspects of the organization’s enterprise, not merely those related to quality, performance, or profit.
Japanese for “sign” or “signboard”; a term originally used in the production context to refer to a printed card containing an order to complete the next stage of the manufacturing process. A component of “JIT manufacturing”, a “Kanban” is a signaling device used at each stage of production to “pull” each unit to the next stage at the proper time.
A model used to measure how well a product of service meets customer requirements; named after its inventor, Dr. Noriaki Kano. “Kano analysis” is usually expressed visually via a “Kano diagram” (also known as a “3-arrow diagram”), which plots three forms of quality (“must-be quality”, “one-dimensional quality”, and “attractive quality”) across a relation diagram. Kano analysis is based on four core principles:
- Understanding “unspoken” customer expectations is as important to performance as understanding those that are expressed
- With regard to some customer requirements, better product performance leads to greater customer satisfaction (i.e. “one-dimensional quality”)
- With regard to some customer requirements, customer satisfaction does not increase with better performance, but it does decrease if the customer notices defects in performance (i.e. “must-be quality”), while with other requirements, because the additional performance is not expected, customer satisfaction does not decrease with its absence, but does increase with its presence (i.e. “attractive quality”)
- A properly designed survey (i.e. a “Kano method survey”) of customers will identify customer expectations with regard to each form of quality, allowing business to ensure performance that meets those expectations
One of a set of specific measures of performance, determined in advance, that demonstrates in quantifiable terms how well a business is meeting its performance standards; usually expressed as a ratio of the frequency with which a given phenomenon actually occurred to the frequency of its opportunities to occur.
The period of time required to fulfill a customer’s order completely – i.e. the length of time that elapses between the moment the customer places the order and the moment the completed order is delivered to the customer.
An analytical approach used to identify and eliminate waste (i.e. any aspect of the business that does not add value in the customer’s eyes, and for which the customer would not want to pay) by formulating a strategic plan to eliminate such waste throughout all levels, systems, and processes of the organization.
A management philosophy that evaluates the entire business, or enterprise, as a whole, identifying and eliminating waste at all levels and in all systems and processes in order to enhance quality, performance, and profit. “Lean enterprise” is based on three core principles:
- Satisfy customer expectation by engaging only in “value-added” activities
- Define the “value stream” – i.e. the flow of information and material, and any obstacles thereto, in the entire production process, from order placement to delivery to the customer
- Eliminate waste in all levels and activities of the business
Lean enterprise implements these principles through five core operational processes:
- Identifying “value”
- Transforming the “value stream” in accordance with those principles
- Ensuring linked, consistent, level flow within the entire value stream
- Ensuring response throughout the entire value stream to customer “pull,” or expectations
- Striving for perfection in fulfillment of all principles throughout the entire business
In the manufacturing context, these principles and processes manifest through four functions:
- Production flow
A manufacturing philosophy that conforms to the principles of “lean enterprise“, with the primary focus on elimination of waste. “Lean manufacturing” implements these principles via six primary processes:
- Ensuring zero wait time
- Maintaining zero inventory
- Using a scheduling system based on customer “pull”
- Reducing production batch sizes to enable “batch to flow” processing
- Balancing production lines to ensure consistent flow
- Reducing actual processing time
A component of lean manufacturing; a method of production scheduling designed to reduce excess inventory and waiting periods by “leveling” each stage of the process (i.e. balancing each stage of the production process so that each has an equivalent amount of work, thus reducing bottlenecks and ensuring continuous flow).
Different values (or “settings”) of a factor that must be considered in measuring its performance.
Sometimes used interchangeably with “life cycle management” (LCM) or “enterprise life cycle management” (ELCM), life cycle methodology is a business approach used to create a product, process, system, or enterprise by assessing the processes involved at two simultaneous levels:
- As a series of consecutive stages
- As the whole comprising those stages
This approach is used to identify potential or actual gaps in performance, and to improve performance, quality, and profitability at every stage of the product’s or process’s “life cycle”.
The set of consecutive stages in an entire process. In the manufacturing context, they manifest in two ways:
- In project management (e.g. identification of a need, conception and design of a product or process to fill it, production, and implementation)
- In production (e.g. order placement, all production stages in chronological order, purchase, delivery, and use)
A type of semiconductor device; when electric current passes through it, it emits either visible or infrared light. LEDs are frequently used as “indicator” lights on electronic equipment and products (e.g. computers, stereo components, watches, wireless electronics, etc.).
A setting in the controller of a component or piece of equipment that establishes the outside parameters (i.e. the high and low limits) of any signal produced.
A visual representation used to depict performance by connecting data points with a line. A line chart measures data in simple quantitative terms – i.e. without incorporating the effects of either process capabilities or control limits; frequently used to determine changes in quantity or rate over time and to identify trends.
Adapted from military terminology; a business role, filled by an individual or team who has “front-line” contact with customers, such as sales staff, used to enhance performance. An employee in the position of “listening post” gathers data on customer wants, needs, preferences, and expectations, and conveys that data in useable format to a person or unit in the business responsible for disseminating the data appropriately throughout all levels of the organization.
A research method used to determine the reasons an individual customer or a class of customers has withdrawn its business, whether by switching to a competing firm or by ceasing to use a given product or service entirely. Such analysis usually includes surveys of “lost” customers, and is used to improve performance by identifying and neutralizing sources of potential or actual customer dissatisfaction.
A collection of a specific quantity of like units derived from a common source; usually expressed as “(lot size = N)”. The units in a lot contain similar or identical features, and are submitted for a stage or production, inspection, acceptance, or delivery in a group.
The probability that a product, process, or system can be maintained (i.e. serviced or repaired) within a defined period of time under conditions of ordinary use.
This award is named after former Secretary of Commerce Malcolm Baldrige; created by an act of Congress in 1987 and administered by the National Institute of Standards and Technology. It is presented annually to American companies (two companies per year in each of five business categories) for the following purposes:
- To recognize their quality and productivity improvements
- To establish best practices for quality improvement
- To encourage other companies to commit to quality improvement
- To aid and educate companies about the importance of quality improvement and assist them in designing and implementing their own programs
An expert in Six Sigma philosophy and methodology who has generally undergone testing and certification. An MBB serves as a leader of a company’s Six Sigma program, is responsible for strategic implementation of programs and for training and mentoring Black Belt and Green Belt in Six Sigma methodologies.
A visual representation that depicts two or more data sets and illustrates the strength and/or the direction of any relationships between them. Five types of matrix diagrams exist: C, L, T, X, and Y matrix diagrams. The name of each is denoted by a letter resembling the shape of the matrix the diagram produces; most common are L and T matrices. Matrix diagrams are used to explore the relationships between data sets to identify trends, isolate potential problems, and improve performance.
The average value of the data points in a data set. The mean is calculated by adding the values of each individual data point and dividing the sum by total number of individual data points.
The middle value (i.e. the exact middle point) of a data set’s distribution; 50% of the values fall above the median, and 50% fall below it.
Standards of measurement applied to specific processes, systems, operations, or enterprises to assess performance.
A vocation or reason for existence; a collection of objectives that, if met, fulfill an organization’s purpose. Usually expressed in a company’s “mission statement”, which generally outlines its objectives, business philosophy, quality and performance standards, and ethics policies.
Also known as a “mixed effects model”; an analytical model that contains a “mix” of both fixed and random effects.
The value that occurs most frequently in the distribution of a data set. In some distributions, there may be more than one value that occurs most frequently (i.e. the same number of occurrences); these are known as “bimodal distribution“.
Japanese for “futility” or “uselessness”; in the business or production context, generally translated as “waste“, and applied to any activity, process, or system that consumes resources but adds no value from the customer’s perspective.
A visual representation of patterns or “families” of variation within and between points in a data set and/or over a period of time.
Forms of accounting that use “new”, or nontraditional costing methods capable of tracking and analyzing more complex financial data than the older standard profit-and-loss methods (e.g. “budgeted hourly rate”, or “BHR” cost systems). Examples of “new cost systems” include activity-based cost systems, attribute-based cost systems, resource consumption accounting, and Grenzplankostenrechnung costing; among others.
Usually arises in the design process; an estimated value assigned to the process that reflects or approximates its target value. Six Sigma methodology calculates deviation from the nominal in determining quality levels.
A structured method, similar to brainstorming, used to promote creativity and diversity in problem-solving and to reach rapid consensus on solutions. Group participants silently generate and record ideas, which are then collected by the group’s facilitator and presented for open collective discussion without identifying each idea’s author.
A traditional system of accounting that is a form of “indirect” costing. Where actual cost figures are unavailable at the moment the information is needed (e.g. as with overhead costs), costs are estimated based on average costs and historical and other relevant data.
Also known as a “Gaussian distribution”; a statistical representation of the frequency with which values occur in a data set, and in which the mean and median tend to be close or identical. The visual depiction of a normal distribution is the classic “bell curve”: The value that occurs most frequently falls at the middle point of the chart (the apex of the bell); the remaining values appear on either side of the middle point in order of decreasing frequency, creating the “tails” of the bell shape.
The probability that no difference or variation exists between two or more data populations; any observed difference is either random or a result of sampling error. Sometimes expressed as “H0”.
The degree of sustained deviation of the controlled or process variable from the set point.
A standard or accepted meaning of a given value that includes a specific, precise description of the value, how it is calculated or derived, and how it is measured.
The cost incurred by choosing one option instead of another – i.e. the unobtained value of the alternative not chosen.
The process of modifying or adjusting a process, system, or operation to create the best possible average performance, while simultaneously minimizing variation, nonconformity, or defects.
To arrange in planned, strategic, systematic format, frequently in related groups or categories, to facilitate access, use, or performance.
A measurable and quantifiable consequence, effect, or result of a process, operation, or enterprise; also used to refer to production quality or performance targets or objectives.
Tools (“measures”) used to assess and quantify the results of a given activity (e.g. number of units produced).
An equipment module or device that conveys a signal to the final segment or “control element” of a process, thus initiating (or preventing) any change in order to ensure a product (i.e. “output”) that conforms to expected or intended standards.
A measure of manufacturing equipment’s productivity and efficiency, based on three primary parameters:
Also known as a “percent chart”; a visual representation of the percentage of defective units in a population sample. A “P chart” is frequently used as a tool to track and analyze quality and performance in production processes.
Sometimes called “paired samples”; two sets of related (or corresponding) observations, with each set deriving from a separate population or sample.
A model or archetype for a given concept that is generally accepted as defining the concept’s parameters, content, processes, or other elements, and as providing a way of thinking about, analyzing, or applying the concept.
A significant change in what is generally accepted as a model or archetype for a concept, and/or a significant change in generally accepted methods of thinking about, analyzing, or applying such a model; usually arises because relevant new data are discovered or existing data become obsolete, alteration fundamental assumptions, theories, and conclusions.
Sometimes called “period expense”; a cost or expense incurred in a given period that is not traceable directly to production processes or to the products created.
Term coined in 1961 by statistical process control expert Shigeo Shingo from the Japanese words for “inadvertent mistake” (“poka”) and “prevention” (“yoke”); generally translated as “mistake-proofing” or “error-proofing”. In the manufacturing context, “poka-yoke” is the initial step in “error-proofing” design or production processes by means of a signaling device that indicates whether the process is in its operable state. Its purpose is to make errors either readily identifiable and easily corrected, or prevented entirely.
A specific collection or set of data, or a collection of units (i.e. a “lot“), from which samples may be drawn and analyzed.
Sometimes used as a synonym for “consistency” or “repeatability”; when measured repeatedly, the degree of lack of variation in:
- A unit or process, or
- Among identical units in a lot (or samples in a population).
A step or series of steps taken to alter conditions in a way that will avert process error or prevent production of defective or nonconforming units.
The likelihood that a particular event will occur; expressed quantitatively between “0” (no chance of occurrence) and “1” (absolute certainty of occurrence). Sometimes known as “long-term (or long-run) relative frequency”.
A series of defined, sequential steps or tasks that, when performed in order using specifc, defined inputs, will produce a specific output or result.
Sometimes expressed as “Cp”; the entire range of a process’s ability to function as intended, within defined limits of variation (i.e. the rate at which it can produce units that are free of defects or nonconformities). Process capability is used to measure whether a process’s performance can meet customer requirements.
Sometimes expressed as “Cpk”; a tool used to measure how closely a process performs to specifications, within defined limits of variation; used to determine how close a process is to performance targets and the level of consistency of its average performance.
A method of objective comparison of two separate processes or sets of process conditions to evaluate which performs better – e.g. comparing process conditions existing in production of nonconforming output with the process conditions involved in production of conforming output.
A method or tool used to ensure quality and/or performance by keeping the steps of the process within defined limits, thus minimizing variation in output.
Sometimes called a “process measurable”; a quantitative measure of performance that assesses how a specific process affects customer expectations, thus permitting modification of the process to enhance performance.
A method of analyzing and classifying processes, usually by means of a flowchart (i.e. the “process map”) as either value-producing or waste-producing; used to establish cycle times and identify waste and loss, thus permitting modification of the process to eliminate waste and enhance value and performance.
The practice of administering and modifying (perhaps continuously) a process or collection of processes in defined, measurable, documentable ways to improve quality and performance and to ensure that customer expectations are met. Sometimes called “re-engineering” or “business process quality management.”
The entire collection of all costs, both direct and indirect, incurred in producing a particular product or service.
A group of products that derive from the same “product platform” (i.e. are related in a defined way, such as having in common specific components), and/or that undergo similar or identical production processes using the same equipment.
A measure of efficiency, expressed as a ratio; the rate of a production process’s output compared to the rate of its input over a defined period of time.
An intended outcome or effect to be achieved through a specific action or series of actions; a function or a reason for existence.
A guarantee that quality levels will conform to specific, defined standards that meet customer expectations; administered via systematic compliance with a formal set of guidelines and procedures that are designed to define quality levels, establish appropriate systems and processes for conformance, and measure results.
A formal, in-depth review of quality control and quality assurance processes, as well as other processes and systems, designed to measure whether such processes and/or their outputs meet established minimum standards of quality. Usually conducted by an independent entity.
Small teams or groups within a company, project, or enterprise that meet regularly and work together to study quality control and quality assurance issues and apply solutions that will improve their individual performance, as well as that of their project and/or the business as a whole. In Japan, known as “Quality control circles.”
A process or set of processes designed to:
- Define established standards of quality that will meet customer expectations
- Assess whether products, services, processes, or systems conform to those standards, and
- Identify any gaps, nonconformities, or failures to meet those standards
One of the three “prongs” in the “Quality Trilogy”.
Quality function deployment (QFD) is a structured method that focusses on identifying customers’ needs and expectations. These are then effectively translated into specific characteristics and specifications to build and deliver products that accomplish or meet those needs.
An organized, comprehensive, systematic approach to enhancing quality levels throughout all levels and divisions of an organization. One of the three “prongs” in the “Quality Trilogy”.
A framework incorporating generally accepted standards of quality and performance (e.g. ISO 9000), used to ensure that a business’s performance and quality levels meet or exceed those standards and consistently strive for improvement. May incorporate a three-pronged system known as the “Quality Trilogy,” comprising:
- Quality control
- Quality improvement
- Quality planning
The systematic design and implementation of processes and systems that will produce products or services that adhere to established quality standards and customer expectations. One of the three “prongs” in the “Quality Trilogy”, quality planning focuses on preventing nonconformities and defects while striving for continuous improvement.
A form of control chart that illustrates the upper and lower statistical limits of a process’s stability. “Events” in the process are organized into weighted classifications and counted; the sum of the weighted data constitutes the process’s “quality score”.
A method, technique, or device used to aid in “process management” efforts to improve quality and performance.
A three-pronged management method used to ensure systematic improvement in quality levels, comprising:
- Quality planning
- Quality control
- Quality improvement
A minimum number of officers (e.g. executive-level staff or board members) that must be present at a meeting in order to transact official business.
A sample (or set of samples) chosen from a population of data via a technique that ensures that each sample (or combination thereof) has an equal chance of being selected (i.e. “Random Sampling”).
The difference between the highest and lowest values in the dispersion or “spread” of a given data set. “Range” is the simplest statistical measure of dispersion; usually used as a supplement to other statistical measures, such as standard deviation.
A method of statistical analysis used to predict a relationship between a dependent variable and one or more independent variables; regression is used to predict the future value of the dependent variable based on the significance of the historical relationship(s) between it and the independent variable(s).
The degree of probability that a product or process will perform its intended function or fulfill its intended purpose, under specified conditions and for a desingated length of time, without defects, failures, or nonconformities.
Systematic, scientific investigation, marked by critical analysis, intended to test a theory, reach a conclusion, or evaluate and apply new data.
A residual occurs in regression analysis, and represents the difference between the actual output value and the output value predicted by the regression model. Also known as “errors,” residuals can help identify weaknesses or defects in a regression model.
A measurement of the degree to which an experimental design evluates interactions, or combinations, that are free of confounding effects. The greater the number of interactions or combinations, the lower the rate of confounding effects. For example, because a full factorial design evaluates all possible combinations, the rate of confounding effects is zero, giving such a design a “complete” resolution.
A method of accounting that analyzes and tracks resources instead of activities; used to measure and manage capacity, to map specific resources to specific processes and activities, and to define “pools” or “families” of resources.
A reaction, result, or effect or a cause, process, or set of factors. To assess whether a given outcome is a response to a particular factor or set of factors, that outcome must be measurable.
The condition of being accountable for the products or processes (or the segments of a product or process) over which one has personal power or authority, including liability for ensuring proper function, performance, and results. Responsibility assumes that one possesses a required minimum level of expertise and capability; it also assumes that one will accept the consequences if the product or process does not meet required standards.
A method of analysis that seeks to identify and study the fundamental reason(s) – i.e. the “root causes” – of a product’s or process’s nonconformance, defect, or failure, thus permitting modification or elimination of that cause.
A visual representation used to measure process performance over a defined period of time, and to identify performance patterns or trends; depicted by plotting data points drawn from the process population across the chart and connecting them with a line.
A subset drawn from a larger population of data; for purposes of analysis, the units in a sample are assumed to be representative of those in the entire population with regard to the characteristics to be analyzed.
Applies to computerized processes and functions; the length of time required for the system’s processor to accept and read all inputs, execute its “control program” and any other necessary functions, and update all outputs accordingly.
Also known as a “scatter plot” or “scattergram”; one of the “Seven Tools of Quality.” A scatter plot is a visual representation of the relationship between two variables, depicted by plotting the data points across the chart in a “scatter” effect; the closer the data points, the stronger the relationship.
A tool used to evaluate the degree to which a business’s performance meets customer expectations; usually takes the form of a customer survey or questionnaire, or is compiled from the results of such a survey or questionnaire.
A formal agreement (i.e. a contract) between a service provider and a customer, defining the parameters of the service(s) to be provided (including such elements as quality, speed, etc.) and guaranteeing that those parameters will be met; may be used between a business and an external customer or between one division of that business and an internal customer (i.e. another division of the same business). An SLA focuses on ensuring the requirements are thoroughly defined in three specific area:
The word representing the letter “S” in the Greek alphabet; in statistical analysis, a symbol, expressed as a lower-case “s,” representing a population’s standard deviation. When used in the business management context, it functions as a measure of process capability; a correlation exists between the standard deviation from a process’s mean and the probable number of defects per million opportunities.
A fundamental tool of “lean production”, used to increase the “up-time” of manufacturing equipment. SMED is a method of reducing “changeover” time – i.e. the time spent switching a piece of equipment from one process to another or from one unit (or lot) to another. Its name derives from the method’s target changeover time: The amount of time that elapses between production of the last conforming unit of Lot A and the first conforming unit of Lot B must be less than ten minutes, or single-digit (i.e. “single-minute”).
A data-driven business management philosophy designed to increase profitability by reducing process variation, eliminating waste, and increasing quality to levels that consistently meet or exceed customer expectations. The name “Six Sigma” derives from the statistical term for a level of standard deviation of no more than 3.4 defects per million opportunities (DPMO); a quality level of no more than 3.4 DPMO is the objective of businesses that use Six Sigma methodology.
Terms used to describe the level of expertise attained by a Six Sigma-trained professional. There are four “belt” levels:
- Yellow Belt (YB) – the lowest level of Six Sigma expertise; applies to a professional who has a basic working knowledge and who may manage smaller process improvement projects, but who does not function as a project or team leader;
- Green Belt (GB) – in many organizations, Six Sigma’s “entry level”; a Six Sigma-trained (and sometimes certified) professional who does not work on Six Sigma projects exclusively, but whose duties include leading projects and teams and implementing Six Sigma methodology at the project level;
- Black Belt (BB) – a Six Sigma-trained professional who has usually completed an examination and been certified in its methods; all job duties include implementation of Six Sigma methodology throughout all levels of the business, leading teams and projects, and providing Six Sigma training and mentoring to Green and Yellow Belts; and
- Master Black Belt (MBB) – the highest level of Six Sigma expertise; all duties involve implementation of Six Sigma, including statistical analysis, strategic and policy planning and implementation, and training and mentoring of Black Belts.
A guarantee that software quality levels will conform to specific, defined standards that meet customer expectations; administered via systematic compliance with a formal set of guidelines and procedures that are designed to define quality levels, establish appropriate systems and processes for conformance, and measure results. In the context of SQA, these guidelines and procedures are followed throughout the processes of choosing, acquiring, installing, and using software.
A problem-solving tool, usually documented in writing, used to evaluate potential solutions to a particular problem and to select the best option. A comprehensive “solution statement” might include a description of the problem; identify potential areas for resolution; list detailed descriptions of potential solutions; evaluate the relative effectiveness of each potential solution; provide a detailed explanation of the solution chosen and the reasons it was chosen; and offer possible ways to prevent the problem’s recurrence.
A statistical measurement of a data set’s dispersion (or variation in distribution); it calculates the data spread in relation to the mean. Standard deviation is the most common method of measuring variation; often represented by the Greek letter for “sigma“. It is also one of the primary components of the measurement system that forms the basis of Six Sigma methodology.
A mode of thinking that includes both logical and analytical reasoning: It evaluates the “whole” of a problem, as well as its component parts, and attempts to assess the effects on the whole of changing one or more variables; and it attempts to understand not only problems and solutions but the processes involved. Some experts define “statistical thinking” to include skills of problem identification; process reasoning; questioning of premises, conclusions, and data; problem-solving; and explanation of the problem, the solution(s), and the processes involved.
A tool used to evaluate, monitor, and control process capability and performance through the application of statistical methods of data analysis; sometimes called “statistical quality control.” SPC is administered through the use of process control charts, in which each data point is compared in statistical terms both with previous data points and with the entire distribution; this helps plot existing or potential patterns, trends, and shifts in processes, and helps to identify whether such changes derive from special causes or common causes.
A formal, systematic, documented approach to developing an organization’s basic policies, mission, and core processes; usually includes such components as a mission statement (or statement of purpose), core objectives, and a “call to action” that itemizes the resources, processes, and systems needed to fulfill its mission and objectives.
A collection of variables in a data set whose spread forms a pattern that demonstrates one or more predictable tendencies (i.e. is “systematic”).
Named after the method’s pioneer, Genichi Taguchi; sometimes called “robust design.” An approach to quality control derived from engineering methods, the Taguchi method is based on the fundamental principle that quality should be ensured at the design stage of a product or process, rather than added subsequently as a result of nonconformities caught at inspection or other stages of production (i.e. “inspected into” it). The method attempts:
- To identify all of the variables that will materially affect the outcome of a process
- Select relevant combinations of those variables and examine their potential effects
- Design the process to neutralize or eliminate adverse effects (e.g. nonconformities, defects, or failures) of those variables before the process is launched
Takt time is derived from the German word “takt,” or “baton” (i.e. used by a conductor to control the music’s speed and tempo); a fundamental component of lean manufacturing. Takt time is the rate of time required to complete the production cycle for a product and meet, but not exceed, customer demand.
Two or more individuals, formally or informally organized into a group for the purpose of working together to fulfill a particular mission or purpose and/or to complete a project, operation, or task.
An individual who works with one or more individuals on a specific task, project, process, or enterprise for the purpose of collaboratively completing that project successfully. A team member is generally selected because s/he brings specific experience and/or expertise to the project. In a business that uses Six Sigma methodologies, members of a team are led by a Green Belt, Black Belt, or Master Black Belt, who is responsible for ensuring that the team implements Six Sigma principles in its work.
A business management approach grounded in assumptions that every business’s ability to fulfill its objectives is limited by one or more constraints, and that such constraints can be identified and neutralized or eliminated, enhancing profitability; uses five tools and five steps to improving business processes.
The five tools are:
- The “current reality tree”, a logic diagram that attempts to identify root causes by tapping the knowledge and experience of others
- The “evaporating cloud”, a conflict-resolution logic diagram that attempts to identify the conflict that is the source of the constraint
- The “future reality tree”, a logic diagram that evaluates a potential solution by attempting to identify any missing elements prior to implementation
- The “prerequisite tree”, a logic diagram that identifies steps needed to lay the groundwork for implementing a successful solution
- The “transition tree”, a logic diagram that determines the steps, actions, and/or resources needed to move from the current environment to one that will permit implementation of the solution
The five steps are:
- “Identify” the constraint
- “Exploit” the constraint, generally by attempting to neutralize, modify, or use the constraint in a productive way that avoids the need for expensive systemic changes
- “Subordinate” unrelated parts or processes of the system to the process of correcting the problem of the constraint – e.g. by adjusting or modifying those parts or processes to function in ways that will neutralize the limiting qualities of the constraint and help it to function productively
- “Elevate” the constraint – i.e. if steps 2 and 3 are not successful, take any necessary steps, including systemic changes, to eliminate the constraint
- Return to step 1 and begin the process again as needed, but beware “inertia” (i.e. complacency that permits other constraints to arise and/or to continue)
A method of accounting that counts as product costs only the unit-level costs of a product or service; it classifies the costs of all other resources (i.e. “committed” costs) as operating expenses, allocated as “period costs”. The “throughput” is calculated by subtracting all unit-level costs from sales revenue, which helps to identify both bottlenecks and products that are profitable, thus permitting the business to redirect its resources accordingly.
A performance management tool used to track a “work item” (e.g. a product, component, operation, system, process, or employee) throughout an entire work process to determine its performance at each stage of the process. A “time value map” sorts the work item’s progress into three categories:
The range of values between highest and lowest limits of deviation under which a product or process will still fulfill its proper function and conform to customer expectations.
A manufacturing approach that seeks to ensure that the production process is not interrupted by maintenance stoppages. TPM engages in “preemptive maintenance” of production equipment – i.e. where “common cause” of problems and outages have been identified, performing regularly-scheduled service on the equipment, rather than waiting until such a problem occurs – to attempt to ensure that all production equipment functions at a 100% performance rate at all times.
A management approach that seeks to improve quality and performance, and thus meet or exceed customer expectations, by integrating all quality-related functions and processes throughout the organization; takes a holistic approach to managing quality design and development, quality control and maintenance, quality improvement, and quality assurance, at all levels and involving all employees.
A business management philosophy that includes a commitment to continuous improvement in quality and performance, strives to enhance profitability by focusing on meeting and exceeding customer expectations, and works to involve all employees, departments, and levels of an organization in the process of continuous improvement.
So named because its shape resembles a tree; a visual representation headed by a problem, project, or task, which is subsequently broken into its constituent elements; as it is subdivided further, each component or segment usually decreases in size and increases in specificity. A tree chart is generally used to make a complex problem or project more manageable and more easily performed or solved, and/or to help generate more creative, diverse options, approaches, and solutions.
A score assigned to a variable in the context of a given measurement; for purposes of quality management, “value” is also the worth of the sum total of all aspects of a product or service purchased by the customer.
The entire series, or “stream,” of activities, operations, and processes that constitute the production process, from order placement to delivery to the customer; includes both those activities that add value and those that do not.
Typically labeled a “paper and pencil” tool, although it may be constructed digitally; a value management tool is designed to create two separate visual representations (i.e. “maps”). The first map illustrates how data and resources move through the “value stream” during the production process, and is used to identify waste, defects, and failures; the second map, using data contained in the first, illustrates a “future state map” of the same value stream with any waste, defects, and failures eliminated. The two maps are used to create detailed strategic and implementation plans to enhance the value stream’s performance.
Those aspects of a product, service, or process that contribute to its collective worth from the customer’s perspective (especially worth above and beyond the customer’s general expectations). In the management context, anything “value-added” is any element of a process, system, operation, or enterprise that contributes to the quality and/or performance (as defined from the customer’s perspective) of its output.
A business management philosophy based on maximizing value consistently in all aspects of the business (with an emphasis on maximizing value as defined by shareholders). VBM comprises three elements:
- “Value creation,” or design, planning, and implementation of methods and processes to maximize value
- “Value management,” or implementation of leadership, structure, policies, strategies, and processes that maximize value
- “Value measurement,” or assessment of value levels and whether performance reaches value objectives
Among other methods of maximizing value, VBM strives to ensure consistency in company mission and philosophy, including vision, strategies, and culture; structure and leadership, including organization and governance; and processes and functions, including day-to-day management, decision-making, performance, and personnel issues.
A method of accounting that allocates only variable manufacturing costs (e.g. product materials, labor, and variable overhead) as per-unit product costs; fixed overhead costs are allocated as a period expense.
The scope of an organization’s ideals, objectives, and purposes, designed to express its reason for existence and intended approach.
A documented expression of an organization’s ideals, objectives, and purposes, sometimes incorporated into its “mission statement”.
A basic component of lean manufacturing; a production environment in which all employees are able to see “at a glance” the enterprise, operation, system, or project’s current status and function. Such “visuals” include signs, charts, graphs, diagrams, and other images and tools that convey information necessary to ensure that quality and performance meet established standards.
The needs, wants, expectations, and preferences, both spoken and unspoken, of the people who constitute the business itself (e.g. shareholders, officers, or others involved in corporate governance).
The needs, wants, expectations, and preferences, both spoken and unspoken, of a business’s customers, whether internal or external. VOC data are obtained through market research, customer surveys, etc., and are used to modify products and processes to meet or exceed customer expectations.
The needs, wants, expectations, and preferences, both spoken and unspoken, of the business’s personnel (i.e. employees).
Anything generated by any activity or process that consumes resources but does not add value to the product, service, or process being produced.
A planning method that prioritizes tasks, issues, concepts, etc. by assigning to each item a point value that reflects its relative importance.
A type of “detection system” used to measure differential pressure (“d/p”) in a steam drum by means of a condensing tank. The high-pressure side of the d/p cell connects to the tank’s vapor space, where the steam condenses and fills the “wet leg” with water, while the low-pressure side detects the liquid inside the drum; the output of the d/p cell should reflect the amount of water in the drum.
A collection of resources (e.g. materials, equipment, personnel, etc.), usually arranged in proximity and sequence, grouped together for the purpose of producing a “family” of related products; each cell is headed by a team leader who coordinates resources and work flow to ensure that productivity, quality, and performance standards are met.
Sometimes called a “natural team”; individuals from the same “work unit” who are grouped together into a team for the purpose of collaborating on a specific project, task, or enterprise. Work team members are likely to share skill sets and expertise, and assignment is likely to be permanent, at least for the duration of the project in question.
The sequential structure of related, interdependent actions or events by which all of the elements in a given process occur. May be distilled to visual format by means of a chart, diagram, or graph.
A term often used in the information technology context; the metaphorical label for the flow of information – i.e. the fundamental element in any process that ensures proper function and performance.
First promulgated by the United Nations in 1990 to raise international awareness of the important role quality plays in ensuring nations’ prosperity; now celebrated annually on the second Thursday of November.
A type of “combined” control chart that integrates data for both means and ranges of subgroups into one image; used to determine whether a process’s center and level of variability are “in control” (i.e. remains constant over a designated period of time).