Sample records for statistical process control

  1. 77 FR 46096 - Statistical Process Controls for Blood Establishments; Public Workshop

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-02

    ...] Statistical Process Controls for Blood Establishments; Public Workshop AGENCY: Food and Drug Administration... workshop entitled: ``Statistical Process Controls for Blood Establishments.'' The purpose of this public workshop is to discuss the implementation of statistical process controls to validate and monitor...

  2. Statistical process control: A feasibility study of the application of time-series measurement in early neurorehabilitation after acquired brain injury.

    PubMed

    Markovic, Gabriela; Schult, Marie-Louise; Bartfai, Aniko; Elg, Mattias

    2017-01-31

    Progress in early cognitive recovery after acquired brain injury is uneven and unpredictable, and thus the evaluation of rehabilitation is complex. The use of time-series measurements is susceptible to statistical change due to process variation. To evaluate the feasibility of using a time-series method, statistical process control, in early cognitive rehabilitation. Participants were 27 patients with acquired brain injury undergoing interdisciplinary rehabilitation of attention within 4 months post-injury. The outcome measure, the Paced Auditory Serial Addition Test, was analysed using statistical process control. Statistical process control identifies if and when change occurs in the process according to 3 patterns: rapid, steady or stationary performers. The statistical process control method was adjusted, in terms of constructing the baseline and the total number of measurement points, in order to measure a process in change. Statistical process control methodology is feasible for use in early cognitive rehabilitation, since it provides information about change in a process, thus enabling adjustment of the individual treatment response. Together with the results indicating discernible subgroups that respond differently to rehabilitation, statistical process control could be a valid tool in clinical decision-making. This study is a starting-point in understanding the rehabilitation process using a real-time-measurements approach.

  3. A case study: application of statistical process control tool for determining process capability and sigma level.

    PubMed

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical process control study on process. Interpretation of such a study provides information about stability, process variability, changing of trends, and quantification of process ability against defective production. Comparative evaluation of critical quality attributes by Pareto charts provides the least capable and most variable process that is liable for improvement. Statistical process control thus proves to be an important tool for six sigma-capable process development and continuous quality improvement.

  4. Applying Statistical Process Control to Clinical Data: An Illustration.

    ERIC Educational Resources Information Center

    Pfadt, Al; And Others

    1992-01-01

    Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…

  5. Using Statistical Process Control to Enhance Student Progression

    ERIC Educational Resources Information Center

    Hanna, Mark D.; Raichura, Nilesh; Bernardes, Ednilson

    2012-01-01

    Public interest in educational outcomes has markedly increased in the most recent decade; however, quality management and statistical process control have not deeply penetrated the management of academic institutions. This paper presents results of an attempt to use Statistical Process Control (SPC) to identify a key impediment to continuous…

  6. Statistical Inference at Work: Statistical Process Control as an Example

    ERIC Educational Resources Information Center

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  7. [Statistical process control applied to intensity modulated radiotherapy pretreatment controls with portal dosimetry].

    PubMed

    Villani, N; Gérard, K; Marchesi, V; Huger, S; François, P; Noël, A

    2010-06-01

    The first purpose of this study was to illustrate the contribution of statistical process control for a better security in intensity modulated radiotherapy (IMRT) treatments. This improvement is possible by controlling the dose delivery process, characterized by pretreatment quality control results. So, it is necessary to put under control portal dosimetry measurements (currently, the ionisation chamber measurements were already monitored by statistical process control thanks to statistical process control tools). The second objective was to state whether it is possible to substitute ionisation chamber with portal dosimetry in order to optimize time devoted to pretreatment quality control. At Alexis-Vautrin center, pretreatment quality controls in IMRT for prostate and head and neck treatments were performed for each beam of each patient. These controls were made with an ionisation chamber, which is the reference detector for the absolute dose measurement, and with portal dosimetry for the verification of dose distribution. Statistical process control is a statistical analysis method, coming from industry, used to control and improve the studied process quality. It uses graphic tools as control maps to follow-up process, warning the operator in case of failure, and quantitative tools to evaluate the process toward its ability to respect guidelines: this is the capability study. The study was performed on 450 head and neck beams and on 100 prostate beams. Control charts, showing drifts, both slow and weak, and also both strong and fast, of mean and standard deviation have been established and have shown special cause introduced (manual shift of the leaf gap of the multileaf collimator). Correlation between dose measured at one point, given with the EPID and the ionisation chamber has been evaluated at more than 97% and disagreement cases between the two measurements were identified. The study allowed to demonstrate the feasibility to reduce the time devoted to pretreatment controls, by substituting the ionisation chamber's measurements with those performed with EPID, and also that a statistical process control monitoring of data brought security guarantee. 2010 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  8. Quality Space and Launch Requirements, Addendum to AS9100C

    DTIC Science & Technology

    2015-05-08

    45 8.9.1 Statistical Process Control (SPC) .......................................................................... 45...SMC Space and Missile Systems Center SME Subject Matter Expert SOW Statement of Work SPC Statistical Process Control SPO System Program Office SRP...occur without any individual data exceeding the control limits. Control limits are developed using standard statistical methods or other approved

  9. Applying Statistical Process Quality Control Methodology to Educational Settings.

    ERIC Educational Resources Information Center

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  10. Improving the Document Development Process: Integrating Relational Data and Statistical Process Control.

    ERIC Educational Resources Information Center

    Miller, John

    1994-01-01

    Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)

  11. Statistical process control methods allow the analysis and improvement of anesthesia care.

    PubMed

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  12. Quality Space and Launch Requirements Addendum to AS9100C

    DTIC Science & Technology

    2015-03-05

    45 8.9.1 Statistical Process Control (SPC) .......................................................................... 45 8.9.1.1 Out of Control...Systems Center SME Subject Matter Expert SOW Statement of Work SPC Statistical Process Control SPO System Program Office SRP Standard Repair...individual data exceeding the control limits. Control limits are developed using standard statistical methods or other approved techniques and are based on

  13. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    ERIC Educational Resources Information Center

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  14. Artificial Intelligence Approach to Support Statistical Quality Control Teaching

    ERIC Educational Resources Information Center

    Reis, Marcelo Menezes; Paladini, Edson Pacheco; Khator, Suresh; Sommer, Willy Arno

    2006-01-01

    Statistical quality control--SQC (consisting of Statistical Process Control, Process Capability Studies, Acceptance Sampling and Design of Experiments) is a very important tool to obtain, maintain and improve the Quality level of goods and services produced by an organization. Despite its importance, and the fact that it is taught in technical and…

  15. Improving Instruction Using Statistical Process Control.

    ERIC Educational Resources Information Center

    Higgins, Ronald C.; Messer, George H.

    1990-01-01

    Two applications of statistical process control to the process of education are described. Discussed are the use of prompt feedback to teachers and prompt feedback to students. A sample feedback form is provided. (CW)

  16. Statistical Process Control Charts for Measuring and Monitoring Temporal Consistency of Ratings

    ERIC Educational Resources Information Center

    Omar, M. Hafidz

    2010-01-01

    Methods of statistical process control were briefly investigated in the field of educational measurement as early as 1999. However, only the use of a cumulative sum chart was explored. In this article other methods of statistical quality control are introduced and explored. In particular, methods in the form of Shewhart mean and standard deviation…

  17. Manufacturing Squares: An Integrative Statistical Process Control Exercise

    ERIC Educational Resources Information Center

    Coy, Steven P.

    2016-01-01

    In the exercise, students in a junior-level operations management class are asked to manufacture a simple product. Given product specifications, they must design a production process, create roles and design jobs for each team member, and develop a statistical process control plan that efficiently and effectively controls quality during…

  18. Economic Statistical Design of Integrated X-bar-S Control Chart with Preventive Maintenance and General Failure Distribution

    PubMed Central

    Caballero Morales, Santiago Omar

    2013-01-01

    The application of Preventive Maintenance (PM) and Statistical Process Control (SPC) are important practices to achieve high product quality, small frequency of failures, and cost reduction in a production process. However there are some points that have not been explored in depth about its joint application. First, most SPC is performed with the X-bar control chart which does not fully consider the variability of the production process. Second, many studies of design of control charts consider just the economic aspect while statistical restrictions must be considered to achieve charts with low probabilities of false detection of failures. Third, the effect of PM on processes with different failure probability distributions has not been studied. Hence, this paper covers these points, presenting the Economic Statistical Design (ESD) of joint X-bar-S control charts with a cost model that integrates PM with general failure distribution. Experiments showed statistically significant reductions in costs when PM is performed on processes with high failure rates and reductions in the sampling frequency of units for testing under SPC. PMID:23527082

  19. Using Paper Helicopters to Teach Statistical Process Control

    ERIC Educational Resources Information Center

    Johnson, Danny J.

    2011-01-01

    This hands-on project uses a paper helicopter to teach students how to distinguish between common and special causes of variability when developing and using statistical process control charts. It allows the student to experience a process that is out-of-control due to imprecise or incomplete product design specifications and to discover how the…

  20. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    PubMed

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  1. Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)

    NASA Technical Reports Server (NTRS)

    Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)

    1999-01-01

    This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.

  2. Statistical Process Control: Going to the Limit for Quality.

    ERIC Educational Resources Information Center

    Training, 1987

    1987-01-01

    Defines the concept of statistical process control, a quality control method used especially in manufacturing. Generally, concept users set specific standard levels that must be met. Makes the point that although employees work directly with the method, management is responsible for its success within the plant. (CH)

  3. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    NASA Technical Reports Server (NTRS)

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  4. Person Fit Based on Statistical Process Control in an Adaptive Testing Environment. Research Report 98-13.

    ERIC Educational Resources Information Center

    van Krimpen-Stoop, Edith M. L. A.; Meijer, Rob R.

    Person-fit research in the context of paper-and-pencil tests is reviewed, and some specific problems regarding person fit in the context of computerized adaptive testing (CAT) are discussed. Some new methods are proposed to investigate person fit in a CAT environment. These statistics are based on Statistical Process Control (SPC) theory. A…

  5. Statistical process control for residential treated wood

    Treesearch

    Patricia K. Lebow; Timothy M. Young; Stan Lebow

    2017-01-01

    This paper is the first stage of a study that attempts to improve the process of manufacturing treated lumber through the use of statistical process control (SPC). Analysis of industrial and auditing agency data sets revealed there are differences between the industry and agency probability density functions (pdf) for normalized retention data. Resampling of batches of...

  6. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    ERIC Educational Resources Information Center

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  7. Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement.

    PubMed

    Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril

    2014-07-01

    Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (p<0.001). Since 2010, the quarterly rate of severe PPH has not exceeded the upper control limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  8. Does daily nurse staffing match ward workload variability? Three hospitals' experiences.

    PubMed

    Gabbay, Uri; Bukchin, Michael

    2009-01-01

    Nurse shortage and rising healthcare resource burdens mean that appropriate workforce use is imperative. This paper aims to evaluate whether daily nursing staffing meets ward workload needs. Nurse attendance and daily nurses' workload capacity in three hospitals were evaluated. Statistical process control was used to evaluate intra-ward nurse workload capacity and day-to-day variations. Statistical process control is a statistics-based method for process monitoring that uses charts with predefined target measure and control limits. Standardization was performed for inter-ward analysis by converting ward-specific crude measures to ward-specific relative measures by dividing observed/expected. Two charts: acceptable and tolerable daily nurse workload intensity, were defined. Appropriate staffing indicators were defined as those exceeding predefined rates within acceptable and tolerable limits (50 percent and 80 percent respectively). A total of 42 percent of the overall days fell within acceptable control limits and 71 percent within tolerable control limits. Appropriate staffing indicators were met in only 33 percent of wards regarding acceptable nurse workload intensity and in only 45 percent of wards regarding tolerable workloads. The study work did not differentiate crude nurse attendance and it did not take into account patient severity since crude bed occupancy was used. Double statistical process control charts and certain staffing indicators were used, which is open to debate. Wards that met appropriate staffing indicators prove the method's feasibility. Wards that did not meet appropriate staffing indicators prove the importance and the need for process evaluations and monitoring. Methods presented for monitoring daily staffing appropriateness are simple to implement either for intra-ward day-to-day variation by using nurse workload capacity statistical process control charts or for inter-ward evaluation using standardized measure of nurse workload intensity. The real challenge will be to develop planning systems and implement corrective interventions such as dynamic and flexible daily staffing, which will face difficulties and barriers. The paper fulfils the need for workforce utilization evaluation. A simple method using available data for daily staffing appropriateness evaluation, which is easy to implement and operate, is presented. The statistical process control method enables intra-ward evaluation, while standardization by converting crude into relative measures enables inter-ward analysis. The staffing indicator definitions enable performance evaluation. This original study uses statistical process control to develop simple standardization methods and applies straightforward statistical tools. This method is not limited to crude measures, rather it uses weighted workload measures such as nursing acuity or weighted nurse level (i.e. grade/band).

  9. Statistical process control: separating signal from noise in emergency department operations.

    PubMed

    Pimentel, Laura; Barrueto, Fermin

    2015-05-01

    Statistical process control (SPC) is a visually appealing and statistically rigorous methodology very suitable to the analysis of emergency department (ED) operations. We demonstrate that the control chart is the primary tool of SPC; it is constructed by plotting data measuring the key quality indicators of operational processes in rationally ordered subgroups such as units of time. Control limits are calculated using formulas reflecting the variation in the data points from one another and from the mean. SPC allows managers to determine whether operational processes are controlled and predictable. We review why the moving range chart is most appropriate for use in the complex ED milieu, how to apply SPC to ED operations, and how to determine when performance improvement is needed. SPC is an excellent tool for operational analysis and quality improvement for these reasons: 1) control charts make large data sets intuitively coherent by integrating statistical and visual descriptions; 2) SPC provides analysis of process stability and capability rather than simple comparison with a benchmark; 3) SPC allows distinction between special cause variation (signal), indicating an unstable process requiring action, and common cause variation (noise), reflecting a stable process; and 4) SPC keeps the focus of quality improvement on process rather than individual performance. Because data have no meaning apart from their context, and every process generates information that can be used to improve it, we contend that SPC should be seriously considered for driving quality improvement in emergency medicine. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. An Automated Statistical Process Control Study of Inline Mixing Using Spectrophotometric Detection

    ERIC Educational Resources Information Center

    Dickey, Michael D.; Stewart, Michael D.; Willson, C. Grant

    2006-01-01

    An experiment is described, which is designed for a junior-level chemical engineering "fundamentals of measurements and data analysis" course, where students are introduced to the concept of statistical process control (SPC) through a simple inline mixing experiment. The students learn how to create and analyze control charts in an effort to…

  11. Disciplined Decision Making in an Interdisciplinary Environment: Some Implications for Clinical Applications of Statistical Process Control.

    ERIC Educational Resources Information Center

    Hantula, Donald A.

    1995-01-01

    Clinical applications of statistical process control (SPC) in human service organizations are considered. SPC is seen as providing a standard set of criteria that serves as a common interface for data-based decision making, which may bring decision making under the control of established contingencies rather than the immediate contingencies of…

  12. Reducing lumber thickness variation using real-time statistical process control

    Treesearch

    Thomas M. Young; Brian H. Bond; Jan Wiedenbeck

    2002-01-01

    A technology feasibility study for reducing lumber thickness variation was conducted from April 2001 until March 2002 at two sawmills located in the southern U.S. A real-time statistical process control (SPC) system was developed that featured Wonderware human machine interface technology (HMI) with distributed real-time control charts for all sawing centers and...

  13. Statistical porcess control in Deep Space Network operation

    NASA Technical Reports Server (NTRS)

    Hodder, J. A.

    2002-01-01

    This report describes how the Deep Space Mission System (DSMS) Operations Program Office at the Jet Propulsion Laboratory's (EL) uses Statistical Process Control (SPC) to monitor performance and evaluate initiatives for improving processes on the National Aeronautics and Space Administration's (NASA) Deep Space Network (DSN).

  14. The Use of Statistical Process Control-Charts for Person-Fit Analysis on Computerized Adaptive Testing. LSAC Research Report Series.

    ERIC Educational Resources Information Center

    Meijer, Rob R.; van Krimpen-Stoop, Edith M. L. A.

    In this study a cumulative-sum (CUSUM) procedure from the theory of Statistical Process Control was modified and applied in the context of person-fit analysis in a computerized adaptive testing (CAT) environment. Six person-fit statistics were proposed using the CUSUM procedure, and three of them could be used to investigate the CAT in online test…

  15. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Statistical Process Control.

    ERIC Educational Resources Information Center

    Billings, Paul H.

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 6-hour introductory module on statistical process control (SPC), designed to develop competencies in the following skill areas: (1) identification of the three classes of SPC use; (2) understanding a process and how it works; (3)…

  16. Multivariate statistical process control of a continuous pharmaceutical twin-screw granulation and fluid bed drying process.

    PubMed

    Silva, A F; Sarraguça, M C; Fonteyne, M; Vercruysse, J; De Leersnyder, F; Vanhoorne, V; Bostijn, N; Verstraeten, M; Vervaet, C; Remon, J P; De Beer, T; Lopes, J A

    2017-08-07

    A multivariate statistical process control (MSPC) strategy was developed for the monitoring of the ConsiGma™-25 continuous tablet manufacturing line. Thirty-five logged variables encompassing three major units, being a twin screw high shear granulator, a fluid bed dryer and a product control unit, were used to monitor the process. The MSPC strategy was based on principal component analysis of data acquired under normal operating conditions using a series of four process runs. Runs with imposed disturbances in the dryer air flow and temperature, in the granulator barrel temperature, speed and liquid mass flow and in the powder dosing unit mass flow were utilized to evaluate the model's monitoring performance. The impact of the imposed deviations to the process continuity was also evaluated using Hotelling's T 2 and Q residuals statistics control charts. The influence of the individual process variables was assessed by analyzing contribution plots at specific time points. Results show that the imposed disturbances were all detected in both control charts. Overall, the MSPC strategy was successfully developed and applied. Additionally, deviations not associated with the imposed changes were detected, mainly in the granulator barrel temperature control. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Statistical Process Control in the Practice of Program Evaluation.

    ERIC Educational Resources Information Center

    Posavac, Emil J.

    1995-01-01

    A technique developed to monitor the quality of manufactured products, statistical process control (SPC), incorporates several features that may prove attractive to evaluators. This paper reviews the history of SPC, suggests how the approach can enrich program evaluation, and illustrates its use in a hospital-based example. (SLD)

  18. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  19. Statistical process control using optimized neural networks: a case study.

    PubMed

    Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid

    2014-09-01

    The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  20. Measuring and improving the quality of postoperative epidural analgesia for major abdominal surgery using statistical process control charts.

    PubMed

    Duncan, Fiona; Haigh, Carol

    2013-10-01

    To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that has led to the establishment of a national pain registry. © 2013 Blackwell Publishing Ltd.

  1. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    PubMed

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  2. Improving Service Delivery in a County Health Department WIC Clinic: An Application of Statistical Process Control Techniques

    PubMed Central

    Boe, Debra Thingstad; Parsons, Helen

    2009-01-01

    Local public health agencies are challenged to continually improve service delivery, yet they frequently operate with constrained resources. Quality improvement methods and techniques such as statistical process control are commonly used in other industries, and they have recently been proposed as a means of improving service delivery and performance in public health settings. We analyzed a quality improvement project undertaken at a local Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) clinic to reduce waiting times and improve client satisfaction with a walk-in nutrition education service. We used statistical process control techniques to evaluate initial process performance, implement an intervention, and assess process improvements. We found that implementation of these techniques significantly reduced waiting time and improved clients' satisfaction with the WIC service. PMID:19608964

  3. Implementing Lean Six Sigma to achieve inventory control in supply chain management

    NASA Astrophysics Data System (ADS)

    Hong, Chen

    2017-11-01

    The inventory cost has important impact on the production cost. In order to get the maximum circulation of funds of enterprise with minimum inventory cost, the inventory control with Lean Six Sigma is presented in supply chain management. The inventory includes both the raw material and the semi-finished parts in manufacturing process. Though the inventory is often studied, the inventory control in manufacturing process is seldom mentioned. This paper reports the inventory control from the perspective of manufacturing process by using statistical techniques including DMAIC, Control Chart, and Statistical Process Control. The process stability is evaluated and the process capability is verified with Lean Six Sigma philosophy. The demonstration in power meter production shows the inventory is decreased from 25% to 0.4%, which indicates the inventory control can be achieved with Lean Six Sigma philosophy and the inventory cost in production can be saved for future sustainable development in supply chain management.

  4. Langley Wind Tunnel Data Quality Assurance-Check Standard Results

    NASA Technical Reports Server (NTRS)

    Hemsch, Michael J.; Grubb, John P.; Krieger, William B.; Cler, Daniel L.

    2000-01-01

    A framework for statistical evaluation, control and improvement of wind funnel measurement processes is presented The methodology is adapted from elements of the Measurement Assurance Plans developed by the National Bureau of Standards (now the National Institute of Standards and Technology) for standards and calibration laboratories. The present methodology is based on the notions of statistical quality control (SQC) together with check standard testing and a small number of customer repeat-run sets. The results of check standard and customer repeat-run -sets are analyzed using the statistical control chart-methods of Walter A. Shewhart long familiar to the SQC community. Control chart results are presented for. various measurement processes in five facilities at Langley Research Center. The processes include test section calibration, force and moment measurements with a balance, and instrument calibration.

  5. Using Statistical Process Control Charts to Study Stuttering Frequency Variability during a Single Day

    ERIC Educational Resources Information Center

    Karimi, Hamid; O'Brian, Sue; Onslow, Mark; Jones, Mark; Menzies, Ross; Packman, Ann

    2013-01-01

    Purpose: Stuttering varies between and within speaking situations. In this study, the authors used statistical process control charts with 10 case studies to investigate variability of stuttering frequency. Method: Participants were 10 adults who stutter. The authors counted the percentage of syllables stuttered (%SS) for segments of their speech…

  6. Design of experiments enhanced statistical process control for wind tunnel check standard testing

    NASA Astrophysics Data System (ADS)

    Phillips, Ben D.

    The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.

  7. Using Statistical Process Control to Drive Improvement in Neonatal Care: A Practical Introduction to Control Charts.

    PubMed

    Gupta, Munish; Kaplan, Heather C

    2017-09-01

    Quality improvement (QI) is based on measuring performance over time, and variation in data measured over time must be understood to guide change and make optimal improvements. Common cause variation is natural variation owing to factors inherent to any process; special cause variation is unnatural variation owing to external factors. Statistical process control methods, and particularly control charts, are robust tools for understanding data over time and identifying common and special cause variation. This review provides a practical introduction to the use of control charts in health care QI, with a focus on neonatology. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculatingmore » a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the security of treatments. They also showed that the dose delivery processes in the cancer center were in control for prostate and head-and-neck treatments. In parallel, long term process performance indices (P{sub p}, P{sub pk}, and P{sub pm}) have been analyzed. Their analysis helped defining which actions should be undertaken in order to improve the performance of the process. The prostate dose delivery process has been shown statistically capable (0.08% of the results is expected to be outside the clinical tolerances) contrary to the head-and-neck dose delivery process (5.76% of the results are expected to be outside the clinical tolerances).« less

  9. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    PubMed

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the security of treatments. They also showed that the dose delivery processes in the cancer center were in control for prostate and head-and-neck treatments. In parallel, long-term process performance indices (P(p), P(pk), and P(pm)) have been analyzed. Their analysis helped defining which actions should be undertaken in order to improve the performance of the process. The prostate dose delivery process has been shown statistically capable (0.08% of the results is expected to be outside the clinical tolerances) contrary to the head-and-neck dose delivery process (5.76% of the results are expected to be outside the clinical tolerances).

  10. Pre-Statistical Process Control: Making Numbers Count! JobLink Winning at Work Instructor's Manual, Module 3.

    ERIC Educational Resources Information Center

    Coast Community Coll. District, Costa Mesa, CA.

    This instructor's manual for workplace trainers contains the materials required to conduct a course in pre-statistical process control. The course consists of six lessons for workers and two lessons for supervisors that discuss the following: concepts taught in the six lessons; workers' progress in the individual lessons; and strategies for…

  11. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC I. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…

  12. Using Statistical Process Control Charts to Identify the Steroids Era in Major League Baseball: An Educational Exercise

    ERIC Educational Resources Information Center

    Hill, Stephen E.; Schvaneveldt, Shane J.

    2011-01-01

    This article presents an educational exercise in which statistical process control charts are constructed and used to identify the Steroids Era in American professional baseball. During this period (roughly 1993 until the present), numerous baseball players were alleged or proven to have used banned, performance-enhancing drugs. Also observed…

  13. Statistical Process Control. Impact and Opportunities for Ohio.

    ERIC Educational Resources Information Center

    Brown, Harold H.

    The first purpose of this study is to help the reader become aware of the evolution of Statistical Process Control (SPC) as it is being implemented and used in industry today. This is approached through the presentation of a brief historical account of SPC, from its inception through the technological miracle that has occurred in Japan. The…

  14. Statistical Process Control. A Summary. FEU/PICKUP Project Report.

    ERIC Educational Resources Information Center

    Owen, M.; Clark, I.

    A project was conducted to develop a curriculum and training materials to be used in training industrial operatives in statistical process control (SPC) techniques. During the first phase of the project, questionnaires were sent to 685 companies (215 of which responded) to determine where SPC was being used, what type of SPC firms needed, and how…

  15. Root Cause Analysis of Quality Defects Using HPLC-MS Fingerprint Knowledgebase for Batch-to-batch Quality Control of Herbal Drugs.

    PubMed

    Yan, Binjun; Fang, Zhonghua; Shen, Lijuan; Qu, Haibin

    2015-01-01

    The batch-to-batch quality consistency of herbal drugs has always been an important issue. To propose a methodology for batch-to-batch quality control based on HPLC-MS fingerprints and process knowledgebase. The extraction process of Compound E-jiao Oral Liquid was taken as a case study. After establishing the HPLC-MS fingerprint analysis method, the fingerprints of the extract solutions produced under normal and abnormal operation conditions were obtained. Multivariate statistical models were built for fault detection and a discriminant analysis model was built using the probabilistic discriminant partial-least-squares method for fault diagnosis. Based on multivariate statistical analysis, process knowledge was acquired and the cause-effect relationship between process deviations and quality defects was revealed. The quality defects were detected successfully by multivariate statistical control charts and the type of process deviations were diagnosed correctly by discriminant analysis. This work has demonstrated the benefits of combining HPLC-MS fingerprints, process knowledge and multivariate analysis for the quality control of herbal drugs. Copyright © 2015 John Wiley & Sons, Ltd.

  16. Statistical process management: An essential element of quality improvement

    NASA Astrophysics Data System (ADS)

    Buckner, M. R.

    Successful quality improvement requires a balanced program involving the three elements that control quality: organization, people and technology. The focus of the SPC/SPM User's Group is to advance the technology component of Total Quality by networking within the Group and by providing an outreach within Westinghouse to foster the appropriate use of statistic techniques to achieve Total Quality. SPM encompasses the disciplines by which a process is measured against its intrinsic design capability, in the face of measurement noise and other obscuring variability. SPM tools facilitate decisions about the process that generated the data. SPM deals typically with manufacturing processes, but with some flexibility of definition and technique it accommodates many administrative processes as well. The techniques of SPM are those of Statistical Process Control, Statistical Quality Control, Measurement Control, and Experimental Design. In addition, techniques such as job and task analysis, and concurrent engineering are important elements of systematic planning and analysis that are needed early in the design process to ensure success. The SPC/SPM User's Group is endeavoring to achieve its objectives by sharing successes that have occurred within the member's own Westinghouse department as well as within other US and foreign industry. In addition, failures are reviewed to establish lessons learned in order to improve future applications. In broader terms, the Group is interested in making SPM the accepted way of doing business within Westinghouse.

  17. Statistical process control in nursing research.

    PubMed

    Polit, Denise F; Chaboyer, Wendy

    2012-02-01

    In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. Copyright © 2011 Wiley Periodicals, Inc.

  18. Statistical Process Control: A Quality Tool for a Venous Thromboembolic Disease Registry.

    PubMed

    Posadas-Martinez, Maria Lourdes; Rojas, Liliana Paloma; Vazquez, Fernando Javier; De Quiros, Fernan Bernaldo; Waisman, Gabriel Dario; Giunta, Diego Hernan

    2016-01-01

    We aim to describe Statistical Control Process as a quality tool for the Institutional Registry of Venous Thromboembolic Disease (IRTD), a registry developed in a community-care tertiary hospital in Buenos Aires, Argentina. The IRTD is a prospective cohort. The process of data acquisition began with the creation of a computerized alert generated whenever physicians requested imaging or laboratory study to diagnose venous thromboembolism, which defined eligible patients. The process then followed a structured methodology for patient's inclusion, evaluation, and posterior data entry. To control this process, process performance indicators were designed to be measured monthly. These included the number of eligible patients, the number of included patients, median time to patient's evaluation, and percentage of patients lost to evaluation. Control charts were graphed for each indicator. The registry was evaluated in 93 months, where 25,757 patients were reported and 6,798 patients met inclusion criteria. The median time to evaluation was 20 hours (SD, 12) and 7.7% of the total was lost to evaluation. Each indicator presented trends over time, caused by structural changes and improvement cycles, and therefore the central limit suffered inflexions. Statistical process control through process performance indicators allowed us to control the performance of the registry over time to detect systematic problems. We postulate that this approach could be reproduced for other clinical registries.

  19. Development of Statistical Process Control Methodology for an Environmentally Compliant Surface Cleaning Process in a Bonding Laboratory

    NASA Technical Reports Server (NTRS)

    Hutchens, Dale E.; Doan, Patrick A.; Boothe, Richard E.

    1997-01-01

    Bonding labs at both MSFC and the northern Utah production plant prepare bond test specimens which simulate or witness the production of NASA's Reusable Solid Rocket Motor (RSRM). The current process for preparing the bonding surfaces employs 1,1,1-trichloroethane vapor degreasing, which simulates the current RSRM process. Government regulations (e.g., the 1990 Amendments to the Clean Air Act) have mandated a production phase-out of a number of ozone depleting compounds (ODC) including 1,1,1-trichloroethane. In order to comply with these regulations, the RSRM Program is qualifying a spray-in-air (SIA) precision cleaning process using Brulin 1990, an aqueous blend of surfactants. Accordingly, surface preparation prior to bonding process simulation test specimens must reflect the new production cleaning process. The Bonding Lab Statistical Process Control (SPC) program monitors the progress of the lab and its capabilities, as well as certifies the bonding technicians, by periodically preparing D6AC steel tensile adhesion panels with EA-91 3NA epoxy adhesive using a standardized process. SPC methods are then used to ensure the process is statistically in control, thus producing reliable data for bonding studies, and identify any problems which might develop. Since the specimen cleaning process is being changed, new SPC limits must be established. This report summarizes side-by-side testing of D6AC steel tensile adhesion witness panels and tapered double cantilevered beams (TDCBs) using both the current baseline vapor degreasing process and a lab-scale spray-in-air process. A Proceco 26 inches Typhoon dishwasher cleaned both tensile adhesion witness panels and TDCBs in a process which simulates the new production process. The tests were performed six times during 1995, subsequent statistical analysis of the data established new upper control limits (UCL) and lower control limits (LCL). The data also demonstrated that the new process was equivalent to the vapor degreasing process.

  20. Expert system and process optimization techniques for real-time monitoring and control of plasma processes

    NASA Astrophysics Data System (ADS)

    Cheng, Jie; Qian, Zhaogang; Irani, Keki B.; Etemad, Hossein; Elta, Michael E.

    1991-03-01

    To meet the ever-increasing demand of the rapidly-growing semiconductor manufacturing industry it is critical to have a comprehensive methodology integrating techniques for process optimization real-time monitoring and adaptive process control. To this end we have accomplished an integrated knowledge-based approach combining latest expert system technology machine learning method and traditional statistical process control (SPC) techniques. This knowledge-based approach is advantageous in that it makes it possible for the task of process optimization and adaptive control to be performed consistently and predictably. Furthermore this approach can be used to construct high-level and qualitative description of processes and thus make the process behavior easy to monitor predict and control. Two software packages RIST (Rule Induction and Statistical Testing) and KARSM (Knowledge Acquisition from Response Surface Methodology) have been developed and incorporated with two commercially available packages G2 (real-time expert system) and ULTRAMAX (a tool for sequential process optimization).

  1. Detecting Anomalies in Process Control Networks

    NASA Astrophysics Data System (ADS)

    Rrushi, Julian; Kang, Kyoung-Don

    This paper presents the estimation-inspection algorithm, a statistical algorithm for anomaly detection in process control networks. The algorithm determines if the payload of a network packet that is about to be processed by a control system is normal or abnormal based on the effect that the packet will have on a variable stored in control system memory. The estimation part of the algorithm uses logistic regression integrated with maximum likelihood estimation in an inductive machine learning process to estimate a series of statistical parameters; these parameters are used in conjunction with logistic regression formulas to form a probability mass function for each variable stored in control system memory. The inspection part of the algorithm uses the probability mass functions to estimate the normalcy probability of a specific value that a network packet writes to a variable. Experimental results demonstrate that the algorithm is very effective at detecting anomalies in process control networks.

  2. Preliminary Retrospective Analysis of Daily Tomotherapy Output Constancy Checks Using Statistical Process Control.

    PubMed

    Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello

    2016-01-01

    The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system.

  3. Evaluation of extreme temperature events in northern Spain based on process control charts

    NASA Astrophysics Data System (ADS)

    Villeta, M.; Valencia, J. L.; Saá, A.; Tarquis, A. M.

    2018-02-01

    Extreme climate events have recently attracted the attention of a growing number of researchers because these events impose a large cost on agriculture and associated insurance planning. This study focuses on extreme temperature events and proposes a new method for their evaluation based on statistical process control tools, which are unusual in climate studies. A series of minimum and maximum daily temperatures for 12 geographical areas of a Spanish region between 1931 and 2009 were evaluated by applying statistical process control charts to statistically test whether evidence existed for an increase or a decrease of extreme temperature events. Specification limits were determined for each geographical area and used to define four types of extreme anomalies: lower and upper extremes for the minimum and maximum anomalies. A new binomial Markov extended process that considers the autocorrelation between extreme temperature events was generated for each geographical area and extreme anomaly type to establish the attribute control charts for the annual fraction of extreme days and to monitor the occurrence of annual extreme days. This method was used to assess the significance of changes and trends of extreme temperature events in the analysed region. The results demonstrate the effectiveness of an attribute control chart for evaluating extreme temperature events. For example, the evaluation of extreme maximum temperature events using the proposed statistical process control charts was consistent with the evidence of an increase in maximum temperatures during the last decades of the last century.

  4. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC 11: SPC & Graphs. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…

  5. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...

  6. 21 CFR 820.250 - Statistical techniques.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Statistical techniques. 820.250 Section 820.250...) MEDICAL DEVICES QUALITY SYSTEM REGULATION Statistical Techniques § 820.250 Statistical techniques. (a... statistical techniques required for establishing, controlling, and verifying the acceptability of process...

  7. Indicator organisms in meat and poultry slaughter operations: their potential use in process control and the role of emerging technologies.

    PubMed

    Saini, Parmesh K; Marks, Harry M; Dreyfuss, Moshe S; Evans, Peter; Cook, L Victor; Dessai, Uday

    2011-08-01

    Measuring commonly occurring, nonpathogenic organisms on poultry products may be used for designing statistical process control systems that could result in reductions of pathogen levels. The extent of pathogen level reduction that could be obtained from actions resulting from monitoring these measurements over time depends upon the degree of understanding cause-effect relationships between processing variables, selected output variables, and pathogens. For such measurements to be effective for controlling or improving processing to some capability level within the statistical process control context, sufficiently frequent measurements would be needed to help identify processing deficiencies. Ultimately the correct balance of sampling and resources is determined by those characteristics of deficient processing that are important to identify. We recommend strategies that emphasize flexibility, depending upon sampling objectives. Coupling the measurement of levels of indicator organisms with practical emerging technologies and suitable on-site platforms that decrease the time between sample collections and interpreting results would enhance monitoring process control.

  8. Performance analysis of different tuning rules for an isothermal CSTR using integrated EPC and SPC

    NASA Astrophysics Data System (ADS)

    Roslan, A. H.; Karim, S. F. Abd; Hamzah, N.

    2018-03-01

    This paper demonstrates the integration of Engineering Process Control (EPC) and Statistical Process Control (SPC) for the control of product concentration of an isothermal CSTR. The objectives of this study are to evaluate the performance of Ziegler-Nichols (Z-N), Direct Synthesis, (DS) and Internal Model Control (IMC) tuning methods and determine the most effective method for this process. The simulation model was obtained from past literature and re-constructed using SIMULINK MATLAB to evaluate the process response. Additionally, the process stability, capability and normality were analyzed using Process Capability Sixpack reports in Minitab. Based on the results, DS displays the best response for having the smallest rise time, settling time, overshoot, undershoot, Integral Time Absolute Error (ITAE) and Integral Square Error (ISE). Also, based on statistical analysis, DS yields as the best tuning method as it exhibits the highest process stability and capability.

  9. [Monitoring method of extraction process for Schisandrae Chinensis Fructus based on near infrared spectroscopy and multivariate statistical process control].

    PubMed

    Xu, Min; Zhang, Lei; Yue, Hong-Shui; Pang, Hong-Wei; Ye, Zheng-Liang; Ding, Li

    2017-10-01

    To establish an on-line monitoring method for extraction process of Schisandrae Chinensis Fructus, the formula medicinal material of Yiqi Fumai lyophilized injection by combining near infrared spectroscopy with multi-variable data analysis technology. The multivariate statistical process control (MSPC) model was established based on 5 normal batches in production and 2 test batches were monitored by PC scores, DModX and Hotelling T2 control charts. The results showed that MSPC model had a good monitoring ability for the extraction process. The application of the MSPC model to actual production process could effectively achieve on-line monitoring for extraction process of Schisandrae Chinensis Fructus, and can reflect the change of material properties in the production process in real time. This established process monitoring method could provide reference for the application of process analysis technology in the process quality control of traditional Chinese medicine injections. Copyright© by the Chinese Pharmaceutical Association.

  10. Batch Statistical Process Monitoring Approach to a Cocrystallization Process.

    PubMed

    Sarraguça, Mafalda C; Ribeiro, Paulo R S; Dos Santos, Adenilson O; Lopes, João A

    2015-12-01

    Cocrystals are defined as crystalline structures composed of two or more compounds that are solid at room temperature held together by noncovalent bonds. Their main advantages are the increase of solubility, bioavailability, permeability, stability, and at the same time retaining active pharmaceutical ingredient bioactivity. The cocrystallization between furosemide and nicotinamide by solvent evaporation was monitored on-line using near-infrared spectroscopy (NIRS) as a process analytical technology tool. The near-infrared spectra were analyzed using principal component analysis. Batch statistical process monitoring was used to create control charts to perceive the process trajectory and define control limits. Normal and non-normal operating condition batches were performed and monitored with NIRS. The use of NIRS associated with batch statistical process models allowed the detection of abnormal variations in critical process parameters, like the amount of solvent or amount of initial components present in the cocrystallization. © 2015 Wiley Periodicals, Inc. and the American Pharmacists Association.

  11. Modified Exponential Weighted Moving Average (EWMA) Control Chart on Autocorrelation Data

    NASA Astrophysics Data System (ADS)

    Herdiani, Erna Tri; Fandrilla, Geysa; Sunusi, Nurtiti

    2018-03-01

    In general, observations of the statistical process control are assumed to be mutually independence. However, this assumption is often violated in practice. Consequently, statistical process controls were developed for interrelated processes, including Shewhart, Cumulative Sum (CUSUM), and exponentially weighted moving average (EWMA) control charts in the data that were autocorrelation. One researcher stated that this chart is not suitable if the same control limits are used in the case of independent variables. For this reason, it is necessary to apply the time series model in building the control chart. A classical control chart for independent variables is usually applied to residual processes. This procedure is permitted provided that residuals are independent. In 1978, Shewhart modification for the autoregressive process was introduced by using the distance between the sample mean and the target value compared to the standard deviation of the autocorrelation process. In this paper we will examine the mean of EWMA for autocorrelation process derived from Montgomery and Patel. Performance to be investigated was investigated by examining Average Run Length (ARL) based on the Markov Chain Method.

  12. SU-D-BRD-07: Evaluation of the Effectiveness of Statistical Process Control Methods to Detect Systematic Errors For Routine Electron Energy Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parker, S

    2015-06-15

    Purpose: To evaluate the ability of statistical process control methods to detect systematic errors when using a two dimensional (2D) detector array for routine electron beam energy verification. Methods: Electron beam energy constancy was measured using an aluminum wedge and a 2D diode array on four linear accelerators. Process control limits were established. Measurements were recorded in control charts and compared with both calculated process control limits and TG-142 recommended specification limits. The data was tested for normality, process capability and process acceptability. Additional measurements were recorded while systematic errors were intentionally introduced. Systematic errors included shifts in the alignmentmore » of the wedge, incorrect orientation of the wedge, and incorrect array calibration. Results: Control limits calculated for each beam were smaller than the recommended specification limits. Process capability and process acceptability ratios were greater than one in all cases. All data was normally distributed. Shifts in the alignment of the wedge were most apparent for low energies. The smallest shift (0.5 mm) was detectable using process control limits in some cases, while the largest shift (2 mm) was detectable using specification limits in only one case. The wedge orientation tested did not affect the measurements as this did not affect the thickness of aluminum over the detectors of interest. Array calibration dependence varied with energy and selected array calibration. 6 MeV was the least sensitive to array calibration selection while 16 MeV was the most sensitive. Conclusion: Statistical process control methods demonstrated that the data distribution was normally distributed, the process was capable of meeting specifications, and that the process was centered within the specification limits. Though not all systematic errors were distinguishable from random errors, process control limits increased the ability to detect systematic errors using routine measurement of electron beam energy constancy.« less

  13. Statistical Design Model (SDM) of satellite thermal control subsystem

    NASA Astrophysics Data System (ADS)

    Mirshams, Mehran; Zabihian, Ehsan; Aarabi Chamalishahi, Mahdi

    2016-07-01

    Satellites thermal control, is a satellite subsystem that its main task is keeping the satellite components at its own survival and activity temperatures. Ability of satellite thermal control plays a key role in satisfying satellite's operational requirements and designing this subsystem is a part of satellite design. In the other hand due to the lack of information provided by companies and designers still doesn't have a specific design process while it is one of the fundamental subsystems. The aim of this paper, is to identify and extract statistical design models of spacecraft thermal control subsystem by using SDM design method. This method analyses statistical data with a particular procedure. To implement SDM method, a complete database is required. Therefore, we first collect spacecraft data and create a database, and then we extract statistical graphs using Microsoft Excel, from which we further extract mathematical models. Inputs parameters of the method are mass, mission, and life time of the satellite. For this purpose at first thermal control subsystem has been introduced and hardware using in the this subsystem and its variants has been investigated. In the next part different statistical models has been mentioned and a brief compare will be between them. Finally, this paper particular statistical model is extracted from collected statistical data. Process of testing the accuracy and verifying the method use a case study. Which by the comparisons between the specifications of thermal control subsystem of a fabricated satellite and the analyses results, the methodology in this paper was proved to be effective. Key Words: Thermal control subsystem design, Statistical design model (SDM), Satellite conceptual design, Thermal hardware

  14. Statistics to the Rescue!: Using Data to Evaluate a Manufacturing Process

    ERIC Educational Resources Information Center

    Keithley, Michael G.

    2009-01-01

    The use of statistics and process controls is too often overlooked in educating students. This article describes an activity appropriate for high school students who have a background in material processing. It gives them a chance to advance their knowledge by determining whether or not a manufacturing process works well. The activity follows a…

  15. 42 CFR 493.1256 - Standard: Control procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... for having control procedures that monitor the accuracy and precision of the complete analytic process..., include two control materials, including one that is capable of detecting errors in the extraction process... control materials having previously determined statistical parameters. (e) For reagent, media, and supply...

  16. Statistical Process Control for KSC Processing

    NASA Technical Reports Server (NTRS)

    Ford, Roger G.; Delgado, Hector; Tilley, Randy

    1996-01-01

    The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.

  17. Control chart pattern recognition using RBF neural network with new training algorithm and practical features.

    PubMed

    Addeh, Abdoljalil; Khormali, Aminollah; Golilarz, Noorbakhsh Amiri

    2018-05-04

    The control chart patterns are the most commonly used statistical process control (SPC) tools to monitor process changes. When a control chart produces an out-of-control signal, this means that the process has been changed. In this study, a new method based on optimized radial basis function neural network (RBFNN) is proposed for control chart patterns (CCPs) recognition. The proposed method consists of four main modules: feature extraction, feature selection, classification and learning algorithm. In the feature extraction module, shape and statistical features are used. Recently, various shape and statistical features have been presented for the CCPs recognition. In the feature selection module, the association rules (AR) method has been employed to select the best set of the shape and statistical features. In the classifier section, RBFNN is used and finally, in RBFNN, learning algorithm has a high impact on the network performance. Therefore, a new learning algorithm based on the bees algorithm has been used in the learning module. Most studies have considered only six patterns: Normal, Cyclic, Increasing Trend, Decreasing Trend, Upward Shift and Downward Shift. Since three patterns namely Normal, Stratification, and Systematic are very similar to each other and distinguishing them is very difficult, in most studies Stratification and Systematic have not been considered. Regarding to the continuous monitoring and control over the production process and the exact type detection of the problem encountered during the production process, eight patterns have been investigated in this study. The proposed method is tested on a dataset containing 1600 samples (200 samples from each pattern) and the results showed that the proposed method has a very good performance. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  18. Teaching Quality Control with Chocolate Chip Cookies

    ERIC Educational Resources Information Center

    Baker, Ardith

    2014-01-01

    Chocolate chip cookies are used to illustrate the importance and effectiveness of control charts in Statistical Process Control. By counting the number of chocolate chips, creating the spreadsheet, calculating the control limits and graphing the control charts, the student becomes actively engaged in the learning process. In addition, examining…

  19. An introduction to statistical process control in research proteomics.

    PubMed

    Bramwell, David

    2013-12-16

    Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.

  20. Statistical process control: a practical application for hospitals.

    PubMed

    VanderVeen, L M

    1992-01-01

    A six-step plan based on using statistics was designed to improve quality in the central processing and distribution department of a 223-bed hospital in Oakland, CA. This article describes how the plan was implemented sequentially, starting with the crucial first step of obtaining administrative support. The QI project succeeded in overcoming beginners' fear of statistics and in training both managers and staff to use inspection checklists, Pareto charts, cause-and-effect diagrams, and control charts. The best outcome of the program was the increased commitment to quality improvement by the members of the department.

  1. Sampling methods to the statistical control of the production of blood components.

    PubMed

    Pereira, Paulo; Seghatchian, Jerard; Caldeira, Beatriz; Santos, Paula; Castro, Rosa; Fernandes, Teresa; Xavier, Sandra; de Sousa, Gracinda; de Almeida E Sousa, João Paulo

    2017-12-01

    The control of blood components specifications is a requirement generalized in Europe by the European Commission Directives and in the US by the AABB standards. The use of a statistical process control methodology is recommended in the related literature, including the EDQM guideline. The control reliability is dependent of the sampling. However, a correct sampling methodology seems not to be systematically applied. Commonly, the sampling is intended to comply uniquely with the 1% specification to the produced blood components. Nevertheless, on a purely statistical viewpoint, this model could be argued not to be related to a consistent sampling technique. This could be a severe limitation to detect abnormal patterns and to assure that the production has a non-significant probability of producing nonconforming components. This article discusses what is happening in blood establishments. Three statistical methodologies are proposed: simple random sampling, sampling based on the proportion of a finite population, and sampling based on the inspection level. The empirical results demonstrate that these models are practicable in blood establishments contributing to the robustness of sampling and related statistical process control decisions for the purpose they are suggested for. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. Statistical transformation and the interpretation of inpatient glucose control data from the intensive care unit.

    PubMed

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B

    2014-05-01

    Glucose control can be problematic in critically ill patients. We evaluated the impact of statistical transformation on interpretation of intensive care unit inpatient glucose control data. Point-of-care blood glucose (POC-BG) data derived from patients in the intensive care unit for 2011 was obtained. Box-Cox transformation of POC-BG measurements was performed, and distribution of data was determined before and after transformation. Different data subsets were used to establish statistical upper and lower control limits. Exponentially weighted moving average (EWMA) control charts constructed from April, October, and November data determined whether out-of-control events could be identified differently in transformed versus nontransformed data. A total of 8679 POC-BG values were analyzed. POC-BG distributions in nontransformed data were skewed but approached normality after transformation. EWMA control charts revealed differences in projected detection of out-of-control events. In April, an out-of-control process resulting in the lower control limit being exceeded was identified at sample 116 in nontransformed data but not in transformed data. October transformed data detected an out-of-control process exceeding the upper control limit at sample 27 that was not detected in nontransformed data. Nontransformed November results remained in control, but transformation identified an out-of-control event less than 10 samples into the observation period. Using statistical methods to assess population-based glucose control in the intensive care unit could alter conclusions about the effectiveness of care processes for managing hyperglycemia. Further study is required to determine whether transformed versus nontransformed data change clinical decisions about the interpretation of care or intervention results. © 2014 Diabetes Technology Society.

  3. Statistical Transformation and the Interpretation of Inpatient Glucose Control Data From the Intensive Care Unit

    PubMed Central

    Saulnier, George E.; Castro, Janna C.

    2014-01-01

    Glucose control can be problematic in critically ill patients. We evaluated the impact of statistical transformation on interpretation of intensive care unit inpatient glucose control data. Point-of-care blood glucose (POC-BG) data derived from patients in the intensive care unit for 2011 was obtained. Box–Cox transformation of POC-BG measurements was performed, and distribution of data was determined before and after transformation. Different data subsets were used to establish statistical upper and lower control limits. Exponentially weighted moving average (EWMA) control charts constructed from April, October, and November data determined whether out-of-control events could be identified differently in transformed versus nontransformed data. A total of 8679 POC-BG values were analyzed. POC-BG distributions in nontransformed data were skewed but approached normality after transformation. EWMA control charts revealed differences in projected detection of out-of-control events. In April, an out-of-control process resulting in the lower control limit being exceeded was identified at sample 116 in nontransformed data but not in transformed data. October transformed data detected an out-of-control process exceeding the upper control limit at sample 27 that was not detected in nontransformed data. Nontransformed November results remained in control, but transformation identified an out-of-control event less than 10 samples into the observation period. Using statistical methods to assess population-based glucose control in the intensive care unit could alter conclusions about the effectiveness of care processes for managing hyperglycemia. Further study is required to determine whether transformed versus nontransformed data change clinical decisions about the interpretation of care or intervention results. PMID:24876620

  4. Quality Control of the Print with the Application of Statistical Methods

    NASA Astrophysics Data System (ADS)

    Simonenko, K. V.; Bulatova, G. S.; Antropova, L. B.; Varepo, L. G.

    2018-04-01

    The basis for standardizing the process of offset printing is the control of print quality indicators. The solution of this problem has various approaches, among which the most important are statistical methods. Practical implementation of them for managing the quality of the printing process is very relevant and is reflected in this paper. The possibility of using the method of constructing a Control Card to identify the reasons for the deviation of the optical density for a triad of inks in offset printing is shown.

  5. Statistical process control charts for monitoring military injuries.

    PubMed

    Schuh, Anna; Canham-Chervak, Michelle; Jones, Bruce H

    2017-12-01

    An essential aspect of an injury prevention process is surveillance, which quantifies and documents injury rates in populations of interest and enables monitoring of injury frequencies, rates and trends. To drive progress towards injury reduction goals, additional tools are needed. Statistical process control charts, a methodology that has not been previously applied to Army injury monitoring, capitalise on existing medical surveillance data to provide information to leadership about injury trends necessary for prevention planning and evaluation. Statistical process control Shewhart u-charts were created for 49 US Army installations using quarterly injury medical encounter rates, 2007-2015, for active duty soldiers obtained from the Defense Medical Surveillance System. Injuries were defined according to established military injury surveillance recommendations. Charts display control limits three standard deviations (SDs) above and below an installation-specific historical average rate determined using 28 data points, 2007-2013. Charts are available in Army strategic management dashboards. From 2007 to 2015, Army injury rates ranged from 1254 to 1494 unique injuries per 1000 person-years. Installation injury rates ranged from 610 to 2312 injuries per 1000 person-years. Control charts identified four installations with injury rates exceeding the upper control limits at least once during 2014-2015, rates at three installations exceeded the lower control limit at least once and 42 installations had rates that fluctuated around the historical mean. Control charts can be used to drive progress towards injury reduction goals by indicating statistically significant increases and decreases in injury rates. Future applications to military subpopulations, other health outcome metrics and chart enhancements are suggested. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  6. Guideline implementation in clinical practice: use of statistical process control charts as visual feedback devices.

    PubMed

    Al-Hussein, Fahad A

    2009-01-01

    To use statistical control charts in a series of audits to improve the acceptance and consistant use of guidelines, and reduce the variations in prescription processing in primary health care. A series of audits were done at the main satellite of King Saud Housing Family and Community Medicine Center, National Guard Health Affairs, Riyadh, where three general practitioners and six pharmacists provide outpatient care to about 3000 residents. Audits were carried out every fortnight to calculate the proportion of prescriptions that did not conform to the given guidelines of prescribing and dispensing. Simple random samples of thirty were chosen from a sampling frame of all prescriptions given in the two previous weeks. Thirty six audits were carried out from September 2004 to February 2006. P-charts were constructed around a parametric specification of non-conformities not exceeding 25%. Of the 1081 prescriptions, the most frequent non-conformity was failure to write generic names (35.5%), followed by the failure to record patient's weight (16.4%), pharmacist's name (14.3%), duration of therapy (9.1%), and the use of inappropriate abbreviations (6.0%). Initially, 100% of prescriptions did not conform to the guidelines, but within a period of three months, this came down to 40%. A process of audits in the context of statistical process control is necessary for any improvement in the implementation of guidelines in primary care. Statistical process control charts are an effective means of visual feedback to the care providers.

  7. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  8. Automated information and control complex of hydro-gas endogenous mine processes

    NASA Astrophysics Data System (ADS)

    Davkaev, K. S.; Lyakhovets, M. V.; Gulevich, T. M.; Zolin, K. A.

    2017-09-01

    The automated information and control complex designed to prevent accidents, related to aerological situation in the underground workings, accounting of the received and handed over individual devices, transmission and display of measurement data, and the formation of preemptive solutions is considered. Examples for the automated workplace of an airgas control operator by individual means are given. The statistical characteristics of field data characterizing the aerological situation in the mine are obtained. The conducted studies of statistical characteristics confirm the feasibility of creating a subsystem of controlled gas distribution with an adaptive arrangement of points for gas control. The adaptive (multivariant) algorithm for processing measuring information of continuous multidimensional quantities and influencing factors has been developed.

  9. Information integration and diagnosis analysis of equipment status and production quality for machining process

    NASA Astrophysics Data System (ADS)

    Zan, Tao; Wang, Min; Hu, Jianzhong

    2010-12-01

    Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.

  10. Moment-Based Physical Models of Broadband Clutter due to Aggregations of Fish

    DTIC Science & Technology

    2013-09-30

    statistical models for signal-processing algorithm development. These in turn will help to develop a capability to statistically forecast the impact of...aggregations of fish based on higher-order statistical measures describable in terms of physical and system parameters. Environmentally , these models...processing. In this experiment, we had good ground truth on (1) and (2), and had control over (3) and (4) except for environmentally -imposed restrictions

  11. Probabilistic models in human sensorimotor control

    PubMed Central

    Wolpert, Daniel M.

    2009-01-01

    Sensory and motor uncertainty form a fundamental constraint on human sensorimotor control. Bayesian decision theory (BDT) has emerged as a unifying framework to understand how the central nervous system performs optimal estimation and control in the face of such uncertainty. BDT has two components: Bayesian statistics and decision theory. Here we review Bayesian statistics and show how it applies to estimating the state of the world and our own body. Recent results suggest that when learning novel tasks we are able to learn the statistical properties of both the world and our own sensory apparatus so as to perform estimation using Bayesian statistics. We review studies which suggest that humans can combine multiple sources of information to form maximum likelihood estimates, can incorporate prior beliefs about possible states of the world so as to generate maximum a posteriori estimates and can use Kalman filter-based processes to estimate time-varying states. Finally, we review Bayesian decision theory in motor control and how the central nervous system processes errors to determine loss functions and optimal actions. We review results that suggest we plan movements based on statistics of our actions that result from signal-dependent noise on our motor outputs. Taken together these studies provide a statistical framework for how the motor system performs in the presence of uncertainty. PMID:17628731

  12. Statistical quality control through overall vibration analysis

    NASA Astrophysics Data System (ADS)

    Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos

    2010-05-01

    The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence of predictive variables (high-frequency vibration displacements) that are sensible to the processes setup and the quality of the products obtained. Based on the result of this overall vibration analysis, a second paper will analyse self-induced vibration spectrums in order to define limit vibration bands, controllable every cycle or connected to permanent vibration-monitoring systems able to adjust sensible process variables identified by ANOVA, once the vibration readings exceed established quality limits.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kegel, T.M.

    Calibration laboratories are faced with the need to become accredited or registered to one or more quality standards. One requirement common to all of these standards is the need to have in place a measurement assurance program. What is a measurement assurance program? Brian Belanger, in Measurement Assurance Programs: Part 1, describes it as a {open_quotes}quality assurance program for a measurement process that quantifies the total uncertainty of the measurements (both random and systematic components of error) with respect to national or designated standards and demonstrates that the total uncertainty is sufficiently small to meet the user`s requirements.{close_quotes} Rolf Schumachermore » is more specific in Measurement Assurance in Your Own Laboratory. He states, {open_quotes}Measurement assurance is the application of broad quality control principles to measurements of calibrations.{close_quotes} Here, the focus is on one important part of any measurement assurance program: implementation of statistical process control (SPC). Paraphrasing Juran`s Quality Control Handbook, a process is in statistical control if the only observed variations are those that can be attributed to random causes. Conversely, a process that exhibits variations due to assignable causes is not in a state of statistical control. Finally, Carrol Croarkin states, {open_quotes}In the measurement assurance context the measurement algorithm including instrumentation, reference standards and operator interactions is the process that is to be controlled, and its direct product is the measurement per se. The measurements are assumed to be valid if the measurement algorithm is operating in a state of control.{close_quotes} Implicit in this statement is the important fact that an out-of-control process cannot produce valid measurements. 7 figs.« less

  14. Particle monitoring and control in vacuum processing equipment

    NASA Astrophysics Data System (ADS)

    Borden, Peter G., Dr.; Gregg, John

    1989-10-01

    Particle contamination during vacuum processes has emerged as the largest single source of yield loss in VLSI manufacturing. While a number of tools have been available to help understand the sources and nature of this contamination, only recently has it been possible to monitor free particle levels within vacuum equipment in real-time. As a result, a better picture is available of how particle contamination can affect a variety of processes. This paper reviews some of the work that has been done to monitor particles in vacuum loadlocks and in processes such as etching, sputtering and ion implantation. The aim has been to make free particles in vacuum equipment a measurable process parameter. Achieving this allows particles to be controlled using statistical process control. It will be shown that free particle levels in load locks correlate to wafer surface counts, device yield and process conditions, but that these levels are considerable higher during production than when dummy wafers are run to qualify a system. It will also be shown how real-time free particle monitoring can be used to monitor and control cleaning cycles, how major episodic events can be detected, and how data can be gathered in a format suitable for statistical process control.

  15. Image-guided radiotherapy quality control: Statistical process control using image similarity metrics.

    PubMed

    Shiraishi, Satomi; Grams, Michael P; Fong de Los Santos, Luis E

    2018-05-01

    The purpose of this study was to demonstrate an objective quality control framework for the image review process. A total of 927 cone-beam computed tomography (CBCT) registrations were retrospectively analyzed for 33 bilateral head and neck cancer patients who received definitive radiotherapy. Two registration tracking volumes (RTVs) - cervical spine (C-spine) and mandible - were defined, within which a similarity metric was calculated and used as a registration quality tracking metric over the course of treatment. First, sensitivity to large misregistrations was analyzed for normalized cross-correlation (NCC) and mutual information (MI) in the context of statistical analysis. The distribution of metrics was obtained for displacements that varied according to a normal distribution with standard deviation of σ = 2 mm, and the detectability of displacements greater than 5 mm was investigated. Then, similarity metric control charts were created using a statistical process control (SPC) framework to objectively monitor the image registration and review process. Patient-specific control charts were created using NCC values from the first five fractions to set a patient-specific process capability limit. Population control charts were created using the average of the first five NCC values for all patients in the study. For each patient, the similarity metrics were calculated as a function of unidirectional translation, referred to as the effective displacement. Patient-specific action limits corresponding to 5 mm effective displacements were defined. Furthermore, effective displacements of the ten registrations with the lowest similarity metrics were compared with a three dimensional (3DoF) couch displacement required to align the anatomical landmarks. Normalized cross-correlation identified suboptimal registrations more effectively than MI within the framework of SPC. Deviations greater than 5 mm were detected at 2.8σ and 2.1σ from the mean for NCC and MI, respectively. Patient-specific control charts using NCC evaluated daily variation and identified statistically significant deviations. This study also showed that subjective evaluations of the images were not always consistent. Population control charts identified a patient whose tracking metrics were significantly lower than those of other patients. The patient-specific action limits identified registrations that warranted immediate evaluation by an expert. When effective displacements in the anterior-posterior direction were compared to 3DoF couch displacements, the agreement was ±1 mm for seven of 10 patients for both C-spine and mandible RTVs. Qualitative review alone of IGRT images can result in inconsistent feedback to the IGRT process. Registration tracking using NCC objectively identifies statistically significant deviations. When used in conjunction with the current image review process, this tool can assist in improving the safety and consistency of the IGRT process. © 2018 American Association of Physicists in Medicine.

  16. Evaluation of statistical protocols for quality control of ecosystem carbon dioxide fluxes

    Treesearch

    Jorge F. Perez-Quezada; Nicanor Z. Saliendra; William E. Emmerich; Emilio A. Laca

    2007-01-01

    The process of quality control of micrometeorological and carbon dioxide (CO2) flux data can be subjective and may lack repeatability, which would undermine the results of many studies. Multivariate statistical methods and time series analysis were used together and independently to detect and replace outliers in CO2 flux...

  17. Product/Process (P/P) Models For The Defense Waste Processing Facility (DWPF): Model Ranges And Validation Ranges For Future Processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jantzen, C.; Edwards, T.

    Radioactive high level waste (HLW) at the Savannah River Site (SRS) has successfully been vitrified into borosilicate glass in the Defense Waste Processing Facility (DWPF) since 1996. Vitrification requires stringent product/process (P/P) constraints since the glass cannot be reworked once it is poured into ten foot tall by two foot diameter canisters. A unique “feed forward” statistical process control (SPC) was developed for this control rather than statistical quality control (SQC). In SPC, the feed composition to the DWPF melter is controlled prior to vitrification. In SQC, the glass product would be sampled after it is vitrified. Individual glass property-compositionmore » models form the basis for the “feed forward” SPC. The models transform constraints on the melt and glass properties into constraints on the feed composition going to the melter in order to guarantee, at the 95% confidence level, that the feed will be processable and that the durability of the resulting waste form will be acceptable to a geologic repository.« less

  18. CRN5EXP: Expert system for statistical quality control

    NASA Technical Reports Server (NTRS)

    Hentea, Mariana

    1991-01-01

    The purpose of the Expert System CRN5EXP is to assist in checking the quality of the coils at two very important mills: Hot Rolling and Cold Rolling in a steel plant. The system interprets the statistical quality control charts, diagnoses and predicts the quality of the steel. Measurements of process control variables are recorded in a database and sample statistics such as the mean and the range are computed and plotted on a control chart. The chart is analyzed through patterns using the C Language Integrated Production System (CLIPS) and a forward chaining technique to reach a conclusion about the causes of defects and to take management measures for the improvement of the quality control techniques. The Expert System combines the certainty factors associated with the process control variables to predict the quality of the steel. The paper presents the approach to extract data from the database, the reason to combine certainty factors, the architecture and the use of the Expert System. However, the interpretation of control charts patterns requires the human expert's knowledge and lends to Expert Systems rules.

  19. A Data-Driven Solution for Performance Improvement

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Marketed as the "Software of the Future," Optimal Engineering Systems P.I. EXPERT(TM) technology offers statistical process control and optimization techniques that are critical to businesses looking to restructure or accelerate operations in order to gain a competitive edge. Kennedy Space Center granted Optimal Engineering Systems the funding and aid necessary to develop a prototype of the process monitoring and improvement software. Completion of this prototype demonstrated that it was possible to integrate traditional statistical quality assurance tools with robust optimization techniques in a user- friendly format that is visually compelling. Using an expert system knowledge base, the software allows the user to determine objectives, capture constraints and out-of-control processes, predict results, and compute optimal process settings.

  20. Statistical Methods in Assembly Quality Management of Multi-Element Products on Automatic Rotor Lines

    NASA Astrophysics Data System (ADS)

    Pries, V. V.; Proskuriakov, N. E.

    2018-04-01

    To control the assembly quality of multi-element mass-produced products on automatic rotor lines, control methods with operational feedback are required. However, due to possible failures in the operation of the devices and systems of automatic rotor line, there is always a real probability of getting defective (incomplete) products into the output process stream. Therefore, a continuous sampling control of the products completeness, based on the use of statistical methods, remains an important element in managing the quality of assembly of multi-element mass products on automatic rotor lines. The feature of continuous sampling control of the multi-element products completeness in the assembly process is its breaking sort, which excludes the possibility of returning component parts after sampling control to the process stream and leads to a decrease in the actual productivity of the assembly equipment. Therefore, the use of statistical procedures for continuous sampling control of the multi-element products completeness when assembled on automatic rotor lines requires the use of such sampling plans that ensure a minimum size of control samples. Comparison of the values of the limit of the average output defect level for the continuous sampling plan (CSP) and for the automated continuous sampling plan (ACSP) shows the possibility of providing lower limit values for the average output defects level using the ACSP-1. Also, the average sample size when using the ACSP-1 plan is less than when using the CSP-1 plan. Thus, the application of statistical methods in the assembly quality management of multi-element products on automatic rotor lines, involving the use of proposed plans and methods for continuous selective control, will allow to automating sampling control procedures and the required level of quality of assembled products while minimizing sample size.

  1. Defining the best quality-control systems by design and inspection.

    PubMed

    Hinckley, C M

    1997-05-01

    Not all of the many approaches to quality control are equally effective. Nonconformities in laboratory testing are caused basically by excessive process variation and mistakes. Statistical quality control can effectively control process variation, but it cannot detect or prevent most mistakes. Because mistakes or blunders are frequently the dominant source of nonconformities, we conclude that statistical quality control by itself is not effective. I explore the 100% inspection methods essential for controlling mistakes. Unlike the inspection techniques that Deming described as ineffective, the new "source" inspection methods can detect mistakes and enable corrections before nonconformities are generated, achieving the highest degree of quality at a fraction of the cost of traditional methods. Key relationships between task complexity and nonconformity rates are also described, along with cultural changes that are essential for implementing the best quality-control practices.

  2. TQM (Total Quality Management) SPARC (Special Process Action Review Committees) Handbook

    DTIC Science & Technology

    1989-08-01

    This document describes the techniques used to support and guide the Special Process Action Review Committees for accomplishing their goals for Total Quality Management (TQM). It includes concepts and definitions, checklists, sample formats, and assessment criteria. Keywords: Continuous process improvement; Logistics information; Process analysis; Quality control; Quality assurance; Total Quality Management ; Statistical processes; Management Planning and control; Management training; Management information systems.

  3. STATISTICS-BASED APPROACH TO WASTEWATER TREATMENT PLANT OPERATIONS

    EPA Science Inventory

    This paper describes work toward development of a convenient decision support system to improve everyday operation and control of the wastewater treatment process. The goal is to help the operator detect problems in the process and select appropriate control actions. The system...

  4. Risk-based Methodology for Validation of Pharmaceutical Batch Processes.

    PubMed

    Wiles, Frederick

    2013-01-01

    In January 2011, the U.S. Food and Drug Administration published new process validation guidance for pharmaceutical processes. The new guidance debunks the long-held industry notion that three consecutive validation batches or runs are all that are required to demonstrate that a process is operating in a validated state. Instead, the new guidance now emphasizes that the level of monitoring and testing performed during process performance qualification (PPQ) studies must be sufficient to demonstrate statistical confidence both within and between batches. In some cases, three qualification runs may not be enough. Nearly two years after the guidance was first published, little has been written defining a statistical methodology for determining the number of samples and qualification runs required to satisfy Stage 2 requirements of the new guidance. This article proposes using a combination of risk assessment, control charting, and capability statistics to define the monitoring and testing scheme required to show that a pharmaceutical batch process is operating in a validated state. In this methodology, an assessment of process risk is performed through application of a process failure mode, effects, and criticality analysis (PFMECA). The output of PFMECA is used to select appropriate levels of statistical confidence and coverage which, in turn, are used in capability calculations to determine when significant Stage 2 (PPQ) milestones have been met. The achievement of Stage 2 milestones signals the release of batches for commercial distribution and the reduction of monitoring and testing to commercial production levels. Individuals, moving range, and range/sigma charts are used in conjunction with capability statistics to demonstrate that the commercial process is operating in a state of statistical control. The new process validation guidance published by the U.S. Food and Drug Administration in January of 2011 indicates that the number of process validation batches or runs required to demonstrate that a pharmaceutical process is operating in a validated state should be based on sound statistical principles. The old rule of "three consecutive batches and you're done" is no longer sufficient. The guidance, however, does not provide any specific methodology for determining the number of runs required, and little has been published to augment this shortcoming. The paper titled "Risk-based Methodology for Validation of Pharmaceutical Batch Processes" describes a statistically sound methodology for determining when a statistically valid number of validation runs has been acquired based on risk assessment and calculation of process capability.

  5. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    NASA Technical Reports Server (NTRS)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  6. [PASS neurocognitive dysfunction in attention deficit].

    PubMed

    Pérez-Alvarez, F; Timoneda-Gallart, C

    Attention deficit disorder shows both cognitive and behavioral patterns. To determine a particular PASS (planning, attention, successive and simultaneous) pattern in order to early diagnosis and remediation according to PASS theory. 80 patients were selected from the neuropediatric attendance, aged 6 to 12 years old, 55 boys and 25 girls. Inclusion criteria were inattention (80 cases) and inattention with hyperactive symptoms (40 cases) according to the Diagnostic and Statistical Manual (DSM-IV). Exclusion criteria were the criteria of phonologic awareness previously reported, considered useful to diagnose dyslexia. A control group of 300 individuals, aged 5 to 12 years old, was used, criteria above mentioned being controlled. DN:CAS (Das-Naglieri Cognitive Assessment System) battery, translated to native language, was given to assess PASS cognitive processes. Results were analyzed with cluster analysis and t-Student test. Statistical factor analysis of the control group had previously identified the four PASS processes: planning, attention, successive and simultaneous. The dendrogram of the cluster analysis discriminated three categories of attention deficit disorder: 1. The most frequent, with planning deficit; 2. Without planning deficit but with deficit in other processes, and 3. Just only a few cases, without cognitive processing deficit. Cognitive deficiency in terms of means of scores was statistically significant when compared to control group (p = 0.001). According to PASS pattern, planning deficiency is a relevant factor. Neurological planning is not exactly the same than neurological executive function. The behavioral pattern is mainly linked to planning deficiency, but also to other PASS processing deficits and even to no processing deficit.

  7. Statistical process control of cocrystallization processes: A comparison between OPLS and PLS.

    PubMed

    Silva, Ana F T; Sarraguça, Mafalda Cruz; Ribeiro, Paulo R; Santos, Adenilson O; De Beer, Thomas; Lopes, João Almeida

    2017-03-30

    Orthogonal partial least squares regression (OPLS) is being increasingly adopted as an alternative to partial least squares (PLS) regression due to the better generalization that can be achieved. Particularly in multivariate batch statistical process control (BSPC), the use of OPLS for estimating nominal trajectories is advantageous. In OPLS, the nominal process trajectories are expected to be captured in a single predictive principal component while uncorrelated variations are filtered out to orthogonal principal components. In theory, OPLS will yield a better estimation of the Hotelling's T 2 statistic and corresponding control limits thus lowering the number of false positives and false negatives when assessing the process disturbances. Although OPLS advantages have been demonstrated in the context of regression, its use on BSPC was seldom reported. This study proposes an OPLS-based approach for BSPC of a cocrystallization process between hydrochlorothiazide and p-aminobenzoic acid monitored on-line with near infrared spectroscopy and compares the fault detection performance with the same approach based on PLS. A series of cocrystallization batches with imposed disturbances were used to test the ability to detect abnormal situations by OPLS and PLS-based BSPC methods. Results demonstrated that OPLS was generally superior in terms of sensibility and specificity in most situations. In some abnormal batches, it was found that the imposed disturbances were only detected with OPLS. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Manufacturing Execution Systems: Examples of Performance Indicator and Operational Robustness Tools.

    PubMed

    Gendre, Yannick; Waridel, Gérard; Guyon, Myrtille; Demuth, Jean-François; Guelpa, Hervé; Humbert, Thierry

    Manufacturing Execution Systems (MES) are computerized systems used to measure production performance in terms of productivity, yield, and quality. In the first part, performance indicator and overall equipment effectiveness (OEE), process robustness tools and statistical process control are described. The second part details some tools to help process robustness and control by operators by preventing deviations from target control charts. MES was developed by Syngenta together with CIMO for automation.

  9. Employee empowerment through team building and use of process control methods.

    PubMed

    Willems, S

    1998-02-01

    The article examines the use of statistical process control and performance improvement techniques in employee empowerment. The focus is how these techniques provide employees with information to improve their productivity and become involved in the decision-making process. Findings suggest that at one Mississippi hospital employee improvement has had a positive effect on employee productivity, morale, and quality of work.

  10. Experimental research on mathematical modelling and unconventional control of clinker kiln in cement plants

    NASA Astrophysics Data System (ADS)

    Rusu-Anghel, S.

    2017-01-01

    Analytical modeling of the flow of manufacturing process of the cement is difficult because of their complexity and has not resulted in sufficiently precise mathematical models. In this paper, based on a statistical model of the process and using the knowledge of human experts, was designed a fuzzy system for automatic control of clinkering process.

  11. Endpoint in plasma etch process using new modified w-multivariate charts and windowed regression

    NASA Astrophysics Data System (ADS)

    Zakour, Sihem Ben; Taleb, Hassen

    2017-09-01

    Endpoint detection is very important undertaking on the side of getting a good understanding and figuring out if a plasma etching process is done in the right way, especially if the etched area is very small (0.1%). It truly is a crucial part of supplying repeatable effects in every single wafer. When the film being etched has been completely cleared, the endpoint is reached. To ensure the desired device performance on the produced integrated circuit, the high optical emission spectroscopy (OES) sensor is employed. The huge number of gathered wavelengths (profiles) is then analyzed and pre-processed using a new proposed simple algorithm named Spectra peak selection (SPS) to select the important wavelengths, then we employ wavelet analysis (WA) to enhance the performance of detection by suppressing noise and redundant information. The selected and treated OES wavelengths are then used in modified multivariate control charts (MEWMA and Hotelling) for three statistics (mean, SD and CV) and windowed polynomial regression for mean. The employ of three aforementioned statistics is motivated by controlling mean shift, variance shift and their ratio (CV) if both mean and SD are not stable. The control charts show their performance in detecting endpoint especially W-mean Hotelling chart and the worst result is given by CV statistic. As the best detection of endpoint is given by the W-Hotelling mean statistic, this statistic will be used to construct a windowed wavelet Hotelling polynomial regression. This latter can only identify the window containing endpoint phenomenon.

  12. Manufacturing Research: Self-Directed Control

    DTIC Science & Technology

    1991-01-01

    reduce this sensitivity. SDO is performing Taguchi’s parameter design . 1-13 Statistical Process Control SPC techniques will be used to monitor the process...Florida,R.E. Krieger Pub. Co., 1988. Dehnad, Khowrow, Quality Control . Robust Design . and the Taguchi Method, Pacific Grove, California, Wadsworth... control system. This turns out to be a non -trivial exercise. A human operator can see an event occur (such as the vessel pressurizing above its setpoint

  13. A Statistical Method to Distinguish Functional Brain Networks

    PubMed Central

    Fujita, André; Vidal, Maciel C.; Takahashi, Daniel Y.

    2017-01-01

    One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism (p < 0.001). PMID:28261045

  14. A Statistical Method to Distinguish Functional Brain Networks.

    PubMed

    Fujita, André; Vidal, Maciel C; Takahashi, Daniel Y

    2017-01-01

    One major problem in neuroscience is the comparison of functional brain networks of different populations, e.g., distinguishing the networks of controls and patients. Traditional algorithms are based on search for isomorphism between networks, assuming that they are deterministic. However, biological networks present randomness that cannot be well modeled by those algorithms. For instance, functional brain networks of distinct subjects of the same population can be different due to individual characteristics. Moreover, networks of subjects from different populations can be generated through the same stochastic process. Thus, a better hypothesis is that networks are generated by random processes. In this case, subjects from the same group are samples from the same random process, whereas subjects from different groups are generated by distinct processes. Using this idea, we developed a statistical test called ANOGVA to test whether two or more populations of graphs are generated by the same random graph model. Our simulations' results demonstrate that we can precisely control the rate of false positives and that the test is powerful to discriminate random graphs generated by different models and parameters. The method also showed to be robust for unbalanced data. As an example, we applied ANOGVA to an fMRI dataset composed of controls and patients diagnosed with autism or Asperger. ANOGVA identified the cerebellar functional sub-network as statistically different between controls and autism ( p < 0.001).

  15. Total Quality Management Implementation Strategy: Directorate of Quality Assurance

    DTIC Science & Technology

    1989-05-01

    Total Quality Control Harrington, H. James The Improvement Process Imai, Masaaki Kaizen Ishikawa , Kaoru What is Total Quality Control Ishikawa ... Kaoru Statistical Quality Control Juran, J. M. Managerial Breakthrough Juran, J. M. Quality Control Handbook Mizuno, Ed Managing for Quality Improvements

  16. Industrial implementation of spatial variability control by real-time SPC

    NASA Astrophysics Data System (ADS)

    Roule, O.; Pasqualini, F.; Borde, M.

    2016-10-01

    Advanced technology nodes require more and more information to get the wafer process well setup. The critical dimension of components decreases following Moore's law. At the same time, the intra-wafer dispersion linked to the spatial non-uniformity of tool's processes is not capable to decrease in the same proportions. APC systems (Advanced Process Control) are being developed in waferfab to automatically adjust and tune wafer processing, based on a lot of process context information. It can generate and monitor complex intrawafer process profile corrections between different process steps. It leads us to put under control the spatial variability, in real time by our SPC system (Statistical Process Control). This paper will outline the architecture of an integrated process control system for shape monitoring in 3D, implemented in waferfab.

  17. Observing fermionic statistics with photons in arbitrary processes

    PubMed Central

    Matthews, Jonathan C. F.; Poulios, Konstantinos; Meinecke, Jasmin D. A.; Politi, Alberto; Peruzzo, Alberto; Ismail, Nur; Wörhoff, Kerstin; Thompson, Mark G.; O'Brien, Jeremy L.

    2013-01-01

    Quantum mechanics defines two classes of particles-bosons and fermions-whose exchange statistics fundamentally dictate quantum dynamics. Here we develop a scheme that uses entanglement to directly observe the correlated detection statistics of any number of fermions in any physical process. This approach relies on sending each of the entangled particles through identical copies of the process and by controlling a single phase parameter in the entangled state, the correlated detection statistics can be continuously tuned between bosonic and fermionic statistics. We implement this scheme via two entangled photons shared across the polarisation modes of a single photonic chip to directly mimic the fermion, boson and intermediate behaviour of two-particles undergoing a continuous time quantum walk. The ability to simulate fermions with photons is likely to have applications for verifying boson scattering and for observing particle correlations in analogue simulation using any physical platform that can prepare the entangled state prescribed here. PMID:23531788

  18. The Brazilian Air Force Uniform Distribution Process: Using Lean Thinking, Statistical Process Control and Theory of Constraints to Address Improvement Opportunities

    DTIC Science & Technology

    2015-03-26

    universal definition” (Evans & Lindsay, 1996). Heizer and Render (2010) argue that several definitions of this term are user-based, meaning, that quality...for example, really good ice cream has high butterfat levels.” ( Heizer & Render , 2010). Garvin, in his Competing in Eight Dimensions of Quality...Montgomery, 2005). As for definition purposes, the concept adopted by this research was provided by Heizer and Render (2010), for whom Statistical Process

  19. Process control charts in infection prevention: Make it simple to make it happen.

    PubMed

    Wiemken, Timothy L; Furmanek, Stephen P; Carrico, Ruth M; Mattingly, William A; Persaud, Annuradha K; Guinn, Brian E; Kelley, Robert R; Ramirez, Julio A

    2017-03-01

    Quality improvement is central to Infection Prevention and Control (IPC) programs. Challenges may occur when applying quality improvement methodologies like process control charts, often due to the limited exposure of typical IPs. Because of this, our team created an open-source database with a process control chart generator for IPC programs. The objectives of this report are to outline the development of the application and demonstrate application using simulated data. We used Research Electronic Data Capture (REDCap Consortium, Vanderbilt University, Nashville, TN), R (R Foundation for Statistical Computing, Vienna, Austria), and R Studio Shiny (R Foundation for Statistical Computing) to create an open source data collection system with automated process control chart generation. We used simulated data to test and visualize both in-control and out-of-control processes for commonly used metrics in IPC programs. The R code for implementing the control charts and Shiny application can be found on our Web site (https://github.com/ul-research-support/spcapp). Screen captures of the workflow and simulated data indicating both common cause and special cause variation are provided. Process control charts can be easily developed based on individual facility needs using freely available software. Through providing our work free to all interested parties, we hope that others will be able to harness the power and ease of use of the application for improving the quality of care and patient safety in their facilities. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  20. Multivariate statistical process control in product quality review assessment - A case study.

    PubMed

    Kharbach, M; Cherrah, Y; Vander Heyden, Y; Bouklouze, A

    2017-11-01

    According to the Food and Drug Administration and the European Good Manufacturing Practices (GMP) guidelines, Annual Product Review (APR) is a mandatory requirement in GMP. It consists of evaluating a large collection of qualitative or quantitative data in order to verify the consistency of an existing process. According to the Code of Federal Regulation Part 11 (21 CFR 211.180), all finished products should be reviewed annually for the quality standards to determine the need of any change in specification or manufacturing of drug products. Conventional Statistical Process Control (SPC) evaluates the pharmaceutical production process by examining only the effect of a single factor at the time using a Shewhart's chart. It neglects to take into account the interaction between the variables. In order to overcome this issue, Multivariate Statistical Process Control (MSPC) can be used. Our case study concerns an APR assessment, where 164 historical batches containing six active ingredients, manufactured in Morocco, were collected during one year. Each batch has been checked by assaying the six active ingredients by High Performance Liquid Chromatography according to European Pharmacopoeia monographs. The data matrix was evaluated both by SPC and MSPC. The SPC indicated that all batches are under control, while the MSPC, based on Principal Component Analysis (PCA), for the data being either autoscaled or robust scaled, showed four and seven batches, respectively, out of the Hotelling T 2 95% ellipse. Also, an improvement of the capability of the process is observed without the most extreme batches. The MSPC can be used for monitoring subtle changes in the manufacturing process during an APR assessment. Copyright © 2017 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  1. Feasibility study of using statistical process control to customized quality assurance in proton therapy.

    PubMed

    Rah, Jeong-Eun; Shin, Dongho; Oh, Do Hoon; Kim, Tae Hyun; Kim, Gwe-Ya

    2014-09-01

    To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors' analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.

  2. 40 CFR 85.2233 - Steady state test equipment calibrations, adjustments, and quality control-EPA 91.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... compensated for automatically and statistical process control demonstrates equal or better quality control... calibrations, adjustments, and quality control-EPA 91. 85.2233 Section 85.2233 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) CONTROL OF AIR POLLUTION FROM MOBILE...

  3. Quality control troubleshooting tools for the mill floor

    Treesearch

    John Dramm

    2000-01-01

    Statistical Process Control (SPC) provides effective tools for improving process quality in the forest products industry resulting in reduced costs and improved productivity. Implementing SPC helps identify and locate problems that occur in wood products manufacturing. SPC tools achieve their real value when applied on the mill floor for monitoring and troubleshooting...

  4. 75 FR 22729 - Defense Federal Acquisition Regulation Supplement; Reporting of Government Property Lost, Stolen...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-30

    ....g., statistical process controls, as a means of managing such variation. (3) Reporting requirements...-based processes to greater use of automation. This proposed rule revises requirements for all DoD... single repository of all LTDD data to improve accountability and control of DoD assets and contractor...

  5. Performance analysis of Integrated Communication and Control System networks

    NASA Technical Reports Server (NTRS)

    Halevi, Y.; Ray, A.

    1990-01-01

    This paper presents statistical analysis of delays in Integrated Communication and Control System (ICCS) networks that are based on asynchronous time-division multiplexing. The models are obtained in closed form for analyzing control systems with randomly varying delays. The results of this research are applicable to ICCS design for complex dynamical processes like advanced aircraft and spacecraft, autonomous manufacturing plants, and chemical and processing plants.

  6. Implementation of Statistical Process Control: Evaluating the Mechanical Performance of a Candidate Silicone Elastomer Docking Seal

    NASA Technical Reports Server (NTRS)

    Oravec, Heather Ann; Daniels, Christopher C.

    2014-01-01

    The National Aeronautics and Space Administration has been developing a novel docking system to meet the requirements of future exploration missions to low-Earth orbit and beyond. A dynamic gas pressure seal is located at the main interface between the active and passive mating components of the new docking system. This seal is designed to operate in the harsh space environment, but is also to perform within strict loading requirements while maintaining an acceptable level of leak rate. In this study, a candidate silicone elastomer seal was designed, and multiple subscale test articles were manufactured for evaluation purposes. The force required to fully compress each test article at room temperature was quantified and found to be below the maximum allowable load for the docking system. However, a significant amount of scatter was observed in the test results. Due to the stochastic nature of the mechanical performance of this candidate docking seal, a statistical process control technique was implemented to isolate unusual compression behavior from typical mechanical performance. The results of this statistical analysis indicated a lack of process control, suggesting a variation in the manufacturing phase of the process. Further investigation revealed that changes in the manufacturing molding process had occurred which may have influenced the mechanical performance of the seal. This knowledge improves the chance of this and future space seals to satisfy or exceed design specifications.

  7. Methods for computational disease surveillance in infection prevention and control: Statistical process control versus Twitter's anomaly and breakout detection algorithms.

    PubMed

    Wiemken, Timothy L; Furmanek, Stephen P; Mattingly, William A; Wright, Marc-Oliver; Persaud, Annuradha K; Guinn, Brian E; Carrico, Ruth M; Arnold, Forest W; Ramirez, Julio A

    2018-02-01

    Although not all health care-associated infections (HAIs) are preventable, reducing HAIs through targeted intervention is key to a successful infection prevention program. To identify areas in need of targeted intervention, robust statistical methods must be used when analyzing surveillance data. The objective of this study was to compare and contrast statistical process control (SPC) charts with Twitter's anomaly and breakout detection algorithms. SPC and anomaly/breakout detection (ABD) charts were created for vancomycin-resistant Enterococcus, Acinetobacter baumannii, catheter-associated urinary tract infection, and central line-associated bloodstream infection data. Both SPC and ABD charts detected similar data points as anomalous/out of control on most charts. The vancomycin-resistant Enterococcus ABD chart detected an extra anomalous point that appeared to be higher than the same time period in prior years. Using a small subset of the central line-associated bloodstream infection data, the ABD chart was able to detect anomalies where the SPC chart was not. SPC charts and ABD charts both performed well, although ABD charts appeared to work better in the context of seasonal variation and autocorrelation. Because they account for common statistical issues in HAI data, ABD charts may be useful for practitioners for analysis of HAI surveillance data. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  8. Review of the patient positioning reproducibility in head-and-neck radiotherapy using Statistical Process Control.

    PubMed

    Moore, Sarah J; Herst, Patries M; Louwe, Robert J W

    2018-05-01

    A remarkable improvement in patient positioning was observed after the implementation of various process changes aiming to increase the consistency of patient positioning throughout the radiotherapy treatment chain. However, no tool was available to describe these changes over time in a standardised way. This study reports on the feasibility of Statistical Process Control (SPC) to highlight changes in patient positioning accuracy and facilitate correlation of these changes with the underlying process changes. Metrics were designed to quantify the systematic and random patient deformation as input for the SPC charts. These metrics were based on data obtained from multiple local ROI matches for 191 patients who were treated for head-and-neck cancer during the period 2011-2016. SPC highlighted a significant improvement in patient positioning that coincided with multiple intentional process changes. The observed improvements could be described as a combination of a reduction in outliers and a systematic improvement in the patient positioning accuracy of all patients. SPC is able to track changes in the reproducibility of patient positioning in head-and-neck radiation oncology, and distinguish between systematic and random process changes. Identification of process changes underlying these trends requires additional statistical analysis and seems only possible when the changes do not overlap in time. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Statistical process control as a tool for controlling operating room performance: retrospective analysis and benchmarking.

    PubMed

    Chen, Tsung-Tai; Chang, Yun-Jau; Ku, Shei-Ling; Chung, Kuo-Piao

    2010-10-01

    There is much research using statistical process control (SPC) to monitor surgical performance, including comparisons among groups to detect small process shifts, but few of these studies have included a stabilization process. This study aimed to analyse the performance of surgeons in operating room (OR) and set a benchmark by SPC after stabilized process. The OR profile of 499 patients who underwent laparoscopic cholecystectomy performed by 16 surgeons at a tertiary hospital in Taiwan during 2005 and 2006 were recorded. SPC was applied to analyse operative and non-operative times using the following five steps: first, the times were divided into two segments; second, they were normalized; third, they were evaluated as individual processes; fourth, the ARL(0) was calculated;, and fifth, the different groups (surgeons) were compared. Outliers were excluded to ensure stability for each group and to facilitate inter-group comparison. The results showed that in the stabilized process, only one surgeon exhibited a significantly shorter total process time (including operative time and non-operative time). In this study, we use five steps to demonstrate how to control surgical and non-surgical time in phase I. There are some measures that can be taken to prevent skew and instability in the process. Also, using SPC, one surgeon can be shown to be a real benchmark. © 2010 Blackwell Publishing Ltd.

  10. The Warning System in Disaster Situations: A Selective Analysis.

    DTIC Science & Technology

    DISASTERS, *WARNING SYSTEMS), CIVIL DEFENSE, SOCIAL PSYCHOLOGY, REACTION(PSYCHOLOGY), FACTOR ANALYSIS, CLASSIFICATION, STATISTICAL DATA, TIME ... MANAGEMENT PLANNING AND CONTROL, DAMAGE, CONTROL SYSTEMS, THREAT EVALUATION, DECISION MAKING, DATA PROCESSING, COMMUNICATION SYSTEMS, NUCLEAR EXPLOSIONS

  11. 43 CFR 2.46 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... group of any records under the control of the Department or a bureau thereof from which information is... management programs or processes such as staffing, employee development, retirement, and grievances and appeals. (i) Statistical records. As used in this subpart, “statistical records” means records in a system...

  12. Design and Analysis of A Multi-Backend Database System for Performance Improvement, Functionality Expansion and Capacity Growth. Part II.

    DTIC Science & Technology

    1981-08-01

    of Transactions ..... . 29 5.5.2 Attached Execution of Transactions ........ ... 29 5.5.3 The Choice of Transaction Execution for Access Control...basic access control mech- anism for statistical security and value-dependent security. In Section 5.5, * we describe the process of execution of ...the process of request execution with access control for in- sert and non-insert requests in MDBS. We recall again (see Chapter 4) that the process

  13. Statistical Modeling of Single Target Cell Encapsulation

    PubMed Central

    Moon, SangJun; Ceyhan, Elvan; Gurkan, Umut Atakan; Demirci, Utkan

    2011-01-01

    High throughput drop-on-demand systems for separation and encapsulation of individual target cells from heterogeneous mixtures of multiple cell types is an emerging method in biotechnology that has broad applications in tissue engineering and regenerative medicine, genomics, and cryobiology. However, cell encapsulation in droplets is a random process that is hard to control. Statistical models can provide an understanding of the underlying processes and estimation of the relevant parameters, and enable reliable and repeatable control over the encapsulation of cells in droplets during the isolation process with high confidence level. We have modeled and experimentally verified a microdroplet-based cell encapsulation process for various combinations of cell loading and target cell concentrations. Here, we explain theoretically and validate experimentally a model to isolate and pattern single target cells from heterogeneous mixtures without using complex peripheral systems. PMID:21814548

  14. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  15. Results of a multicentre randomised controlled trial of statistical process control charts and structured diagnostic tools to reduce ward-acquired meticillin-resistant Staphylococcus aureus: the CHART Project.

    PubMed

    Curran, E; Harper, P; Loveday, H; Gilmour, H; Jones, S; Benneyan, J; Hood, J; Pratt, R

    2008-10-01

    Statistical process control (SPC) charts have previously been advocated for infection control quality improvement. To determine their effectiveness, a multicentre randomised controlled trial was undertaken to explore whether monthly SPC feedback from infection control nurses (ICNs) to healthcare workers of ward-acquired meticillin-resistant Staphylococcus aureus (WA-MRSA) colonisation or infection rates would produce any reductions in incidence. Seventy-five wards in 24 hospitals in the UK were randomised into three arms: (1) wards receiving SPC chart feedback; (2) wards receiving SPC chart feedback in conjunction with structured diagnostic tools; and (3) control wards receiving neither type of feedback. Twenty-five months of pre-intervention WA-MRSA data were compared with 24 months of post-intervention data. Statistically significant and sustained decreases in WA-MRSA rates were identified in all three arms (P<0.001; P=0.015; P<0.001). The mean percentage reduction was 32.3% for wards receiving SPC feedback, 19.6% for wards receiving SPC and diagnostic feedback, and 23.1% for control wards, but with no significant difference between the control and intervention arms (P=0.23). There were significantly more post-intervention 'out-of-control' episodes (P=0.021) in the control arm (averages of 0.60, 0.28, and 0.28 for Control, SPC and SPC+Tools wards, respectively). Participants identified SPC charts as an effective communication tool and valuable for disseminating WA-MRSA data.

  16. Batch statistical process control of a fluid bed granulation process using in-line spatial filter velocimetry and product temperature measurements.

    PubMed

    Burggraeve, A; Van den Kerkhof, T; Hellings, M; Remon, J P; Vervaet, C; De Beer, T

    2011-04-18

    Fluid bed granulation is a batch process, which is characterized by the processing of raw materials for a predefined period of time, consisting of a fixed spraying phase and a subsequent drying period. The present study shows the multivariate statistical modeling and control of a fluid bed granulation process based on in-line particle size distribution (PSD) measurements (using spatial filter velocimetry) combined with continuous product temperature registration using a partial least squares (PLS) approach. Via the continuous in-line monitoring of the PSD and product temperature during granulation of various reference batches, a statistical batch model was developed allowing the real-time evaluation and acceptance or rejection of future batches. Continuously monitored PSD and product temperature process data of 10 reference batches (X-data) were used to develop a reference batch PLS model, regressing the X-data versus the batch process time (Y-data). Two PLS components captured 98.8% of the variation in the X-data block. Score control charts in which the average batch trajectory and upper and lower control limits are displayed were developed. Next, these control charts were used to monitor 4 new test batches in real-time and to immediately detect any deviations from the expected batch trajectory. By real-time evaluation of new batches using the developed control charts and by computation of contribution plots of deviating process behavior at a certain time point, batch losses or reprocessing can be prevented. Immediately after batch completion, all PSD and product temperature information (i.e., a batch progress fingerprint) was used to estimate some granule properties (density and flowability) at an early stage, which can improve batch release time. Individual PLS models relating the computed scores (X) of the reference PLS model (based on the 10 reference batches) and the density, respectively, flowabililty as Y-matrix, were developed. The scores of the 4 test batches were used to examine the predictive ability of the model. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Defense waste processing facility (DWPF) liquids model: revisions for processing higher TIO 2 containing glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jantzen, C. M.; Edwards, T. B.; Trivelpiece, C. L.

    Radioactive high level waste (HLW) at the Savannah River Site (SRS) has successfully been vitrified into borosilicate glass in the Defense Waste Processing Facility (DWPF) since 1996. Vitrification requires stringent product/process (P/P) constraints since the glass cannot be reworked once it is poured into ten foot tall by two foot diameter canisters. A unique “feed forward” statistical process control (SPC) was developed for this control rather than statistical quality control (SQC). In SPC, the feed composition to the DWPF melter is controlled prior to vitrification. In SQC, the glass product would be sampled after it is vitrified. Individual glass property-compositionmore » models form the basis for the “feed forward” SPC. The models transform constraints on the melt and glass properties into constraints on the feed composition going to the melter in order to guarantee, at the 95% confidence level, that the feed will be processable and that the durability of the resulting waste form will be acceptable to a geologic repository. This report documents the development of revised TiO 2, Na 2O, Li 2O and Fe 2O 3 coefficients in the SWPF liquidus model and revised coefficients (a, b, c, and d).« less

  18. 75 FR 64746 - Submission for OMB Review: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-20

    ... Pilot Testing for the BLS Green Practices and Processes Project. OMB Control Number: 1220-0NEW... DEPARTMENT OF LABOR Bureau of Labor Statistics Submission for OMB Review: Comment Request October... Officer for the Department of Labor--Bureau of Labor Statistics (BLS), Office of Management and Budget...

  19. Errors in patient specimen collection: application of statistical process control.

    PubMed

    Dzik, Walter Sunny; Beckman, Neil; Selleng, Kathleen; Heddle, Nancy; Szczepiorkowski, Zbigniew; Wendel, Silvano; Murphy, Michael

    2008-10-01

    Errors in the collection and labeling of blood samples for pretransfusion testing increase the risk of transfusion-associated patient morbidity and mortality. Statistical process control (SPC) is a recognized method to monitor the performance of a critical process. An easy-to-use SPC method was tested to determine its feasibility as a tool for monitoring quality in transfusion medicine. SPC control charts were adapted to a spreadsheet presentation. Data tabulating the frequency of mislabeled and miscollected blood samples from 10 hospitals in five countries from 2004 to 2006 were used to demonstrate the method. Control charts were produced to monitor process stability. The participating hospitals found the SPC spreadsheet very suitable to monitor the performance of the sample labeling and collection and applied SPC charts to suit their specific needs. One hospital monitored subcategories of sample error in detail. A large hospital monitored the number of wrong-blood-in-tube (WBIT) events. Four smaller-sized facilities, each following the same policy for sample collection, combined their data on WBIT samples into a single control chart. One hospital used the control chart to monitor the effect of an educational intervention. A simple SPC method is described that can monitor the process of sample collection and labeling in any hospital. SPC could be applied to other critical steps in the transfusion processes as a tool for biovigilance and could be used to develop regional or national performance standards for pretransfusion sample collection. A link is provided to download the spreadsheet for free.

  20. Automatic detection of health changes using statistical process control techniques on measured transfer times of elderly.

    PubMed

    Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom

    2015-01-01

    It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days.

  1. Statistical process control based chart for information systems security

    NASA Astrophysics Data System (ADS)

    Khan, Mansoor S.; Cui, Lirong

    2015-07-01

    Intrusion detection systems have a highly significant role in securing computer networks and information systems. To assure the reliability and quality of computer networks and information systems, it is highly desirable to develop techniques that detect intrusions into information systems. We put forward the concept of statistical process control (SPC) in computer networks and information systems intrusions. In this article we propose exponentially weighted moving average (EWMA) type quality monitoring scheme. Our proposed scheme has only one parameter which differentiates it from the past versions. We construct the control limits for the proposed scheme and investigate their effectiveness. We provide an industrial example for the sake of clarity for practitioner. We give comparison of the proposed scheme with EWMA schemes and p chart; finally we provide some recommendations for the future work.

  2. 77 FR 5614 - Shipping Coordinating Committee; Notice of Committee Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-03

    ... statistics and investigations; Harmonization of port State control activities; Port State Control (PSC...; Development of guidelines on port State control under the 2004 Ballast Water Management (BWM) Convention... of the room. To facilitate the building security process, and to request reasonable accommodation...

  3. 78 FR 2479 - Shipping Coordinating Committee; Notice of Committee Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-11

    ...); Casualty statistics and investigations; Harmonization of port State control activities; Port State Control... Convention, 2006; Development of guidelines on port State control under the 2004 Ballast Water Management... building security process, and to request reasonable accommodation, those who plan to attend should contact...

  4. Fuzzy Adaptive Control for Intelligent Autonomous Space Exploration Problems

    NASA Technical Reports Server (NTRS)

    Esogbue, Augustine O.

    1998-01-01

    The principal objective of the research reported here is the re-design, analysis and optimization of our newly developed neural network fuzzy adaptive controller model for complex processes capable of learning fuzzy control rules using process data and improving its control through on-line adaption. The learned improvement is according to a performance objective function that provides evaluative feedback; this performance objective is broadly defined to meet long-range goals over time. Although fuzzy control had proven effective for complex, nonlinear, imprecisely-defined processes for which standard models and controls are either inefficient, impractical or cannot be derived, the state of the art prior to our work showed that procedures for deriving fuzzy control, however, were mostly ad hoc heuristics. The learning ability of neural networks was exploited to systematically derive fuzzy control and permit on-line adaption and in the process optimize control. The operation of neural networks integrates very naturally with fuzzy logic. The neural networks which were designed and tested using simulation software and simulated data, followed by realistic industrial data were reconfigured for application on several platforms as well as for the employment of improved algorithms. The statistical procedures of the learning process were investigated and evaluated with standard statistical procedures (such as ANOVA, graphical analysis of residuals, etc.). The computational advantage of dynamic programming-like methods of optimal control was used to permit on-line fuzzy adaptive control. Tests for the consistency, completeness and interaction of the control rules were applied. Comparisons to other methods and controllers were made so as to identify the major advantages of the resulting controller model. Several specific modifications and extensions were made to the original controller. Additional modifications and explorations have been proposed for further study. Some of these are in progress in our laboratory while others await additional support. All of these enhancements will improve the attractiveness of the controller as an effective tool for the on line control of an array of complex process environments.

  5. Functional differences between statistical learning with and without explicit training

    PubMed Central

    Reber, Paul J.; Paller, Ken A.

    2015-01-01

    Humans are capable of rapidly extracting regularities from environmental input, a process known as statistical learning. This type of learning typically occurs automatically, through passive exposure to environmental input. The presumed function of statistical learning is to optimize processing, allowing the brain to more accurately predict and prepare for incoming input. In this study, we ask whether the function of statistical learning may be enhanced through supplementary explicit training, in which underlying regularities are explicitly taught rather than simply abstracted through exposure. Learners were randomly assigned either to an explicit group or an implicit group. All learners were exposed to a continuous stream of repeating nonsense words. Prior to this implicit training, learners in the explicit group received supplementary explicit training on the nonsense words. Statistical learning was assessed through a speeded reaction-time (RT) task, which measured the extent to which learners used acquired statistical knowledge to optimize online processing. Both RTs and brain potentials revealed significant differences in online processing as a function of training condition. RTs showed a crossover interaction; responses in the explicit group were faster to predictable targets and marginally slower to less predictable targets relative to responses in the implicit group. P300 potentials to predictable targets were larger in the explicit group than in the implicit group, suggesting greater recruitment of controlled, effortful processes. Taken together, these results suggest that information abstracted through passive exposure during statistical learning may be processed more automatically and with less effort than information that is acquired explicitly. PMID:26472644

  6. 40 CFR 51.364 - Enforcement against contractors, stations and inspectors.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... suspend or revoke the station or inspector license within three station business days of the finding. (2..., revocations, and violations and shall compile statistics on violations and penalties on an annual basis. (d... approved by the Administrator. Statistical process control shall be used whenever possible to demonstrate...

  7. An Ontology for Identifying Cyber Intrusion Induced Faults in Process Control Systems

    NASA Astrophysics Data System (ADS)

    Hieb, Jeffrey; Graham, James; Guan, Jian

    This paper presents an ontological framework that permits formal representations of process control systems, including elements of the process being controlled and the control system itself. A fault diagnosis algorithm based on the ontological model is also presented. The algorithm can identify traditional process elements as well as control system elements (e.g., IP network and SCADA protocol) as fault sources. When these elements are identified as a likely fault source, the possibility exists that the process fault is induced by a cyber intrusion. A laboratory-scale distillation column is used to illustrate the model and the algorithm. Coupled with a well-defined statistical process model, this fault diagnosis approach provides cyber security enhanced fault diagnosis information to plant operators and can help identify that a cyber attack is underway before a major process failure is experienced.

  8. The estimation of the measurement results with using statistical methods

    NASA Astrophysics Data System (ADS)

    Velychko, O.; Gordiyenko, T.

    2015-02-01

    The row of international standards and guides describe various statistical methods that apply for a management, control and improvement of processes with the purpose of realization of analysis of the technical measurement results. The analysis of international standards and guides on statistical methods estimation of the measurement results recommendations for those applications in laboratories is described. For realization of analysis of standards and guides the cause-and-effect Ishikawa diagrams concerting to application of statistical methods for estimation of the measurement results are constructed.

  9. “Using Statistical Comparisons between SPartICus Cirrus Microphysical Measurements, Detailed Cloud Models, and GCM Cloud Parameterizations to Understand Physical Processes Controlling Cirrus Properties and to Improve the Cloud Parameterizations”

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woods, Sarah

    2015-12-01

    The dual objectives of this project were improving our basic understanding of processes that control cirrus microphysical properties and improvement of the representation of these processes in the parameterizations. A major effort in the proposed research was to integrate, calibrate, and better understand the uncertainties in all of these measurements.

  10. Application of Statistical Quality Control Techniques to Detonator Fabrication: Feasibility Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, J. Frank

    1971-05-20

    A feasibility study was performed on the use of process control techniques which might reduce the need for a duplicate inspection by production inspection and quality control inspection. Two active detonator fabrication programs were selected for the study. Inspection areas accounting for the greatest percentage of total inspection costs were selected by applying "Pareto's Principle of Maldistribution." Data from these areas were then gathered and analyzed by a process capabiltiy study.

  11. Using statistical process control to make data-based clinical decisions.

    PubMed

    Pfadt, A; Wheeler, D J

    1995-01-01

    Applied behavior analysis is based on an investigation of variability due to interrelationships among antecedents, behavior, and consequences. This permits testable hypotheses about the causes of behavior as well as for the course of treatment to be evaluated empirically. Such information provides corrective feedback for making data-based clinical decisions. This paper considers how a different approach to the analysis of variability based on the writings of Walter Shewart and W. Edwards Deming in the area of industrial quality control helps to achieve similar objectives. Statistical process control (SPC) was developed to implement a process of continual product improvement while achieving compliance with production standards and other requirements for promoting customer satisfaction. SPC involves the use of simple statistical tools, such as histograms and control charts, as well as problem-solving techniques, such as flow charts, cause-and-effect diagrams, and Pareto charts, to implement Deming's management philosophy. These data-analytic procedures can be incorporated into a human service organization to help to achieve its stated objectives in a manner that leads to continuous improvement in the functioning of the clients who are its customers. Examples are provided to illustrate how SPC procedures can be used to analyze behavioral data. Issues related to the application of these tools for making data-based clinical decisions and for creating an organizational climate that promotes their routine use in applied settings are also considered.

  12. Terms of Productivity, Including the Relationship Between Productivity, Effectiveness and Efficiency.

    DTIC Science & Technology

    1989-04-01

    for Awareness Juran on Planning for Quality, 1988, J.M. Juran What is Total Quality Control? The Japanese Way, 1985, Kaoru Ishikawa Guide to Quality...Control, 1982, Kaoru Ishikawa Andrews, M. (1985). Statistical Process Control: Mandatory Management Tool. Production April 1985. Bushe, G. (1988

  13. Feasibility study of using statistical process control to customized quality assurance in proton therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rah, Jeong-Eun; Oh, Do Hoon; Shin, Dongho

    Purpose: To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. Methods: The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. Results: The authors established a customized tolerance level of ±2% formore » D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors’ analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. Conclusions: SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.« less

  14. Combat Ration Advanced Manufacturing Technology Demonstration (CRAMTD). ’Generic Inspection-Statistical Process Control System for a Combat Ration Manufacturing Facility’. Short Term Project (STP) Number 3.

    DTIC Science & Technology

    1996-01-01

    failure as due to an adhesive layer between the foil and inner polypropylene layers. "* Under subcontract, NFPA provided HACCP draft manuals for the...parameters of the production process and to ensure that they are within their target values. In addition, a HACCP program was used to assure product...played an important part in implementing Hazard Analysis Critical Control Points ( HACCP ) as part of the Process and Quality Control manual. The National

  15. A survey of statistics in three UK general practice journal

    PubMed Central

    Rigby, Alan S; Armstrong, Gillian K; Campbell, Michael J; Summerton, Nick

    2004-01-01

    Background Many medical specialities have reviewed the statistical content of their journals. To our knowledge this has not been done in general practice. Given the main role of a general practitioner as a diagnostician we thought it would be of interest to see whether the statistical methods reported reflect the diagnostic process. Methods Hand search of three UK journals of general practice namely the British Medical Journal (general practice section), British Journal of General Practice and Family Practice over a one-year period (1 January to 31 December 2000). Results A wide variety of statistical techniques were used. The most common methods included t-tests and Chi-squared tests. There were few articles reporting likelihood ratios and other useful diagnostic methods. There was evidence that the journals with the more thorough statistical review process reported a more complex and wider variety of statistical techniques. Conclusions The BMJ had a wider range and greater diversity of statistical methods than the other two journals. However, in all three journals there was a dearth of papers reflecting the diagnostic process. Across all three journals there were relatively few papers describing randomised controlled trials thus recognising the difficulty of implementing this design in general practice. PMID:15596014

  16. Statistical process control charts for attribute data involving very large sample sizes: a review of problems and solutions.

    PubMed

    Mohammed, Mohammed A; Panesar, Jagdeep S; Laney, David B; Wilson, Richard

    2013-04-01

    The use of statistical process control (SPC) charts in healthcare is increasing. The primary purpose of SPC is to distinguish between common-cause variation which is attributable to the underlying process, and special-cause variation which is extrinsic to the underlying process. This is important because improvement under common-cause variation requires action on the process, whereas special-cause variation merits an investigation to first find the cause. Nonetheless, when dealing with attribute or count data (eg, number of emergency admissions) involving very large sample sizes, traditional SPC charts often produce tight control limits with most of the data points appearing outside the control limits. This can give a false impression of common and special-cause variation, and potentially misguide the user into taking the wrong actions. Given the growing availability of large datasets from routinely collected databases in healthcare, there is a need to present a review of this problem (which arises because traditional attribute charts only consider within-subgroup variation) and its solutions (which consider within and between-subgroup variation), which involve the use of the well-established measurements chart and the more recently developed attribute charts based on Laney's innovative approach. We close by making some suggestions for practice.

  17. Statistical transformation and the interpretation of inpatient glucose control data.

    PubMed

    Saulnier, George E; Castro, Janna C; Cook, Curtiss B

    2014-03-01

    To introduce a statistical method of assessing hospital-based non-intensive care unit (non-ICU) inpatient glucose control. Point-of-care blood glucose (POC-BG) data from hospital non-ICUs were extracted for January 1 through December 31, 2011. Glucose data distribution was examined before and after Box-Cox transformations and compared to normality. Different subsets of data were used to establish upper and lower control limits, and exponentially weighted moving average (EWMA) control charts were constructed from June, July, and October data as examples to determine if out-of-control events were identified differently in nontransformed versus transformed data. A total of 36,381 POC-BG values were analyzed. In all 3 monthly test samples, glucose distributions in nontransformed data were skewed but approached a normal distribution once transformed. Interpretation of out-of-control events from EWMA control chart analyses also revealed differences. In the June test data, an out-of-control process was identified at sample 53 with nontransformed data, whereas the transformed data remained in control for the duration of the observed period. Analysis of July data demonstrated an out-of-control process sooner in the transformed (sample 55) than nontransformed (sample 111) data, whereas for October, transformed data remained in control longer than nontransformed data. Statistical transformations increase the normal behavior of inpatient non-ICU glycemic data sets. The decision to transform glucose data could influence the interpretation and conclusions about the status of inpatient glycemic control. Further study is required to determine whether transformed versus nontransformed data influence clinical decisions or evaluation of interventions.

  18. Design and analysis of multiple diseases genome-wide association studies without controls.

    PubMed

    Chen, Zhongxue; Huang, Hanwen; Ng, Hon Keung Tony

    2012-11-15

    In genome-wide association studies (GWAS), multiple diseases with shared controls is one of the case-control study designs. If data obtained from these studies are appropriately analyzed, this design can have several advantages such as improving statistical power in detecting associations and reducing the time and cost in the data collection process. In this paper, we propose a study design for GWAS which involves multiple diseases but without controls. We also propose corresponding statistical data analysis strategy for GWAS with multiple diseases but no controls. Through a simulation study, we show that the statistical association test with the proposed study design is more powerful than the test with single disease sharing common controls, and it has comparable power to the overall test based on the whole dataset including the controls. We also apply the proposed method to a real GWAS dataset to illustrate the methodologies and the advantages of the proposed design. Some possible limitations of this study design and testing method and their solutions are also discussed. Our findings indicate that the proposed study design and statistical analysis strategy could be more efficient than the usual case-control GWAS as well as those with shared controls. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Statistical process control analysis for patient-specific IMRT and VMAT QA.

    PubMed

    Sanghangthum, Taweap; Suriyapee, Sivalee; Srisatit, Somyot; Pawlicki, Todd

    2013-05-01

    This work applied statistical process control to establish the control limits of the % gamma pass of patient-specific intensity modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) quality assurance (QA), and to evaluate the efficiency of the QA process by using the process capability index (Cpml). A total of 278 IMRT QA plans in nasopharyngeal carcinoma were measured with MapCHECK, while 159 VMAT QA plans were undertaken with ArcCHECK. Six megavolts with nine fields were used for the IMRT plan and 2.5 arcs were used to generate the VMAT plans. The gamma (3%/3 mm) criteria were used to evaluate the QA plans. The % gamma passes were plotted on a control chart. The first 50 data points were employed to calculate the control limits. The Cpml was calculated to evaluate the capability of the IMRT/VMAT QA process. The results showed higher systematic errors in IMRT QA than VMAT QA due to the more complicated setup used in IMRT QA. The variation of random errors was also larger in IMRT QA than VMAT QA because the VMAT plan has more continuity of dose distribution. The average % gamma pass was 93.7% ± 3.7% for IMRT and 96.7% ± 2.2% for VMAT. The Cpml value of IMRT QA was 1.60 and VMAT QA was 1.99, which implied that the VMAT QA process was more accurate than the IMRT QA process. Our lower control limit for % gamma pass of IMRT is 85.0%, while the limit for VMAT is 90%. Both the IMRT and VMAT QA processes are good quality because Cpml values are higher than 1.0.

  20. [Statistical Process Control (SPC) can help prevent treatment errors without increasing costs in radiotherapy].

    PubMed

    Govindarajan, R; Llueguera, E; Melero, A; Molero, J; Soler, N; Rueda, C; Paradinas, C

    2010-01-01

    Statistical Process Control (SPC) was applied to monitor patient set-up in radiotherapy and, when the measured set-up error values indicated a loss of process stability, its root cause was identified and eliminated to prevent set-up errors. Set up errors were measured for medial-lateral (ml), cranial-caudal (cc) and anterior-posterior (ap) dimensions and then the upper control limits were calculated. Once the control limits were known and the range variability was acceptable, treatment set-up errors were monitored using sub-groups of 3 patients, three times each shift. These values were plotted on a control chart in real time. Control limit values showed that the existing variation was acceptable. Set-up errors, measured and plotted on a X chart, helped monitor the set-up process stability and, if and when the stability was lost, treatment was interrupted, the particular cause responsible for the non-random pattern was identified and corrective action was taken before proceeding with the treatment. SPC protocol focuses on controlling the variability due to assignable cause instead of focusing on patient-to-patient variability which normally does not exist. Compared to weekly sampling of set-up error in each and every patient, which may only ensure that just those sampled sessions were set-up correctly, the SPC method enables set-up error prevention in all treatment sessions for all patients and, at the same time, reduces the control costs. Copyright © 2009 SECA. Published by Elsevier Espana. All rights reserved.

  1. On the Stability of Jump-Linear Systems Driven by Finite-State Machines with Markovian Inputs

    NASA Technical Reports Server (NTRS)

    Patilkulkarni, Sudarshan; Herencia-Zapana, Heber; Gray, W. Steven; Gonzalez, Oscar R.

    2004-01-01

    This paper presents two mean-square stability tests for a jump-linear system driven by a finite-state machine with a first-order Markovian input process. The first test is based on conventional Markov jump-linear theory and avoids the use of any higher-order statistics. The second test is developed directly using the higher-order statistics of the machine s output process. The two approaches are illustrated with a simple model for a recoverable computer control system.

  2. Assessment of the beryllium lymphocyte proliferation test using statistical process control.

    PubMed

    Cher, Daniel J; Deubner, David C; Kelsh, Michael A; Chapman, Pamela S; Ray, Rose M

    2006-10-01

    Despite more than 20 years of surveillance and epidemiologic studies using the beryllium blood lymphocyte proliferation test (BeBLPT) as a measure of beryllium sensitization (BeS) and as an aid for diagnosing subclinical chronic beryllium disease (CBD), improvements in specific understanding of the inhalation toxicology of CBD have been limited. Although epidemiologic data suggest that BeS and CBD risks vary by process/work activity, it has proven difficult to reach specific conclusions regarding the dose-response relationship between workplace beryllium exposure and BeS or subclinical CBD. One possible reason for this uncertainty could be misclassification of BeS resulting from variation in BeBLPT testing performance. The reliability of the BeBLPT, a biological assay that measures beryllium sensitization, is unknown. To assess the performance of four laboratories that conducted this test, we used data from a medical surveillance program that offered testing for beryllium sensitization with the BeBLPT. The study population was workers exposed to beryllium at various facilities over a 10-year period (1992-2001). Workers with abnormal results were offered diagnostic workups for CBD. Our analyses used a standard statistical technique, statistical process control (SPC), to evaluate test reliability. The study design involved a repeated measures analysis of BeBLPT results generated from the company-wide, longitudinal testing. Analytical methods included use of (1) statistical process control charts that examined temporal patterns of variation for the stimulation index, a measure of cell reactivity to beryllium; (2) correlation analysis that compared prior perceptions of BeBLPT instability to the statistical measures of test variation; and (3) assessment of the variation in the proportion of missing test results and how time periods with more missing data influenced SPC findings. During the period of this study, all laboratories displayed variation in test results that were beyond what would be expected due to chance alone. Patterns of test results suggested that variations were systematic. We conclude that laboratories performing the BeBLPT or other similar biological assays of immunological response could benefit from a statistical approach such as SPC to improve quality management.

  3. Graphene growth process modeling: a physical-statistical approach

    NASA Astrophysics Data System (ADS)

    Wu, Jian; Huang, Qiang

    2014-09-01

    As a zero-band semiconductor, graphene is an attractive material for a wide variety of applications such as optoelectronics. Among various techniques developed for graphene synthesis, chemical vapor deposition on copper foils shows high potential for producing few-layer and large-area graphene. Since fabrication of high-quality graphene sheets requires the understanding of growth mechanisms, and methods of characterization and control of grain size of graphene flakes, analytical modeling of graphene growth process is therefore essential for controlled fabrication. The graphene growth process starts with randomly nucleated islands that gradually develop into complex shapes, grow in size, and eventually connect together to cover the copper foil. To model this complex process, we develop a physical-statistical approach under the assumption of self-similarity during graphene growth. The growth kinetics is uncovered by separating island shapes from area growth rate. We propose to characterize the area growth velocity using a confined exponential model, which not only has clear physical explanation, but also fits the real data well. For the shape modeling, we develop a parametric shape model which can be well explained by the angular-dependent growth rate. This work can provide useful information for the control and optimization of graphene growth process on Cu foil.

  4. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    NASA Astrophysics Data System (ADS)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization model and dashboard that demonstrates the use of statistical methods, statistical process control, sensitivity analysis, quantitative and optimization techniques to establish a baseline and predict future customer satisfaction index scores (outcomes). The American Customer Satisfaction Index (ACSI) model and industry benchmarks were used as a framework for the simulation model.

  5. Automated process control for plasma etching

    NASA Astrophysics Data System (ADS)

    McGeown, Margaret; Arshak, Khalil I.; Murphy, Eamonn

    1992-06-01

    This paper discusses the development and implementation of a rule-based system which assists in providing automated process control for plasma etching. The heart of the system is to establish a correspondence between a particular data pattern -- sensor or data signals -- and one or more modes of failure, i.e., a data-driven monitoring approach. The objective of this rule based system, PLETCHSY, is to create a program combining statistical process control (SPC) and fault diagnosis to help control a manufacturing process which varies over time. This can be achieved by building a process control system (PCS) with the following characteristics. A facility to monitor the performance of the process by obtaining and analyzing the data relating to the appropriate process variables. Process sensor/status signals are input into an SPC module. If trends are present, the SPC module outputs the last seven control points, a pattern which is represented by either regression or scoring. The pattern is passed to the rule-based module. When the rule-based system recognizes a pattern, it starts the diagnostic process using the pattern. If the process is considered to be going out of control, advice is provided about actions which should be taken to bring the process back into control.

  6. Ready-to-Use Simulation: Demystifying Statistical Process Control

    ERIC Educational Resources Information Center

    Sumukadas, Narendar; Fairfield-Sonn, James W.; Morgan, Sandra

    2005-01-01

    Business students are typically introduced to the concept of process management in their introductory course on operations management. A very important learning outcome here is an appreciation that the management of processes is a key to the management of quality. Some of the related concepts are qualitative, such as strategic and behavioral…

  7. Computer aided statistical process control for on-line instrumentation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meils, D.E.

    1995-01-01

    On-line chemical process instrumentation historically has been used for trending. Recent technological advances in on-line instrumentation have improved the accuracy and reliability of on-line instrumentation. However, little attention has been given to validating and verifying on-line instrumentation. This paper presents two practical approaches for validating instrument performance by comparison of on-line instrument response to either another portable instrument or another bench instrument. Because the comparison of two instruments` performance to each other requires somewhat complex statistical calculations, a computer code (Lab Stats Pack{reg_sign}) is used to simplify the calculations. Lab Stats Pack{reg_sign} also develops control charts that may be usedmore » for continuous verification of on-line instrument performance.« less

  8. Understanding medical group financial and operational performance: the synergistic effect of linking statistical process control and profit and loss.

    PubMed

    Smolko, J R; Greisler, D S

    2001-01-01

    There is ongoing pressure for medical groups owned by not-for-profit health care systems or for-profit entrepreneurs to generate profit. The fading promise of superior strategy through health care integration has boards of directors clamoring for bottom-line performance. While prudent, sole focus on the bottom line through the lens of the profit-and-loss (P&L) statement provides incomplete information upon which to base executive decisions. The purpose of this paper is to suggest that placing statistical process control (SPC) charts in tandem with the P&L statement provides a more complete picture of medical group performance thereby optimizing decision making as executives deal with the whitewater issues surrounding physician practice ownership.

  9. A quality improvement project using statistical process control methods for type 2 diabetes control in a resource-limited setting.

    PubMed

    Flood, David; Douglas, Kate; Goldberg, Vera; Martinez, Boris; Garcia, Pablo; Arbour, MaryCatherine; Rohloff, Peter

    2017-08-01

    Quality improvement (QI) is a key strategy for improving diabetes care in low- and middle-income countries (LMICs). This study reports on a diabetes QI project in rural Guatemala whose primary aim was to improve glycemic control of a panel of adult diabetes patients. Formative research suggested multiple areas for programmatic improvement in ambulatory diabetes care. This project utilized the Model for Improvement and Agile Global Health, our organization's complementary healthcare implementation framework. A bundle of improvement activities were implemented at the home, clinic and institutional level. Control charts of mean hemoglobin A1C (HbA1C) and proportion of patients meeting target HbA1C showed improvement as special cause variation was identified 3 months after the intervention began. Control charts for secondary process measures offered insights into the value of different components of the intervention. Intensity of home-based diabetes education emerged as an important driver of panel glycemic control. Diabetes QI work is feasible in resource-limited settings in LMICs and can improve glycemic control. Statistical process control charts are a promising methodology for use with panels or registries of diabetes patients. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  10. A System-Oriented Approach for the Optimal Control of Process Chains under Stochastic Influences

    NASA Astrophysics Data System (ADS)

    Senn, Melanie; Schäfer, Julian; Pollak, Jürgen; Link, Norbert

    2011-09-01

    Process chains in manufacturing consist of multiple connected processes in terms of dynamic systems. The properties of a product passing through such a process chain are influenced by the transformation of each single process. There exist various methods for the control of individual processes, such as classical state controllers from cybernetics or function mapping approaches realized by statistical learning. These controllers ensure that a desired state is obtained at process end despite of variations in the input and disturbances. The interactions between the single processes are thereby neglected, but play an important role in the optimization of the entire process chain. We divide the overall optimization into two phases: (1) the solution of the optimization problem by Dynamic Programming to find the optimal control variable values for each process for any encountered end state of its predecessor and (2) the application of the optimal control variables at runtime for the detected initial process state. The optimization problem is solved by selecting adequate control variables for each process in the chain backwards based on predefined quality requirements for the final product. For the demonstration of the proposed concept, we have chosen a process chain from sheet metal manufacturing with simplified transformation functions.

  11. Licorice Production and Manufacturing: All-Sorts of Practical Applications for Statistics

    ERIC Educational Resources Information Center

    Watson, Jane; Skalicky, Jane; Fitzallen, Noleine; Wright, Suzie

    2009-01-01

    Among the practical applications of statistics is the collection of data from manufacturing processes. Often collected in the form of a time series, data collected from a series of measurements show the variation in those measurements, such as mass of a product manufactured. Limits are set for quality control and if these are exceeded then a…

  12. Estimating the Time to Benefit for Preventive Drugs with the Statistical Process Control Method: An Example with Alendronate.

    PubMed

    van de Glind, Esther M M; Willems, Hanna C; Eslami, Saeid; Abu-Hanna, Ameen; Lems, Willem F; Hooft, Lotty; de Rooij, Sophia E; Black, Dennis M; van Munster, Barbara C

    2016-05-01

    For physicians dealing with patients with a limited life expectancy, knowing the time to benefit (TTB) of preventive medication is essential to support treatment decisions. The aim of this study was to investigate the usefulness of statistical process control (SPC) for determining the TTB in relation to fracture risk with alendronate versus placebo in postmenopausal women. We performed a post hoc analysis of the Fracture Intervention Trial (FIT), a randomized, controlled trial that investigated the effect of alendronate versus placebo on fracture risk in postmenopausal women. We used SPC, a statistical method used for monitoring processes for quality control, to determine if and when the intervention group benefited significantly more than the control group. SPC discriminated between the normal variations over time in the numbers of fractures in both groups and the variations that were attributable to alendronate. The TTB was defined as the time point from which the cumulative difference in the number of clinical fractures remained greater than the upper control limit on the SPC chart. For the total group, the TTB was defined as 11 months. For patients aged ≥70 years, the TTB was 8 months [absolute risk reduction (ARR) = 1.4%]; for patients aged <70 years, it was 19 months (ARR = 0.7%). SPC is a clear and understandable graphical method to determine the TTB. Its main advantage is that there is no need to define a prespecified time point, as is the case in traditional survival analyses. Prescribing alendronate to patients who are aged ≥70 years is useful because the TTB shows that they will benefit after 8 months. Investigators should report the TTB to simplify clinical decision making.

  13. Output statistics of laser anemometers in sparsely seeded flows

    NASA Technical Reports Server (NTRS)

    Edwards, R. V.; Jensen, A. S.

    1982-01-01

    It is noted that until very recently, research on this topic concentrated on the particle arrival statistics and the influence of the optical parameters on them. Little attention has been paid to the influence of subsequent processing on the measurement statistics. There is also controversy over whether the effects of the particle statistics can be measured. It is shown here that some of the confusion derives from a lack of understanding of the experimental parameters that are to be controlled or known. A rigorous framework is presented for examining the measurement statistics of such systems. To provide examples, two problems are then addressed. The first has to do with a sample and hold processor, the second with what is called a saturable processor. The sample and hold processor converts the output to a continuous signal by holding the last reading until a new one is obtained. The saturable system is one where the maximum processable rate is arrived at by the dead time of some unit in the system. At high particle rates, the processed rate is determined through the dead time.

  14. Plan delivery quality assurance for CyberKnife: Statistical process control analysis of 350 film-based patient-specific QAs.

    PubMed

    Bellec, J; Delaby, N; Jouyaux, F; Perdrieux, M; Bouvier, J; Sorel, S; Henry, O; Lafond, C

    2017-07-01

    Robotic radiosurgery requires plan delivery quality assurance (DQA) but there has never been a published comprehensive analysis of a patient-specific DQA process in a clinic. We proposed to evaluate 350 consecutive film-based patient-specific DQAs using statistical process control. We evaluated the performance of the process to propose achievable tolerance criteria for DQA validation and we sought to identify suboptimal DQA using control charts. DQAs were performed on a CyberKnife-M6 using Gafchromic-EBT3 films. The signal-to-dose conversion was performed using a multichannel-correction and a scanning protocol that combined measurement and calibration in a single scan. The DQA analysis comprised a gamma-index analysis at 3%/1.5mm and a separate evaluation of spatial and dosimetric accuracy of the plan delivery. Each parameter was plotted on a control chart and control limits were calculated. A capability index (Cpm) was calculated to evaluate the ability of the process to produce results within specifications. The analysis of capability showed that a gamma pass rate of 85% at 3%/1.5mm was highly achievable as acceptance criteria for DQA validation using a film-based protocol (Cpm>1.33). 3.4% of DQA were outside a control limit of 88% for gamma pass-rate. The analysis of the out-of-control DQA helped identify a dosimetric error in our institute for a specific treatment type. We have defined initial tolerance criteria for DQA validations. We have shown that the implementation of a film-based patient-specific DQA protocol with the use of control charts is an effective method to improve patient treatment safety on CyberKnife. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  15. ON CONTINUOUS-REVIEW (S-1,S) INVENTORY POLICIES WITH STATE-DEPENDENT LEADTIMES,

    DTIC Science & Technology

    INVENTORY CONTROL, *REPLACEMENT THEORY), MATHEMATICAL MODELS, LEAD TIME , MANAGEMENT ENGINEERING, DISTRIBUTION FUNCTIONS, PROBABILITY, QUEUEING THEORY, COSTS, OPTIMIZATION, STATISTICAL PROCESSES, DIFFERENCE EQUATIONS

  16. A tale of two audits: statistical process control for improving diabetes care in primary care settings.

    PubMed

    Al-Hussein, Fahad Abdullah

    2008-01-01

    Diabetes constitutes a major burden of disease globally. Both primary and secondary prevention need to improve in order to face this challenge. Improving management of diabetes in primary care is therefore of fundamental importance. The objective of these series of audits was to find means of improving diabetes management in chronic disease mini-clinics in primary health care. In the process, we were able to study the effect and practical usefulness of different audit designs - those measuring clinical outcomes, process of care, or both. King Saud City Family and Community Medicine Centre, Saudi National Guard Health Affairs in Riyadh city, Saudi Arabia. Simple random samples of 30 files were selected every two weeks from a sampling frame of file numbers for all diabetes clients seen over the period. Information was transferred to a form, entered on the computer and an automated response was generated regarding the appropriateness of management, a criterion mutually agreed upon by care providers. The results were plotted on statistical process control charts, p charts, displayed for all employees. Data extraction, archiving, entry, analysis, plotting and design and preparation of p charts were managed by nursing staff specially trained for the purpose by physicians with relevant previous experience. Audit series with mixed outcome and process measures failed to detect any changes in the proportion of non-conforming cases over a period of one year. The process measures series, on the other hand, showed improvement in care corresponding to a reduction in the proportion non-conforming by 10% within a period of 3 months. Non-conformities dropped from a mean of 5.0 to 1.4 over the year (P < 0.001). It is possible to improve providers' behaviour regarding implementation of given guidelines through periodic process audits and feedbacks. Frequent process audits in the context of statistical process control should be supplemented with concurrent outcome audits, once or twice a year.

  17. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. Part 2: Theoretical development of a dynamic model and application to rain fade durations and tolerable control delays for fade countermeasures

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1987-01-01

    A dynamic rain attenuation prediction model is developed for use in obtaining the temporal characteristics, on time scales of minutes or hours, of satellite communication link availability. Analagous to the associated static rain attenuation model, which yields yearly attenuation predictions, this dynamic model is applicable at any location in the world that is characterized by the static rain attenuation statistics peculiar to the geometry of the satellite link and the rain statistics of the location. Such statistics are calculated by employing the formalism of Part I of this report. In fact, the dynamic model presented here is an extension of the static model and reduces to the static model in the appropriate limit. By assuming that rain attenuation is dynamically described by a first-order stochastic differential equation in time and that this random attenuation process is a Markov process, an expression for the associated transition probability is obtained by solving the related forward Kolmogorov equation. This transition probability is then used to obtain such temporal rain attenuation statistics as attenuation durations and allowable attenuation margins versus control system delay.

  18. Interactive Video: Meeting the Ford Challenge.

    ERIC Educational Resources Information Center

    Copeland, Peter

    Many companies using Statistical Process Control (SPC) in their manufacturing processes have found that, despite the training difficulties presented by the technique, the rewards of successful SPC include increased productivity, quality, and market leadership. The Ford Motor Company has developed its SPC training with interactive video, which…

  19. 14 CFR 21.303 - Replacement and modification parts.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... determination can be made. Statistical quality control procedures may be employed where it is shown that a... AIRCRAFT CERTIFICATION PROCEDURES FOR PRODUCTS AND PARTS Approval of Materials, Parts, Processes, and... the configuration of the part; and (ii) Information on dimensions, materials, and processes necessary...

  20. Adaptive filtering in biological signal processing.

    PubMed

    Iyer, V K; Ploysongsang, Y; Ramamoorthy, P A

    1990-01-01

    The high dependence of conventional optimal filtering methods on the a priori knowledge of the signal and noise statistics render them ineffective in dealing with signals whose statistics cannot be predetermined accurately. Adaptive filtering methods offer a better alternative, since the a priori knowledge of statistics is less critical, real time processing is possible, and the computations are less expensive for this approach. Adaptive filtering methods compute the filter coefficients "on-line", converging to the optimal values in the least-mean square (LMS) error sense. Adaptive filtering is therefore apt for dealing with the "unknown" statistics situation and has been applied extensively in areas like communication, speech, radar, sonar, seismology, and biological signal processing and analysis for channel equalization, interference and echo canceling, line enhancement, signal detection, system identification, spectral analysis, beamforming, modeling, control, etc. In this review article adaptive filtering in the context of biological signals is reviewed. An intuitive approach to the underlying theory of adaptive filters and its applicability are presented. Applications of the principles in biological signal processing are discussed in a manner that brings out the key ideas involved. Current and potential future directions in adaptive biological signal processing are also discussed.

  1. Statistical process control applied to mechanized peanut sowing as a function of soil texture.

    PubMed

    Zerbato, Cristiano; Furlani, Carlos Eduardo Angeli; Ormond, Antonio Tassio Santana; Gírio, Lucas Augusto da Silva; Carneiro, Franciele Morlin; da Silva, Rouverson Pereira

    2017-01-01

    The successful establishment of agricultural crops depends on sowing quality, machinery performance, soil type and conditions, among other factors. This study evaluates the operational quality of mechanized peanut sowing in three soil types (sand, silt, and clay) with variable moisture contents. The experiment was conducted in three locations in the state of São Paulo, Brazil. The track-sampling scheme was used for 80 sampling locations of each soil type. Descriptive statistics and statistical process control (SPC) were used to evaluate the quality indicators of mechanized peanut sowing. The variables had normal distributions and were stable from the viewpoint of SPC. The best performance for peanut sowing density, normal spacing, and the initial seedling growing stand was found for clayey soil followed by sandy soil and then silty soil. Sandy or clayey soils displayed similar results regarding sowing depth, which was deeper than in the silty soil. Overall, the texture and the moisture of clayey soil provided the best operational performance for mechanized peanut sowing.

  2. Statistical process control applied to mechanized peanut sowing as a function of soil texture

    PubMed Central

    Furlani, Carlos Eduardo Angeli; da Silva, Rouverson Pereira

    2017-01-01

    The successful establishment of agricultural crops depends on sowing quality, machinery performance, soil type and conditions, among other factors. This study evaluates the operational quality of mechanized peanut sowing in three soil types (sand, silt, and clay) with variable moisture contents. The experiment was conducted in three locations in the state of São Paulo, Brazil. The track-sampling scheme was used for 80 sampling locations of each soil type. Descriptive statistics and statistical process control (SPC) were used to evaluate the quality indicators of mechanized peanut sowing. The variables had normal distributions and were stable from the viewpoint of SPC. The best performance for peanut sowing density, normal spacing, and the initial seedling growing stand was found for clayey soil followed by sandy soil and then silty soil. Sandy or clayey soils displayed similar results regarding sowing depth, which was deeper than in the silty soil. Overall, the texture and the moisture of clayey soil provided the best operational performance for mechanized peanut sowing. PMID:28742095

  3. Mutual interference between statistical summary perception and statistical learning.

    PubMed

    Zhao, Jiaying; Ngo, Nhi; McKendrick, Ryan; Turk-Browne, Nicholas B

    2011-09-01

    The visual system is an efficient statistician, extracting statistical summaries over sets of objects (statistical summary perception) and statistical regularities among individual objects (statistical learning). Although these two kinds of statistical processing have been studied extensively in isolation, their relationship is not yet understood. We first examined how statistical summary perception influences statistical learning by manipulating the task that participants performed over sets of objects containing statistical regularities (Experiment 1). Participants who performed a summary task showed no statistical learning of the regularities, whereas those who performed control tasks showed robust learning. We then examined how statistical learning influences statistical summary perception by manipulating whether the sets being summarized contained regularities (Experiment 2) and whether such regularities had already been learned (Experiment 3). The accuracy of summary judgments improved when regularities were removed and when learning had occurred in advance. In sum, calculating summary statistics impeded statistical learning, and extracting statistical regularities impeded statistical summary perception. This mutual interference suggests that statistical summary perception and statistical learning are fundamentally related.

  4. Statistical process control for electron beam monitoring.

    PubMed

    López-Tarjuelo, Juan; Luquero-Llopis, Naika; García-Mollá, Rafael; Quirós-Higueras, Juan David; Bouché-Babiloni, Ana; Juan-Senabre, Xavier Jordi; de Marco-Blancas, Noelia; Ferrer-Albiach, Carlos; Santos-Serra, Agustín

    2015-07-01

    To assess the electron beam monitoring statistical process control (SPC) in linear accelerator (linac) daily quality control. We present a long-term record of our measurements and evaluate which SPC-led conditions are feasible for maintaining control. We retrieved our linac beam calibration, symmetry, and flatness daily records for all electron beam energies from January 2008 to December 2013, and retrospectively studied how SPC could have been applied and which of its features could be used in the future. A set of adjustment interventions designed to maintain these parameters under control was also simulated. All phase I data was under control. The dose plots were characterized by rising trends followed by steep drops caused by our attempts to re-center the linac beam calibration. Where flatness and symmetry trends were detected they were less-well defined. The process capability ratios ranged from 1.6 to 9.3 at a 2% specification level. Simulated interventions ranged from 2% to 34% of the total number of measurement sessions. We also noted that if prospective SPC had been applied it would have met quality control specifications. SPC can be used to assess the inherent variability of our electron beam monitoring system. It can also indicate whether a process is capable of maintaining electron parameters under control with respect to established specifications by using a daily checking device, but this is not practical unless a method to establish direct feedback from the device to the linac can be devised. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  5. Statistical Methods for Quality Control of Steel Coils Manufacturing Process using Generalized Linear Models

    NASA Astrophysics Data System (ADS)

    García-Díaz, J. Carlos

    2009-11-01

    Fault detection and diagnosis is an important problem in process engineering. Process equipments are subject to malfunctions during operation. Galvanized steel is a value added product, furnishing effective performance by combining the corrosion resistance of zinc with the strength and formability of steel. Fault detection and diagnosis is an important problem in continuous hot dip galvanizing and the increasingly stringent quality requirements in automotive industry has also demanded ongoing efforts in process control to make the process more robust. When faults occur, they change the relationship among these observed variables. This work compares different statistical regression models proposed in the literature for estimating the quality of galvanized steel coils on the basis of short time histories. Data for 26 batches were available. Five variables were selected for monitoring the process: the steel strip velocity, four bath temperatures and bath level. The entire data consisting of 48 galvanized steel coils was divided into sets. The first training data set was 25 conforming coils and the second data set was 23 nonconforming coils. Logistic regression is a modeling tool in which the dependent variable is categorical. In most applications, the dependent variable is binary. The results show that the logistic generalized linear models do provide good estimates of quality coils and can be useful for quality control in manufacturing process.

  6. Importance of good manufacturing practices in microbiological monitoring in processing human tissues for transplant.

    PubMed

    Pianigiani, Elisa; Ierardi, Francesca; Fimiani, Michele

    2013-12-01

    Skin allografts represent an important therapeutic resource in the treatment of severe skin loss. The risk associated with application of processed tissues in humans is very low, however, human material always carries the risk of disease transmission. To minimise the risk of contamination of grafts, processing is carried out in clean rooms where air quality is monitored. Procedures and quality control tests are performed to standardise the production process and to guarantee the final product for human use. Since we only validate and distribute aseptic tissues, we conducted a study to determine what type of quality controls for skin processing are the most suitable for detecting processing errors and intercurrent contamination, and for faithfully mapping the process without unduly increasing production costs. Two different methods for quality control were statistically compared using the Fisher exact test. On the basis of the current study we selected our quality control procedure based on pre- and post-processing tissue controls, operator and environmental controls. Evaluation of the predictability of our control methods showed that tissue control was the most reliable method of revealing microbial contamination of grafts. We obtained 100 % sensitivity by doubling tissue controls, while maintaining high specificity (77 %).

  7. A quality improvement project to improve the Medicare and Medicaid Services (CMS) sepsis bundle compliance rate in a large healthcare system.

    PubMed

    Raschke, Robert A; Groves, Robert H; Khurana, Hargobind S; Nikhanj, Nidhi; Utter, Ethel; Hartling, Didi; Stoffer, Brenda; Nunn, Kristina; Tryon, Shona; Bruner, Michelle; Calleja, Maria; Curry, Steven C

    2017-01-01

    Sepsis is a leading cause of mortality and morbidity in hospitalised patients. The Centers for Medicare and Medicaid Services (CMS) mandated that US hospitals report sepsis bundle compliance rate as a quality process measure in October 2015. The specific aim of our study was to improve the CMS sepsis bundle compliance rate from 30% to 40% across 20 acute care hospitals in our healthcare system within 1 year. The study included all adult inpatients with sepsis sampled according to CMS specifications from October 2015 to September 2016. The CMS sepsis bundle compliance rate was tracked monthly using statistical process control charting. A baseline rate of 28.5% with 99% control limits was established. We implemented multiple interventions including computerised decision support systems (CDSSs) to increase compliance with the most commonly missing bundle elements. Compliance reached 42% (99% statistical process control limits 18.4%-38.6%) as CDSS was implemented system-wide, but this improvement was not sustained after CMS changed specifications of the outcome measure. Difficulties encountered elucidate shortcomings of our study methodology and of the CMS sepsis bundle compliance rate as a quality process measure.

  8. 25 CFR 542.42 - What are the minimum internal control standards for internal audit for Tier C gaming operations?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... reconciliation process; (ii) Pull tabs, including but not limited to, statistical records, winner verification... 25 Indians 2 2010-04-01 2010-04-01 false What are the minimum internal control standards for... COMMISSION, DEPARTMENT OF THE INTERIOR HUMAN SERVICES MINIMUM INTERNAL CONTROL STANDARDS § 542.42 What are...

  9. Faculty Return to Industry. Final Report.

    ERIC Educational Resources Information Center

    Hulse, David R.

    Lagging productivity and increased competitiveness from abroad are forcing U.S. management companies to adopt new management styles that promote efficiency and quality in production. High on the agenda is the incorporation of Statistical Process Control (SPC). SPC requires definition of process and product specifications to facilitate their…

  10. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  11. Confidence limits for contribution plots in multivariate statistical process control using bootstrap estimates.

    PubMed

    Babamoradi, Hamid; van den Berg, Frans; Rinnan, Åsmund

    2016-02-18

    In Multivariate Statistical Process Control, when a fault is expected or detected in the process, contribution plots are essential for operators and optimization engineers in identifying those process variables that were affected by or might be the cause of the fault. The traditional way of interpreting a contribution plot is to examine the largest contributing process variables as the most probable faulty ones. This might result in false readings purely due to the differences in natural variation, measurement uncertainties, etc. It is more reasonable to compare variable contributions for new process runs with historical results achieved under Normal Operating Conditions, where confidence limits for contribution plots estimated from training data are used to judge new production runs. Asymptotic methods cannot provide confidence limits for contribution plots, leaving re-sampling methods as the only option. We suggest bootstrap re-sampling to build confidence limits for all contribution plots in online PCA-based MSPC. The new strategy to estimate CLs is compared to the previously reported CLs for contribution plots. An industrial batch process dataset was used to illustrate the concepts. Copyright © 2016 Elsevier B.V. All rights reserved.

  12. Multifractal Properties of Process Control Variables

    NASA Astrophysics Data System (ADS)

    Domański, Paweł D.

    2017-06-01

    Control system is an inevitable element of any industrial installation. Its quality affects overall process performance significantly. The assessment, whether control system needs any improvement or not, requires relevant and constructive measures. There are various methods, like time domain based, Minimum Variance, Gaussian and non-Gaussian statistical factors, fractal and entropy indexes. Majority of approaches use time series of control variables. They are able to cover many phenomena. But process complexities and human interventions cause effects that are hardly visible for standard measures. It is shown that the signals originating from industrial installations have multifractal properties and such an analysis may extend standard approach to further observations. The work is based on industrial and simulation data. The analysis delivers additional insight into the properties of control system and the process. It helps to discover internal dependencies and human factors, which are hardly detectable.

  13. A Simple Graphical Method for Quantification of Disaster Management Surge Capacity Using Computer Simulation and Process-control Tools.

    PubMed

    Franc, Jeffrey Michael; Ingrassia, Pier Luigi; Verde, Manuela; Colombo, Davide; Della Corte, Francesco

    2015-02-01

    Surge capacity, or the ability to manage an extraordinary volume of patients, is fundamental for hospital management of mass-casualty incidents. However, quantification of surge capacity is difficult and no universal standard for its measurement has emerged, nor has a standardized statistical method been advocated. As mass-casualty incidents are rare, simulation may represent a viable alternative to measure surge capacity. Hypothesis/Problem The objective of the current study was to develop a statistical method for the quantification of surge capacity using a combination of computer simulation and simple process-control statistical tools. Length-of-stay (LOS) and patient volume (PV) were used as metrics. The use of this method was then demonstrated on a subsequent computer simulation of an emergency department (ED) response to a mass-casualty incident. In the derivation phase, 357 participants in five countries performed 62 computer simulations of an ED response to a mass-casualty incident. Benchmarks for ED response were derived from these simulations, including LOS and PV metrics for triage, bed assignment, physician assessment, and disposition. In the application phase, 13 students of the European Master in Disaster Medicine (EMDM) program completed the same simulation scenario, and the results were compared to the standards obtained in the derivation phase. Patient-volume metrics included number of patients to be triaged, assigned to rooms, assessed by a physician, and disposed. Length-of-stay metrics included median time to triage, room assignment, physician assessment, and disposition. Simple graphical methods were used to compare the application phase group to the derived benchmarks using process-control statistical tools. The group in the application phase failed to meet the indicated standard for LOS from admission to disposition decision. This study demonstrates how simulation software can be used to derive values for objective benchmarks of ED surge capacity using PV and LOS metrics. These objective metrics can then be applied to other simulation groups using simple graphical process-control tools to provide a numeric measure of surge capacity. Repeated use in simulations of actual EDs may represent a potential means of objectively quantifying disaster management surge capacity. It is hoped that the described statistical method, which is simple and reusable, will be useful for investigators in this field to apply to their own research.

  14. Development of a Comprehensive Digital Avionics Curriculum for the Aeronautical Engineer

    DTIC Science & Technology

    2006-03-01

    able to analyze and design aircraft and missile guidance and control systems, including feedback stabilization schemes and stochastic processes, using ...Uncertainty modeling for robust control; Robust closed-loop stability and performance; Robust H- infinity control; Robustness check using mu-analysis...Controlled feedback (reduces noise) 3. Statistical group response (reduce pressure toward conformity) When used as a tool to study a complex problem

  15. Monitoring the healing process of rat bones using Raman spectroscopy

    NASA Astrophysics Data System (ADS)

    Gamulin, O.; Serec, K.; Bilić, V.; Balarin, M.; Kosović, M.; Drmić, D.; Brčić, L.; Seiwerth, S.; Sikirić, P.

    2013-07-01

    The healing effect of BPC 157 on rat femoral head osteonecrosis was monitored by Raman spectroscopy. Three groups of rats were defined: an injured group treated with BPC 157 (10 μg/kg/daily ip), an injured control group (treated with saline, 5 ml/kg/daily ip), and an uninjured healthy group. The spectra were recorded and the healing effect assessed on samples harvested from animals which were sacrificed 3 and 6 weeks after being injured. The statistical analysis of the recorded spectra showed statistical differences between the BPC 157-treated, control, and healthy groups of animals. In particular, after 6 weeks the spectral resemblance between the healthy and BPC 157 samples indicated a positive BPC 157 influence on the healing process of rat femoral head.

  16. The psychophysiology of real-time financial risk processing.

    PubMed

    Lo, Andrew W; Repin, Dmitry V

    2002-04-01

    A longstanding controversy in economics and finance is whether financial markets are governed by rational forces or by emotional responses. We study the importance of emotion in the decision-making process of professional securities traders by measuring their physiological characteristics (e.g., skin conductance, blood volume pulse, etc.) during live trading sessions while simultaneously capturing real-time prices from which market events can be detected. In a sample of 10 traders, we find statistically significant differences in mean electrodermal responses during transient market events relative to no-event control periods, and statistically significant mean changes in cardiovascular variables during periods of heightened market volatility relative to normal-volatility control periods. We also observe significant differences in these physiological responses across the 10 traders that may be systematically related to the traders' levels of experience.

  17. Statistical analysis and digital processing of the Mössbauer spectra

    NASA Astrophysics Data System (ADS)

    Prochazka, Roman; Tucek, Pavel; Tucek, Jiri; Marek, Jaroslav; Mashlan, Miroslav; Pechousek, Jiri

    2010-02-01

    This work is focused on using the statistical methods and development of the filtration procedures for signal processing in Mössbauer spectroscopy. Statistical tools for noise filtering in the measured spectra are used in many scientific areas. The use of a pure statistical approach in accumulated Mössbauer spectra filtration is described. In Mössbauer spectroscopy, the noise can be considered as a Poisson statistical process with a Gaussian distribution for high numbers of observations. This noise is a superposition of the non-resonant photons counting with electronic noise (from γ-ray detection and discrimination units), and the velocity system quality that can be characterized by the velocity nonlinearities. The possibility of a noise-reducing process using a new design of statistical filter procedure is described. This mathematical procedure improves the signal-to-noise ratio and thus makes it easier to determine the hyperfine parameters of the given Mössbauer spectra. The filter procedure is based on a periodogram method that makes it possible to assign the statistically important components in the spectral domain. The significance level for these components is then feedback-controlled using the correlation coefficient test results. The estimation of the theoretical correlation coefficient level which corresponds to the spectrum resolution is performed. Correlation coefficient test is based on comparison of the theoretical and the experimental correlation coefficients given by the Spearman method. The correctness of this solution was analyzed by a series of statistical tests and confirmed by many spectra measured with increasing statistical quality for a given sample (absorber). The effect of this filter procedure depends on the signal-to-noise ratio and the applicability of this method has binding conditions.

  18. The utilization of six sigma and statistical process control techniques in surgical quality improvement.

    PubMed

    Sedlack, Jeffrey D

    2010-01-01

    Surgeons have been slow to incorporate industrial reliability techniques. Process control methods were applied to surgeon waiting time between cases, and to length of stay (LOS) after colon surgery. Waiting times between surgeries were evaluated by auditing the operating room records of a single hospital over a 1-month period. The medical records of 628 patients undergoing colon surgery over a 5-year period were reviewed. The average surgeon wait time between cases was 53 min, and the busiest surgeon spent 291/2 hr in 1 month waiting between surgeries. Process control charting demonstrated poor overall control of the room turnover process. Average LOS after colon resection also demonstrated very poor control. Mean LOS was 10 days. Weibull's conditional analysis revealed a conditional LOS of 9.83 days. Serious process management problems were identified in both analyses. These process issues are both expensive and adversely affect the quality of service offered by the institution. Process control mechanisms were suggested or implemented to improve these surgical processes. Industrial reliability and quality management tools can easily and effectively identify process control problems that occur on surgical services. © 2010 National Association for Healthcare Quality.

  19. A New Performance Improvement Model: Adding Benchmarking to the Analysis of Performance Indicator Data.

    PubMed

    Al-Kuwaiti, Ahmed; Homa, Karen; Maruthamuthu, Thennarasu

    2016-01-01

    A performance improvement model was developed that focuses on the analysis and interpretation of performance indicator (PI) data using statistical process control and benchmarking. PIs are suitable for comparison with benchmarks only if the data fall within the statistically accepted limit-that is, show only random variation. Specifically, if there is no significant special-cause variation over a period of time, then the data are ready to be benchmarked. The proposed Define, Measure, Control, Internal Threshold, and Benchmark model is adapted from the Define, Measure, Analyze, Improve, Control (DMAIC) model. The model consists of the following five steps: Step 1. Define the process; Step 2. Monitor and measure the variation over the period of time; Step 3. Check the variation of the process; if stable (no significant variation), go to Step 4; otherwise, control variation with the help of an action plan; Step 4. Develop an internal threshold and compare the process with it; Step 5.1. Compare the process with an internal benchmark; and Step 5.2. Compare the process with an external benchmark. The steps are illustrated through the use of health care-associated infection (HAI) data collected for 2013 and 2014 from the Infection Control Unit, King Fahd Hospital, University of Dammam, Saudi Arabia. Monitoring variation is an important strategy in understanding and learning about a process. In the example, HAI was monitored for variation in 2013, and the need to have a more predictable process prompted the need to control variation by an action plan. The action plan was successful, as noted by the shift in the 2014 data, compared to the historical average, and, in addition, the variation was reduced. The model is subject to limitations: For example, it cannot be used without benchmarks, which need to be calculated the same way with similar patient populations, and it focuses only on the "Analyze" part of the DMAIC model.

  20. Evaluating the quality of a cell counting measurement process via a dilution series experimental design.

    PubMed

    Sarkar, Sumona; Lund, Steven P; Vyzasatya, Ravi; Vanguri, Padmavathy; Elliott, John T; Plant, Anne L; Lin-Gibson, Sheng

    2017-12-01

    Cell counting measurements are critical in the research, development and manufacturing of cell-based products, yet determining cell quantity with accuracy and precision remains a challenge. Validating and evaluating a cell counting measurement process can be difficult because of the lack of appropriate reference material. Here we describe an experimental design and statistical analysis approach to evaluate the quality of a cell counting measurement process in the absence of appropriate reference materials or reference methods. The experimental design is based on a dilution series study with replicate samples and observations as well as measurement process controls. The statistical analysis evaluates the precision and proportionality of the cell counting measurement process and can be used to compare the quality of two or more counting methods. As an illustration of this approach, cell counting measurement processes (automated and manual methods) were compared for a human mesenchymal stromal cell (hMSC) preparation. For the hMSC preparation investigated, results indicated that the automated method performed better than the manual counting methods in terms of precision and proportionality. By conducting well controlled dilution series experimental designs coupled with appropriate statistical analysis, quantitative indicators of repeatability and proportionality can be calculated to provide an assessment of cell counting measurement quality. This approach does not rely on the use of a reference material or comparison to "gold standard" methods known to have limited assurance of accuracy and precision. The approach presented here may help the selection, optimization, and/or validation of a cell counting measurement process. Published by Elsevier Inc.

  1. Automated system for the on-line monitoring of powder blending processes using near-infrared spectroscopy. Part I. System development and control.

    PubMed

    Hailey, P A; Doherty, P; Tapsell, P; Oliver, T; Aldridge, P K

    1996-03-01

    An automated system for the on-line monitoring of powder blending processes is described. The system employs near-infrared (NIR) spectroscopy using fibre-optics and a graphical user interface (GUI) developed in the LabVIEW environment. The complete supervisory control and data analysis (SCADA) software controls blender and spectrophotometer operation and performs statistical spectral data analysis in real time. A data analysis routine using standard deviation is described to demonstrate an approach to the real-time determination of blend homogeneity.

  2. Task-Specific and General Cognitive Effects in Chiari Malformation Type I

    PubMed Central

    Allen, Philip A.; Houston, James R.; Pollock, Joshua W.; Buzzelli, Christopher; Li, Xuan; Harrington, A. Katherine; Martin, Bryn A.; Loth, Francis; Lien, Mei-Ching; Maleki, Jahangir; Luciano, Mark G.

    2014-01-01

    Objective Our objective was to use episodic memory and executive function tests to determine whether or not Chiari Malformation Type I (CM) patients experience cognitive dysfunction. Background CM is a neurological syndrome in which the cerebellum descends into the cervical spine causing neural compression, severe headaches, neck pain, and number of other physical symptoms. While primarily a disorder of the cervico-medullary junction, both clinicians and researchers have suspected deficits in higher-level cognitive function. Design and Methods We tested 24 CM patients who had undergone decompression neurosurgery and 24 age- and education-matched controls on measures of immediate and delayed episodic memory, as well as three measures of executive function. Results The CM group showed performance decrements relative to the controls in response inhibition (Stroop interference), working memory computational speed (Ospan), and processing speed (automated digit symbol substitution task), but group differences in recall did not reach statistical significance. After statistical control for depression and anxiety scores, the group effects for working memory and processing speed were eliminated, but not for response inhibition. This response inhibition difference was not due to overall general slowing for the CM group, either, because when controls' data were transformed using the linear function fit to all of the reaction time tasks, the interaction with group remained statistically significant. Furthermore, there was a multivariate group effect for all of the response time measures and immediate and delayed recall after statistical control of depression and anxiety scores. Conclusion These results suggest that CM patients with decompression surgery exhibit cognitive dysfunction compared to age- and education-matched controls. While some of these results may be related to anxiety and depression (likely proxies for chronic pain), response inhibition effects, in particular, as well as a general cognitive deficit persisted even after control for anxiety and decompression. PMID:24736676

  3. Asymptotic inference in system identification for the atom maser.

    PubMed

    Catana, Catalin; van Horssen, Merlijn; Guta, Madalin

    2012-11-28

    System identification is closely related to control theory and plays an increasing role in quantum engineering. In the quantum set-up, system identification is usually equated to process tomography, i.e. estimating a channel by probing it repeatedly with different input states. However, for quantum dynamical systems such as quantum Markov processes, it is more natural to consider the estimation based on continuous measurements of the output, with a given input that may be stationary. We address this problem using asymptotic statistics tools, for the specific example of estimating the Rabi frequency of an atom maser. We compute the Fisher information of different measurement processes as well as the quantum Fisher information of the atom maser, and establish the local asymptotic normality of these statistical models. The statistical notions can be expressed in terms of spectral properties of certain deformed Markov generators, and the connection to large deviations is briefly discussed.

  4. Nuclear Security: Quantifying Late Detection in MC&A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Surinder Paul; Gibbs, Philip W.; Bultz, Garl A.

    2014-03-01

    The objectives of this presentation are to understand the concept of late detection; review the statistics used in MC&A; specify the Pd and timelines for Pd for MC&A elements for item inventories; review elements of process control as they relate to bulk processes; and specify the timelines and detection thresholds for Pd for MC&A elements for bulk or Processing Operations.

  5. Attributing Meanings to Representations of Data: The Case of Statistical Process Control

    ERIC Educational Resources Information Center

    Hoyles, Celia; Bakker, Arthur; Kent, Phillip; Noss, Richard

    2007-01-01

    This article is concerned with the meanings that employees in industry attribute to representations of data and the contingencies of these meanings in context. Our primary concern is to more precisely characterize how the context of the industrial process is constitutive of the meaning of graphs of data derived from this process. We draw on data…

  6. Using a statistical process control chart during the quality assessment of cancer registry data.

    PubMed

    Myles, Zachary M; German, Robert R; Wilson, Reda J; Wu, Manxia

    2011-01-01

    Statistical process control (SPC) charts may be used to detect acute variations in the data while simultaneously evaluating unforeseen aberrations that may warrant further investigation by the data user. Using cancer stage data captured by the Summary Stage 2000 (SS2000) variable, we sought to present a brief report highlighting the utility of the SPC chart during the quality assessment of cancer registry data. Using a county-level caseload for the diagnosis period of 2001-2004 (n=25,648), we found the overall variation of the SS2000 variable to be in control during diagnosis years of 2001 and 2002, exceeded the lower control limit (LCL) in 2003, and exceeded the upper control limit (UCL) in 2004; in situ/localized stages were in control throughout the diagnosis period, regional stage exceeded UCL in 2004, and distant stage exceeded the LCL in 2001 and the UCL in 2004. Our application of the SPC chart with cancer registry data illustrates that the SPC chart may serve as a readily available and timely tool for identifying areas of concern during the data collection and quality assessment of central cancer registry data.

  7. Feasibility Study on the Use of On-line Multivariate Statistical Process Control for Safeguards Applications in Natural Uranium Conversion Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ladd-Lively, Jennifer L

    2014-01-01

    The objective of this work was to determine the feasibility of using on-line multivariate statistical process control (MSPC) for safeguards applications in natural uranium conversion plants. Multivariate statistical process control is commonly used throughout industry for the detection of faults. For safeguards applications in uranium conversion plants, faults could include the diversion of intermediate products such as uranium dioxide, uranium tetrafluoride, and uranium hexafluoride. This study was limited to a 100 metric ton of uranium (MTU) per year natural uranium conversion plant (NUCP) using the wet solvent extraction method for the purification of uranium ore concentrate. A key component inmore » the multivariate statistical methodology is the Principal Component Analysis (PCA) approach for the analysis of data, development of the base case model, and evaluation of future operations. The PCA approach was implemented through the use of singular value decomposition of the data matrix where the data matrix represents normal operation of the plant. Component mole balances were used to model each of the process units in the NUCP. However, this approach could be applied to any data set. The monitoring framework developed in this research could be used to determine whether or not a diversion of material has occurred at an NUCP as part of an International Atomic Energy Agency (IAEA) safeguards system. This approach can be used to identify the key monitoring locations, as well as locations where monitoring is unimportant. Detection limits at the key monitoring locations can also be established using this technique. Several faulty scenarios were developed to test the monitoring framework after the base case or normal operating conditions of the PCA model were established. In all of the scenarios, the monitoring framework was able to detect the fault. Overall this study was successful at meeting the stated objective.« less

  8. The effect of upper gastrointestinal system endoscopy process on serum oxidative stress levels.

    PubMed

    Turan, Mehmet Nuri; Aslan, Mehmet; Bolukbas, Filiz Fusun; Bolukbas, Cengiz; Selek, Sahbettin; Sabuncu, Tevfik

    2016-12-01

    Some authors have investigated the effects of oxidative stress in some process such as undergoing laparoscopic. However, the effect of upper gastrointestinal system endoscopy process on oxidative stress is unclear. We evaluated the short-term effect of upper gastrointestinal system endoscopy process on oxidative stress. Thirty patients who underwent endoscopy process and 20 healthy controls were enrolled in the prospective study. Serum total antioxidant capacity and total oxidant status measurements were measured before and after endoscopy process. The ratio percentage of total oxidant status to total antioxidant capacity was regarded as oxidative stress index. Before endoscopy process, serum total antioxidant capacity levels were higher, while serum total oxidant status levels and oxidative stress index values were lower in patients than controls, but this difference was not statistically significant (all, p > 0.05). After endoscopy process, serum total antioxidant capacity and total oxidant status levels were significantly higher in patients than before endoscopy process (both, p < 0.05). However, oxidative stress index values were slight higher in patients but this difference was not statistically significant (p > 0.05). We observed that serum TAC and TOS levels were increased in patients who underwent endoscopy process after endoscopy process. However, short-time upper gastrointestinal system endoscopy process did not cause an important change in the oxidative stress index. Further studies enrolling a larger number of patients are required to clarify the results obtained here.

  9. Cumulative sum control charts for monitoring geometrically inflated Poisson processes: An application to infectious disease counts data.

    PubMed

    Rakitzis, Athanasios C; Castagliola, Philippe; Maravelakis, Petros E

    2018-02-01

    In this work, we study upper-sided cumulative sum control charts that are suitable for monitoring geometrically inflated Poisson processes. We assume that a process is properly described by a two-parameter extension of the zero-inflated Poisson distribution, which can be used for modeling count data with an excessive number of zero and non-zero values. Two different upper-sided cumulative sum-type schemes are considered, both suitable for the detection of increasing shifts in the average of the process. Aspects of their statistical design are discussed and their performance is compared under various out-of-control situations. Changes in both parameters of the process are considered. Finally, the monitoring of the monthly cases of poliomyelitis in the USA is given as an illustrative example.

  10. Poster - Thur Eve - 29: Detecting changes in IMRT QA using statistical process control.

    PubMed

    Drever, L; Salomons, G

    2012-07-01

    Statistical process control (SPC) methods were used to analyze 239 measurement based individual IMRT QA events. The selected IMRT QA events were all head and neck (H&N) cases with 70Gy in 35 fractions, and all prostate cases with 76Gy in 38 fractions planned between March 2009 and 2012. The results were used to determine if the tolerance limits currently being used for IMRT QA were able to indicate if the process was under control. The SPC calculations were repeated for IMRT QA of the same type of cases that were planned after the treatment planning system was upgraded from Eclipse version 8.1.18 to version 10.0.39. The initial tolerance limits were found to be acceptable for two of the three metrics tested prior to the upgrade. After the upgrade to the treatment planning system the SPC analysis found that the a priori limits were no longer capable of indicating control for 2 of the 3 metrics analyzed. The changes in the IMRT QA results were clearly identified using SPC, indicating that it is a useful tool for finding changes in the IMRT QA process. Routine application of SPC to IMRT QA results would help to distinguish unintentional trends and changes from the random variation in the IMRT QA results for individual plans. © 2012 American Association of Physicists in Medicine.

  11. The protocol and design of a randomised controlled study on training of attention within the first year after acquired brain injury.

    PubMed

    Bartfai, Aniko; Markovic, Gabriela; Sargenius Landahl, Kristina; Schult, Marie-Louise

    2014-05-08

    To describe the design of the study aiming to examine intensive targeted cognitive rehabilitation of attention in the acute (<4 months) and subacute rehabilitation phases (4-12 months) after acquired brain injury and to evaluate the effects on function, activity and participation (return to work). Within a prospective, randomised, controlled study 120 consecutive patients with stroke or traumatic brain injury were randomised to 20 hours of intensive attention training by Attention Process Training or by standard, activity based training. Progress was evaluated by Statistical Process Control and by pre and post measurement of functional and activity levels. Return to work was also evaluated in the post-acute phase. Primary endpoints were the changes in the attention measure, Paced Auditory Serial Addition Test and changes in work ability. Secondary endpoints included measurement of cognitive functions, activity and work return. There were 3, 6 and 12-month follow ups focussing on health economics. The study will provide information on rehabilitation of attention in the early phases after ABI; effects on function, activity and return to work. Further, the application of Statistical Process Control might enable closer investigation of the cognitive changes after acquired brain injury and demonstrate the usefulness of process measures in rehabilitation. The study was registered at ClinicalTrials.gov Protocol. NCT02091453, registered: 19 March 2014.

  12. Control Strategies for Drug Product Continuous Direct Compression-State of Control, Product Collection Strategies, and Startup/Shutdown Operations for the Production of Clinical Trial Materials and Commercial Products.

    PubMed

    Almaya, Ahmad; De Belder, Lawrence; Meyer, Robert; Nagapudi, Karthik; Lin, Hung-Ren Homer; Leavesley, Ian; Jayanth, Jayanthy; Bajwa, Gurjit; DiNunzio, James; Tantuccio, Anthony; Blackwood, Dan; Abebe, Admassu

    2017-04-01

    Continuous manufacturing (CM) has emerged in the pharmaceutical industry as a paradigm shift with significant advantages related to cost, efficiency, flexibility, and higher assurance of quality. The inherent differences from batch processes justify examining the CM control strategy more holistically. This article describes the current thinking for the control and implementation of CM, using the example of a direct compression process and taking into consideration the ICH Q10 definition of "state of control" and process validation requirements. Statistical process control using control charts, sources of variation, process capability, and process performance is explained as a useful concept that can help assess the impact of variation within a batch and indicates if a process is in state of control. The potential for time-variant nature of startup and shutdown with CM is discussed to assure product quality while minimizing waste as well as different options for detection and isolation of non-conforming materials due to process upsets. While different levels of control are possible with CM, an appropriate balance between process control and end product testing is needed depending on the level of process understanding at the different stages of development from the production of clinical supplies through commercialization. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  13. Automation of experimental research of waveguide paths induction soldering

    NASA Astrophysics Data System (ADS)

    Tynchenko, V. S.; Petrenko, V. E.; Kukartsev, V. V.; Tynchenko, V. V.; Antamoshkin, O. A.

    2018-05-01

    The article presents an automated system of experimental studies of the waveguide paths induction soldering process. The system is a part of additional software for a complex of automated control of the technological process of induction soldering of thin-walled waveguide paths from aluminum alloys, expanding its capabilities. The structure of the software product, the general appearance of the controls and the potential application possibilities are presented. The utility of the developed application by approbation in a series of field experiments was considered and justified. The application of the experimental research system makes it possible to improve the process under consideration, providing the possibility of fine-tuning the control regulators, as well as keeping the statistics of the soldering process in a convenient form for analysis.

  14. Surface defect detection in tiling Industries using digital image processing methods: analysis and evaluation.

    PubMed

    Karimi, Mohammad H; Asemani, Davud

    2014-05-01

    Ceramic and tile industries should indispensably include a grading stage to quantify the quality of products. Actually, human control systems are often used for grading purposes. An automatic grading system is essential to enhance the quality control and marketing of the products. Since there generally exist six different types of defects originating from various stages of tile manufacturing lines with distinct textures and morphologies, many image processing techniques have been proposed for defect detection. In this paper, a survey has been made on the pattern recognition and image processing algorithms which have been used to detect surface defects. Each method appears to be limited for detecting some subgroup of defects. The detection techniques may be divided into three main groups: statistical pattern recognition, feature vector extraction and texture/image classification. The methods such as wavelet transform, filtering, morphology and contourlet transform are more effective for pre-processing tasks. Others including statistical methods, neural networks and model-based algorithms can be applied to extract the surface defects. Although, statistical methods are often appropriate for identification of large defects such as Spots, but techniques such as wavelet processing provide an acceptable response for detection of small defects such as Pinhole. A thorough survey is made in this paper on the existing algorithms in each subgroup. Also, the evaluation parameters are discussed including supervised and unsupervised parameters. Using various performance parameters, different defect detection algorithms are compared and evaluated. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  15. High-throughput manufacturing of size-tuned liposomes by a new microfluidics method using enhanced statistical tools for characterization.

    PubMed

    Kastner, Elisabeth; Kaur, Randip; Lowry, Deborah; Moghaddam, Behfar; Wilkinson, Alexander; Perrie, Yvonne

    2014-12-30

    Microfluidics has recently emerged as a new method of manufacturing liposomes, which allows for reproducible mixing in miliseconds on the nanoliter scale. Here we investigate microfluidics-based manufacturing of liposomes. The aim of these studies was to assess the parameters in a microfluidic process by varying the total flow rate (TFR) and the flow rate ratio (FRR) of the solvent and aqueous phases. Design of experiment and multivariate data analysis were used for increased process understanding and development of predictive and correlative models. High FRR lead to the bottom-up synthesis of liposomes, with a strong correlation with vesicle size, demonstrating the ability to in-process control liposomes size; the resulting liposome size correlated with the FRR in the microfluidics process, with liposomes of 50 nm being reproducibly manufactured. Furthermore, we demonstrate the potential of a high throughput manufacturing of liposomes using microfluidics with a four-fold increase in the volumetric flow rate, maintaining liposome characteristics. The efficacy of these liposomes was demonstrated in transfection studies and was modelled using predictive modeling. Mathematical modelling identified FRR as the key variable in the microfluidic process, with the highest impact on liposome size, polydispersity and transfection efficiency. This study demonstrates microfluidics as a robust and high-throughput method for the scalable and highly reproducible manufacture of size-controlled liposomes. Furthermore, the application of statistically based process control increases understanding and allows for the generation of a design-space for controlled particle characteristics. Copyright © 2014 The Authors. Published by Elsevier B.V. All rights reserved.

  16. Distinguishing synchronous and time-varying synergies using point process interval statistics: motor primitives in frog and rat

    PubMed Central

    Hart, Corey B.; Giszter, Simon F.

    2013-01-01

    We present and apply a method that uses point process statistics to discriminate the forms of synergies in motor pattern data, prior to explicit synergy extraction. The method uses electromyogram (EMG) pulse peak timing or onset timing. Peak timing is preferable in complex patterns where pulse onsets may be overlapping. An interval statistic derived from the point processes of EMG peak timings distinguishes time-varying synergies from synchronous synergies (SS). Model data shows that the statistic is robust for most conditions. Its application to both frog hindlimb EMG and rat locomotion hindlimb EMG show data from these preparations is clearly most consistent with synchronous synergy models (p < 0.001). Additional direct tests of pulse and interval relations in frog data further bolster the support for synchronous synergy mechanisms in these data. Our method and analyses support separated control of rhythm and pattern of motor primitives, with the low level execution primitives comprising pulsed SS in both frog and rat, and both episodic and rhythmic behaviors. PMID:23675341

  17. Modified SPC for short run test and measurement process in multi-stations

    NASA Astrophysics Data System (ADS)

    Koh, C. K.; Chin, J. F.; Kamaruddin, S.

    2018-03-01

    Due to short production runs and measurement error inherent in electronic test and measurement (T&M) processes, continuous quality monitoring through real-time statistical process control (SPC) is challenging. Industry practice allows the installation of guard band using measurement uncertainty to reduce the width of acceptance limit, as an indirect way to compensate the measurement errors. This paper presents a new SPC model combining modified guard band and control charts (\\bar{\\text{Z}} chart and W chart) for short runs in T&M process in multi-stations. The proposed model standardizes the observed value with measurement target (T) and rationed measurement uncertainty (U). S-factor (S f) is introduced to the control limits to improve the sensitivity in detecting small shifts. The model was embedded in automated quality control system and verified with a case study in real industry.

  18. Making It Right the First Time.

    ERIC Educational Resources Information Center

    Wilcox, John

    1987-01-01

    The author discusses how using statistical process control can help manufacturers save money and produce a better product. He covers barriers to its implementation within an organization, focusing on training workers in the methods. (CH)

  19. RCT: Module 2.03, Counting Errors and Statistics, Course 8768

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hillmer, Kurt T.

    2017-04-01

    Radiological sample analysis involves the observation of a random process that may or may not occur and an estimation of the amount of radioactive material present based on that observation. Across the country, radiological control personnel are using the activity measurements to make decisions that may affect the health and safety of workers at those facilities and their surrounding environments. This course will present an overview of measurement processes, a statistical evaluation of both measurements and equipment performance, and some actions to take to minimize the sources of error in count room operations. This course will prepare the student withmore » the skills necessary for radiological control technician (RCT) qualification by passing quizzes, tests, and the RCT Comprehensive Phase 1, Unit 2 Examination (TEST 27566) and by providing in the field skills.« less

  20. Direct Evidence for a Dual Process Model of Deductive Inference

    ERIC Educational Resources Information Center

    Markovits, Henry; Brunet, Marie-Laurence; Thompson, Valerie; Brisson, Janie

    2013-01-01

    In 2 experiments, we tested a strong version of a dual process theory of conditional inference (cf. Verschueren et al., 2005a, 2005b) that assumes that most reasoners have 2 strategies available, the choice of which is determined by situational variables, cognitive capacity, and metacognitive control. The statistical strategy evaluates inferences…

  1. Training site statistics from Landsat and Seasat satellite imagery registered to a common map base

    NASA Technical Reports Server (NTRS)

    Clark, J.

    1981-01-01

    Landsat and Seasat satellite imagery and training site boundary coordinates were registered to a common Universal Transverse Mercator map base in the Newport Beach area of Orange County, California. The purpose was to establish a spatially-registered, multi-sensor data base which would test the use of Seasat synthetic aperture radar imagery to improve spectral separability of channels used for land use classification of an urban area. Digital image processing techniques originally developed for the digital mosaics of the California Desert and the State of Arizona were adapted to spatially register multispectral and radar data. Techniques included control point selection from imagery and USGS topographic quadrangle maps, control point cataloguing with the Image Based Information System, and spatial and spectral rectifications of the imagery. The radar imagery was pre-processed to reduce its tendency toward uniform data distributions, so that training site statistics for selected Landsat and pre-processed Seasat imagery indicated good spectral separation between channels.

  2. Application of spatial technology in malaria research & control: some new insights.

    PubMed

    Saxena, Rekha; Nagpal, B N; Srivastava, Aruna; Gupta, S K; Dash, A P

    2009-08-01

    Geographical information System (GIS) has emerged as the core of the spatial technology which integrates wide range of dataset available from different sources including Remote Sensing (RS) and Global Positioning System (GPS). Literature published during the decade (1998-2007) has been compiled and grouped into six categories according to the usage of the technology in malaria epidemiology. Different GIS modules like spatial data sources, mapping and geo-processing tools, distance calculation, digital elevation model (DEM), buffer zone and geo-statistical analysis have been investigated in detail, illustrated with examples as per the derived results. These GIS tools have contributed immensely in understanding the epidemiological processes of malaria and examples drawn have shown that GIS is now widely used for research and decision making in malaria control. Statistical data analysis currently is the most consistent and established set of tools to analyze spatial datasets. The desired future development of GIS is in line with the utilization of geo-statistical tools which combined with high quality data has capability to provide new insight into malaria epidemiology and the complexity of its transmission potential in endemic areas.

  3. Investigation of improving MEMS-type VOA reliability

    NASA Astrophysics Data System (ADS)

    Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.

    2003-12-01

    MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).

  4. Investigation of improving MEMS-type VOA reliability

    NASA Astrophysics Data System (ADS)

    Hong, Seok K.; Lee, Yeong G.; Park, Moo Y.

    2004-01-01

    MEMS technologies have been applied to a lot of areas, such as optical communications, Gyroscopes and Bio-medical components and so on. In terms of the applications in the optical communication field, MEMS technologies are essential, especially, in multi dimensional optical switches and Variable Optical Attenuators(VOAs). This paper describes the process for the development of MEMS type VOAs with good optical performance and improved reliability. Generally, MEMS VOAs have been fabricated by silicon micro-machining process, precise fibre alignment and sophisticated packaging process. Because, it is composed of many structures with various materials, it is difficult to make devices reliable. We have developed MEMS type VOSs with many failure mode considerations (FMEA: Failure Mode Effect Analysis) in the initial design step, predicted critical failure factors and revised the design, and confirmed the reliability by preliminary test. These predicted failure factors were moisture, bonding strength of the wire, which wired between the MEMS chip and TO-CAN and instability of supplied signals. Statistical quality control tools (ANOVA, T-test and so on) were used to control these potential failure factors and produce optimum manufacturing conditions. To sum up, we have successfully developed reliable MEMS type VOAs with good optical performances by controlling potential failure factors and using statistical quality control tools. As a result, developed VOAs passed international reliability standards (Telcodia GR-1221-CORE).

  5. Relating triggering processes in lab experiments with earthquakes.

    NASA Astrophysics Data System (ADS)

    Baro Urbea, J.; Davidsen, J.; Kwiatek, G.; Charalampidou, E. M.; Goebel, T.; Stanchits, S. A.; Vives, E.; Dresen, G.

    2016-12-01

    Statistical relations such as Gutenberg-Richter's, Omori-Utsu's and the productivity of aftershocks were first observed in seismology, but are also common to other physical phenomena exhibiting avalanche dynamics such as solar flares, rock fracture, structural phase transitions and even stock market transactions. All these examples exhibit spatio-temporal correlations that can be explained as triggering processes: Instead of being activated as a response to external driving or fluctuations, some events are consequence of previous activity. Although different plausible explanations have been suggested in each system, the ubiquity of such statistical laws remains unknown. However, the case of rock fracture may exhibit a physical connection with seismology. It has been suggested that some features of seismology have a microscopic origin and are reproducible over a vast range of scales. This hypothesis has motivated mechanical experiments to generate artificial catalogues of earthquakes at a laboratory scale -so called labquakes- and under controlled conditions. Microscopic fractures in lab tests release elastic waves that are recorded as ultrasonic (kHz-MHz) acoustic emission (AE) events by means of piezoelectric transducers. Here, we analyse the statistics of labquakes recorded during the failure of small samples of natural rocks and artificial porous materials under different controlled compression regimes. Temporal and spatio-temporal correlations are identified in certain cases. Specifically, we distinguish between the background and triggered events, revealing some differences in the statistical properties. We fit the data to statistical models of seismicity. As a particular case, we explore the branching process approach simplified in the Epidemic Type Aftershock Sequence (ETAS) model. We evaluate the empirical spatio-temporal kernel of the model and investigate the physical origins of triggering. Our analysis of the focal mechanisms implies that the occurrence of the empirical laws extends well beyond purely frictional sliding events, in contrast to what is often assumed.

  6. Statistical modelling predicts almost complete loss of major periglacial processes in Northern Europe by 2100.

    PubMed

    Aalto, Juha; Harrison, Stephan; Luoto, Miska

    2017-09-11

    The periglacial realm is a major part of the cryosphere, covering a quarter of Earth's land surface. Cryogenic land surface processes (LSPs) control landscape development, ecosystem functioning and climate through biogeochemical feedbacks, but their response to contemporary climate change is unclear. Here, by statistically modelling the current and future distributions of four major LSPs unique to periglacial regions at fine scale, we show fundamental changes in the periglacial climate realm are inevitable with future climate change. Even with the most optimistic CO 2 emissions scenario (Representative Concentration Pathway (RCP) 2.6) we predict a 72% reduction in the current periglacial climate realm by 2050 in our climatically sensitive northern Europe study area. These impacts are projected to be especially severe in high-latitude continental interiors. We further predict that by the end of the twenty-first century active periglacial LSPs will exist only at high elevations. These results forecast a future tipping point in the operation of cold-region LSP, and predict fundamental landscape-level modifications in ground conditions and related atmospheric feedbacks.Cryogenic land surface processes characterise the periglacial realm and control landscape development and ecosystem functioning. Here, via statistical modelling, the authors predict a 72% reduction of the periglacial realm in Northern Europe by 2050, and almost complete disappearance by 2100.

  7. Control charts for monitoring accumulating adverse event count frequencies from single and multiple blinded trials.

    PubMed

    Gould, A Lawrence

    2016-12-30

    Conventional practice monitors accumulating information about drug safety in terms of the numbers of adverse events reported from trials in a drug development program. Estimates of between-treatment adverse event risk differences can be obtained readily from unblinded trials with adjustment for differences among trials using conventional statistical methods. Recent regulatory guidelines require monitoring the cumulative frequency of adverse event reports to identify possible between-treatment adverse event risk differences without unblinding ongoing trials. Conventional statistical methods for assessing between-treatment adverse event risks cannot be applied when the trials are blinded. However, CUSUM charts can be used to monitor the accumulation of adverse event occurrences. CUSUM charts for monitoring adverse event occurrence in a Bayesian paradigm are based on assumptions about the process generating the adverse event counts in a trial as expressed by informative prior distributions. This article describes the construction of control charts for monitoring adverse event occurrence based on statistical models for the processes, characterizes their statistical properties, and describes how to construct useful prior distributions. Application of the approach to two adverse events of interest in a real trial gave nearly identical results for binomial and Poisson observed event count likelihoods. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  8. Terrestrial photovoltaic cell process testing

    NASA Technical Reports Server (NTRS)

    Burger, D. R.

    1985-01-01

    The paper examines critical test parameters, criteria for selecting appropriate tests, and the use of statistical controls and test patterns to enhance PV-cell process test results. The coverage of critical test parameters is evaluated by examining available test methods and then screening these methods by considering the ability to measure those critical parameters which are most affected by the generic process, the cost of the test equipment and test performance, and the feasibility for process testing.

  9. Terrestrial photovoltaic cell process testing

    NASA Astrophysics Data System (ADS)

    Burger, D. R.

    The paper examines critical test parameters, criteria for selecting appropriate tests, and the use of statistical controls and test patterns to enhance PV-cell process test results. The coverage of critical test parameters is evaluated by examining available test methods and then screening these methods by considering the ability to measure those critical parameters which are most affected by the generic process, the cost of the test equipment and test performance, and the feasibility for process testing.

  10. 1H NMR-based metabolic profiling for evaluating poppy seed rancidity and brewing.

    PubMed

    Jawień, Ewa; Ząbek, Adam; Deja, Stanisław; Łukaszewicz, Marcin; Młynarz, Piotr

    2015-12-01

    Poppy seeds are widely used in household and commercial confectionery. The aim of this study was to demonstrate the application of metabolic profiling for industrial monitoring of the molecular changes which occur during minced poppy seed rancidity and brewing processes performed on raw seeds. Both forms of poppy seeds were obtained from a confectionery company. Proton nuclear magnetic resonance (1H NMR) was applied as the analytical method of choice together with multivariate statistical data analysis. Metabolic fingerprinting was applied as a bioprocess control tool to monitor rancidity with the trajectory of change and brewing progressions. Low molecular weight compounds were found to be statistically significant biomarkers of these bioprocesses. Changes in concentrations of chemical compounds were explained relative to the biochemical processes and external conditions. The obtained results provide valuable and comprehensive information to gain a better understanding of the biology of rancidity and brewing processes, while demonstrating the potential for applying NMR spectroscopy combined with multivariate data analysis tools for quality control in food industries involved in the processing of oilseeds. This precious and versatile information gives a better understanding of the biology of these processes.

  11. Impact of hospital care on incidence of bloodstream infection: the evaluation of processes and indicators in infection control study.

    PubMed Central

    Kritchevsky, S. B.; Braun, B. I.; Wong, E. S.; Solomon, S. L.; Steele, L.; Richards, C.; Simmons, B. P.

    2001-01-01

    The Evaluation of Processes and Indicators in Infection Control (EPIC) study assesses the relationship between hospital care and rates of central venous catheter-associated primary bacteremia in 54 intensive-care units (ICUs) in the United States and 14 other countries. Using ICU rather than the patient as the primary unit of statistical analysis permits evaluation of factors that vary at the ICU level. The design of EPIC can serve as a template for studies investigating the relationship between process and event rates across health-care institutions. PMID:11294704

  12. Results Of Automating A Photolithography Cell In A Clean Tunnel

    NASA Astrophysics Data System (ADS)

    June, David H.

    1987-01-01

    A prototype automated photobay was installed in an existing fab area utilizing flexible material handling techniques within a clean tunnel. The project objective was to prove design concepts of automated cassette-to-cassette handling within a clean tunnel that isolated operators from the wafers being processed. Material handling was by monorail track transport system to feed cassettes to pick and place robots. The robots loaded and unloaded cassettes of wafers to each of the various pieces of process equipment. The material handling algorithms, recipe downloading and statistical process control functions were all performed by custom software on the photobay cell controller.

  13. The Role of Design-of-Experiments in Managing Flow in Compact Air Vehicle Inlets

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Miller, Daniel N.; Gridley, Marvin C.; Agrell, Johan

    2003-01-01

    It is the purpose of this study to demonstrate the viability and economy of Design-of-Experiments methodologies to arrive at microscale secondary flow control array designs that maintain optimal inlet performance over a wide range of the mission variables and to explore how these statistical methods provide a better understanding of the management of flow in compact air vehicle inlets. These statistical design concepts were used to investigate the robustness properties of low unit strength micro-effector arrays. Low unit strength micro-effectors are micro-vanes set at very low angles-of-incidence with very long chord lengths. They were designed to influence the near wall inlet flow over an extended streamwise distance, and their advantage lies in low total pressure loss and high effectiveness in managing engine face distortion. The term robustness is used in this paper in the same sense as it is used in the industrial problem solving community. It refers to minimizing the effects of the hard-to-control factors that influence the development of a product or process. In Robustness Engineering, the effects of the hard-to-control factors are often called noise , and the hard-to-control factors themselves are referred to as the environmental variables or sometimes as the Taguchi noise variables. Hence Robust Optimization refers to minimizing the effects of the environmental or noise variables on the development (design) of a product or process. In the management of flow in compact inlets, the environmental or noise variables can be identified with the mission variables. Therefore this paper formulates a statistical design methodology that minimizes the impact of variations in the mission variables on inlet performance and demonstrates that these statistical design concepts can lead to simpler inlet flow management systems.

  14. Performance points. The reform club.

    PubMed

    Edwards, Nick

    2004-03-18

    The improvement Partnership for Hospitals programme is the vanguard of Modernization Agency work. It is based on statistical process control to eliminate variations in performance, especially in elective service. All starred trusts will join IPH by next April.

  15. [Role of medical information processing for quality assurance in obstetrics].

    PubMed

    Selbmann, H K

    1983-06-01

    The paradigma of problem-orientated assuring of the professional quality of medical case is a kind of "control loop system" consisting of the following 5 steps: routine observation, identification of the problem, analysis of the problem, translation of problem solutions into daily practice and control as to whether the problem has been solved or eliminated. Medical data processing, which involves documentation, electronic data processing and statistics, can make substantial contributions especially to the steps of observation, identification of the problem, and follow-up control. Perinatal data collection, which has already been introduced in 6 Länder of the Federal Republic of Germany, has supplied ample proof of this. These operations were conducted under the heading "internal clinical assuring of quality with external aid". Those clinics who participated in this programme, were given the necessary aid in self-observation (questionnaires, clinical statistics), and they were also given comparative informative data to help them in identifying the problems (clinical profiles, etc.). It is entirely left to the responsibility of the clinics themselves--voluntary cooperation and guarantee of remaining anonymous being a matter of course -- to draw their own consequences from the collected data and to translate these into clinical everyday practice.

  16. Using statistical process control for monitoring the prevalence of hospital-acquired pressure ulcers.

    PubMed

    Kottner, Jan; Halfens, Ruud

    2010-05-01

    Institutionally acquired pressure ulcers are used as outcome indicators to assess the quality of pressure ulcer prevention programs. Determining whether quality improvement projects that aim to decrease the proportions of institutionally acquired pressure ulcers lead to real changes in clinical practice depends on the measurement method and statistical analysis used. To examine whether nosocomial pressure ulcer prevalence rates in hospitals in the Netherlands changed, a secondary data analysis using different statistical approaches was conducted of annual (1998-2008) nationwide nursing-sensitive health problem prevalence studies in the Netherlands. Institutions that participated regularly in all survey years were identified. Risk-adjusted nosocomial pressure ulcers prevalence rates, grade 2 to 4 (European Pressure Ulcer Advisory Panel system) were calculated per year and hospital. Descriptive statistics, chi-square trend tests, and P charts based on statistical process control (SPC) were applied and compared. Six of the 905 healthcare institutions participated in every survey year and 11,444 patients in these six hospitals were identified as being at risk for pressure ulcers. Prevalence rates per year ranged from 0.05 to 0.22. Chi-square trend tests revealed statistically significant downward trends in four hospitals but based on SPC methods, prevalence rates of five hospitals varied by chance only. Results of chi-square trend tests and SPC methods were not comparable, making it impossible to decide which approach is more appropriate. P charts provide more valuable information than single P values and are more helpful for monitoring institutional performance. Empirical evidence about the decrease of nosocomial pressure ulcer prevalence rates in the Netherlands is contradictory and limited.

  17. SU-E-CAMPUS-T-04: Statistical Process Control for Patient-Specific QA in Proton Beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    LAH, J; SHIN, D; Kim, G

    Purpose: To evaluate and improve the reliability of proton QA process, to provide an optimal customized level using the statistical process control (SPC) methodology. The aim is then to suggest the suitable guidelines for patient-specific QA process. Methods: We investigated the constancy of the dose output and range to see whether it was within the tolerance level of daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to suggest the suitable guidelines for patient-specific QA in proton beam by using process capability indices. In this study, patient QA plans were classifiedmore » into 6 treatment sites: head and neck (41 cases), spinal cord (29 cases), lung (28 cases), liver (30 cases), pancreas (26 cases), and prostate (24 cases). Results: The deviations for the dose output and range of daily QA process were ±0.84% and ±019%, respectively. Our results show that the patient-specific range measurements are capable at a specification limit of ±2% in all treatment sites except spinal cord cases. In spinal cord cases, comparison of process capability indices (Cp, Cpm, Cpk ≥1, but Cpmk ≤1) indicated that the process is capable, but not centered, the process mean deviates from its target value. The UCL (upper control limit), CL (center line) and LCL (lower control limit) for spinal cord cases were 1.37%, −0.27% and −1.89%, respectively. On the other hands, the range differences in prostate cases were good agreement between calculated and measured values. The UCL, CL and LCL for prostate cases were 0.57%, −0.11% and −0.78%, respectively. Conclusion: SPC methodology has potential as a useful tool to customize an optimal tolerance levels and to suggest the suitable guidelines for patient-specific QA in clinical proton beam.« less

  18. The integration of probabilistic information during sensorimotor estimation is unimpaired in children with Cerebral Palsy

    PubMed Central

    Sokhey, Taegh; Gaebler-Spira, Deborah; Kording, Konrad P.

    2017-01-01

    Background It is important to understand the motor deficits of children with Cerebral Palsy (CP). Our understanding of this motor disorder can be enriched by computational models of motor control. One crucial stage in generating movement involves combining uncertain information from different sources, and deficits in this process could contribute to reduced motor function in children with CP. Healthy adults can integrate previously-learned information (prior) with incoming sensory information (likelihood) in a close-to-optimal way when estimating object location, consistent with the use of Bayesian statistics. However, there are few studies investigating how children with CP perform sensorimotor integration. We compare sensorimotor estimation in children with CP and age-matched controls using a model-based analysis to understand the process. Methods and findings We examined Bayesian sensorimotor integration in children with CP, aged between 5 and 12 years old, with Gross Motor Function Classification System (GMFCS) levels 1–3 and compared their estimation behavior with age-matched typically-developing (TD) children. We used a simple sensorimotor estimation task which requires participants to combine probabilistic information from different sources: a likelihood distribution (current sensory information) with a prior distribution (learned target information). In order to examine sensorimotor integration, we quantified how participants weighed statistical information from the two sources (prior and likelihood) and compared this to the statistical optimal weighting. We found that the weighing of statistical information in children with CP was as statistically efficient as that of TD children. Conclusions We conclude that Bayesian sensorimotor integration is not impaired in children with CP and therefore, does not contribute to their motor deficits. Future research has the potential to enrich our understanding of motor disorders by investigating the stages of motor processing set out by computational models. Therapeutic interventions should exploit the ability of children with CP to use statistical information. PMID:29186196

  19. Hydrochemical evolution and groundwater flow processes in the Galilee and Eromanga basins, Great Artesian Basin, Australia: a multivariate statistical approach.

    PubMed

    Moya, Claudio E; Raiber, Matthias; Taulis, Mauricio; Cox, Malcolm E

    2015-03-01

    The Galilee and Eromanga basins are sub-basins of the Great Artesian Basin (GAB). In this study, a multivariate statistical approach (hierarchical cluster analysis, principal component analysis and factor analysis) is carried out to identify hydrochemical patterns and assess the processes that control hydrochemical evolution within key aquifers of the GAB in these basins. The results of the hydrochemical assessment are integrated into a 3D geological model (previously developed) to support the analysis of spatial patterns of hydrochemistry, and to identify the hydrochemical and hydrological processes that control hydrochemical variability. In this area of the GAB, the hydrochemical evolution of groundwater is dominated by evapotranspiration near the recharge area resulting in a dominance of the Na-Cl water types. This is shown conceptually using two selected cross-sections which represent discrete groundwater flow paths from the recharge areas to the deeper parts of the basins. With increasing distance from the recharge area, a shift towards a dominance of carbonate (e.g. Na-HCO3 water type) has been observed. The assessment of hydrochemical changes along groundwater flow paths highlights how aquifers are separated in some areas, and how mixing between groundwater from different aquifers occurs elsewhere controlled by geological structures, including between GAB aquifers and coal bearing strata of the Galilee Basin. The results of this study suggest that distinct hydrochemical differences can be observed within the previously defined Early Cretaceous-Jurassic aquifer sequence of the GAB. A revision of the two previously recognised hydrochemical sequences is being proposed, resulting in three hydrochemical sequences based on systematic differences in hydrochemistry, salinity and dominant hydrochemical processes. The integrated approach presented in this study which combines different complementary multivariate statistical techniques with a detailed assessment of the geological framework of these sedimentary basins, can be adopted in other complex multi-aquifer systems to assess hydrochemical evolution and its geological controls. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Using Deming To Improve Quality in Colleges and Universities.

    ERIC Educational Resources Information Center

    Cornesky, Robert A.; And Others

    Of all the people known for stressing quality in industry, W. Edwards Deming is the pioneer. He stresses statistical process control (SPC) and a 14-point process for managers to improve quality and productivity. His approach is humanistic and treats people as intelligent human beings who want to do a good job. Twelve administrators in a university…

  1. Implementation of a real-time statistical process control system in hardwood sawmills

    Treesearch

    Timothy M. Young; Brian H. Bond; Jan Wiedenbeck

    2007-01-01

    Variation in sawmill processes reduces the financial benefit of converting fiber from a log into lumber. Lumber is intentionally oversized during manufacture to allow for sawing variation, shrinkage from drying, and final surfacing. This oversizing of lumber due to sawing variation requires higher operating targets and leads to suboptimal fiber recovery. For more than...

  2. Combining Natural Language Processing and Statistical Text Mining: A Study of Specialized versus Common Languages

    ERIC Educational Resources Information Center

    Jarman, Jay

    2011-01-01

    This dissertation focuses on developing and evaluating hybrid approaches for analyzing free-form text in the medical domain. This research draws on natural language processing (NLP) techniques that are used to parse and extract concepts based on a controlled vocabulary. Once important concepts are extracted, additional machine learning algorithms,…

  3. Impaired Statistical Learning in Developmental Dyslexia

    PubMed Central

    Thiessen, Erik D.; Holt, Lori L.

    2015-01-01

    Purpose Developmental dyslexia (DD) is commonly thought to arise from phonological impairments. However, an emerging perspective is that a more general procedural learning deficit, not specific to phonological processing, may underlie DD. The current study examined if individuals with DD are capable of extracting statistical regularities across sequences of passively experienced speech and nonspeech sounds. Such statistical learning is believed to be domain-general, to draw upon procedural learning systems, and to relate to language outcomes. Method DD and control groups were familiarized with a continuous stream of syllables or sine-wave tones, the ordering of which was defined by high or low transitional probabilities across adjacent stimulus pairs. Participants subsequently judged two 3-stimulus test items with either high or low statistical coherence as being the most similar to the sounds heard during familiarization. Results As with control participants, the DD group was sensitive to the transitional probability structure of the familiarization materials as evidenced by above-chance performance. However, the performance of participants with DD was significantly poorer than controls across linguistic and nonlinguistic stimuli. In addition, reading-related measures were significantly correlated with statistical learning performance of both speech and nonspeech material. Conclusion Results are discussed in light of procedural learning impairments among participants with DD. PMID:25860795

  4. Optimizing construction quality management of pavements using mechanistic performance analysis.

    DOT National Transportation Integrated Search

    2004-08-01

    This report presents a statistical-based algorithm that was developed to reconcile the results from several pavement performance models used in the state of practice with systematic process control techniques. These algorithms identify project-specif...

  5. Low-level processing for real-time image analysis

    NASA Technical Reports Server (NTRS)

    Eskenazi, R.; Wilf, J. M.

    1979-01-01

    A system that detects object outlines in television images in real time is described. A high-speed pipeline processor transforms the raw image into an edge map and a microprocessor, which is integrated into the system, clusters the edges, and represents them as chain codes. Image statistics, useful for higher level tasks such as pattern recognition, are computed by the microprocessor. Peak intensity and peak gradient values are extracted within a programmable window and are used for iris and focus control. The algorithms implemented in hardware and the pipeline processor architecture are described. The strategy for partitioning functions in the pipeline was chosen to make the implementation modular. The microprocessor interface allows flexible and adaptive control of the feature extraction process. The software algorithms for clustering edge segments, creating chain codes, and computing image statistics are also discussed. A strategy for real time image analysis that uses this system is given.

  6. Investigating output and energy variations and their relationship to delivery QA results using Statistical Process Control for helical tomotherapy.

    PubMed

    Binny, Diana; Mezzenga, Emilio; Lancaster, Craig M; Trapp, Jamie V; Kairn, Tanya; Crowe, Scott B

    2017-06-01

    The aims of this study were to investigate machine beam parameters using the TomoTherapy quality assurance (TQA) tool, establish a correlation to patient delivery quality assurance results and to evaluate the relationship between energy variations detected using different TQA modules. TQA daily measurement results from two treatment machines for periods of up to 4years were acquired. Analyses of beam quality, helical and static output variations were made. Variations from planned dose were also analysed using Statistical Process Control (SPC) technique and their relationship to output trends were studied. Energy variations appeared to be one of the contributing factors to delivery output dose seen in the analysis. Ion chamber measurements were reliable indicators of energy and output variations and were linear with patient dose verifications. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  7. Integrated Assessment and Improvement of the Quality Assurance System for the Cosworth Casting Process

    NASA Astrophysics Data System (ADS)

    Yousif, Dilon

    The purpose of this study was to improve the Quality Assurance (QA) System at the Nemak Windsor Aluminum Plant (WAP). The project used Six Sigma method based on Define, Measure, Analyze, Improve, and Control (DMAIC). Analysis of in process melt at WAP was based on chemical, thermal, and mechanical testing. The control limits for the W319 Al Alloy were statistically recalculated using the composition measured under stable conditions. The "Chemistry Viewer" software was developed for statistical analysis of alloy composition. This software features the Silicon Equivalency (SiBQ) developed by the IRC. The Melt Sampling Device (MSD) was designed and evaluated at WAP to overcome traditional sampling limitations. The Thermal Analysis "Filters" software was developed for cooling curve analysis of the 3XX Al Alloy(s) using IRC techniques. The impact of low melting point impurities on the start of melting was evaluated using the Universal Metallurgical Simulator and Analyzer (UMSA).

  8. Adaptive and Optimal Control of Stochastic Dynamical Systems

    DTIC Science & Technology

    2015-09-14

    Advances in Statistics, Probability and Actuarial Sciences , Vol. 1, World Scientific, 2012, 451- 463. [4] T. E. Duncan and B. Pasik-Duncan, A...S. N. Cohen, T. K. Siu and H. Yang) Advances in Statistics, Probability and Actuarial Sciences , Vol. 1, World Scientific, 2012, 451-463. 4. T. E...games with gen- eral noise processes, Models and Methods in Economics and Management Science : Essays in Honor of Charles S. Tapiero, (eds. F. El

  9. Data mining and statistical inference in selective laser melting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamath, Chandrika

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  10. Data mining and statistical inference in selective laser melting

    DOE PAGES

    Kamath, Chandrika

    2016-01-11

    Selective laser melting (SLM) is an additive manufacturing process that builds a complex three-dimensional part, layer-by-layer, using a laser beam to fuse fine metal powder together. The design freedom afforded by SLM comes associated with complexity. As the physical phenomena occur over a broad range of length and time scales, the computational cost of modeling the process is high. At the same time, the large number of parameters that control the quality of a part make experiments expensive. In this paper, we describe ways in which we can use data mining and statistical inference techniques to intelligently combine simulations andmore » experiments to build parts with desired properties. We start with a brief summary of prior work in finding process parameters for high-density parts. We then expand on this work to show how we can improve the approach by using feature selection techniques to identify important variables, data-driven surrogate models to reduce computational costs, improved sampling techniques to cover the design space adequately, and uncertainty analysis for statistical inference. Here, our results indicate that techniques from data mining and statistics can complement those from physical modeling to provide greater insight into complex processes such as selective laser melting.« less

  11. The interprocess NIR sampling as an alternative approach to multivariate statistical process control for identifying sources of product-quality variability.

    PubMed

    Marković, Snežana; Kerč, Janez; Horvat, Matej

    2017-03-01

    We are presenting a new approach of identifying sources of variability within a manufacturing process by NIR measurements of samples of intermediate material after each consecutive unit operation (interprocess NIR sampling technique). In addition, we summarize the development of a multivariate statistical process control (MSPC) model for the production of enteric-coated pellet product of the proton-pump inhibitor class. By developing provisional NIR calibration models, the identification of critical process points yields comparable results to the established MSPC modeling procedure. Both approaches are shown to lead to the same conclusion, identifying parameters of extrusion/spheronization and characteristics of lactose that have the greatest influence on the end-product's enteric coating performance. The proposed approach enables quicker and easier identification of variability sources during manufacturing process, especially in cases when historical process data is not straightforwardly available. In the presented case the changes of lactose characteristics are influencing the performance of the extrusion/spheronization process step. The pellet cores produced by using one (considered as less suitable) lactose source were on average larger and more fragile, leading to consequent breakage of the cores during subsequent fluid bed operations. These results were confirmed by additional experimental analyses illuminating the underlying mechanism of fracture of oblong pellets during the pellet coating process leading to compromised film coating.

  12. [Programme for improving emotional and cognitive changes in patients under renal dialysis in Egypt].

    PubMed

    Awadalla, Hala I; El-Ateek, Ahmed M; Elhammady, Mohamed M; Kamel, Magda A

    2008-01-01

    We investigated the effect of chronic renal failure on the emotional status, social and psychological adaptation and the cognitive status of patients and the effect of a programme to improve the psychosocial state of the patients; 40 renal dialysis patients and 40 healthy controls were included. We used the Emotional Status Scale, Psychosocial Adaptation Scale, the Primary Mental Abilities Test and the Memory Processes Scale for assessment of the participants. The controls had better emotional/cognitive status and psychosocial adaptation than the dialysis patients, a statistically significant difference. There were also statistically significant differences between the patients before and after the application of the programme.

  13. Diagnosis of abnormal patterns in multivariate microclimate monitoring: a case study of an open-air archaeological site in Pompeii (Italy).

    PubMed

    Merello, Paloma; García-Diego, Fernando-Juan; Zarzo, Manuel

    2014-08-01

    Chemometrics has been applied successfully since the 1990s for the multivariate statistical control of industrial processes. A new area of interest for these tools is the microclimatic monitoring of cultural heritage. Sensors record climatic parameters over time and statistical data analysis is performed to obtain valuable information for preventive conservation. A case study of an open-air archaeological site is presented here. A set of 26 temperature and relative humidity data-loggers was installed in four rooms of Ariadne's house (Pompeii). If climatic values are recorded versus time at different positions, the resulting data structure is equivalent to records of physical parameters registered at several points of a continuous chemical process. However, there is an important difference in this case: continuous processes are controlled to reach a steady state, whilst open-air sites undergo tremendous fluctuations. Although data from continuous processes are usually column-centred prior to applying principal components analysis, it turned out that another pre-treatment (row-centred data) was more convenient for the interpretation of components and to identify abnormal patterns. The detection of typical trajectories was more straightforward by dividing the whole monitored period into several sub-periods, because the marked climatic fluctuations throughout the year affect the correlation structures. The proposed statistical methodology is of interest for the microclimatic monitoring of cultural heritage, particularly in the case of open-air or semi-confined archaeological sites. Copyright © 2014 Elsevier B.V. All rights reserved.

  14. Bulk tank somatic cell counts analyzed by statistical process control tools to identify and monitor subclinical mastitis incidence.

    PubMed

    Lukas, J M; Hawkins, D M; Kinsel, M L; Reneau, J K

    2005-11-01

    The objective of this study was to examine the relationship between monthly Dairy Herd Improvement (DHI) subclinical mastitis and new infection rate estimates and daily bulk tank somatic cell count (SCC) summarized by statistical process control tools. Dairy Herd Improvement Association test-day subclinical mastitis and new infection rate estimates along with daily or every other day bulk tank SCC data were collected for 12 mo of 2003 from 275 Upper Midwest dairy herds. Herds were divided into 5 herd production categories. A linear score [LNS = ln(BTSCC/100,000)/0.693147 + 3] was calculated for each individual bulk tank SCC. For both the raw SCC and the transformed data, the mean and sigma were calculated using the statistical quality control individual measurement and moving range chart procedure of Statistical Analysis System. One hundred eighty-three herds of the 275 herds from the study data set were then randomly selected and the raw (method 1) and transformed (method 2) bulk tank SCC mean and sigma were used to develop models for predicting subclinical mastitis and new infection rate estimates. Herd production category was also included in all models as 5 dummy variables. Models were validated by calculating estimates of subclinical mastitis and new infection rates for the remaining 92 herds and plotting them against observed values of each of the dependents. Only herd production category and bulk tank SCC mean were significant and remained in the final models. High R2 values (0.83 and 0.81 for methods 1 and 2, respectively) indicated a strong correlation between the bulk tank SCC and herd's subclinical mastitis prevalence. The standard errors of the estimate were 4.02 and 4.28% for methods 1 and 2, respectively, and decreased with increasing herd production. As a case study, Shewhart Individual Measurement Charts were plotted from the bulk tank SCC to identify shifts in mastitis incidence. Four of 5 charts examined signaled a change in bulk tank SCC before the DHI test day identified the change in subclinical mastitis prevalence. It can be concluded that applying statistical process control tools to daily bulk tank SCC can be used to estimate subclinical mastitis prevalence in the herd and observe for change in the subclinical mastitis status. Single DHI test day estimates of new infection rate were insufficient to accurately describe its dynamics.

  15. Statistical Learning of Two Artificial Languages Presented Successively: How Conscious?

    PubMed Central

    Franco, Ana; Cleeremans, Axel; Destrebecqz, Arnaud

    2011-01-01

    Statistical learning is assumed to occur automatically and implicitly, but little is known about the extent to which the representations acquired over training are available to conscious awareness. In this study, we focus on whether the knowledge acquired in a statistical learning situation is available to conscious control. Participants were first exposed to an artificial language presented auditorily. Immediately thereafter, they were exposed to a second artificial language. Both languages were composed of the same corpus of syllables and differed only in the transitional probabilities. We first determined that both languages were equally learnable (Experiment 1) and that participants could learn the two languages and differentiate between them (Experiment 2). Then, in Experiment 3, we used an adaptation of the Process-Dissociation Procedure (Jacoby, 1991) to explore whether participants could consciously manipulate the acquired knowledge. Results suggest that statistical information can be used to parse and differentiate between two different artificial languages, and that the resulting representations are available to conscious control. PMID:21960981

  16. A new approach to process control using Instability Index

    NASA Astrophysics Data System (ADS)

    Weintraub, Jeffrey; Warrick, Scott

    2016-03-01

    The merits of a robust Statistical Process Control (SPC) methodology have long been established. In response to the numerous SPC rule combinations, processes, and the high cost of containment, the Instability Index (ISTAB) is presented as a tool for managing these complexities. ISTAB focuses limited resources on key issues and provides a window into the stability of manufacturing operations. ISTAB takes advantage of the statistical nature of processes by comparing the observed average run length (OARL) to the expected run length (ARL), resulting in a gap value called the ISTAB index. The ISTAB index has three characteristic behaviors that are indicative of defects in an SPC instance. Case 1: The observed average run length is excessively long relative to expectation. ISTAB > 0 is indicating the possibility that the limits are too wide. Case 2: The observed average run length is consistent with expectation. ISTAB near zero is indicating that the process is stable. Case 3: The observed average run length is inordinately short relative to expectation. ISTAB < 0 is indicating that the limits are too tight, the process is unstable or both. The probability distribution of run length is the basis for establishing an ARL. We demonstrate that the geometric distribution is a good approximation to run length across a wide variety of rule sets. Excessively long run lengths are associated with one kind of defect in an SPC instance; inordinately short run lengths are associated with another. A sampling distribution is introduced as a way to quantify excessively long and inordinately short observed run lengths. This paper provides detailed guidance for action limits on these run lengths. ISTAB as a statistical method of review facilitates automated instability detection. This paper proposes a management system based on ISTAB as an enhancement to more traditional SPC approaches.

  17. Model Checking Techniques for Assessing Functional Form Specifications in Censored Linear Regression Models.

    PubMed

    León, Larry F; Cai, Tianxi

    2012-04-01

    In this paper we develop model checking techniques for assessing functional form specifications of covariates in censored linear regression models. These procedures are based on a censored data analog to taking cumulative sums of "robust" residuals over the space of the covariate under investigation. These cumulative sums are formed by integrating certain Kaplan-Meier estimators and may be viewed as "robust" censored data analogs to the processes considered by Lin, Wei & Ying (2002). The null distributions of these stochastic processes can be approximated by the distributions of certain zero-mean Gaussian processes whose realizations can be generated by computer simulation. Each observed process can then be graphically compared with a few realizations from the Gaussian process. We also develop formal test statistics for numerical comparison. Such comparisons enable one to assess objectively whether an apparent trend seen in a residual plot reects model misspecification or natural variation. We illustrate the methods with a well known dataset. In addition, we examine the finite sample performance of the proposed test statistics in simulation experiments. In our simulation experiments, the proposed test statistics have good power of detecting misspecification while at the same time controlling the size of the test.

  18. Chromosome instability on children with asthma.

    PubMed

    Lialiaris, Theodore; Polyzou, Aggeliki; Mpountoukas, Panagiotis; Tsiggene, Anthi; Kouskoukis, Alexandros; Pouliliou, Stamatia; Paraskakis, Emmanouil; Tentes, Ioannis; Trypsianis, Grigorios; Chatzimichail, Athanasios

    2009-10-01

    Asthma is a complex disease with multiple interactions between genetic and environmental factors. The aim of our study was to investigate the possible genetic instability in asthmatic patients (AP) with asthma in human cultured peripheral blood lymphocytes. Furthermore, the presence of either cytostaticity or cytotoxicity was demonstrated. Human peripheral blood lymphocytes were cultured from 18 admitted children to the Pediatric Clinic of the University Hospital of Alexandroupolis (average age 7.2 years), and 9 healthy blood donors were used as control subjects (average age 6.5 years), none of whom was receiving drugs for medical or other reasons. A significant (p < 0.05) increase in spontaneous sister chromatid exchanges (SCEs) frequency in asthmatic patients compared with control subjects was observed. No statistically significant modification in the spontaneous proliferation rate index (PRI) in AP compared with the controls was demonstrated. Finally, MMC induced a statistically significant increase in SCEs frequency both to controls and to AP, with the MMC-induced SCEs rates in AP being statistically (p < 0.01) higher compared to the MMC-induced SCEs in controls. We try to improve a new diagnostic process of possible genetic instability by a combination of genotoxic, cytostatic and cytotoxic effects of asthma on human peripheral lymphocytes.

  19. Biocompatibility evaluation of alendronate paste in rat's subcutaneous tissue.

    PubMed

    Mori, Graziela Garrido; de Moraes, Ivaldo Gomes; Nunes, Daniele Clapes; Castilho, Lithiene Ribeiro; Poi, Wilson Roberto; Capaldi, Maria Luciana P Manzoli

    2009-04-01

    Alendronate is a known inhibitor of root resorption and the development of alendronate paste would enhance its utilization as intracanal medication. Therefore, this study aimed to investigate the biocompatibility of experimental alendronate paste in subcutaneous tissue of rats, for utilization in teeth susceptible to root resorption. The study was conducted on 15 male rats, weighing approximately 180-200 grams. The rats' dorsal regions were submitted to one incision on the median region and, laterally to the incision, the subcutaneous tissue was raised and gently dissected for introduction of two tubes, in each rat. The tubes were sealed at one end with gutta-percha and taken as control. The tubes were filled with experimental alendronate paste. The animals were killed at 7, 15 and 45 days after surgery and the specimens were processed in laboratory. The histological sections were stained with hematoxylin-eosin and analyzed by light microscopy. Scores were assigned to the inflammatory process and statistically compared by the Tukey test (P < 0.05). Alendronate paste promoted severe inflammation process at 7 days, with statistically significant difference compared to the control (P < 0.05%). However, at 15 days, there was a regression of inflammation and the presence of connective tissue with collagen fibers, fibroblasts and blood vessels was observed. After 45 days, it was observed the presence of well-organized connective tissue, with collagen fibers and fibroblasts, and few inflammatory cells. No statistical difference was observed between the control and experimental paste at 15 and 45 days. The experimental alendronate paste was considered biocompatible with subcutaneous tissue of rat.

  20. The effects of computer-assisted instruction and locus of control upon preservice elementary teachers' acquisition of the integrated science process skills

    NASA Astrophysics Data System (ADS)

    Wesley, Beth Eddinger; Krockover, Gerald H.; Devito, Alfred

    The purpose of this study was to determine the effects of computer-assisted instruction (CAI) versus a text mode of programmed instruction (PI), and the cognitive style of locus of control, on preservice elementary teachers' achievement of the integrated science process skills. Eighty-one preservice elementary teachers in six sections of a science methods class were classified as internally or externally controlled. The sections were randomly assigned to receive instruction in the integrated science process skills via a microcomputer or printed text. The study used a pretest-posttest control group design. Before assessing main and interaction effects, analysis of covariance was used to adjust posttest scores using the pretest scores. Statistical analysis revealed that main effects were not significant. Additionally, no interaction effects between treatments and loci of control were demonstrated. The results suggest that printed PI and tutorial CAI are equally effective modes of instruction for teaching internally and externally oriented preservice elementary teachers the integrated science process skills.

  1. Statistical power in parallel group point exposure studies with time-to-event outcomes: an empirical comparison of the performance of randomized controlled trials and the inverse probability of treatment weighting (IPTW) approach.

    PubMed

    Austin, Peter C; Schuster, Tibor; Platt, Robert W

    2015-10-15

    Estimating statistical power is an important component of the design of both randomized controlled trials (RCTs) and observational studies. Methods for estimating statistical power in RCTs have been well described and can be implemented simply. In observational studies, statistical methods must be used to remove the effects of confounding that can occur due to non-random treatment assignment. Inverse probability of treatment weighting (IPTW) using the propensity score is an attractive method for estimating the effects of treatment using observational data. However, sample size and power calculations have not been adequately described for these methods. We used an extensive series of Monte Carlo simulations to compare the statistical power of an IPTW analysis of an observational study with time-to-event outcomes with that of an analysis of a similarly-structured RCT. We examined the impact of four factors on the statistical power function: number of observed events, prevalence of treatment, the marginal hazard ratio, and the strength of the treatment-selection process. We found that, on average, an IPTW analysis had lower statistical power compared to an analysis of a similarly-structured RCT. The difference in statistical power increased as the magnitude of the treatment-selection model increased. The statistical power of an IPTW analysis tended to be lower than the statistical power of a similarly-structured RCT.

  2. Control by quality: proposition of a typology.

    PubMed

    Pujo, P; Pillet, M

    The application of Quality tools and methods in industrial management has always had a fundamental impact on the control of production. It influences the behavior of the actors concerned, while introducing the necessary notions and formalizations, especially for production systems with little or no automation, which constitute a large part of the industrial activity. Several quality approaches are applied in the workshop and are implemented at the level of the control. In this paper, the authors present a typology of the various approaches that have successively influenced control, such as statistical process control, quality assurance, and continuous improvement. First the authors present a parallel between production control and quality organizational structure. They note the duality between control, which is aimed at increasing productivity, and quality, which aims to satisfy the needs of the customer. They also note the hierarchical organizational structure of these two systems of management with, at each level, the notion of a feedback loop. This notion is fundamental to any kind of decision making. The paper is organized around the operational, tactical, and strategic levels, by describing for each level the main methods and tools for control by quality. The overview of these tools and methods starts at the operational level, with the Statistical Process Control, the Taguchi technique, and the "six sigma" approach. On the tactical level, we find a quality system approach, with a documented description of the procedures introduced in the firm. The management system can refer here to Quality Assurance, Total Productive Maintenance, or Management by Total Quality. The formalization through procedures of the rules of decision governing the process control enhances the validity of these rules. This leads to the enhancement of their reliability and to their consolidation. All this counterbalances the human, intrinsically fluctuating, behavior of the control operators. Strategic control by quality is then detailed, and the two main approaches, the continuous improvement approach and the proactive improvement approach, are introduced. Finally, the authors observe that at each of the three levels, the continuous process improvement, which is a component of Total Quality, becomes an essential preoccupation for the control. Ultimately, the recursive utilization of the Deming cycle remains the best practice for the control by quality.

  3. Development of rational pay factors based on concrete compressive strength data

    DOT National Transportation Integrated Search

    2008-06-01

    This research project addresses the opportunity to contain the escalating costs of concrete materials in construction projects. Both statistical process control and rational acceptance criteria show that quality improvement and cost savings can be ac...

  4. Quality control for quantitative PCR based on amplification compatibility test.

    PubMed

    Tichopad, Ales; Bar, Tzachi; Pecen, Ladislav; Kitchen, Robert R; Kubista, Mikael; Pfaffl, Michael W

    2010-04-01

    Quantitative qPCR is a routinely used method for the accurate quantification of nucleic acids. Yet it may generate erroneous results if the amplification process is obscured by inhibition or generation of aberrant side-products such as primer dimers. Several methods have been established to control for pre-processing performance that rely on the introduction of a co-amplified reference sequence, however there is currently no method to allow for reliable control of the amplification process without directly modifying the sample mix. Herein we present a statistical approach based on multivariate analysis of the amplification response data generated in real-time. The amplification trajectory in its most resolved and dynamic phase is fitted with a suitable model. Two parameters of this model, related to amplification efficiency, are then used for calculation of the Z-score statistics. Each studied sample is compared to a predefined reference set of reactions, typically calibration reactions. A probabilistic decision for each individual Z-score is then used to identify the majority of inhibited reactions in our experiments. We compare this approach to univariate methods using only the sample specific amplification efficiency as reporter of the compatibility. We demonstrate improved identification performance using the multivariate approach compared to the univariate approach. Finally we stress that the performance of the amplification compatibility test as a quality control procedure depends on the quality of the reference set. Copyright 2010 Elsevier Inc. All rights reserved.

  5. Total Quality Management (TQM), an Overview

    DTIC Science & Technology

    1991-09-01

    Quality Management (TQM). It discusses the reasons TQM is a current growth industry, what it is, and how one implements it. It describes the basic analytical tools, statistical process control, some advanced analytical tools, tools used by process improvement teams to enhance their own operations, and action plans for making improvements. The final sections discuss assessing quality efforts and measuring the quality to knowledge

  6. An Assessment of Statistical Process Control-Based Approaches for Charting Student Evaluation Scores

    ERIC Educational Resources Information Center

    Ding, Xin; Wardell, Don; Verma, Rohit

    2006-01-01

    We compare three control charts for monitoring data from student evaluations of teaching (SET) with the goal of improving student satisfaction with teaching performance. The two charts that we propose are a modified "p" chart and a z-score chart. We show that these charts overcome some of the shortcomings of the more traditional charts…

  7. Soft Sensors: Chemoinformatic Model for Efficient Control and Operation in Chemical Plants.

    PubMed

    Funatsu, Kimito

    2016-12-01

    Soft sensor is statistical model as an essential tool for controlling pharmaceutical, chemical and industrial plants. I introduce soft sensor, the roles, the applications, the problems and the research examples such as adaptive soft sensor, database monitoring and efficient process control. The use of soft sensor enables chemical industrial plants to be operated more effectively and stably. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Real-time control data wrangling for development of mathematical control models of technological processes

    NASA Astrophysics Data System (ADS)

    Vasilyeva, N. V.; Koteleva, N. I.; Fedorova, E. R.

    2018-05-01

    The relevance of the research is due to the need to stabilize the composition of the melting products of copper-nickel sulfide raw materials in the Vanyukov furnace. The goal of this research is to identify the most suitable methods for the aggregation of the real time data for the development of a mathematical model for control of the technological process of melting copper-nickel sulfide raw materials in the Vanyukov furnace. Statistical methods of analyzing the historical data of the real technological object and the correlation analysis of process parameters are described. Factors that exert the greatest influence on the main output parameter (copper content in matte) and ensure the physical-chemical transformations are revealed. An approach to the processing of the real time data for the development of a mathematical model for control of the melting process is proposed. The stages of processing the real time information are considered. The adopted methodology for the aggregation of data suitable for the development of a control model for the technological process of melting copper-nickel sulfide raw materials in the Vanyukov furnace allows us to interpret the obtained results for their further practical application.

  9. [Near infrared spectroscopy based process trajectory technology and its application in monitoring and controlling of traditional Chinese medicine manufacturing process].

    PubMed

    Li, Wen-Long; Qu, Hai-Bin

    2016-10-01

    In this paper, the principle of NIRS (near infrared spectroscopy)-based process trajectory technology was introduced.The main steps of the technique include:① in-line collection of the processes spectra of different technics; ② unfolding of the 3-D process spectra;③ determination of the process trajectories and their normal limits;④ monitoring of the new batches with the established MSPC (multivariate statistical process control) models.Applications of the technology in the chemical and biological medicines were reviewed briefly. By a comprehensive introduction of our feasibility research on the monitoring of traditional Chinese medicine technical process using NIRS-based multivariate process trajectories, several important problems of the practical applications which need urgent solutions are proposed, and also the application prospect of the NIRS-based process trajectory technology is fully discussed and put forward in the end. Copyright© by the Chinese Pharmaceutical Association.

  10. Defense Waste Processing Facility (DWPF) Viscosity Model: Revisions for Processing High TiO 2 Containing Glasses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jantzen, C. M.; Edwards, T. B.

    Radioactive high-level waste (HLW) at the Savannah River Site (SRS) has successfully been vitrified into borosilicate glass in the Defense Waste Processing Facility (DWPF) since 1996. Vitrification requires stringent product/process (P/P) constraints since the glass cannot be reworked once it is poured into ten foot tall by two foot diameter canisters. A unique “feed forward” statistical process control (SPC) was developed for this control rather than statistical quality control (SQC). In SPC, the feed composition to the DWPF melter is controlled prior to vitrification. In SQC, the glass product would be sampled after it is vitrified. Individual glass property-composition modelsmore » form the basis for the “feed forward” SPC. The models transform constraints on the melt and glass properties into constraints on the feed composition going to the melter in order to guarantee, at the 95% confidence level, that the feed will be processable and that the durability of the resulting waste form will be acceptable to a geologic repository. The DWPF SPC system is known as the Product Composition Control System (PCCS). The DWPF will soon be receiving wastes from the Salt Waste Processing Facility (SWPF) containing increased concentrations of TiO 2, Na 2O, and Cs 2O . The SWPF is being built to pretreat the high-curie fraction of the salt waste to be removed from the HLW tanks in the F- and H-Area Tank Farms at the SRS. In order to process TiO 2 concentrations >2.0 wt% in the DWPF, new viscosity data were developed over the range of 1.90 to 6.09 wt% TiO 2 and evaluated against the 2005 viscosity model. An alternate viscosity model is also derived for potential future use, should the DWPF ever need to process other titanate-containing ion exchange materials. The ultimate limit on the amount of TiO 2 that can be accommodated from SWPF will be determined by the three PCCS models, the waste composition of a given sludge batch, the waste loading of the sludge batch, and the frit used for vitrification.« less

  11. Zirconia toughened SiC whisker reinforced alumina composites small business innovation research

    NASA Technical Reports Server (NTRS)

    Loutfy, R. O.; Stuffle, K. L.; Withers, J. C.; Lee, C. T.

    1987-01-01

    The objective of this phase 1 project was to develop a ceramic composite with superior fracture toughness and high strength, based on combining two toughness inducing materials: zirconia for transformation toughening and SiC whiskers for reinforcement, in a controlled microstructure alumina matrix. The controlled matrix microstructure is obtained by controlling the nucleation frequency of the alumina gel with seeds (submicron alpha-alumina). The results demonstrate the technical feasibility of producing superior binary composites (Al2O3-ZrO2) and tertiary composites (Al2O3-ZrO2-SiC). Thirty-two composites were prepared, consolidated, and fracture toughness tested. Statistical analysis of the results showed that: (1) the SiC type is the key statistically significant factor for increased toughness; (2) sol-gel processing with a-alumina seed had a statistically significant effect on increasing toughness of the binary and tertiary composites compared to the corresponding mixed powder processing; and (3) ZrO2 content within the range investigated had a minor effect. Binary composites with an average critical fracture toughness of 6.6MPam sup 1/2, were obtained. Tertiary composites with critical fracture toughness in the range of 9.3 to 10.1 MPam sup 1/2 were obtained. Results indicate that these composites are superior to zirconia toughened alumina and SiC whisker reinforced alumina ceramic composites produced by conventional techniques with similar composition from published data.

  12. 15 CFR 200.103 - Consulting and advisory services.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...., details of design and construction, operational aspects, unusual or extreme conditions, methods of statistical control of the measurement process, automated acquisition of laboratory data, and data reduction... group seminars on the precision measurement of specific types of physical quantities, offering the...

  13. 15 CFR 200.103 - Consulting and advisory services.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ...., details of design and construction, operational aspects, unusual or extreme conditions, methods of statistical control of the measurement process, automated acquisition of laboratory data, and data reduction... group seminars on the precision measurement of specific types of physical quantities, offering the...

  14. Information processing speed and attention in multiple sclerosis: Reconsidering the Attention Network Test (ANT).

    PubMed

    Roth, Alexandra K; Denney, Douglas R; Lynch, Sharon G

    2015-01-01

    The Attention Network Test (ANT) assesses attention in terms of discrepancies between response times to items that differ in the burden they place on some facet of attention. However, simple arithmetic difference scores commonly used to capture these discrepancies fail to provide adequate control for information processing speed, leading to distorted findings when patient and control groups differ markedly in the speed with which they process and respond to stimulus information. This study examined attention networks in patients with multiple sclerosis (MS) using simple difference scores, proportional scores, and residualized scores that control for processing speed through statistical regression. Patients with relapsing-remitting (N = 20) or secondary progressive (N = 20) MS and healthy controls (N = 40) of similar age, education, and gender completed the ANT. Substantial differences between patients and controls were found on all measures of processing speed. Patients exhibited difficulties in the executive control network, but only when difference scores were considered. When deficits in information processing speed were adequately controlled using proportional or residualized score, deficits in the alerting network emerged. The effect sizes for these deficits were notably smaller than those for overall information processing speed and were also limited to patients with secondary progressive MS. Deficits in processing speed are more prominent in MS than those involving attention, and when the former are properly accounted for, differences in the latter are confined to the alerting network.

  15. Space-time-modulated stochastic processes

    NASA Astrophysics Data System (ADS)

    Giona, Massimiliano

    2017-10-01

    Starting from the physical problem associated with the Lorentzian transformation of a Poisson-Kac process in inertial frames, the concept of space-time-modulated stochastic processes is introduced for processes possessing finite propagation velocity. This class of stochastic processes provides a two-way coupling between the stochastic perturbation acting on a physical observable and the evolution of the physical observable itself, which in turn influences the statistical properties of the stochastic perturbation during its evolution. The definition of space-time-modulated processes requires the introduction of two functions: a nonlinear amplitude modulation, controlling the intensity of the stochastic perturbation, and a time-horizon function, which modulates its statistical properties, providing irreducible feedback between the stochastic perturbation and the physical observable influenced by it. The latter property is the peculiar fingerprint of this class of models that makes them suitable for extension to generic curved-space times. Considering Poisson-Kac processes as prototypical examples of stochastic processes possessing finite propagation velocity, the balance equations for the probability density functions associated with their space-time modulations are derived. Several examples highlighting the peculiarities of space-time-modulated processes are thoroughly analyzed.

  16. Use of ventilator associated pneumonia bundle and statistical process control chart to decrease VAP rate in Syria.

    PubMed

    Alsadat, Reem; Al-Bardan, Hussam; Mazloum, Mona N; Shamah, Asem A; Eltayeb, Mohamed F E; Marie, Ali; Dakkak, Abdulrahman; Naes, Ola; Esber, Faten; Betelmal, Ibrahim; Kherallah, Mazen

    2012-10-01

    Implementation of ventilator associated pneumonia (VAP) bundle as a performance improvement project in the critical care units for all mechanically ventilated patients aiming to decrease the VAP rates. VAP bundle was implemented in 4 teaching hospitals after educational sessions and compliance rates along with VAP rates were monitored using statistical process control charts. VAP bundle compliance rates were steadily increasing from 33 to 80% in hospital 1, from 33 to 86% in hospital 2 and from 83 to 100% in hospital 3 during the study period. The VAP bundle was not applied in hospital 4 therefore no data was available. A target level of 95% was reached only in hospital 3. This correlated with a decrease in VAP rates from 30 to 6.4 per 1000 ventilator days in hospital 1, from 12 to 4.9 per 1000 ventilator days in hospital 3, whereas VAP rate failed to decrease in hospital 2 (despite better compliance) and it remained high around 33 per 1000 ventilator days in hospital 4 where VAP bundle was not implemented. VAP bundle has performed differently in different hospitals in our study. Prevention of VAP requires a multidimensional strategy that includes strict infection control interventions, VAP bundle implementation, process and outcome surveillance and education.

  17. Experience and Sentence Processing: Statistical Learning and Relative Clause Comprehension

    PubMed Central

    Wells, Justine B.; Christiansen, Morten H.; Race, David S.; Acheson, Daniel J.; MacDonald, Maryellen C.

    2009-01-01

    Many explanations of the difficulties associated with interpreting object relative clauses appeal to the demands that object relatives make on working memory. MacDonald and Christiansen (2002) pointed to variations in reading experience as a source of differences, arguing that the unique word order of object relatives makes their processing more difficult and more sensitive to the effects of previous experience than the processing of subject relatives. This hypothesis was tested in a large-scale study manipulating reading experiences of adults over several weeks. The group receiving relative clause experience increased reading speeds for object relatives more than for subject relatives, whereas a control experience group did not. The reading time data were compared to performance of a computational model given different amounts of experience. The results support claims for experience-based individual differences and an important role for statistical learning in sentence comprehension processes. PMID:18922516

  18. Structure Learning in Bayesian Sensorimotor Integration

    PubMed Central

    Genewein, Tim; Hez, Eduard; Razzaghpanah, Zeynab; Braun, Daniel A.

    2015-01-01

    Previous studies have shown that sensorimotor processing can often be described by Bayesian learning, in particular the integration of prior and feedback information depending on its degree of reliability. Here we test the hypothesis that the integration process itself can be tuned to the statistical structure of the environment. We exposed human participants to a reaching task in a three-dimensional virtual reality environment where we could displace the visual feedback of their hand position in a two dimensional plane. When introducing statistical structure between the two dimensions of the displacement, we found that over the course of several days participants adapted their feedback integration process in order to exploit this structure for performance improvement. In control experiments we found that this adaptation process critically depended on performance feedback and could not be induced by verbal instructions. Our results suggest that structural learning is an important meta-learning component of Bayesian sensorimotor integration. PMID:26305797

  19. Criterion validity of the Wechsler Intelligence Scale for Children-Fourth Edition after pediatric traumatic brain injury.

    PubMed

    Donders, Jacobus; Janke, Kelly

    2008-07-01

    The performance of 40 children with complicated mild to severe traumatic brain injury on the Wechsler Intelligence Scale for Children-Fourth Edition (WISC-IV; Wechsler, 2003) was compared with that of 40 demographically matched healthy controls. Of the four WISC-IV factor index scores, only Processing Speed yielded a statistically significant group difference (p < .001) as well as a statistically significant negative correlation with length of coma (p < .01). Logistic regression, using Processing Speed to classify individual children, yielded a sensitivity of 72.50% and a specificity of 62.50%, with false positive and false negative rates both exceeding 30%. We conclude that Processing Speed has acceptable criterion validity in the evaluation of children with complicated mild to severe traumatic brain injury but that the WISC-IV should be supplemented with other measures to assure sufficient accuracy in the diagnostic process.

  20. Neural network approaches versus statistical methods in classification of multisource remote sensing data

    NASA Technical Reports Server (NTRS)

    Benediktsson, Jon A.; Swain, Philip H.; Ersoy, Okan K.

    1990-01-01

    Neural network learning procedures and statistical classificaiton methods are applied and compared empirically in classification of multisource remote sensing and geographic data. Statistical multisource classification by means of a method based on Bayesian classification theory is also investigated and modified. The modifications permit control of the influence of the data sources involved in the classification process. Reliability measures are introduced to rank the quality of the data sources. The data sources are then weighted according to these rankings in the statistical multisource classification. Four data sources are used in experiments: Landsat MSS data and three forms of topographic data (elevation, slope, and aspect). Experimental results show that two different approaches have unique advantages and disadvantages in this classification application.

  1. Quality Control of High-Dose-Rate Brachytherapy: Treatment Delivery Analysis Using Statistical Process Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Able, Charles M., E-mail: cable@wfubmc.edu; Bright, Megan; Frizzell, Bart

    Purpose: Statistical process control (SPC) is a quality control method used to ensure that a process is well controlled and operates with little variation. This study determined whether SPC was a viable technique for evaluating the proper operation of a high-dose-rate (HDR) brachytherapy treatment delivery system. Methods and Materials: A surrogate prostate patient was developed using Vyse ordnance gelatin. A total of 10 metal oxide semiconductor field-effect transistors (MOSFETs) were placed from prostate base to apex. Computed tomography guidance was used to accurately position the first detector in each train at the base. The plan consisted of 12 needles withmore » 129 dwell positions delivering a prescribed peripheral dose of 200 cGy. Sixteen accurate treatment trials were delivered as planned. Subsequently, a number of treatments were delivered with errors introduced, including wrong patient, wrong source calibration, wrong connection sequence, single needle displaced inferiorly 5 mm, and entire implant displaced 2 mm and 4 mm inferiorly. Two process behavior charts (PBC), an individual and a moving range chart, were developed for each dosimeter location. Results: There were 4 false positives resulting from 160 measurements from 16 accurately delivered treatments. For the inaccurately delivered treatments, the PBC indicated that measurements made at the periphery and apex (regions of high-dose gradient) were much more sensitive to treatment delivery errors. All errors introduced were correctly identified by either the individual or the moving range PBC in the apex region. Measurements at the urethra and base were less sensitive to errors. Conclusions: SPC is a viable method for assessing the quality of HDR treatment delivery. Further development is necessary to determine the most effective dose sampling, to ensure reproducible evaluation of treatment delivery accuracy.« less

  2. Statistical physics of human beings in games: Controlled experiments

    NASA Astrophysics Data System (ADS)

    Liang, Yuan; Huang, Ji-Ping

    2014-07-01

    It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems.

  3. Six Sigma Quality Management System and Design of Risk-based Statistical Quality Control.

    PubMed

    Westgard, James O; Westgard, Sten A

    2017-03-01

    Six sigma concepts provide a quality management system (QMS) with many useful tools for managing quality in medical laboratories. This Six Sigma QMS is driven by the quality required for the intended use of a test. The most useful form for this quality requirement is the allowable total error. Calculation of a sigma-metric provides the best predictor of risk for an analytical examination process, as well as a design parameter for selecting the statistical quality control (SQC) procedure necessary to detect medically important errors. Simple point estimates of sigma at medical decision concentrations are sufficient for laboratory applications. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Common method biases in behavioral research: a critical review of the literature and recommended remedies.

    PubMed

    Podsakoff, Philip M; MacKenzie, Scott B; Lee, Jeong-Yeon; Podsakoff, Nathan P

    2003-10-01

    Interest in the problem of method biases has a long history in the behavioral sciences. Despite this, a comprehensive summary of the potential sources of method biases and how to control for them does not exist. Therefore, the purpose of this article is to examine the extent to which method biases influence behavioral research results, identify potential sources of method biases, discuss the cognitive processes through which method biases influence responses to measures, evaluate the many different procedural and statistical techniques that can be used to control method biases, and provide recommendations for how to select appropriate procedural and statistical remedies for different types of research settings.

  5. Statistical analysis plan for the Alveolar Recruitment for Acute Respiratory Distress Syndrome Trial (ART). A randomized controlled trial

    PubMed Central

    Damiani, Lucas Petri; Berwanger, Otavio; Paisani, Denise; Laranjeira, Ligia Nasi; Suzumura, Erica Aranha; Amato, Marcelo Britto Passos; Carvalho, Carlos Roberto Ribeiro; Cavalcanti, Alexandre Biasi

    2017-01-01

    Background The Alveolar Recruitment for Acute Respiratory Distress Syndrome Trial (ART) is an international multicenter randomized pragmatic controlled trial with allocation concealment involving 120 intensive care units in Brazil, Argentina, Colombia, Italy, Poland, Portugal, Malaysia, Spain, and Uruguay. The primary objective of ART is to determine whether maximum stepwise alveolar recruitment associated with PEEP titration, adjusted according to the static compliance of the respiratory system (ART strategy), is able to increase 28-day survival in patients with acute respiratory distress syndrome compared to conventional treatment (ARDSNet strategy). Objective To describe the data management process and statistical analysis plan. Methods The statistical analysis plan was designed by the trial executive committee and reviewed and approved by the trial steering committee. We provide an overview of the trial design with a special focus on describing the primary (28-day survival) and secondary outcomes. We describe our data management process, data monitoring committee, interim analyses, and sample size calculation. We describe our planned statistical analyses for primary and secondary outcomes as well as pre-specified subgroup analyses. We also provide details for presenting results, including mock tables for baseline characteristics, adherence to the protocol and effect on clinical outcomes. Conclusion According to best trial practice, we report our statistical analysis plan and data management plan prior to locking the database and beginning analyses. We anticipate that this document will prevent analysis bias and enhance the utility of the reported results. Trial registration ClinicalTrials.gov number, NCT01374022. PMID:28977255

  6. Non-Markovian quantum feedback networks II: Controlled flows

    NASA Astrophysics Data System (ADS)

    Gough, John E.

    2017-06-01

    The concept of a controlled flow of a dynamical system, especially when the controlling process feeds information back about the system, is of central importance in control engineering. In this paper, we build on the ideas presented by Bouten and van Handel [Quantum Stochastics and Information: Statistics, Filtering and Control (World Scientific, 2008)] and develop a general theory of quantum feedback. We elucidate the relationship between the controlling processes, Z, and the measured processes, Y, and to this end we make a distinction between what we call the input picture and the output picture. We should note that the input-output relations for the noise fields have additional terms not present in the standard theory but that the relationship between the control processes and measured processes themselves is internally consistent—we do this for the two main cases of quadrature measurement and photon-counting measurement. The theory is general enough to include a modulating filter which post-processes the measurement readout Y before returning to the system. This opens up the prospect of applying very general engineering feedback control techniques to open quantum systems in a systematic manner, and we consider a number of specific modulating filter problems. Finally, we give a brief argument as to why most of the rules for making instantaneous feedback connections [J. Gough and M. R. James, Commun. Math. Phys. 287, 1109 (2009)] ought to apply for controlled dynamical networks as well.

  7. IMPACTS OF ANTIFOAM ADDITIONS AND ARGON BUBBLING ON DEFENSE WASTE PROCESSING FACILITY REDUCTION/OXIDATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jantzen, C.; Johnson, F.

    2012-06-05

    During melting of HLW glass, the REDOX of the melt pool cannot be measured. Therefore, the Fe{sup +2}/{Sigma}Fe ratio in the glass poured from the melter must be related to melter feed organic and oxidant concentrations to ensure production of a high quality glass without impacting production rate (e.g., foaming) or melter life (e.g., metal formation and accumulation). A production facility such as the Defense Waste Processing Facility (DWPF) cannot wait until the melt or waste glass has been made to assess its acceptability, since by then no further changes to the glass composition and acceptability are possible. therefore, themore » acceptability decision is made on the upstream process, rather than on the downstream melt or glass product. That is, it is based on 'feed foward' statistical process control (SPC) rather than statistical quality control (SQC). In SPC, the feed composition to the melter is controlled prior to vitrification. Use of the DWPF REDOX model has controlled the balanjce of feed reductants and oxidants in the Sludge Receipt and Adjustment Tank (SRAT). Once the alkali/alkaline earth salts (both reduced and oxidized) are formed during reflux in the SRAT, the REDOX can only change if (1) additional reductants or oxidants are added to the SRAT, the Slurry Mix Evaporator (SME), or the Melter Feed Tank (MFT) or (2) if the melt pool is bubble dwith an oxidizing gas or sparging gas that imposes a different REDOX target than the chemical balance set during reflux in the SRAT.« less

  8. Implementation of Lean System on Erbium Doped Fibre Amplifier Manufacturing Process to Reduce Production Time

    NASA Astrophysics Data System (ADS)

    Maneechote, T.; Luangpaiboon, P.

    2010-10-01

    A manufacturing process of erbium doped fibre amplifiers is complicated. It needs to meet the customers' requirements under a present economic status that products need to be shipped to customers as soon as possible after purchasing orders. This research aims to study and improve processes and production lines of erbium doped fibre amplifiers using lean manufacturing systems via an application of computer simulation. Three scenarios of lean tooled box systems are selected via the expert system. Firstly, the production schedule based on shipment date is combined with a first in first out control system. The second scenario focuses on a designed flow process plant layout. Finally, the previous flow process plant layout combines with production schedule based on shipment date including the first in first out control systems. The computer simulation with the limited data via an expected value is used to observe the performance of all scenarios. The most preferable resulted lean tooled box systems from a computer simulation are selected to implement in the real process of a production of erbium doped fibre amplifiers. A comparison is carried out to determine the actual performance measures via an analysis of variance of the response or the production time per unit achieved in each scenario. The goodness of an adequacy of the linear statistical model via experimental errors or residuals is also performed to check the normality, constant variance and independence of the residuals. The results show that a hybrid scenario of lean manufacturing system with the first in first out control and flow process plant lay out statistically leads to better performance in terms of the mean and variance of production times.

  9. Vehicular headways on signalized intersections: theory, models, and reality

    NASA Astrophysics Data System (ADS)

    Krbálek, Milan; Šleis, Jiří

    2015-01-01

    We discuss statistical properties of vehicular headways measured on signalized crossroads. On the basis of mathematical approaches, we formulate theoretical and empirically inspired criteria for the acceptability of theoretical headway distributions. Sequentially, the multifarious families of statistical distributions (commonly used to fit real-road headway statistics) are confronted with these criteria, and with original empirical time clearances gauged among neighboring vehicles leaving signal-controlled crossroads after a green signal appears. Using three different numerical schemes, we demonstrate that an arrangement of vehicles on an intersection is a consequence of the general stochastic nature of queueing systems, rather than a consequence of traffic rules, driver estimation processes, or decision-making procedures.

  10. Statistical Capability Study of a Helical Grinding Machine Producing Screw Rotors

    NASA Astrophysics Data System (ADS)

    Holmes, C. S.; Headley, M.; Hart, P. W.

    2017-08-01

    Screw compressors depend for their efficiency and reliability on the accuracy of the rotors, and therefore on the machinery used in their production. The machinery has evolved over more than half a century in response to customer demands for production accuracy, efficiency, and flexibility, and is now at a high level on all three criteria. Production equipment and processes must be capable of maintaining accuracy over a production run, and this must be assessed statistically under strictly controlled conditions. This paper gives numerical data from such a study of an innovative machine tool and shows that it is possible to meet the demanding statistical capability requirements.

  11. Automatic centring and bonding of lenses

    NASA Astrophysics Data System (ADS)

    Krey, Stefan; Heinisch, J.; Dumitrescu, E.

    2007-05-01

    We present an automatic bonding station which is able to center and bond individual lenses or doublets to a barrel with sub micron centring accuracy. The complete manufacturing cycle includes the glue dispensing and UV curing. During the process the state of centring is continuously controlled by the vision software, and the final result is recorded to a file for process statistics. Simple pass or fail results are displayed to the operator at the end of the process.

  12. A DMAIC approach for process capability improvement an engine crankshaft manufacturing process

    NASA Astrophysics Data System (ADS)

    Sharma, G. V. S. S.; Rao, P. Srinivasa

    2014-05-01

    The define-measure-analyze-improve-control (DMAIC) approach is a five-strata approach, namely DMAIC. This approach is the scientific approach for reducing the deviations and improving the capability levels of the manufacturing processes. The present work elaborates on DMAIC approach applied in reducing the process variations of the stub-end-hole boring operation of the manufacture of crankshaft. This statistical process control study starts with selection of the critical-to-quality (CTQ) characteristic in the define stratum. The next stratum constitutes the collection of dimensional measurement data of the CTQ characteristic identified. This is followed by the analysis and improvement strata where the various quality control tools like Ishikawa diagram, physical mechanism analysis, failure modes effects analysis and analysis of variance are applied. Finally, the process monitoring charts are deployed at the workplace for regular monitoring and control of the concerned CTQ characteristic. By adopting DMAIC approach, standard deviation is reduced from 0.003 to 0.002. The process potential capability index ( C P) values improved from 1.29 to 2.02 and the process performance capability index ( C PK) values improved from 0.32 to 1.45, respectively.

  13. Numerical Implementation of Indicators and Statistical Control Tools in Monitoring and Evaluating CACEI-ISO Indicators of Study Program in Industrial Process by Systematization

    ERIC Educational Resources Information Center

    Ayala, Gabriela Cota; Real, Francia Angélica Karlos; Ivan, Ramirez Alvarado Edqar

    2016-01-01

    The research was conducted to determine if the study program of the career of industrial processes Technological University of Chihuahua, 1 year after that it was certified by CACEI, continues achieving the established indicators and ISO 9001: 2008, implementing quality tools, monitoring of essential indicators are determined, flow charts are…

  14. 36 CFR 1008.2 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... photograph. Related definitions include: (1) System of records means a group of any records under the control... means records used for personnel management programs or processes such as staffing, employee development, retirement, and grievances and appeals. (4) Statistical records means records in a system of records...

  15. QUARTERLY PROGRESS REPORT NO. 83,

    DTIC Science & Technology

    Topics included are: microwave spectroscopy; radio astronomy; solid-state microwave electronics; optical and infrared spectroscopy; physical electronics and surface physics; physical acoustics; plasma physics; gaseous electronics; plasmas and controlled nuclear fusion ; energy conversion research; statistical communication theory; linguistics; cognitive information processing; communications biophysics; neurophysiology; computation research.

  16. Statistics of the Vestibular Input Experienced during Natural Self-Motion: Implications for Neural Processing

    PubMed Central

    Carriot, Jérome; Jamali, Mohsen; Chacron, Maurice J.

    2014-01-01

    It is widely believed that sensory systems are optimized for processing stimuli occurring in the natural environment. However, it remains unknown whether this principle applies to the vestibular system, which contributes to essential brain functions ranging from the most automatic reflexes to spatial perception and motor coordination. Here we quantified, for the first time, the statistics of natural vestibular inputs experienced by freely moving human subjects during typical everyday activities. Although previous studies have found that the power spectra of natural signals across sensory modalities decay as a power law (i.e., as 1/fα), we found that this did not apply to natural vestibular stimuli. Instead, power decreased slowly at lower and more rapidly at higher frequencies for all motion dimensions. We further establish that this unique stimulus structure is the result of active motion as well as passive biomechanical filtering occurring before any neural processing. Notably, the transition frequency (i.e., frequency at which power starts to decrease rapidly) was lower when subjects passively experienced sensory stimulation than when they actively controlled stimulation through their own movement. In contrast to signals measured at the head, the spectral content of externally generated (i.e., passive) environmental motion did follow a power law. Specifically, transformations caused by both motor control and biomechanics shape the statistics of natural vestibular stimuli before neural processing. We suggest that the unique structure of natural vestibular stimuli will have important consequences on the neural coding strategies used by this essential sensory system to represent self-motion in everyday life. PMID:24920638

  17. Statistical Learning, Syllable Processing, and Speech Production in Healthy Hearing and Hearing-Impaired Preschool Children: A Mismatch Negativity Study.

    PubMed

    Studer-Eichenberger, Esther; Studer-Eichenberger, Felix; Koenig, Thomas

    2016-01-01

    The objectives of the present study were to investigate temporal/spectral sound-feature processing in preschool children (4 to 7 years old) with peripheral hearing loss compared with age-matched controls. The results verified the presence of statistical learning, which was diminished in children with hearing impairments (HIs), and elucidated possible perceptual mediators of speech production. Perception and production of the syllables /ba/, /da/, /ta/, and /na/ were recorded in 13 children with normal hearing and 13 children with HI. Perception was assessed physiologically through event-related potentials (ERPs) recorded by EEG in a multifeature mismatch negativity paradigm and behaviorally through a discrimination task. Temporal and spectral features of the ERPs during speech perception were analyzed, and speech production was quantitatively evaluated using speech motor maximum performance tasks. Proximal to stimulus onset, children with HI displayed a difference in map topography, indicating diminished statistical learning. In later ERP components, children with HI exhibited reduced amplitudes in the N2 and early parts of the late disciminative negativity components specifically, which are associated with temporal and spectral control mechanisms. Abnormalities of speech perception were only subtly reflected in speech production, as the lone difference found in speech production studies was a mild delay in regulating speech intensity. In addition to previously reported deficits of sound-feature discriminations, the present study results reflect diminished statistical learning in children with HI, which plays an early and important, but so far neglected, role in phonological processing. Furthermore, the lack of corresponding behavioral abnormalities in speech production implies that impaired perceptual capacities do not necessarily translate into productive deficits.

  18. Cued Memory Reactivation During SWS Abolishes the Beneficial Effect of Sleep on Abstraction.

    PubMed

    Hennies, Nora; Lambon Ralph, Matthew A; Durrant, Simon J; Cousins, James N; Lewis, Penelope A

    2017-08-01

    Extracting regularities from stimuli in our environment and generalizing these to new situations are fundamental processes in human cognition. Sleep has been shown to enhance these processes, possibly by facilitating reactivation-triggered memory reorganization. Here, we assessed whether cued reactivation during slow wave sleep (SWS) promotes the beneficial effect of sleep on abstraction of statistical regularities. We used an auditory statistical learning task, in which the benefit of sleep has been firmly established. Participants were exposed to a probabilistically determined sequence of tones and subsequently tested for recognition of novel short sequences adhering to this same statistical pattern in both immediate and delayed recall sessions. In different groups, the exposure stream was replayed during SWS in the night between the recall sessions (SWS-replay group), in wake just before sleep (presleep replay group), or not at all (control group). Surprisingly, participants who received replay in sleep performed worse in the delayed recall session than the control and the presleep replay group. They also failed to show the association between SWS and task performance that has been observed in previous studies and was present in the controls. Importantly, sleep structure and sleep quality did not differ between groups, suggesting that replay during SWS did not impair sleep but rather disrupted or interfered with sleep-dependent mechanisms that underlie the extraction of the statistical pattern. These findings raise important questions about the scope of cued memory reactivation and the mechanisms that underlie sleep-related generalization. © Sleep Research Society 2017. Published by Oxford University Press on behalf of the Sleep Research Society. All rights reserved. For permissions, please e-mail journals.permissions@oup.com.

  19. Hemispheric processing of vocal emblem sounds.

    PubMed

    Neumann-Werth, Yael; Levy, Erika S; Obler, Loraine K

    2013-01-01

    Vocal emblems, such as shh and brr, are speech sounds that have linguistic and nonlinguistic features; thus, it is unclear how they are processed in the brain. Five adult dextral individuals with left-brain damage and moderate-severe Wernicke's aphasia, five adult dextral individuals with right-brain damage, and five Controls participated in two tasks: (1) matching vocal emblems to photographs ('picture task') and (2) matching vocal emblems to verbal translations ('phrase task'). Cross-group statistical analyses on items on which the Controls performed at ceiling revealed lower accuracy by the group with left-brain damage (than by Controls) on both tasks, and lower accuracy by the group with right-brain damage (than by Controls) on the picture task. Additionally, the group with left-brain damage performed significantly less accurately than the group with right-brain damage on the phrase task only. Findings suggest that comprehension of vocal emblems recruits more left- than right-hemisphere processing.

  20. Control of spectral transmission enhancement properties of random anti-reflecting surface structures fabricated using gold masking

    NASA Astrophysics Data System (ADS)

    Peltier, Abigail; Sapkota, Gopal; Potter, Matthew; Busse, Lynda E.; Frantz, Jesse A.; Shaw, L. Brandon; Sanghera, Jasbinder S.; Aggarwal, Ishwar D.; Poutous, Menelaos K.

    2017-02-01

    Random anti-reflecting subwavelength surface structures (rARSS) have been shown to suppress Fresnel reflection and scatter from optical surfaces. The structures effectively function as a gradient-refractive-index at the substrate boundary, and the spectral transmission properties of the boundary have been shown to depend on the structure's statistical properties (diameter, height, and density.) We fabricated rARSS on fused silica substrates using gold masking. A thin layer of gold was deposited on the surface of the substrate and then subjected to a rapid thermal annealing (RTA) process at various temperatures. This RTA process resulted in the formation of gold "islands" on the surface of the substrate, which then acted as a mask while the substrate was dry etched in a reactive ion etching (RIE) process. The plasma etch yielded a fused silica surface covered with randomly arranged "rods" that act as the anti-reflective layer. We present data relating the physical characteristics of the gold "island" statistical populations, and the resulting rARSS "rod" population, as well as, optical scattering losses and spectral transmission properties of the final surfaces. We focus on comparing results between samples processed at different RTA temperatures, as well as samples fabricated without undergoing RTA, to relate fabrication process statistics to transmission enhancement values.

  1. How weak values emerge in joint measurements on cloned quantum systems.

    PubMed

    Hofmann, Holger F

    2012-07-13

    A statistical analysis of optimal universal cloning shows that it is possible to identify an ideal (but nonpositive) copying process that faithfully maps all properties of the original Hilbert space onto two separate quantum systems, resulting in perfect correlations for all observables. The joint probabilities for noncommuting measurements on separate clones then correspond to the real parts of the complex joint probabilities observed in weak measurements on a single system, where the measurements on the two clones replace the corresponding sequence of weak measurement and postselection. The imaginary parts of weak measurement statics can be obtained by replacing the cloning process with a partial swap operation. A controlled-swap operation combines both processes, making the complete weak measurement statistics accessible as a well-defined contribution to the joint probabilities of fully resolved projective measurements on the two output systems.

  2. 40 CFR 51.369 - Improving repair effectiveness.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... technical questions that arise in the repair process, and answer questions related to the legal requirements of State and Federal law with regard to emission control device tampering, engine switching, or... vehicles for retest. Performance monitoring shall include statistics on the number of vehicles submitted...

  3. 25 CFR 700.257 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... under the control of the Commission from which information is retrieved by the name of the individual or... Commission by the Commission and used for personnel management programs or processes such as staffing, employee development, retirement, and grievances and appeals. (h) Statistical records. As used in this...

  4. Small Steps, Big Reward: Quality Improvement through Pilot Groups.

    ERIC Educational Resources Information Center

    Bindl, Jim; Schuler, Jim

    1988-01-01

    Because of a need for quality improvement, Wisconsin Power and Light trained two six-person pilot groups in statistical process control, had them apply that knowledge to actual problems, and showed management the dollars-and-cents savings that come from quality improvement. (JOW)

  5. Application of statistical process control to qualitative molecular diagnostic assays.

    PubMed

    O'Brien, Cathal P; Finn, Stephen P

    2014-01-01

    Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data.

  6. Using statistical process control methods to trace small changes in perinatal mortality after a training program in a low-resource setting.

    PubMed

    Mduma, Estomih R; Ersdal, Hege; Kvaloy, Jan Terje; Svensen, Erling; Mdoe, Paschal; Perlman, Jeffrey; Kidanto, Hussein Lessio; Soreide, Eldar

    2018-05-01

    To trace and document smaller changes in perinatal survival over time. Prospective observational study, with retrospective analysis. Labor ward and operating theater at Haydom Lutheran Hospital in rural north-central Tanzania. All women giving birth and birth attendants. Helping Babies Breathe (HBB) simulation training on newborn care and resuscitation and some other efforts to improve perinatal outcome. Perinatal survival, including fresh stillbirths and early (24-h) newborn survival. The variable life-adjusted plot and cumulative sum chart revealed a steady improvement in survival over time, after the baseline period. There were some variations throughout the study period, and some of these could be linked to different interventions and events. To our knowledge, this is the first time statistical process control methods have been used to document changes in perinatal mortality over time in a rural Sub-Saharan hospital, showing a steady increase in survival. These methods can be utilized to continuously monitor and describe changes in patient outcomes.

  7. Sigma Metrics Across the Total Testing Process.

    PubMed

    Charuruks, Navapun

    2017-03-01

    Laboratory quality control has been developed for several decades to ensure patients' safety, from a statistical quality control focus on the analytical phase to total laboratory processes. The sigma concept provides a convenient way to quantify the number of errors in extra-analytical and analytical phases through the defect per million and sigma metric equation. Participation in a sigma verification program can be a convenient way to monitor analytical performance continuous quality improvement. Improvement of sigma-scale performance has been shown from our data. New tools and techniques for integration are needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Improving Emergency Department Door to Doctor Time and Process Reliability

    PubMed Central

    El Sayed, Mazen J.; El-Eid, Ghada R.; Saliba, Miriam; Jabbour, Rima; Hitti, Eveline A.

    2015-01-01

    Abstract The aim of this study is to determine the effectiveness of using lean management methods on improving emergency department door to doctor times at a tertiary care hospital. We performed a before and after study at an academic urban emergency department with 49,000 annual visits after implementing a series of lean driven interventions over a 20 month period. The primary outcome was mean door to doctor time and the secondary outcome was length of stay of both admitted and discharged patients. A convenience sample from the preintervention phase (February 2012) was compared to another from the postintervention phase (mid-October to mid-November 2013). Individual control charts were used to assess process stability. Postintervention there was a statistically significant decrease in the mean door to doctor time measure (40.0 minutes ± 53.44 vs 25.3 minutes ± 15.93 P < 0.001). The postintervention process was more statistically in control with a drop in the upper control limits from 148.8 to 72.9 minutes. Length of stay of both admitted and discharged patients dropped from 2.6 to 2.0 hours and 9.0 to 5.5 hours, respectively. All other variables including emergency department visit daily volumes, hospital occupancy, and left without being seen rates were comparable. Using lean change management techniques can be effective in reducing door to doctor time in the Emergency Department and improving process reliability. PMID:26496278

  9. Symptom-specific amygdala hyperactivity modulates motor control network in conversion disorder.

    PubMed

    Hassa, Thomas; Sebastian, Alexandra; Liepert, Joachim; Weiller, Cornelius; Schmidt, Roger; Tüscher, Oliver

    2017-01-01

    Initial historical accounts as well as recent data suggest that emotion processing is dysfunctional in conversion disorder patients and that this alteration may be the pathomechanistic neurocognitive basis for symptoms in conversion disorder. However, to date evidence of direct interaction of altered negative emotion processing with motor control networks in conversion disorder is still lacking. To specifically study the neural correlates of emotion processing interacting with motor networks we used a task combining emotional and sensorimotor stimuli both separately as well as simultaneously during functional magnetic resonance imaging in a well characterized group of 13 conversion disorder patients with functional hemiparesis and 19 demographically matched healthy controls. We performed voxelwise statistical parametrical mapping for a priori regions of interest within emotion processing and motor control networks. Psychophysiological interaction (PPI) was used to test altered functional connectivity of emotion and motor control networks. Only during simultaneous emotional stimulation and passive movement of the affected hand patients displayed left amygdala hyperactivity. PPI revealed increased functional connectivity in patients between the left amygdala and the (pre-)supplemental motor area and the subthalamic nucleus, key regions within the motor control network. These findings suggest a novel mechanistic direct link between dysregulated emotion processing and motor control circuitry in conversion disorder.

  10. Network Controllability in the Inferior Frontal Gyrus Relates to Controlled Language Variability and Susceptibility to TMS.

    PubMed

    Medaglia, John D; Harvey, Denise Y; White, Nicole; Kelkar, Apoorva; Zimmerman, Jared; Bassett, Danielle S; Hamilton, Roy H

    2018-06-08

    In language production, humans are confronted with considerable word selection demands. Often, we must select a word from among similar, acceptable, and competing alternative words in order to construct a sentence that conveys an intended meaning. In recent years, the left inferior frontal gyrus (LIFG) has been identified as critical to this ability. Despite a recent emphasis on network approaches to understanding language, how the LIFG interacts with the brain's complex networks to facilitate controlled language performance remains unknown. Here, we take a novel approach to understand word selection as a network control process in the brain. Using an anatomical brain network derived from high-resolution diffusion spectrum imaging (DSI), we computed network controllability underlying the site of transcranial magnetic stimulation in the LIFG between administrations of language tasks that vary in response (cognitive control) demands: open-response (word generation) vs. closed-response (number naming) tasks. We find that a statistic that quantifies the LIFG's theoretically predicted control of communication across modules in the human connectome explains TMS-induced changes in open-response language task performance only. Moreover, we find that a statistic that quantifies the LIFG's theoretically predicted control of difficult-to-reach states explains vulnerability to TMS in the closed-ended (but not open-ended) response task. These findings establish a link between network controllability, cognitive function, and TMS effects. SIGNIFICANCE STATEMENT This work illustrates that network control statistics applied to anatomical connectivity data demonstrate relationships with cognitive variability during controlled language tasks and TMS effects. Copyright © 2018 the authors.

  11. Lessons Learned from the Implementation of Total Quality Management at the Naval Aviation Depot, North Island, California

    DTIC Science & Technology

    1988-12-01

    Kaoru Ishikawa recognized the potential of statistical process control during one of Dr. Deming’s many instructional visits to Japan. He wrote the Guide...to Quality Control which has been utilized for both self-study and classroom training. In the Guide to Quality Control, Dr. Ishikawa describes...job data are essential for making a proper evaluation.( Ishikawa , p. 14) The gathering of data and its subsequent analysis are the foundation of

  12. Statistical process control of mortality series in the Australian and New Zealand Intensive Care Society (ANZICS) adult patient database: implications of the data generating process.

    PubMed

    Moran, John L; Solomon, Patricia J

    2013-05-24

    Statistical process control (SPC), an industrial sphere initiative, has recently been applied in health care and public health surveillance. SPC methods assume independent observations and process autocorrelation has been associated with increase in false alarm frequency. Monthly mean raw mortality (at hospital discharge) time series, 1995-2009, at the individual Intensive Care unit (ICU) level, were generated from the Australia and New Zealand Intensive Care Society adult patient database. Evidence for series (i) autocorrelation and seasonality was demonstrated using (partial)-autocorrelation ((P)ACF) function displays and classical series decomposition and (ii) "in-control" status was sought using risk-adjusted (RA) exponentially weighted moving average (EWMA) control limits (3 sigma). Risk adjustment was achieved using a random coefficient (intercept as ICU site and slope as APACHE III score) logistic regression model, generating an expected mortality series. Application of time-series to an exemplar complete ICU series (1995-(end)2009) was via Box-Jenkins methodology: autoregressive moving average (ARMA) and (G)ARCH ((Generalised) Autoregressive Conditional Heteroscedasticity) models, the latter addressing volatility of the series variance. The overall data set, 1995-2009, consisted of 491324 records from 137 ICU sites; average raw mortality was 14.07%; average(SD) raw and expected mortalities ranged from 0.012(0.113) and 0.013(0.045) to 0.296(0.457) and 0.278(0.247) respectively. For the raw mortality series: 71 sites had continuous data for assessment up to or beyond lag40 and 35% had autocorrelation through to lag40; and of 36 sites with continuous data for ≥ 72 months, all demonstrated marked seasonality. Similar numbers and percentages were seen with the expected series. Out-of-control signalling was evident for the raw mortality series with respect to RA-EWMA control limits; a seasonal ARMA model, with GARCH effects, displayed white-noise residuals which were in-control with respect to EWMA control limits and one-step prediction error limits (3SE). The expected series was modelled with a multiplicative seasonal autoregressive model. The data generating process of monthly raw mortality series at the ICU level displayed autocorrelation, seasonality and volatility. False-positive signalling of the raw mortality series was evident with respect to RA-EWMA control limits. A time series approach using residual control charts resolved these issues.

  13. Determination of dominant biogeochemical processes in a contaminated aquifer-wetland system using multivariate statistical analysis

    USGS Publications Warehouse

    Baez-Cazull, S. E.; McGuire, J.T.; Cozzarelli, I.M.; Voytek, M.A.

    2008-01-01

    Determining the processes governing aqueous biogeochemistry in a wetland hydrologically linked to an underlying contaminated aquifer is challenging due to the complex exchange between the systems and their distinct responses to changes in precipitation, recharge, and biological activities. To evaluate temporal and spatial processes in the wetland-aquifer system, water samples were collected using cm-scale multichambered passive diffusion samplers (peepers) to span the wetland-aquifer interface over a period of 3 yr. Samples were analyzed for major cations and anions, methane, and a suite of organic acids resulting in a large dataset of over 8000 points, which was evaluated using multivariate statistics. Principal component analysis (PCA) was chosen with the purpose of exploring the sources of variation in the dataset to expose related variables and provide insight into the biogeochemical processes that control the water chemistry of the system. Factor scores computed from PCA were mapped by date and depth. Patterns observed suggest that (i) fermentation is the process controlling the greatest variability in the dataset and it peaks in May; (ii) iron and sulfate reduction were the dominant terminal electron-accepting processes in the system and were associated with fermentation but had more complex seasonal variability than fermentation; (iii) methanogenesis was also important and associated with bacterial utilization of minerals as a source of electron acceptors (e.g., barite BaSO4); and (iv) seasonal hydrological patterns (wet and dry periods) control the availability of electron acceptors through the reoxidation of reduced iron-sulfur species enhancing iron and sulfate reduction. Copyright ?? 2008 by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America. All rights reserved.

  14. Characterization of a Saccharomyces cerevisiae fermentation process for production of a therapeutic recombinant protein using a multivariate Bayesian approach.

    PubMed

    Fu, Zhibiao; Baker, Daniel; Cheng, Aili; Leighton, Julie; Appelbaum, Edward; Aon, Juan

    2016-05-01

    The principle of quality by design (QbD) has been widely applied to biopharmaceutical manufacturing processes. Process characterization is an essential step to implement the QbD concept to establish the design space and to define the proven acceptable ranges (PAR) for critical process parameters (CPPs). In this study, we present characterization of a Saccharomyces cerevisiae fermentation process using risk assessment analysis, statistical design of experiments (DoE), and the multivariate Bayesian predictive approach. The critical quality attributes (CQAs) and CPPs were identified with a risk assessment. The statistical model for each attribute was established using the results from the DoE study with consideration given to interactions between CPPs. Both the conventional overlapping contour plot and the multivariate Bayesian predictive approaches were used to establish the region of process operating conditions where all attributes met their specifications simultaneously. The quantitative Bayesian predictive approach was chosen to define the PARs for the CPPs, which apply to the manufacturing control strategy. Experience from the 10,000 L manufacturing scale process validation, including 64 continued process verification batches, indicates that the CPPs remain under a state of control and within the established PARs. The end product quality attributes were within their drug substance specifications. The probability generated with the Bayesian approach was also used as a tool to assess CPP deviations. This approach can be extended to develop other production process characterization and quantify a reliable operating region. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:799-812, 2016. © 2016 American Institute of Chemical Engineers.

  15. Statistical Analysis of the Processes Controlling Choline and Ethanolamine Glycerophospholipid Molecular Species Composition

    PubMed Central

    Kiebish, Michael A.; Yang, Kui; Han, Xianlin; Gross, Richard W.; Chuang, Jeffrey

    2012-01-01

    The regulation and maintenance of the cellular lipidome through biosynthetic, remodeling, and catabolic mechanisms are critical for biological homeostasis during development, health and disease. These complex mechanisms control the architectures of lipid molecular species, which have diverse yet highly regulated fatty acid chains at both the sn1 and sn2 positions. Phosphatidylcholine (PC) and phosphatidylethanolamine (PE) serve as the predominant biophysical scaffolds in membranes, acting as reservoirs for potent lipid signals and regulating numerous enzymatic processes. Here we report the first rigorous computational dissection of the mechanisms influencing PC and PE molecular architectures from high-throughput shotgun lipidomic data. Using novel statistical approaches, we have analyzed multidimensional mass spectrometry-based shotgun lipidomic data from developmental mouse heart and mature mouse heart, lung, brain, and liver tissues. We show that in PC and PE, sn1 and sn2 positions are largely independent, though for low abundance species regulatory processes may interact with both the sn1 and sn2 chain simultaneously, leading to cooperative effects. Chains with similar biochemical properties appear to be remodeled similarly. We also see that sn2 positions are more regulated than sn1, and that PC exhibits stronger cooperative effects than PE. A key aspect of our work is a novel statistically rigorous approach to determine cooperativity based on a modified Fisher's exact test using Markov Chain Monte Carlo sampling. This computational approach provides a novel tool for developing mechanistic insight into lipidomic regulation. PMID:22662143

  16. A Simple Approach for Monitoring Business Service Time Variation

    PubMed Central

    2014-01-01

    Control charts are effective tools for signal detection in both manufacturing processes and service processes. Much of the data in service industries comes from processes having nonnormal or unknown distributions. The commonly used Shewhart variable control charts, which depend heavily on the normality assumption, are not appropriately used here. In this paper, we propose a new asymmetric EWMA variance chart (EWMA-AV chart) and an asymmetric EWMA mean chart (EWMA-AM chart) based on two simple statistics to monitor process variance and mean shifts simultaneously. Further, we explore the sampling properties of the new monitoring statistics and calculate the average run lengths when using both the EWMA-AV chart and the EWMA-AM chart. The performance of the EWMA-AV and EWMA-AM charts and that of some existing variance and mean charts are compared. A numerical example involving nonnormal service times from the service system of a bank branch in Taiwan is used to illustrate the applications of the EWMA-AV and EWMA-AM charts and to compare them with the existing variance (or standard deviation) and mean charts. The proposed EWMA-AV chart and EWMA-AM charts show superior detection performance compared to the existing variance and mean charts. The EWMA-AV chart and EWMA-AM chart are thus recommended. PMID:24895647

  17. A simple approach for monitoring business service time variation.

    PubMed

    Yang, Su-Fen; Arnold, Barry C

    2014-01-01

    Control charts are effective tools for signal detection in both manufacturing processes and service processes. Much of the data in service industries comes from processes having nonnormal or unknown distributions. The commonly used Shewhart variable control charts, which depend heavily on the normality assumption, are not appropriately used here. In this paper, we propose a new asymmetric EWMA variance chart (EWMA-AV chart) and an asymmetric EWMA mean chart (EWMA-AM chart) based on two simple statistics to monitor process variance and mean shifts simultaneously. Further, we explore the sampling properties of the new monitoring statistics and calculate the average run lengths when using both the EWMA-AV chart and the EWMA-AM chart. The performance of the EWMA-AV and EWMA-AM charts and that of some existing variance and mean charts are compared. A numerical example involving nonnormal service times from the service system of a bank branch in Taiwan is used to illustrate the applications of the EWMA-AV and EWMA-AM charts and to compare them with the existing variance (or standard deviation) and mean charts. The proposed EWMA-AV chart and EWMA-AM charts show superior detection performance compared to the existing variance and mean charts. The EWMA-AV chart and EWMA-AM chart are thus recommended.

  18. Long-term health experience of jet engine manufacturing workers: VIII. glioblastoma incidence in relation to workplace experiences with parts and processes.

    PubMed

    Marsh, Gary M; Youk, Ada O; Buchanich, Jeanine M; Downing, Sarah; Kennedy, Kathleen J; Esmen, Nurtan A; Hancock, Roger P; Lacey, Steven E; Pierce, Jennifer S; Fleissner, Mary Lou

    2013-06-01

    To determine whether glioblastoma (GB) incidence rates among jet engine manufacturing workers were associated with workplace experiences with specific parts produced and processes performed. Subjects were 210,784 workers employed between 1952 and 2001. We conducted nested case-control and cohort incidence studies with focus on 277 GB cases. We estimated time experienced with 16 part families, 4 process categories, and 32 concurrent part-process combinations with 20 or more GB cases. In both the cohort and case-control studies, none of the part families, process categories, or both considered was associated with increased GB risk. If not due to chance alone, the not statistically significantly elevated GB rates in the North Haven plant may reflect external occupational factors or nonoccupational factors unmeasured in the current evaluation.

  19. CHAM: weak signals detection through a new multivariate algorithm for process control

    NASA Astrophysics Data System (ADS)

    Bergeret, François; Soual, Carole; Le Gratiet, B.

    2016-10-01

    Derivatives technologies based on core CMOS processes are significantly aggressive in term of design rules and process control requirements. Process control plan is a derived from Process Assumption (PA) calculations which result in a design rule based on known process variability capabilities, taking into account enough margin to be safe not only for yield but especially for reliability. Even though process assumptions are calculated with a 4 sigma known process capability margin, efficient and competitive designs are challenging the process especially for derivatives technologies in 40 and 28nm nodes. For wafer fab process control, PA are declined in monovariate (layer1 CD, layer2 CD, layer2 to layer1 overlay, layer3 CD etc….) control charts with appropriated specifications and control limits which all together are securing the silicon. This is so far working fine but such system is not really sensitive to weak signals coming from interactions of multiple key parameters (high layer2 CD combined with high layer3 CD as an example). CHAM is a software using an advanced statistical algorithm specifically designed to detect small signals, especially when there are many parameters to control and when the parameters can interact to create yield issues. In this presentation we will first present the CHAM algorithm, then the case-study on critical dimensions, with the results, and we will conclude on future work. This partnership between Ippon and STM is part of E450LMDAP, European project dedicated to metrology and lithography development for future technology nodes, especially 10nm.

  20. 21 CFR 820.200 - Servicing.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... reports with appropriate statistical methodology in accordance with § 820.100. (c) Each manufacturer who... chapter shall automatically consider the report a complaint and shall process it in accordance with the... device serviced; (2) Any device identification(s) and control number(s) used; (3) The date of service; (4...

  1. 75 FR 2488 - Mid-Atlantic Fishery Management Council; Public Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-15

    ... Mid-Atlantic Fishery Management Council's (MAFMC) Scientific and Statistical Committee (SSC) will hold... include new member orientation (overview of Council process and role of the SSC), review and adoption of SSC Standard Operating Practices and Procedures, ABC Control Rule Framework and Council Risk Policy...

  2. The Role of Measurement Error in Familiar Statistics

    DTIC Science & Technology

    2006-06-01

    Manual of Mental Disorders (3rd ed., revised; American Psychiatric Association, 1987) for the clinical conditions of anorexia nervosa and borderline...borderline personality disorder, anorexia ner’vosa, and a control group. Family Process, 39, 345-358. Hunter, J. E., & Schmidt, F. L. (1976). A critical

  3. TU-FG-201-05: Varian MPC as a Statistical Process Control Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Carver, A; Rowbottom, C

    Purpose: Quality assurance in radiotherapy requires the measurement of various machine parameters to ensure they remain within permitted values over time. In Truebeam release 2.0 the Machine Performance Check (MPC) was released allowing beam output and machine axis movements to be assessed in a single test. We aim to evaluate the Varian Machine Performance Check (MPC) as a tool for Statistical Process Control (SPC). Methods: Varian’s MPC tool was used on three Truebeam and one EDGE linac for a period of approximately one year. MPC was commissioned against independent systems. After this period the data were reviewed to determine whethermore » or not the MPC was useful as a process control tool. Analyses on individual tests were analysed using Shewhart control plots, using Matlab for analysis. Principal component analysis was used to determine if a multivariate model was of any benefit in analysing the data. Results: Control charts were found to be useful to detect beam output changes, worn T-nuts and jaw calibration issues. Upper and lower control limits were defined at the 95% level. Multivariate SPC was performed using Principal Component Analysis. We found little evidence of clustering beyond that which might be naively expected such as beam uniformity and beam output. Whilst this makes multivariate analysis of little use it suggests that each test is giving independent information. Conclusion: The variety of independent parameters tested in MPC makes it a sensitive tool for routine machine QA. We have determined that using control charts in our QA programme would rapidly detect changes in machine performance. The use of control charts allows large quantities of tests to be performed on all linacs without visual inspection of all results. The use of control limits alerts users when data are inconsistent with previous measurements before they become out of specification. A. Carver has received a speaker’s honorarium from Varian.« less

  4. A database application for pre-processing, storage and comparison of mass spectra derived from patients and controls

    PubMed Central

    Titulaer, Mark K; Siccama, Ivar; Dekker, Lennard J; van Rijswijk, Angelique LCT; Heeren, Ron MA; Sillevis Smitt, Peter A; Luider, Theo M

    2006-01-01

    Background Statistical comparison of peptide profiles in biomarker discovery requires fast, user-friendly software for high throughput data analysis. Important features are flexibility in changing input variables and statistical analysis of peptides that are differentially expressed between patient and control groups. In addition, integration the mass spectrometry data with the results of other experiments, such as microarray analysis, and information from other databases requires a central storage of the profile matrix, where protein id's can be added to peptide masses of interest. Results A new database application is presented, to detect and identify significantly differentially expressed peptides in peptide profiles obtained from body fluids of patient and control groups. The presented modular software is capable of central storage of mass spectra and results in fast analysis. The software architecture consists of 4 pillars, 1) a Graphical User Interface written in Java, 2) a MySQL database, which contains all metadata, such as experiment numbers and sample codes, 3) a FTP (File Transport Protocol) server to store all raw mass spectrometry files and processed data, and 4) the software package R, which is used for modular statistical calculations, such as the Wilcoxon-Mann-Whitney rank sum test. Statistic analysis by the Wilcoxon-Mann-Whitney test in R demonstrates that peptide-profiles of two patient groups 1) breast cancer patients with leptomeningeal metastases and 2) prostate cancer patients in end stage disease can be distinguished from those of control groups. Conclusion The database application is capable to distinguish patient Matrix Assisted Laser Desorption Ionization (MALDI-TOF) peptide profiles from control groups using large size datasets. The modular architecture of the application makes it possible to adapt the application to handle also large sized data from MS/MS- and Fourier Transform Ion Cyclotron Resonance (FT-ICR) mass spectrometry experiments. It is expected that the higher resolution and mass accuracy of the FT-ICR mass spectrometry prevents the clustering of peaks of different peptides and allows the identification of differentially expressed proteins from the peptide profiles. PMID:16953879

  5. A database application for pre-processing, storage and comparison of mass spectra derived from patients and controls.

    PubMed

    Titulaer, Mark K; Siccama, Ivar; Dekker, Lennard J; van Rijswijk, Angelique L C T; Heeren, Ron M A; Sillevis Smitt, Peter A; Luider, Theo M

    2006-09-05

    Statistical comparison of peptide profiles in biomarker discovery requires fast, user-friendly software for high throughput data analysis. Important features are flexibility in changing input variables and statistical analysis of peptides that are differentially expressed between patient and control groups. In addition, integration the mass spectrometry data with the results of other experiments, such as microarray analysis, and information from other databases requires a central storage of the profile matrix, where protein id's can be added to peptide masses of interest. A new database application is presented, to detect and identify significantly differentially expressed peptides in peptide profiles obtained from body fluids of patient and control groups. The presented modular software is capable of central storage of mass spectra and results in fast analysis. The software architecture consists of 4 pillars, 1) a Graphical User Interface written in Java, 2) a MySQL database, which contains all metadata, such as experiment numbers and sample codes, 3) a FTP (File Transport Protocol) server to store all raw mass spectrometry files and processed data, and 4) the software package R, which is used for modular statistical calculations, such as the Wilcoxon-Mann-Whitney rank sum test. Statistic analysis by the Wilcoxon-Mann-Whitney test in R demonstrates that peptide-profiles of two patient groups 1) breast cancer patients with leptomeningeal metastases and 2) prostate cancer patients in end stage disease can be distinguished from those of control groups. The database application is capable to distinguish patient Matrix Assisted Laser Desorption Ionization (MALDI-TOF) peptide profiles from control groups using large size datasets. The modular architecture of the application makes it possible to adapt the application to handle also large sized data from MS/MS- and Fourier Transform Ion Cyclotron Resonance (FT-ICR) mass spectrometry experiments. It is expected that the higher resolution and mass accuracy of the FT-ICR mass spectrometry prevents the clustering of peaks of different peptides and allows the identification of differentially expressed proteins from the peptide profiles.

  6. Systematic review of the use of Statistical Process Control methods to measure the success of pressure ulcer prevention.

    PubMed

    Clark, Michael; Young, Trudie; Fallon, Maureen

    2018-06-01

    Successful prevention of pressure ulcers is the end product of a complex series of care processes including, but not limited to, the assessment of vulnerability to pressure damage; skin assessment and care; nutritional support; repositioning; and the use of beds, mattresses, and cushions to manage mechanical loads on the skin and soft tissues. The purpose of this review was to examine where and how Statistical Process Control (SPC) measures have been used to assess the success of quality improvement initiatives intended to improve pressure ulcer prevention. A search of 7 electronic bibliographic databases was performed on May 17th, 2017, for studies that met the inclusion criteria. SPC methods have been reported in 9 publications since 2010 to interpret changes in the incidence of pressure ulcers over time. While these methods offer rapid interpretation of changes in incidence than is gained from a comparison of 2 arbitrarily selected time points pre- and post-implementation of change, more work is required to ensure that the clinical and scientific communities adopt the most appropriate SPC methods. © 2018 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  7. Proton magnetic resonance spectroscopy in adult cancer patients with delirium

    PubMed Central

    Yager, Jeffrey R.; Magnotta, Vincent A.; Mills, James A.; Vik, Stacie M.; Weckmann, Michelle T.; Capizzano, Aristides A.; Gingrich, Roger; Beglinger, Leigh J.

    2010-01-01

    Background Delirium is associated with a host of negative outcomes, including increased risk of mortality, longer hospital stay, and poor long-term cognitive function. The pathophysiology of delirium is not well understood. Cancer patients undergoing a bone marrow transplant (BMT) are at high risk for developing delirium and Proton Magnetic Resonance Spectroscopy (1H-MRS) could lead to better understanding of the delirium process. Methods Fourteen BMT patients and 10 controls completed 1H-MRS, positioned above the corpus callosum, shortly after delirium onset or at study end if no delirium occurred. Results In the BMT-delirium group, statistically significantly elevated tCho/tCr was found in contrast to the BMT-no delirium group (p<0.05). The BMT–delirium group also showed statistically significantly lesser NAA/tCho compared to both controls (p=0.01) and the BMT–no delirium group (p=0.04). Conclusions Elevated choline and reduced NAA indicate inflammatory processes and white matter damage as well as neuronal metabolic impairment. Further research is needed to separate the choline peaks, as well as more detailed collection of medication regimens to determine whether a higher choline concentration is a function of the delirium process or cancer treatment effects. PMID:21227658

  8. Aging Affects Adaptation to Sound-Level Statistics in Human Auditory Cortex.

    PubMed

    Herrmann, Björn; Maess, Burkhard; Johnsrude, Ingrid S

    2018-02-21

    Optimal perception requires efficient and adaptive neural processing of sensory input. Neurons in nonhuman mammals adapt to the statistical properties of acoustic feature distributions such that they become sensitive to sounds that are most likely to occur in the environment. However, whether human auditory responses adapt to stimulus statistical distributions and how aging affects adaptation to stimulus statistics is unknown. We used MEG to study how exposure to different distributions of sound levels affects adaptation in auditory cortex of younger (mean: 25 years; n = 19) and older (mean: 64 years; n = 20) adults (male and female). Participants passively listened to two sound-level distributions with different modes (either 15 or 45 dB sensation level). In a control block with long interstimulus intervals, allowing neural populations to recover from adaptation, neural response magnitudes were similar between younger and older adults. Critically, both age groups demonstrated adaptation to sound-level stimulus statistics, but adaptation was altered for older compared with younger people: in the older group, neural responses continued to be sensitive to sound level under conditions in which responses were fully adapted in the younger group. The lack of full adaptation to the statistics of the sensory environment may be a physiological mechanism underlying the known difficulty that older adults have with filtering out irrelevant sensory information. SIGNIFICANCE STATEMENT Behavior requires efficient processing of acoustic stimulation. Animal work suggests that neurons accomplish efficient processing by adjusting their response sensitivity depending on statistical properties of the acoustic environment. Little is known about the extent to which this adaptation to stimulus statistics generalizes to humans, particularly to older humans. We used MEG to investigate how aging influences adaptation to sound-level statistics. Listeners were presented with sounds drawn from sound-level distributions with different modes (15 vs 45 dB). Auditory cortex neurons adapted to sound-level statistics in younger and older adults, but adaptation was incomplete in older people. The data suggest that the aging auditory system does not fully capitalize on the statistics available in sound environments to tune the perceptual system dynamically. Copyright © 2018 the authors 0270-6474/18/381989-11$15.00/0.

  9. Acceptance Probability (P a) Analysis for Process Validation Lifecycle Stages.

    PubMed

    Alsmeyer, Daniel; Pazhayattil, Ajay; Chen, Shu; Munaretto, Francesco; Hye, Maksuda; Sanghvi, Pradeep

    2016-04-01

    This paper introduces an innovative statistical approach towards understanding how variation impacts the acceptance criteria of quality attributes. Because of more complex stage-wise acceptance criteria, traditional process capability measures are inadequate for general application in the pharmaceutical industry. The probability of acceptance concept provides a clear measure, derived from specific acceptance criteria for each quality attribute. In line with the 2011 FDA Guidance, this approach systematically evaluates data and scientifically establishes evidence that a process is capable of consistently delivering quality product. The probability of acceptance provides a direct and readily understandable indication of product risk. As with traditional capability indices, the acceptance probability approach assumes that underlying data distributions are normal. The computational solutions for dosage uniformity and dissolution acceptance criteria are readily applicable. For dosage uniformity, the expected AV range may be determined using the s lo and s hi values along with the worst case estimates of the mean. This approach permits a risk-based assessment of future batch performance of the critical quality attributes. The concept is also readily applicable to sterile/non sterile liquid dose products. Quality attributes such as deliverable volume and assay per spray have stage-wise acceptance that can be converted into an acceptance probability. Accepted statistical guidelines indicate processes with C pk > 1.33 as performing well within statistical control and those with C pk < 1.0 as "incapable" (1). A C pk > 1.33 is associated with a centered process that will statistically produce less than 63 defective units per million. This is equivalent to an acceptance probability of >99.99%.

  10. Outbreak of resistant Acinetobacter baumannii- measures and proposal for prevention and control.

    PubMed

    Romanelli, Roberta Maia de Castro; Jesus, Lenize Adriana de; Clemente, Wanessa Trindade; Lima, Stella Sala Soares; Rezende, Edna Maria; Coutinho, Rosane Luiza; Moreira, Ricardo Luiz Fontes; Neves, Francelli Aparecida Cordeiro; Brás, Nelma de Jesus

    2009-10-01

    Acinetobacter baumannii colonization and infection, frequent in Intensive Care Unit (ICU) patients, is commonly associated with high morbimortality. Several outbreaks due to multidrug-resistant (MDR) A. baumanii have been reported but few of them in Brazil. This study aimed to identify risk factors associated with colonization and infection by MDR and carbapenem-resistant A. baumannii strains isolated from patients admitted to the adult ICU at HC/UFMG. A case-control study was performed from January 2007 to June 2008. Cases were defined as patients colonized or infected by MDR/carbapenem-resistant A. baumannii, and controls were patients without MDR/carbapenem-resistant A. baumannii isolation, in a 1:2 proportion. For statistical analysis, due to changes in infection control guidelines, infection criteria and the notification process, this study was divided into two periods. During the first period analyzed, from January to December 2007, colonization or infection by MDR/carbapenem-resistant A. baumannii was associated with prior infection, invasive device utilization, prior carbapenem use and clinical severity. In the multivariate analysis, prior infection and mechanical ventilation proved to be statistically significant risk factors. Carbapenem use showed a tendency towards a statistical association. During the second study period, from January to June 2008, variables with a significant association with MDR/carbapenem-resistant A. baumannii colonization/infection were catheter utilization, carbapenem and third-generation cephalosporin use, hepatic transplantation, and clinical severity. In the multivariate analysis, only CVC use showed a statistical difference. Carbapenem and third-generation cephalosporin use displayed a tendency to be risk factors. Risk factors must be focused on infection control and prevention measures considering A. baumanni dissemination.

  11. [Effects of Self-directed Feedback Practice using Smartphone Videos on Basic Nursing Skills, Confidence in Performance and Learning Satisfaction].

    PubMed

    Lee, Seul Gi; Shin, Yun Hee

    2016-04-01

    This study was done to verify effects of a self-directed feedback practice using smartphone videos on nursing students' basic nursing skills, confidence in performance and learning satisfaction. In this study an experimental study with a post-test only control group design was used. Twenty-nine students were assigned to the experimental group and 29 to the control group. Experimental treatment was exchanging feedback on deficiencies through smartphone recorded videos of nursing practice process taken by peers during self-directed practice. Basic nursing skills scores were higher for all items in the experimental group compared to the control group, and differences were statistically significant ["Measuring vital signs" (t=-2.10, p=.039); "Wearing protective equipment when entering and exiting the quarantine room and the management of waste materials" (t=-4.74, p<.001) "Gavage tube feeding" (t=-2.70, p=.009)]. Confidence in performance was higher in the experimental group compared to the control group, but the differences were not statistically significant. However, after the complete practice, there was a statistically significant difference in overall performance confidence (t=-3.07. p=.003). Learning satisfaction was higher in the experimental group compared to the control group, but the difference was not statistically significant (t=-1.67, p=.100). Results of this study indicate that self-directed feedback practice using smartphone videos can improve basic nursing skills. The significance is that it can help nursing students gain confidence in their nursing skills for the future through improvement of basic nursing skills and performance of quality care, thus providing patients with safer care.

  12. Comparison of Grand Median and Cumulative Sum Control Charts on Shuttlecock Weight Variable in CV Marjoko Kompas dan Domas

    NASA Astrophysics Data System (ADS)

    Musdalifah, N.; Handajani, S. S.; Zukhronah, E.

    2017-06-01

    Competition between the homoneous companies cause the company have to keep production quality. To cover this problem, the company controls the production with statistical quality control using control chart. Shewhart control chart is used to normal distributed data. The production data is often non-normal distribution and occured small process shift. Grand median control chart is a control chart for non-normal distributed data, while cumulative sum (cusum) control chart is a sensitive control chart to detect small process shift. The purpose of this research is to compare grand median and cusum control charts on shuttlecock weight variable in CV Marjoko Kompas dan Domas by generating data as the actual distribution. The generated data is used to simulate multiplier of standard deviation on grand median and cusum control charts. Simulation is done to get average run lenght (ARL) 370. Grand median control chart detects ten points that out of control, while cusum control chart detects a point out of control. It can be concluded that grand median control chart is better than cusum control chart.

  13. When the Single Matters more than the Group (II): Addressing the Problem of High False Positive Rates in Single Case Voxel Based Morphometry Using Non-parametric Statistics.

    PubMed

    Scarpazza, Cristina; Nichols, Thomas E; Seramondi, Donato; Maumet, Camille; Sartori, Giuseppe; Mechelli, Andrea

    2016-01-01

    In recent years, an increasing number of studies have used Voxel Based Morphometry (VBM) to compare a single patient with a psychiatric or neurological condition of interest against a group of healthy controls. However, the validity of this approach critically relies on the assumption that the single patient is drawn from a hypothetical population with a normal distribution and variance equal to that of the control group. In a previous investigation, we demonstrated that family-wise false positive error rate (i.e., the proportion of statistical comparisons yielding at least one false positive) in single case VBM are much higher than expected (Scarpazza et al., 2013). Here, we examine whether the use of non-parametric statistics, which does not rely on the assumptions of normal distribution and equal variance, would enable the investigation of single subjects with good control of false positive risk. We empirically estimated false positive rates (FPRs) in single case non-parametric VBM, by performing 400 statistical comparisons between a single disease-free individual and a group of 100 disease-free controls. The impact of smoothing (4, 8, and 12 mm) and type of pre-processing (Modulated, Unmodulated) was also examined, as these factors have been found to influence FPRs in previous investigations using parametric statistics. The 400 statistical comparisons were repeated using two independent, freely available data sets in order to maximize the generalizability of the results. We found that the family-wise error rate was 5% for increases and 3.6% for decreases in one data set; and 5.6% for increases and 6.3% for decreases in the other data set (5% nominal). Further, these results were not dependent on the level of smoothing and modulation. Therefore, the present study provides empirical evidence that single case VBM studies with non-parametric statistics are not susceptible to high false positive rates. The critical implication of this finding is that VBM can be used to characterize neuroanatomical alterations in individual subjects as long as non-parametric statistics are employed.

  14. Impact of cleaning and other interventions on the reduction of hospital-acquired Clostridium difficile infections in two hospitals in England assessed using a breakpoint model.

    PubMed

    Hughes, G J; Nickerson, E; Enoch, D A; Ahluwalia, J; Wilkinson, C; Ayers, R; Brown, N M

    2013-07-01

    Clostridium difficile infection remains a major challenge for hospitals. Although targeted infection control initiatives have been shown to be effective in reducing the incidence of hospital-acquired C. difficile infection, there is little evidence available to assess the effectiveness of specific interventions. To use statistical modelling to detect substantial reductions in the incidence of C. difficile from time series data from two hospitals in England, and relate these time points to infection control interventions. A statistical breakpoints model was fitted to likely hospital-acquired C. difficile infection incidence data from a teaching hospital (2002-2009) and a district general hospital (2005-2009) in England. Models with increasing complexity (i.e. increasing the number of breakpoints) were tested for an improved fit to the data. Partitions estimated from breakpoint models were tested for individual stability using statistical process control charts. Major infection control interventions from both hospitals during this time were grouped according to their primary target (antibiotics, cleaning, isolation, other) and mapped to the model-suggested breakpoints. For both hospitals, breakpoints coincided with enhancements to cleaning protocols. Statistical models enabled formal assessment of the impact of different interventions, and showed that enhancements to deep cleaning programmes are the interventions that have most likely led to substantial reductions in hospital-acquired C. difficile infections at the two hospitals studied. Copyright © 2013 The Healthcare Infection Society. Published by Elsevier Ltd. All rights reserved.

  15. [Dyslexia as a disfunction in successive processing].

    PubMed

    Pérez-Alvarez, F; Timoneda-Gallart, C

    We present a study on reading and writing difficulties after normal instruction during a year. Verifying if these patients showed a specific pattern of PASS (Planning, Attention, Sequential and Simultaneous) cognitive processing; if so, it allows us a rapid diagnosis and a useful cognitive remediation according to the PASS theory of intelligence. Thirty patients were selected from neuropediatric patients because of learning disability. They were selected according to their performance on several tests of phonological aware and a test of writing to discover errors in spelling. Patients with verbal language problems, as in dysphasia, and patients with learning difficulty not determined by reading or writing were ruled out. A control group of 300 scholars was used. The translated DN:CAS battery was administered to the study group and the control group for assessing the PASS cognitive processing. Statistical factorial analysis of the control group was performed as a validity confirmation to discriminate the four PASS cognitive processes. Cluster analysis of the study group was performed to discriminate its homogeneity. Differences between means were tested with the t-Student. The four PASS cognitive processes were identified in the control group. The study group scored less than minus 1 SD in successive processing, the rest of the processes being clearly higher than minus 1 SD, and the mean of study group was inferior to control group (p = 0.001). A kind of dyslexia may be defined by disfunction in PASS successive processing.

  16. Designing a risk-based surveillance program for Mycobacterium avium ssp. paratuberculosis in Norwegian dairy herds using multivariate statistical process control analysis.

    PubMed

    Whist, A C; Liland, K H; Jonsson, M E; Sæbø, S; Sviland, S; Østerås, O; Norström, M; Hopp, P

    2014-11-01

    Surveillance programs for animal diseases are critical to early disease detection and risk estimation and to documenting a population's disease status at a given time. The aim of this study was to describe a risk-based surveillance program for detecting Mycobacterium avium ssp. paratuberculosis (MAP) infection in Norwegian dairy cattle. The included risk factors for detecting MAP were purchase of cattle, combined cattle and goat farming, and location of the cattle farm in counties containing goats with MAP. The risk indicators included production data [culling of animals >3 yr of age, carcass conformation of animals >3 yr of age, milk production decrease in older lactating cows (lactations 3, 4, and 5)], and clinical data (diarrhea, enteritis, or both, in animals >3 yr of age). Except for combined cattle and goat farming and cattle farm location, all data were collected at the cow level and summarized at the herd level. Predefined risk factors and risk indicators were extracted from different national databases and combined in a multivariate statistical process control to obtain a risk assessment for each herd. The ordinary Hotelling's T(2) statistic was applied as a multivariate, standardized measure of difference between the current observed state and the average state of the risk factors for a given herd. To make the analysis more robust and adapt it to the slowly developing nature of MAP, monthly risk calculations were based on data accumulated during a 24-mo period. Monitoring of these variables was performed to identify outliers that may indicate deviance in one or more of the underlying processes. The highest-ranked herds were scattered all over Norway and clustered in high-density dairy cattle farm areas. The resulting rankings of herds are being used in the national surveillance program for MAP in 2014 to increase the sensitivity of the ongoing surveillance program in which 5 fecal samples for bacteriological examination are collected from 25 dairy herds. The use of multivariate statistical process control for selection of herds will be beneficial when a diagnostic test suitable for mass screening is available and validated on the Norwegian cattle population, thus making it possible to increase the number of sampled herds. Copyright © 2014 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  17. Statistics of the vestibular input experienced during natural self-motion: implications for neural processing.

    PubMed

    Carriot, Jérome; Jamali, Mohsen; Chacron, Maurice J; Cullen, Kathleen E

    2014-06-11

    It is widely believed that sensory systems are optimized for processing stimuli occurring in the natural environment. However, it remains unknown whether this principle applies to the vestibular system, which contributes to essential brain functions ranging from the most automatic reflexes to spatial perception and motor coordination. Here we quantified, for the first time, the statistics of natural vestibular inputs experienced by freely moving human subjects during typical everyday activities. Although previous studies have found that the power spectra of natural signals across sensory modalities decay as a power law (i.e., as 1/f(α)), we found that this did not apply to natural vestibular stimuli. Instead, power decreased slowly at lower and more rapidly at higher frequencies for all motion dimensions. We further establish that this unique stimulus structure is the result of active motion as well as passive biomechanical filtering occurring before any neural processing. Notably, the transition frequency (i.e., frequency at which power starts to decrease rapidly) was lower when subjects passively experienced sensory stimulation than when they actively controlled stimulation through their own movement. In contrast to signals measured at the head, the spectral content of externally generated (i.e., passive) environmental motion did follow a power law. Specifically, transformations caused by both motor control and biomechanics shape the statistics of natural vestibular stimuli before neural processing. We suggest that the unique structure of natural vestibular stimuli will have important consequences on the neural coding strategies used by this essential sensory system to represent self-motion in everyday life. Copyright © 2014 the authors 0270-6474/14/348347-11$15.00/0.

  18. Social cognition in patients with schizophrenia, their unaffected first degree relatives and healthy controls. Comparison between groups and analysis of associated clinical and sociodemographic variables.

    PubMed

    Rodríguez Sosa, Juana Teresa; Gil Santiago, Hiurma; Trujillo Cubas, Angel; Winter Navarro, Marta; León Pérez, Petra; Guerra Cazorla, Luz Marina; Martín Jiménez, José María

    2013-01-01

    To evaluate and compare the social cognition in patients with schizophrenia, healthy first-degree relatives and controls, by studying the relationship between social cognition and nonsocial cognition, psychopathology, and other clinical and sociodemographic variables. The total sample was comprised of patients diagnosed with paranoid schizophrenia (N = 29), healthy first-degree relatives (N = 21) and controls (N = 28). All groups were assessed with an ad hoc questionnaire and a Social Cognition Scale, which assessed the domains: emotional processing, social perception and attributional style in a Spanish population. The patient group was also assessed with the Scale for the Positive and Negative Syndrome Scale and the Mini-mental state examination. Statistical analyses were performed with SPSS version 15.0. Patients scored significantly worse in all domains of social cognition assessed, compared with controls, and mastery attributional style, compared with relatives. The type of psychopathology correlated negatively and statistically significantly with different domains of social cognition: negative symptoms with emotional processing and attributional style, and positive symptoms with social perception. Basic cognition scores correlated positively and statistically significantly with the domains social perception and attributional style. Social cognition has become an interesting object of study, especially in how it relates to non-social cognition, psychopathology and global functioning of patients, bringing new elements to be considered in the early detection, comprehensive treatment and psychosocial rehabilitation of patients. Its conceptualization as trait variable, the consideration of the existence of a continuum between patients and relatives are plausible hypotheses that require further research. Copyright © 2012 SEP y SEPB. Published by Elsevier Espana. All rights reserved.

  19. Version pressure feedback mechanisms for speculative versioning caches

    DOEpatents

    Eichenberger, Alexandre E.; Gara, Alan; O& #x27; Brien, Kathryn M.; Ohmacht, Martin; Zhuang, Xiaotong

    2013-03-12

    Mechanisms are provided for controlling version pressure on a speculative versioning cache. Raw version pressure data is collected based on one or more threads accessing cache lines of the speculative versioning cache. One or more statistical measures of version pressure are generated based on the collected raw version pressure data. A determination is made as to whether one or more modifications to an operation of a data processing system are to be performed based on the one or more statistical measures of version pressure, the one or more modifications affecting version pressure exerted on the speculative versioning cache. An operation of the data processing system is modified based on the one or more determined modifications, in response to a determination that one or more modifications to the operation of the data processing system are to be performed, to affect the version pressure exerted on the speculative versioning cache.

  20. Temporal scaling and spatial statistical analyses of groundwater level fluctuations

    NASA Astrophysics Data System (ADS)

    Sun, H.; Yuan, L., Sr.; Zhang, Y.

    2017-12-01

    Natural dynamics such as groundwater level fluctuations can exhibit multifractionality and/or multifractality due likely to multi-scale aquifer heterogeneity and controlling factors, whose statistics requires efficient quantification methods. This study explores multifractionality and non-Gaussian properties in groundwater dynamics expressed by time series of daily level fluctuation at three wells located in the lower Mississippi valley, after removing the seasonal cycle in the temporal scaling and spatial statistical analysis. First, using the time-scale multifractional analysis, a systematic statistical method is developed to analyze groundwater level fluctuations quantified by the time-scale local Hurst exponent (TS-LHE). Results show that the TS-LHE does not remain constant, implying the fractal-scaling behavior changing with time and location. Hence, we can distinguish the potentially location-dependent scaling feature, which may characterize the hydrology dynamic system. Second, spatial statistical analysis shows that the increment of groundwater level fluctuations exhibits a heavy tailed, non-Gaussian distribution, which can be better quantified by a Lévy stable distribution. Monte Carlo simulations of the fluctuation process also show that the linear fractional stable motion model can well depict the transient dynamics (i.e., fractal non-Gaussian property) of groundwater level, while fractional Brownian motion is inadequate to describe natural processes with anomalous dynamics. Analysis of temporal scaling and spatial statistics therefore may provide useful information and quantification to understand further the nature of complex dynamics in hydrology.

  1. New S control chart using skewness correction method for monitoring process dispersion of skewed distributions

    NASA Astrophysics Data System (ADS)

    Atta, Abdu; Yahaya, Sharipah; Zain, Zakiyah; Ahmed, Zalikha

    2017-11-01

    Control chart is established as one of the most powerful tools in Statistical Process Control (SPC) and is widely used in industries. The conventional control charts rely on normality assumption, which is not always the case for industrial data. This paper proposes a new S control chart for monitoring process dispersion using skewness correction method for skewed distributions, named as SC-S control chart. Its performance in terms of false alarm rate is compared with various existing control charts for monitoring process dispersion, such as scaled weighted variance S chart (SWV-S); skewness correction R chart (SC-R); weighted variance R chart (WV-R); weighted variance S chart (WV-S); and standard S chart (STD-S). Comparison with exact S control chart with regards to the probability of out-of-control detections is also accomplished. The Weibull and gamma distributions adopted in this study are assessed along with the normal distribution. Simulation study shows that the proposed SC-S control chart provides good performance of in-control probabilities (Type I error) in almost all the skewness levels and sample sizes, n. In the case of probability of detection shift the proposed SC-S chart is closer to the exact S control chart than the existing charts for skewed distributions, except for the SC-R control chart. In general, the performance of the proposed SC-S control chart is better than all the existing control charts for monitoring process dispersion in the cases of Type I error and probability of detection shift.

  2. Quality at a Glance

    DTIC Science & Technology

    1990-01-01

    This document contains summaries of fifteen of the well known books which underlie the Total Quality Management philosophy. Members of the DCASR St Louis staff offer comments and opinions on how the authors have presented the quality concept in todays business environment. Keywords: TQM (Total Quality Management ), Quality concepts, Statistical process control.

  3. 19 CFR 19.14 - Materials for use in manufacturing warehouse.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...; DEPARTMENT OF THE TREASURY CUSTOMS WAREHOUSES, CONTAINER STATIONS AND CONTROL OF MERCHANDISE THEREIN... statistical information as provided in § 141.61(e) of this chapter. If the merchandise has been imported or... report this information for each warehouse entry represented in the manufacturing process. [28 FR 14763...

  4. Improving Teaching Effectiveness through the Application of SPC Methodology

    ERIC Educational Resources Information Center

    Cadden, David; Driscoll, Vincent; Thompson, Mark

    2008-01-01

    One method used extensively to aid in determining instruction effectiveness is Student Evaluations of Instruction (SEI). This paper examines the use of statistical Process Control charts as a way to correctly measure teaching effectiveness. This field studying SEIs has produced a significant literature. It is not surprising that there is…

  5. 10 CFR 1304.110 - Disclosure of records to third parties.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Board with adequate advance written assurance that the record will be used solely as a statistical... under the control of the United States for a civil or criminal law enforcement activity, if the activity... record is disclosed under such compulsory legal process, the Board shall make reasonable efforts to...

  6. Opportunities for Applied Behavior Analysis in the Total Quality Movement.

    ERIC Educational Resources Information Center

    Redmon, William K.

    1992-01-01

    This paper identifies critical components of recent organizational quality improvement programs and specifies how applied behavior analysis can contribute to quality technology. Statistical Process Control and Total Quality Management approaches are compared, and behavior analysts are urged to build their research base and market behavior change…

  7. FORMAT OF TROPICAL CYCLONE RECORDS ("TCVITALS")

    Science.gov Websites

    FORMAT OF TROPICAL CYCLONE VITAL STATISTICS RECORDS ("TCVITALS") 8-16-2007 CHARACTER(S - These appear only in records that have been processed by the NCEP tropical cyclone quality control program SYNDAT_QCTROPCY. BOLDFACE - These appear only in NHC records. 1 - Prior to 1999, report date was

  8. SPC-Prep. Instructor's Guide. Workplace Education. Project ALERT.

    ERIC Educational Resources Information Center

    Ruetz, Nancy

    This instructor's guide contains materials for a course designed to prepare employees for statistical process control (SPC) training given at their workplace by refreshing math skills and building the concepts and vocabulary necessary to understand SPC in manufacturing environments. SPC-Prep 1 addresses the math skills necessary to perform SPC…

  9. Control mechanisms for stochastic biochemical systems via computation of reachable sets.

    PubMed

    Lakatos, Eszter; Stumpf, Michael P H

    2017-08-01

    Controlling the behaviour of cells by rationally guiding molecular processes is an overarching aim of much of synthetic biology. Molecular processes, however, are notoriously noisy and frequently nonlinear. We present an approach to studying the impact of control measures on motifs of molecular interactions that addresses the problems faced in many biological systems: stochasticity, parameter uncertainty and nonlinearity. We show that our reachability analysis formalism can describe the potential behaviour of biological (naturally evolved as well as engineered) systems, and provides a set of bounds on their dynamics at the level of population statistics: for example, we can obtain the possible ranges of means and variances of mRNA and protein expression levels, even in the presence of uncertainty about model parameters.

  10. Control mechanisms for stochastic biochemical systems via computation of reachable sets

    PubMed Central

    Lakatos, Eszter

    2017-01-01

    Controlling the behaviour of cells by rationally guiding molecular processes is an overarching aim of much of synthetic biology. Molecular processes, however, are notoriously noisy and frequently nonlinear. We present an approach to studying the impact of control measures on motifs of molecular interactions that addresses the problems faced in many biological systems: stochasticity, parameter uncertainty and nonlinearity. We show that our reachability analysis formalism can describe the potential behaviour of biological (naturally evolved as well as engineered) systems, and provides a set of bounds on their dynamics at the level of population statistics: for example, we can obtain the possible ranges of means and variances of mRNA and protein expression levels, even in the presence of uncertainty about model parameters. PMID:28878957

  11. Statistical Process Control Techniques for the Telecommunications Systems Manager

    DTIC Science & Technology

    1992-03-01

    products that are out of 59 tolerance and bad designs. The third type of defect, mistakes, are remedied by Poka - Yoke methods that are 1 introduced later...based on total production costs plus quality costs. Once production is underway, interventions are determined by their impact on the QLF. F. POKA - YOKE ...Mistakes require process improvements called Poka Yoke or mistake proofing. Shiego Shingo developed Poka Yoke methods to incorporate 100% inspection at

  12. Statistical Model Selection for TID Hardness Assurance

    NASA Technical Reports Server (NTRS)

    Ladbury, R.; Gorelick, J. L.; McClure, S.

    2010-01-01

    Radiation Hardness Assurance (RHA) methodologies against Total Ionizing Dose (TID) degradation impose rigorous statistical treatments for data from a part's Radiation Lot Acceptance Test (RLAT) and/or its historical performance. However, no similar methods exist for using "similarity" data - that is, data for similar parts fabricated in the same process as the part under qualification. This is despite the greater difficulty and potential risk in interpreting of similarity data. In this work, we develop methods to disentangle part-to-part, lot-to-lot and part-type-to-part-type variation. The methods we develop apply not just for qualification decisions, but also for quality control and detection of process changes and other "out-of-family" behavior. We begin by discussing the data used in ·the study and the challenges of developing a statistic providing a meaningful measure of degradation across multiple part types, each with its own performance specifications. We then develop analysis techniques and apply them to the different data sets.

  13. Eye-gaze determination of user intent at the computer interface

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldberg, J.H.; Schryver, J.C.

    1993-12-31

    Determination of user intent at the computer interface through eye-gaze monitoring can significantly aid applications for the disabled, as well as telerobotics and process control interfaces. Whereas current eye-gaze control applications are limited to object selection and x/y gazepoint tracking, a methodology was developed here to discriminate a more abstract interface operation: zooming-in or out. This methodology first collects samples of eve-gaze location looking at controlled stimuli, at 30 Hz, just prior to a user`s decision to zoom. The sample is broken into data frames, or temporal snapshots. Within a data frame, all spatial samples are connected into a minimummore » spanning tree, then clustered, according to user defined parameters. Each cluster is mapped to one in the prior data frame, and statistics are computed from each cluster. These characteristics include cluster size, position, and pupil size. A multiple discriminant analysis uses these statistics both within and between data frames to formulate optimal rules for assigning the observations into zooming, zoom-out, or no zoom conditions. The statistical procedure effectively generates heuristics for future assignments, based upon these variables. Future work will enhance the accuracy and precision of the modeling technique, and will empirically test users in controlled experiments.« less

  14. Recursive least squares estimation and its application to shallow trench isolation

    NASA Astrophysics Data System (ADS)

    Wang, Jin; Qin, S. Joe; Bode, Christopher A.; Purdy, Matthew A.

    2003-06-01

    In recent years, run-to-run (R2R) control technology has received tremendous interest in semiconductor manufacturing. One class of widely used run-to-run controllers is based on the exponentially weighted moving average (EWMA) statistics to estimate process deviations. Using an EWMA filter to smooth the control action on a linear process has been shown to provide good results in a number of applications. However, for a process with severe drifts, the EWMA controller is insufficient even when large weights are used. This problem becomes more severe when there is measurement delay, which is almost inevitable in semiconductor industry. In order to control drifting processes, a predictor-corrector controller (PCC) and a double EWMA controller have been developed. Chen and Guo (2001) show that both PCC and double-EWMA controller are in effect Integral-double-Integral (I-II) controllers, which are able to control drifting processes. However, since offset is often within the noise of the process, the second integrator can actually cause jittering. Besides, tuning the second filter is not as intuitive as a single EWMA filter. In this work, we look at an alternative way Recursive Least Squares (RLS), to estimate and control the drifting process. EWMA and double-EWMA are shown to be the least squares estimate for locally constant mean model and locally constant linear trend model. Then the recursive least squares with exponential factor is applied to shallow trench isolation etch process to predict the future etch rate. The etch process, which is a critical process in the flash memory manufacturing, is known to suffer from significant etch rate drift due to chamber seasoning. In order to handle the metrology delay, we propose a new time update scheme. RLS with the new time update method gives very good result. The estimate error variance is smaller than that from EWMA, and mean square error decrease more than 10% compared to that from EWMA.

  15. Real-time monitoring of a coffee roasting process with near infrared spectroscopy using multivariate statistical analysis: A feasibility study.

    PubMed

    Catelani, Tiago A; Santos, João Rodrigo; Páscoa, Ricardo N M J; Pezza, Leonardo; Pezza, Helena R; Lopes, João A

    2018-03-01

    This work proposes the use of near infrared (NIR) spectroscopy in diffuse reflectance mode and multivariate statistical process control (MSPC) based on principal component analysis (PCA) for real-time monitoring of the coffee roasting process. The main objective was the development of a MSPC methodology able to early detect disturbances to the roasting process resourcing to real-time acquisition of NIR spectra. A total of fifteen roasting batches were defined according to an experimental design to develop the MSPC models. This methodology was tested on a set of five batches where disturbances of different nature were imposed to simulate real faulty situations. Some of these batches were used to optimize the model while the remaining was used to test the methodology. A modelling strategy based on a time sliding window provided the best results in terms of distinguishing batches with and without disturbances, resourcing to typical MSPC charts: Hotelling's T 2 and squared predicted error statistics. A PCA model encompassing a time window of four minutes with three principal components was able to efficiently detect all disturbances assayed. NIR spectroscopy combined with the MSPC approach proved to be an adequate auxiliary tool for coffee roasters to detect faults in a conventional roasting process in real-time. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Citizen surveillance for environmental monitoring: combining the efforts of citizen science and crowdsourcing in a quantitative data framework.

    PubMed

    Welvaert, Marijke; Caley, Peter

    2016-01-01

    Citizen science and crowdsourcing have been emerging as methods to collect data for surveillance and/or monitoring activities. They could be gathered under the overarching term citizen surveillance . The discipline, however, still struggles to be widely accepted in the scientific community, mainly because these activities are not embedded in a quantitative framework. This results in an ongoing discussion on how to analyze and make useful inference from these data. When considering the data collection process, we illustrate how citizen surveillance can be classified according to the nature of the underlying observation process measured in two dimensions-the degree of observer reporting intention and the control in observer detection effort. By classifying the observation process in these dimensions we distinguish between crowdsourcing, unstructured citizen science and structured citizen science. This classification helps the determine data processing and statistical treatment of these data for making inference. Using our framework, it is apparent that published studies are overwhelmingly associated with structured citizen science, and there are well developed statistical methods for the resulting data. In contrast, methods for making useful inference from purely crowd-sourced data remain under development, with the challenges of accounting for the unknown observation process considerable. Our quantitative framework for citizen surveillance calls for an integration of citizen science and crowdsourcing and provides a way forward to solve the statistical challenges inherent to citizen-sourced data.

  17. Perceived Health Locus of Control, Self-Esteem, and Its Relations to Psychological Well-Being Status in Iranian Students

    PubMed Central

    Moshki, M; Ashtarian, H

    2010-01-01

    Background: Health locus of control (HLC) has been associated with a variety of ailments and health outcomes and designed to predict behaviors and cognitive processes relevant to mental and physical health. This study investigated the relationships between perceived health locus of control, self-esteem, and mental health status among Iranian students. Methods: In this analytical study the subjects were recruited from students in Gonabad University of Medical Sciences, Iran, who studied in the first year (N=154). Students completed the questionnaires for assessing demographic, perceived health locus of control, self - esteem and psychological well- being data. Results: The statistical analysis revealed a negative relationship between perceived Internal HLC and self-esteem with psychological well-being. The positive correlation of the perceived Chance HLC with psychological well-being was statistically significant (r= 0.21, P< 0.01) and the positive correlation of the perceived Internal HLC with self-esteem was statistically significant (r= 0.25, P< 0.01). A significantly direct relationship between low perceived Internal HLC, self–esteem and psychological problems was found among these students. Conclusion: The findings will be addressed in relation to their implications for effective mental health education based on health locus of control especially internal and powerful others beliefs associated with self-esteem for students. This will require additional monitoring and uninterrupted trying in order to be effective. PMID:23113040

  18. Dimensional and material characteristics of direct deposited tool steel by CO II laser

    NASA Astrophysics Data System (ADS)

    Choi, J.

    2006-01-01

    Laser aided direct metalimaterial deposition (DMD) process builds metallic parts layer-by-layer directly from the CAD representation. In general, the process uses powdered metaUmaterials fed into a melt pool, creating fully dense parts. Success of this technology in the die and tool industry depends on the parts quality to be achieved. To obtain designed geometric dimensions and material properties, delicate control of the parameters such as laser power, spot diameter, traverse speed and powder mass flow rate is critical. In this paper, the dimensional and material characteristics of directed deposited H13 tool steel by CO II laser are investigated for the DMD process with a feedback height control system. The relationships between DMD process variables and the product characteristics are analyzed using statistical techniques. The performance of the DMD process is examined with the material characteristics of hardness, porosity, microstructure, and composition.

  19. EXPLORING ENGINEERING CONTROL THROUGH PROCESS MANIPULATION OF RADIOACTIVE LIQUID WASTE TANK CHEMICAL CLEANING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, A.

    2014-04-27

    One method of remediating legacy liquid radioactive waste produced during the cold war, is aggressive in-tank chemical cleaning. Chemical cleaning has successfully reduced the curie content of residual waste heels in large underground storage tanks; however this process generates significant chemical hazards. Mercury is often the bounding hazard due to its extensive use in the separations process that produced the waste. This paper explores how variations in controllable process factors, tank level and temperature, may be manipulated to reduce the hazard potential related to mercury vapor generation. When compared using a multivariate regression analysis, findings indicated that there was amore » significant relationship between both tank level (p value of 1.65x10{sup -23}) and temperature (p value of 6.39x10{sup -6}) to the mercury vapor concentration in the tank ventilation system. Tank temperature showed the most promise as a controllable parameter for future tank cleaning endeavors. Despite statistically significant relationships, there may not be confidence in the ability to control accident scenarios to below mercury’s IDLH or PAC-III levels for future cleaning initiatives.« less

  20. Statistical metrology—measurement and modeling of variation for advanced process development and design rule generation

    NASA Astrophysics Data System (ADS)

    Boning, Duane S.; Chung, James E.

    1998-11-01

    Advanced process technology will require more detailed understanding and tighter control of variation in devices and interconnects. The purpose of statistical metrology is to provide methods to measure and characterize variation, to model systematic and random components of that variation, and to understand the impact of variation on both yield and performance of advanced circuits. Of particular concern are spatial or pattern-dependencies within individual chips; such systematic variation within the chip can have a much larger impact on performance than wafer-level random variation. Statistical metrology methods will play an important role in the creation of design rules for advanced technologies. For example, a key issue in multilayer interconnect is the uniformity of interlevel dielectric (ILD) thickness within the chip. For the case of ILD thickness, we describe phases of statistical metrology development and application to understanding and modeling thickness variation arising from chemical-mechanical polishing (CMP). These phases include screening experiments including design of test structures and test masks to gather electrical or optical data, techniques for statistical decomposition and analysis of the data, and approaches to calibrating empirical and physical variation models. These models can be integrated with circuit CAD tools to evaluate different process integration or design rule strategies. One focus for the generation of interconnect design rules are guidelines for the use of "dummy fill" or "metal fill" to improve the uniformity of underlying metal density and thus improve the uniformity of oxide thickness within the die. Trade-offs that can be evaluated via statistical metrology include the improvements to uniformity possible versus the effect of increased capacitance due to additional metal.

  1. Statistical analysis plan for evaluating low- vs. standard-dose alteplase in the ENhanced Control of Hypertension and Thrombolysis strokE stuDy (ENCHANTED).

    PubMed

    Anderson, Craig S; Woodward, Mark; Arima, Hisatomi; Chen, Xiaoying; Lindley, Richard I; Wang, Xia; Chalmers, John

    2015-12-01

    The ENhanced Control of Hypertension And Thrombolysis strokE stuDy trial is a 2 × 2 quasi-factorial active-comparison, prospective, randomized, open, blinded endpoint clinical trial that is evaluating in thrombolysis-eligible acute ischemic stroke patients whether: (1) low-dose (0·6 mg/kg body weight) intravenous alteplase has noninferior efficacy and lower risk of symptomatic intracerebral hemorrhage compared with standard-dose (0·9 mg/kg body weight) intravenous alteplase; and (2) early intensive blood pressure lowering (systolic target 130-140 mmHg) has superior efficacy and lower risk of any intracerebral hemorrhage compared with guideline-recommended blood pressure control (systolic target <180 mmHg). To outline in detail the predetermined statistical analysis plan for the 'alteplase dose arm' of the study. All data collected by participating researchers will be reviewed and formally assessed. Information pertaining to the baseline characteristics of patients, their process of care, and the delivery of treatments will be classified, and for each item, appropriate descriptive statistical analyses are planned with appropriate comparisons made between randomized groups. For the trial outcomes, the most appropriate statistical comparisons to be made between groups are planned and described. A statistical analysis plan was developed for the results of the alteplase dose arm of the study that is transparent, available to the public, verifiable, and predetermined before completion of data collection. We have developed a predetermined statistical analysis plan for the ENhanced Control of Hypertension And Thrombolysis strokE stuDy alteplase dose arm which is to be followed to avoid analysis bias arising from prior knowledge of the study findings. © 2015 The Authors. International Journal of Stroke published by John Wiley & Sons Ltd on behalf of World Stroke Organization.

  2. The critical role of NIR spectroscopy and statistical process control (SPC) strategy towards captopril tablets (25 mg) manufacturing process understanding: a case study.

    PubMed

    Curtivo, Cátia Panizzon Dal; Funghi, Nathália Bitencourt; Tavares, Guilherme Diniz; Barbosa, Sávio Fujita; Löbenberg, Raimar; Bou-Chacra, Nádia Araci

    2015-05-01

    In this work, near-infrared spectroscopy (NIRS) method was used to evaluate the uniformity of dosage units of three captopril 25 mg tablets commercial batches. The performance of the calibration method was assessed by determination of Q value (0.9986), standard error of estimation (C-set SEE = 1.956), standard error of prediction (V-set SEP = 2.076) as well as the consistency (106.1%). These results indicated the adequacy of the selected model. The method validation revealed the agreement of the reference high pressure liquid chromatography (HPLC) and NIRS methods. The process evaluation using the NIRS method showed that the variability was due to common causes and delivered predictable results consistently. Cp and Cpk values were, respectively, 2.05 and 1.80. These results revealed a non-centered process in relation to the average target (100% w/w), in the specified range (85-115%). The probability of failure was 21:100 million tablets of captopril. The NIRS in combination with the method of multivariate calibration, partial least squares (PLS) regression, allowed the development of methodology for the uniformity of dosage units evaluation of captopril tablets 25 mg. The statistical process control strategy associated with NIRS method as PAT played a critical role in understanding of the sources and degree of variation and its impact on the process. This approach led towards a better process understanding and provided the sound scientific basis for its continuous improvement.

  3. Can a combination of average of normals and "real time" External Quality Assurance replace Internal Quality Control?

    PubMed

    Badrick, Tony; Graham, Peter

    2018-03-28

    Internal Quality Control and External Quality Assurance are separate but related processes that have developed independently in laboratory medicine over many years. They have different sample frequencies, statistical interpretations and immediacy. Both processes have evolved absorbing new understandings of the concept of laboratory error, sample material matrix and assay capability. However, we do not believe at the coalface that either process has led to much improvement in patient outcomes recently. It is the increasing reliability and automation of analytical platforms along with improved stability of reagents that has reduced systematic and random error, which in turn has minimised the risk of running less frequent IQC. We suggest that it is time to rethink the role of both these processes and unite them into a single approach using an Average of Normals model supported by more frequent External Quality Assurance samples. This new paradigm may lead to less confusion for laboratory staff and quicker responses to and identification of out of control situations.

  4. MAGMA: analysis of two-channel microarrays made easy.

    PubMed

    Rehrauer, Hubert; Zoller, Stefan; Schlapbach, Ralph

    2007-07-01

    The web application MAGMA provides a simple and intuitive interface to identify differentially expressed genes from two-channel microarray data. While the underlying algorithms are not superior to those of similar web applications, MAGMA is particularly user friendly and can be used without prior training. The user interface guides the novice user through the most typical microarray analysis workflow consisting of data upload, annotation, normalization and statistical analysis. It automatically generates R-scripts that document MAGMA's entire data processing steps, thereby allowing the user to regenerate all results in his local R installation. The implementation of MAGMA follows the model-view-controller design pattern that strictly separates the R-based statistical data processing, the web-representation and the application logic. This modular design makes the application flexible and easily extendible by experts in one of the fields: statistical microarray analysis, web design or software development. State-of-the-art Java Server Faces technology was used to generate the web interface and to perform user input processing. MAGMA's object-oriented modular framework makes it easily extendible and applicable to other fields and demonstrates that modern Java technology is also suitable for rather small and concise academic projects. MAGMA is freely available at www.magma-fgcz.uzh.ch.

  5. Reproducibility of ZrO2-based freeze casting for biomaterials.

    PubMed

    Naleway, Steven E; Fickas, Kate C; Maker, Yajur N; Meyers, Marc A; McKittrick, Joanna

    2016-04-01

    The processing technique of freeze casting has been intensely researched for its potential to create porous scaffold and infiltrated composite materials for biomedical implants and structural materials. However, in order for this technique to be employed medically or commercially, it must be able to reliably produce materials in great quantities with similar microstructures and properties. Here we investigate the reproducibility of the freeze casting process by independently fabricating three sets of eight ZrO2-epoxy composite scaffolds with the same processing conditions but varying solid loading (10, 15 and 20 vol.%). Statistical analyses (One-way ANOVA and Tukey's HSD tests) run upon measurements of the microstructural dimensions of these composite scaffold sets show that, while the majority of microstructures are similar, in all cases the composite scaffolds display statistically significant variability. In addition, composite scaffolds where mechanically compressed and statistically analyzed. Similar to the microstructures, almost all of their resultant properties displayed significant variability though most composite scaffolds were similar. These results suggest that additional research to improve control of the freeze casting technique is required before scaffolds and composite scaffolds can reliably be reproduced for commercial or medical applications. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Differentiation of chocolates according to the cocoa's geographical origin using chemometrics.

    PubMed

    Cambrai, Amandine; Marcic, Christophe; Morville, Stéphane; Sae Houer, Pierre; Bindler, Françoise; Marchioni, Eric

    2010-02-10

    The determination of the geographical origin of cocoa used to produce chocolate has been assessed through the analysis of the volatile compounds of chocolate samples. The analysis of the volatile content and their statistical processing by multivariate analyses tended to form independent groups for both Africa and Madagascar, even if some of the chocolate samples analyzed appeared in a mixed zone together with those from America. This analysis also allowed a clear separation between Caribbean chocolates and those from other origins. Height compounds (such as linalool or (E,E)-2,4-decadienal) characteristic of chocolate's different geographical origins were also identified. The method described in this work (hydrodistillation, GC analysis, and statistic treatment) may improve the control of the geographical origin of chocolate during its long production process.

  7. Processes and subdivisions in diogenites, a multivariate statistical analysis

    NASA Technical Reports Server (NTRS)

    Harriott, T. A.; Hewins, R. H.

    1984-01-01

    Multivariate statistical techniques used on diogenite orthopyroxene analyses show the relationships that occur within diogenites and the two orthopyroxenite components (class I and II) in the polymict diogenite Garland. Cluster analysis shows that only Peckelsheim is similar to Garland class I (Fe-rich) and the other diogenites resemble Garland class II. The unique diogenite Y 75032 may be related to type I by fractionation. Factor analysis confirms the subdivision and shows that Fe does not correlate with the weakly incompatible elements across the entire pyroxene composition range, indicating that igneous fractionation is not the process controlling total diogenite composition variation. The occurrence of two groups of diogenites is interpreted as the result of sampling or mixing of two main sequences of orthopyroxene cumulates with slightly different compositions.

  8. Neural network approach in multichannel auditory event-related potential analysis.

    PubMed

    Wu, F Y; Slater, J D; Ramsay, R E

    1994-04-01

    Even though there are presently no clearly defined criteria for the assessment of P300 event-related potential (ERP) abnormality, it is strongly indicated through statistical analysis that such criteria exist for classifying control subjects and patients with diseases resulting in neuropsychological impairment such as multiple sclerosis (MS). We have demonstrated the feasibility of artificial neural network (ANN) methods in classifying ERP waveforms measured at a single channel (Cz) from control subjects and MS patients. In this paper, we report the results of multichannel ERP analysis and a modified network analysis methodology to enhance automation of the classification rule extraction process. The proposed methodology significantly reduces the work of statistical analysis. It also helps to standardize the criteria of P300 ERP assessment and facilitate the computer-aided analysis on neuropsychological functions.

  9. Photoresist thin-film effects on alignment process capability

    NASA Astrophysics Data System (ADS)

    Flores, Gary E.; Flack, Warren W.

    1993-08-01

    Two photoresists were selected for alignment characterization based on their dissimilar coating properties and observed differences on alignment capability. The materials are Dynachem OFPR-800 and Shipley System 8. Both photoresists were examined on two challenging alignment levels in a submicron CMOS process, a nitride level and a planarized second level metal. An Ultratech Stepper model 1500 which features a darkfield alignment system with a broadband green light for alignment signal detection was used for this project. Initially, statistically designed linear screening experiments were performed to examine six process factors for each photoresist: viscosity, spin acceleration, spin speed, spin time, softbake time, and softbake temperature. Using the results derived from the screening experiments, a more thorough examination of the statistically significant process factors was performed. A full quadratic experimental design was conducted to examine viscosity, spin speed, and spin time coating properties on alignment. This included a characterization of both intra and inter wafer alignment control and alignment process capability. Insight to the different alignment behavior is analyzed in terms of photoresist material properties and the physical nature of the alignment detection system.

  10. PMMA/PS coaxial electrospinning: a statistical analysis on processing parameters

    NASA Astrophysics Data System (ADS)

    Rahmani, Shahrzad; Arefazar, Ahmad; Latifi, Masoud

    2017-08-01

    Coaxial electrospinning, as a versatile method for producing core-shell fibers, is known to be very sensitive to two classes of influential factors including material and processing parameters. Although coaxial electrospinning has been the focus of many studies, the effects of processing parameters on the outcomes of this method have not yet been well investigated. A good knowledge of the impacts of processing parameters and their interactions on coaxial electrospinning can make it possible to better control and optimize this process. Hence, in this study, the statistical technique of response surface method (RSM) using the design of experiments on four processing factors of voltage, distance, core and shell flow rates was applied. Transmission electron microscopy (TEM), scanning electron microscopy (SEM), oil immersion and Fluorescent microscopy were used to characterize fiber morphology. The core and shell diameters of fibers were measured and the effects of all factors and their interactions were discussed. Two polynomial models with acceptable R-squares were proposed to describe the core and shell diameters as functions of the processing parameters. Voltage and distance were recognized as the most significant and influential factors on shell diameter, while core diameter was mainly under the influence of core and shell flow rates besides the voltage.

  11. Multivariate statistical monitoring as applied to clean-in-place (CIP) and steam-in-place (SIP) operations in biopharmaceutical manufacturing.

    PubMed

    Roy, Kevin; Undey, Cenk; Mistretta, Thomas; Naugle, Gregory; Sodhi, Manbir

    2014-01-01

    Multivariate statistical process monitoring (MSPM) is becoming increasingly utilized to further enhance process monitoring in the biopharmaceutical industry. MSPM can play a critical role when there are many measurements and these measurements are highly correlated, as is typical for many biopharmaceutical operations. Specifically, for processes such as cleaning-in-place (CIP) and steaming-in-place (SIP, also known as sterilization-in-place), control systems typically oversee the execution of the cycles, and verification of the outcome is based on offline assays. These offline assays add to delays and corrective actions may require additional setup times. Moreover, this conventional approach does not take interactive effects of process variables into account and cycle optimization opportunities as well as salient trends in the process may be missed. Therefore, more proactive and holistic online continued verification approaches are desirable. This article demonstrates the application of real-time MSPM to processes such as CIP and SIP with industrial examples. The proposed approach has significant potential for facilitating enhanced continuous verification, improved process understanding, abnormal situation detection, and predictive monitoring, as applied to CIP and SIP operations. © 2014 American Institute of Chemical Engineers.

  12. Selecting Statistical Procedures for Quality Control Planning Based on Risk Management.

    PubMed

    Yago, Martín; Alcover, Silvia

    2016-07-01

    According to the traditional approach to statistical QC planning, the performance of QC procedures is assessed in terms of its probability of rejecting an analytical run that contains critical size errors (PEDC). Recently, the maximum expected increase in the number of unacceptable patient results reported during the presence of an undetected out-of-control error condition [Max E(NUF)], has been proposed as an alternative QC performance measure because it is more related to the current introduction of risk management concepts for QC planning in the clinical laboratory. We used a statistical model to investigate the relationship between PEDC and Max E(NUF) for simple QC procedures widely used in clinical laboratories and to construct charts relating Max E(NUF) with the capability of the analytical process that allow for QC planning based on the risk of harm to a patient due to the report of erroneous results. A QC procedure shows nearly the same Max E(NUF) value when used for controlling analytical processes with the same capability, and there is a close relationship between PEDC and Max E(NUF) for simple QC procedures; therefore, the value of PEDC can be estimated from the value of Max E(NUF) and vice versa. QC procedures selected by their high PEDC value are also characterized by a low value for Max E(NUF). The PEDC value can be used for estimating the probability of patient harm, allowing for the selection of appropriate QC procedures in QC planning based on risk management. © 2016 American Association for Clinical Chemistry.

  13. Six-sigma application in tire-manufacturing company: a case study

    NASA Astrophysics Data System (ADS)

    Gupta, Vikash; Jain, Rahul; Meena, M. L.; Dangayach, G. S.

    2017-09-01

    Globalization, advancement of technologies, and increment in the demand of the customer change the way of doing business in the companies. To overcome these barriers, the six-sigma define-measure-analyze-improve-control (DMAIC) method is most popular and useful. This method helps to trim down the wastes and generating the potential ways of improvement in the process as well as service industries. In the current research, the DMAIC method was used for decreasing the process variations of bead splice causing wastage of material. This six-sigma DMAIC research was initiated by problem identification through voice of customer in the define step. The subsequent step constitutes of gathering the specification data of existing tire bead. This step was followed by the analysis and improvement steps, where the six-sigma quality tools such as cause-effect diagram, statistical process control, and substantial analysis of existing system were implemented for root cause identification and reduction in process variation. The process control charts were used for systematic observation and control the process. Utilizing DMAIC methodology, the standard deviation was decreased from 2.17 to 1.69. The process capability index (C p) value was enhanced from 1.65 to 2.95 and the process performance capability index (C pk) value was enhanced from 0.94 to 2.66. A DMAIC methodology was established that can play a key role for reducing defects in the tire-manufacturing process in India.

  14. An experimental study of factors affecting the selective inhibition of sintering process

    NASA Astrophysics Data System (ADS)

    Asiabanpour, Bahram

    Selective Inhibition of Sintering (SIS) is a new rapid prototyping method that builds parts in a layer-by-layer fabrication basis. SIS works by joining powder particles through sintering in the part's body, and by sintering inhibition of some selected powder areas. The objective of this research has been to improve the new SIS process, which has been invented at USC. The process improvement is based on statistical design of experiments. To conduct the needed experiments a working machine and related path generator software were needed. The machine and its control software were made available prior to this research. The path generator algorithms and software had to be created. This program should obtain model geometry data from a CAD file and generate an appropriate path file for the printer nozzle. Also, the program should generate a simulation file for path file inspection using virtual prototyping. The activities related to path generator constitute the first part of this research, which has resulted in an efficient path generator. In addition, to reach an acceptable level of accuracy, strength, and surface quality in the fabricated parts, all effective factors in the SIS process should be identified and controlled. Simultaneous analytical and experimental studies were conducted to recognize effective factors and to control the SIS process. Also, it was known that polystyrene was the most appropriate polymer powder and saturated potassium iodide was the most effective inhibitor among the available candidate materials. In addition, statistical tools were applied to improve the desirable properties of the parts fabricated by the SIS process. An investigation of part strength was conducted using the Response Surface Methodology (RSM) and a region of acceptable operating conditions for the part strength was found. Then, through analysis of the experimental results, the impact of the factors on the final part surface quality and dimensional accuracy was modeled. After developing a desirability function model, process operating conditions for maximum desirability were identified. Finally, the desirability model was validated.

  15. Statistical characteristics of excess fiber length in loose tubes of optical cable

    NASA Astrophysics Data System (ADS)

    Andreev, Vladimir A.; Gavryushin, Sergey A.; Popov, Boris V.; Popov, Victor B.; Vazhdaev, Michael A.

    2017-04-01

    This paper presents an analysis of the data measurements of excess fiber length in the loose tubes of optical cable during the post-process quality control of ready-made products. At determining estimates of numerical characteristics of excess fiber length method of results processing of direct multiple equally accurate measurements has been used. The results of experimental research of the excess length value at the manufacturing technology of loose tube remains constant.

  16. United States Air Force Summer Faculty Research Program (1987). Program Technical Report. Volume 2.

    DTIC Science & Technology

    1987-12-01

    the area of statistical inference, distribution theory and stochastic * •processes. I have taught courses in random processes and sample % j .functions...controlled phase separation of isotropic, binary mixtures, the theory of spinodal decomposition has been developed by Cahn and Hilliard.5 ,6 This theory is...peak and its initial rate of growth at a given temperature are predicted by the spinodal theory . The angle of maximum intensity is then determined by

  17. Evaluation of teeth loss among workers in the laminate and composite materials department of aircraft factory.

    PubMed

    Bachanek, Teresa; Samborski, Dariusz; Chałas, Renata; Wolańska, Ewa

    2005-01-01

    Liquid epoxide resins, solvents and solvent-modified epoxide resins, as well as hardeners for epoxide resins, appear to be skin and mucosa irritants of different intensity and possibly have allergenic properties. Therefore, it is required that the employees are qualified and industrial safety rules are followed when these substances are in use in the manufacturing process. Our study evaluated the state of dentition and analysed the loss of teeth in the workers of the laminate and composite materials department of aircraft factory. The research has been carried out in a group of 114 workers, which consisted of 88 men and 26 women 20 to 61 years old. The control group consisted of 41 workers of the administration department in the aircraft factory who did not have any contact with chemical compounds. The workers in the studied group are characterised as having an unsatisfactory state of dentition, as shown by the high rate of lost teeth (74 %). Statistically significant difference between the studied group and the control was found when the relationship between the number of lost molar teeth in women in the studied group and those in the control group is taken into consideration, a statistically significant difference appears to refer to teeth 46 and 27. The same statistically important correlation between men in the control and studied groups concerns teeth 16. The research data shows that incisor teeth are the least frequently extracted teeth in the whole population studied. Statistically significant differences can be noted for teeth 21 and 23 between the women in the control group and those in the studied one. Future studies are necessary to assess the potential relationship between the loss of teeth among workers of the department of laminate and composite materials of aircraft factory and their workplace.

  18. Emotion, working memory, and cognitive control in patients with first-onset and previously untreated minor depressive disorders.

    PubMed

    Li, Mi; Lu, Shengfu; Wang, Gang; Feng, Lei; Fu, Bingbing; Zhong, Ning

    2016-06-01

    To explore working memory and the ability to process different emotional stimuli in patients with first-onset and untreated minor (mild or moderate) depression. Patients with first-onset and previously untreated minor depression, and healthy controls, were enrolled. Using a modified Sternberg working memory paradigm to investigate the combined effects of emotional stimuli with working memory, participants were exposed to experimental stimuli comprising pictures that represented positive, neutral and negative emotions. Working memory ability was measured using reaction time and accuracy, and emotion-processing ability was measured using pupil diameter. Out of 36 participants (18 patients with minor depression and 18 controls), there were no statistically significant between-group differences in response time and accuracy. Positive stimuli evoked changes in pupil diameter that were significantly smaller in patients with minor depression versus controls, but changes in pupil diameter evoked by negative stimuli were not significantly different between the two groups. Healthy subjects showed a stronger emotional response to positive emotional stimuli than patients with first onset and previously untreated minor depression, but there were no differences in response to negative emotions. There were no statistically significant between-group differences in terms of speed of cognitive response, but this may have been due to the relatively small samples sizes assessed. Studies with larger sample populations are required to further investigate these results. © The Author(s) 2016.

  19. Distinct contributions of attention and working memory to visual statistical learning and ensemble processing.

    PubMed

    Hall, Michelle G; Mattingley, Jason B; Dux, Paul E

    2015-08-01

    The brain exploits redundancies in the environment to efficiently represent the complexity of the visual world. One example of this is ensemble processing, which provides a statistical summary of elements within a set (e.g., mean size). Another is statistical learning, which involves the encoding of stable spatial or temporal relationships between objects. It has been suggested that ensemble processing over arrays of oriented lines disrupts statistical learning of structure within the arrays (Zhao, Ngo, McKendrick, & Turk-Browne, 2011). Here we asked whether ensemble processing and statistical learning are mutually incompatible, or whether this disruption might occur because ensemble processing encourages participants to process the stimulus arrays in a way that impedes statistical learning. In Experiment 1, we replicated Zhao and colleagues' finding that ensemble processing disrupts statistical learning. In Experiments 2 and 3, we found that statistical learning was unimpaired by ensemble processing when task demands necessitated (a) focal attention to individual items within the stimulus arrays and (b) the retention of individual items in working memory. Together, these results are consistent with an account suggesting that ensemble processing and statistical learning can operate over the same stimuli given appropriate stimulus processing demands during exposure to regularities. (c) 2015 APA, all rights reserved).

  20. Assessing Language Dominance with Functional MRI: The Role of Control Tasks and Statistical Analysis

    ERIC Educational Resources Information Center

    Dodoo-Schittko, Frank; Rosengarth, Katharina; Doenitz, Christian; Greenlee, Mark W.

    2012-01-01

    There is a discrepancy between the brain regions revealed by functional neuroimaging techniques and those brain regions where a loss of function, either by lesion or by electrocortical stimulation, induces language disorders. To differentiate between essential and non-essential language-related processes, we investigated the effects of linguistic…

  1. SPC-Prep 1. Participant's Manual. Workplace Education. Project ALERT.

    ERIC Educational Resources Information Center

    Ruetz, Nancy

    This companion document to the instructor's guide for a course designed to prepare employees for statistical process control (SPC) training given at their workplace by refreshing math skills and building the concepts and vocabulary necessary to understand SPC in manufacturing environments. SPC-Prep 1 addresses the math skills necessary to perform…

  2. Incorporating E-Mail into the Learning Process: Its Impact on Student Academic Achievement and Attitudes.

    ERIC Educational Resources Information Center

    Yu, Fu-Yun; Yu, Hsin-Jin Jessy

    2002-01-01

    Describes a study of Taiwan university students that investigated the impacts of incorporating email into the classroom on student achievement and attitudes using a posttest-only control-group design. Results showed a statistically significant difference in academic performance but not in student attitudes toward computers. (Author/LRW)

  3. Total Quality Management and Organizational Behavior Management: An Integration for Continual Improvement.

    ERIC Educational Resources Information Center

    Mawhinney, Thomas C.

    1992-01-01

    The history and main features of organizational behavior management (OBM) are compared and integrated with those of total quality management (TQM), with emphasis on W.E. Deming's 14 points and OBM's operant-based approach to performance management. Interventions combining OBM, TQM, and statistical process control are recommended. (DB)

  4. Outlier Detection in High-Stakes Certification Testing. Research Report.

    ERIC Educational Resources Information Center

    Meijer, Rob R.

    Recent developments of person-fit analysis in computerized adaptive testing (CAT) are discussed. Methods from statistical process control are presented that have been proposed to classify an item score pattern as fitting or misfitting the underlying item response theory (IRT) model in a CAT. Most person-fit research in CAT is restricted to…

  5. Outlier Detection in High-Stakes Certification Testing.

    ERIC Educational Resources Information Center

    Meijer, Rob R.

    2002-01-01

    Used empirical data from a certification test to study methods from statistical process control that have been proposed to classify an item score pattern as fitting or misfitting the underlying item response theory model in computerized adaptive testing. Results for 1,392 examinees show that different types of misfit can be distinguished. (SLD)

  6. 45 CFR 2508.4 - When can Corporation records be disclosed?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... record will be used solely as a statistical research or reporting record, and the record is to be... agency or to an instrumentality of any governmental jurisdiction within or under the control of the... 1201.3, and provided that if any such record is disclosed under such compulsory legal process and...

  7. 76 FR 57958 - Fisheries of the South Atlantic and Gulf of Mexico; South Atlantic Fishery Management Council...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-19

    ...) will hold meetings of its Scientific and Statistical Committee (SSC) and Socio-Economic Sub-Panel (SEP) to review fishery management plan (FMP) amendments under development, ABC control rule approaches... process. The SSC will discuss FMP amendments under development, assessments of black sea bass and tilefish...

  8. 4 CFR 200.10 - Disclosure of records to third parties.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... as a statistical research or reporting record and that the record is to be transferred in a form that... governmental jurisdiction within or under the control of the United States for a civil or criminal law... jurisdiction. In the event that any record is disclosed under such compulsory legal process, the Board shall...

  9. Training Factory Workers: Three Case Studies. Contractor Report.

    ERIC Educational Resources Information Center

    Hirschhorn, Larry D.

    Case studies examined the context and impact of training in three factories: a bakery, a circuit assembly plant, and a plant that produces microchips. Cookie-Foods, Inc. used Statistical Process Control (SPC) and a course on problem solving to increase the operators' productivity. Impact of the SPC program was limited, because workers who…

  10. Fieldcrest Cannon Workplace Literacy Modules for Yardage Binder Operators. Alabama Partnership for Training.

    ERIC Educational Resources Information Center

    Alabama State Dept. of Education, Montgomery.

    This packet contains eight learning modules developed for use in Fieldcrest Cannon workplace literacy classes for yardage binder operators. The modules cover the following topics: (1) communications 1 and 2; (2) safety; (3) fractions; (4) statistical process control; (5) measurement; (6) calculator; (7) benefits; and (8) computer. Modules consist…

  11. Optimal filtering and Bayesian detection for friction-based diagnostics in machines.

    PubMed

    Ray, L R; Townsend, J R; Ramasubramanian, A

    2001-01-01

    Non-model-based diagnostic methods typically rely on measured signals that must be empirically related to process behavior or incipient faults. The difficulty in interpreting a signal that is indirectly related to the fundamental process behavior is significant. This paper presents an integrated non-model and model-based approach to detecting when process behavior varies from a proposed model. The method, which is based on nonlinear filtering combined with maximum likelihood hypothesis testing, is applicable to dynamic systems whose constitutive model is well known, and whose process inputs are poorly known. Here, the method is applied to friction estimation and diagnosis during motion control in a rotating machine. A nonlinear observer estimates friction torque in a machine from shaft angular position measurements and the known input voltage to the motor. The resulting friction torque estimate can be analyzed directly for statistical abnormalities, or it can be directly compared to friction torque outputs of an applicable friction process model in order to diagnose faults or model variations. Nonlinear estimation of friction torque provides a variable on which to apply diagnostic methods that is directly related to model variations or faults. The method is evaluated experimentally by its ability to detect normal load variations in a closed-loop controlled motor driven inertia with bearing friction and an artificially-induced external line contact. Results show an ability to detect statistically significant changes in friction characteristics induced by normal load variations over a wide range of underlying friction behaviors.

  12. Primary Health Care and tuberculosis: services evaluation.

    PubMed

    Wysocki, Anneliese Domingues; Ponce, Maria Amélia Zanon; Brunello, Maria Eugênia Firmino; Beraldo, Aline Ale; Vendramini, Silvia Helena Figueiredo; Scatena, Lúcia Marina; Ruffino, Antonio; Villa, Tereza Cristina Scatena

    2017-01-01

    In order to control tuberculosis, the Brazilian Ministry of Health recommends the decentralization of control actions directed to the Primary Health Care, and there are few studies on the performance of the Tuberculosis Control Program in decentralized contexts. To evaluate the performance of Primary Health Care services in tuberculosis treatment. This is an evaluative study with cross-sectional approach conducted in 2011. Two hundred and thirty-nine health professionals from Primary Health Care units were interviewed using a structured instrument based on the evaluation reference of the health services quality (structure - process - results). The performance of these services was analyzed applying techniques of descriptive statistics, validation, and construction of indicators and by determining the reduced variable "Z". The indicators "participation of professionals in tuberculosis patients' care" (structure) and "reference and counterreference" (process) had the best evaluations, whereas "professional training" (structure) and "external actions for tuberculosis control" (process) had the worst results. The decentralization of tuberculosis control actions has been taking place in a vertical manner in Primary Health Care. The challenge of controlling tuberculosis involves overcoming constraints related to the engagement, training, and turnover rates among health professionals, which is a coordination between services and monitoring of control actions in Primary Health Care.

  13. Application of the statistical process control method for prospective patient safety monitoring during the learning phase: robotic kidney transplantation with regional hypothermia (IDEAL phase 2a-b).

    PubMed

    Sood, Akshay; Ghani, Khurshid R; Ahlawat, Rajesh; Modi, Pranjal; Abaza, Ronney; Jeong, Wooju; Sammon, Jesse D; Diaz, Mireya; Kher, Vijay; Menon, Mani; Bhandari, Mahendra

    2014-08-01

    Traditional evaluation of the learning curve (LC) of an operation has been retrospective. Furthermore, LC analysis does not permit patient safety monitoring. To prospectively monitor patient safety during the learning phase of robotic kidney transplantation (RKT) and determine when it could be considered learned using the techniques of statistical process control (SPC). From January through May 2013, 41 patients with end-stage renal disease underwent RKT with regional hypothermia at one of two tertiary referral centers adopting RKT. Transplant recipients were classified into three groups based on the robotic training and kidney transplant experience of the surgeons: group 1, robot trained with limited kidney transplant experience (n=7); group 2, robot trained and kidney transplant experienced (n=20); and group 3, kidney transplant experienced with limited robot training (n=14). We employed prospective monitoring using SPC techniques, including cumulative summation (CUSUM) and Shewhart control charts, to perform LC analysis and patient safety monitoring, respectively. Outcomes assessed included post-transplant graft function and measures of surgical process (anastomotic and ischemic times). CUSUM and Shewhart control charts are time trend analytic techniques that allow comparative assessment of outcomes following a new intervention (RKT) relative to those achieved with established techniques (open kidney transplant; target value) in a prospective fashion. CUSUM analysis revealed an initial learning phase for group 3, whereas groups 1 and 2 had no to minimal learning time. The learning phase for group 3 varied depending on the parameter assessed. Shewhart control charts demonstrated no compromise in functional outcomes for groups 1 and 2. Graft function was compromised in one patient in group 3 (p<0.05) secondary to reasons unrelated to RKT. In multivariable analysis, robot training was significantly associated with improved task-completion times (p<0.01). Graft function was not adversely affected by either the lack of robotic training (p=0.22) or kidney transplant experience (p=0.72). The LC and patient safety of a new surgical technique can be assessed prospectively using CUSUM and Shewhart control chart analytic techniques. These methods allow determination of the duration of mentorship and identification of adverse events in a timely manner. A new operation can be considered learned when outcomes achieved with the new intervention are at par with outcomes following established techniques. Statistical process control techniques allowed for robust, objective, and prospective monitoring of robotic kidney transplantation and can similarly be applied to other new interventions during the introduction and adoption phase. Copyright © 2014 European Association of Urology. Published by Elsevier B.V. All rights reserved.

  14. Inferring Master Painters' Esthetic Biases from the Statistics of Portraits

    PubMed Central

    Aleem, Hassan; Correa-Herran, Ivan; Grzywacz, Norberto M.

    2017-01-01

    The Processing Fluency Theory posits that the ease of sensory information processing in the brain facilitates esthetic pleasure. Accordingly, the theory would predict that master painters should display biases toward visual properties such as symmetry, balance, and moderate complexity. Have these biases been occurring and if so, have painters been optimizing these properties (fluency variables)? Here, we address these questions with statistics of portrait paintings from the Early Renaissance period. To do this, we first developed different computational measures for each of the aforementioned fluency variables. Then, we measured their statistics in 153 portraits from 26 master painters, in 27 photographs of people in three controlled poses, and in 38 quickly snapped photographs of individual persons. A statistical comparison between Early Renaissance portraits and quickly snapped photographs revealed that painters showed a bias toward balance, symmetry, and moderate complexity. However, a comparison between portraits and controlled-pose photographs showed that painters did not optimize each of these properties. Instead, different painters presented biases toward different, narrow ranges of fluency variables. Further analysis suggested that the painters' individuality stemmed in part from having to resolve the tension between complexity vs. symmetry and balance. We additionally found that constraints on the use of different painting materials by distinct painters modulated these fluency variables systematically. In conclusion, the Processing Fluency Theory of Esthetic Pleasure would need expansion if we were to apply it to the history of visual art since it cannot explain the lack of optimization of each fluency variables. To expand the theory, we propose the existence of a Neuroesthetic Space, which encompasses the possible values that each of the fluency variables can reach in any given art period. We discuss the neural mechanisms of this Space and propose that it has a distributed representation in the human brain. We further propose that different artists reside in different, small sub-regions of the Space. This Neuroesthetic-Space hypothesis raises the question of how painters and their paintings evolve across art periods. PMID:28337133

  15. Influence of the processing route of porcelain/Ti-6Al-4V interfaces on shear bond strength.

    PubMed

    Toptan, Fatih; Alves, Alexandra C; Henriques, Bruno; Souza, Júlio C M; Coelho, Rui; Silva, Filipe S; Rocha, Luís A; Ariza, Edith

    2013-04-01

    This study aims at evaluating the two-fold effect of initial surface conditions and dental porcelain-to-Ti-6Al-4V alloy joining processing route on the shear bond strength. Porcelain-to-Ti-6Al-4V samples were processed by conventional furnace firing (porcelain-fused-to-metal) and hot pressing. Prior to the processing, Ti-6Al-4V cylinders were prepared by three different surface treatments: polishing, alumina or silica blasting. Within the firing process, polished and alumina blasted samples were subjected to two different cooling rates: air cooling and a slower cooling rate (65°C/min). Metal/porcelain bond strength was evaluated by shear bond test. The data were analyzed using one-way ANOVA followed by Tuckey's test (p<0.05). Before and after shear bond tests, metallic surfaces and metal/ceramic interfaces were examined by Field Emission Gun Scanning Electron Microscope (FEG-SEM) equipped with Energy Dispersive X-Ray Spectroscopy (EDS). Shear bond strength values of the porcelain-to-Ti-6Al-4V alloy interfaces ranged from 27.1±8.9MPa for porcelain fused to polished samples up to 134.0±43.4MPa for porcelain fused to alumina blasted samples. According to the statistical analysis, no significant difference were found on the shear bond strength values for different cooling rates. Processing method was statistically significant only for the polished samples, and airborne particle abrasion was statistically significant only for the fired samples. The type of the blasting material did not cause a statistically significant difference on the shear bond strength values. Shear bond strength of dental porcelain to Ti-6Al-4V alloys can be significantly improved from controlled conditions of surface treatments and processing methods. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Three-Dimensional Kinematic Analysis of Prehension Movements in Young Children with Autism Spectrum Disorder: New Insights on Motor Impairment.

    PubMed

    Campione, Giovanna Cristina; Piazza, Caterina; Villa, Laura; Molteni, Massimo

    2016-06-01

    The study was aimed at better clarifying whether action execution impairment in autism depends mainly on disruptions either in feedforward mechanisms or in feedback-based control processes supporting motor execution. To this purpose, we analyzed prehension movement kinematics in 4- and 5-year-old children with autism and in peers with typical development. Statistical analysis showed that the kinematics of the grasp component was spared in autism, whereas early kinematics of the reach component was atypical. We discussed this evidence as suggesting impairment in the feedforward processes involved in action execution, whereas impairment in feedback-based control processes remained unclear. We proposed that certain motor abilities are available in autism, and children may use them differently as a function of motor context complexity.

  17. Amyotrophic lateral sclerosis: increased solubility of skin collagen

    NASA Technical Reports Server (NTRS)

    Ono, S.; Yamauchi, M.

    1992-01-01

    We studied the solubility of skin collagen from six patients with amyotrophic lateral sclerosis (ALS) and six controls. The amount of collagen extracted with neutral salt solution was significantly greater in patients with ALS than in controls. In addition, there was a statistically significant increase in the proportion of collagen extracted from ALS patients with increased duration of illness. The collagen solubilized by pepsin and cyanogen bromide treatments was significantly higher in ALS patients than in controls, and its proportion was positively and significantly associated with duration of illness in ALS patients. These results indicate that the metabolism of skin collagen may be affected in the disease process of ALS, causing an increase in immature soluble collagen in the tissue, which is the opposite to that which occurs in the normal aging process.

  18. Chemical Structure and Molecular Dimension As Controls on the Inherent Stability of Charcoal in Boreal Forest Soil

    NASA Astrophysics Data System (ADS)

    Hockaday, W. C.; Kane, E. S.; Ohlson, M.; Huang, R.; Von Bargen, J.; Davis, R.

    2014-12-01

    Efforts have been made by various scientific disciplines to study hyporheic zones and characterize their associated processes. One way to approach the study of the hyporheic zone is to define facies, which are elements of a (hydrobio) geologic classification scheme that groups components of a complex system with high variability into a manageable set of discrete classes. In this study, we try to classify the hyporheic zone based on the geology, geochemistry, microbiology, and understand their interactive influences on the integrated biogeochemical distributions and processes. A number of measurements have been taken for 21 freeze core samples along the Columbia River bank in the Hanford 300 Area, and unique datasets have been obtained on biomass, pH, number of microbial taxa, percentage of N/C/H/S, microbial activity parameters, as well as microbial community attributes/modules. In order to gain a complete understanding of the geological control on these variables and processes, the explanatory variables are set to include quantitative gravel/sand/mud/silt/clay percentages, statistical moments of grain size distributions, as well as geological (e.g., Folk-Wentworth) and statistical (e.g., hierarchical) clusters. The dominant factors for major microbial and geochemical variables are identified and summarized using exploratory data analysis approaches (e.g., principal component analysis, hierarchical clustering, factor analysis, multivariate analysis of variance). The feasibility of extending the facies definition and its control of microbial and geochemical properties to larger scales is discussed.

  19. Microbial facies distribution and its geological and geochemical controls at the Hanford 300 area

    NASA Astrophysics Data System (ADS)

    Hou, Z.; Nelson, W.; Stegen, J.; Murray, C. J.; Arntzen, E.

    2015-12-01

    Efforts have been made by various scientific disciplines to study hyporheic zones and characterize their associated processes. One way to approach the study of the hyporheic zone is to define facies, which are elements of a (hydrobio) geologic classification scheme that groups components of a complex system with high variability into a manageable set of discrete classes. In this study, we try to classify the hyporheic zone based on the geology, geochemistry, microbiology, and understand their interactive influences on the integrated biogeochemical distributions and processes. A number of measurements have been taken for 21 freeze core samples along the Columbia River bank in the Hanford 300 Area, and unique datasets have been obtained on biomass, pH, number of microbial taxa, percentage of N/C/H/S, microbial activity parameters, as well as microbial community attributes/modules. In order to gain a complete understanding of the geological control on these variables and processes, the explanatory variables are set to include quantitative gravel/sand/mud/silt/clay percentages, statistical moments of grain size distributions, as well as geological (e.g., Folk-Wentworth) and statistical (e.g., hierarchical) clusters. The dominant factors for major microbial and geochemical variables are identified and summarized using exploratory data analysis approaches (e.g., principal component analysis, hierarchical clustering, factor analysis, multivariate analysis of variance). The feasibility of extending the facies definition and its control of microbial and geochemical properties to larger scales is discussed.

  20. Direct evidence for a dual process model of deductive inference.

    PubMed

    Markovits, Henry; Brunet, Marie-Laurence; Thompson, Valerie; Brisson, Janie

    2013-07-01

    In 2 experiments, we tested a strong version of a dual process theory of conditional inference (cf. Verschueren et al., 2005a, 2005b) that assumes that most reasoners have 2 strategies available, the choice of which is determined by situational variables, cognitive capacity, and metacognitive control. The statistical strategy evaluates inferences probabilistically, accepting those with high conditional probability. The counterexample strategy rejects inferences when a counterexample shows the inference to be invalid. To discriminate strategy use, we presented reasoners with conditional statements (if p, then q) and explicit statistical information about the relative frequency of the probability of p/q (50% vs. 90%). A statistical strategy would accept the more probable inferences more frequently, whereas the counterexample one would reject both. In Experiment 1, reasoners under time pressure used the statistical strategy more, but switched to the counterexample strategy when time constraints were removed; the former took less time than the latter. These data are consistent with the hypothesis that the statistical strategy is the default heuristic. Under a free-time condition, reasoners preferred the counterexample strategy and kept it when put under time pressure. Thus, it is not simply a lack of capacity that produces a statistical strategy; instead, it seems that time pressure disrupts the ability to make good metacognitive choices. In line with this conclusion, in a 2nd experiment, we measured reasoners' confidence in their performance; those under time pressure were less confident in the statistical than the counterexample strategy and more likely to switch strategies under free-time conditions. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  1. Bayesian models based on test statistics for multiple hypothesis testing problems.

    PubMed

    Ji, Yuan; Lu, Yiling; Mills, Gordon B

    2008-04-01

    We propose a Bayesian method for the problem of multiple hypothesis testing that is routinely encountered in bioinformatics research, such as the differential gene expression analysis. Our algorithm is based on modeling the distributions of test statistics under both null and alternative hypotheses. We substantially reduce the complexity of the process of defining posterior model probabilities by modeling the test statistics directly instead of modeling the full data. Computationally, we apply a Bayesian FDR approach to control the number of rejections of null hypotheses. To check if our model assumptions for the test statistics are valid for various bioinformatics experiments, we also propose a simple graphical model-assessment tool. Using extensive simulations, we demonstrate the performance of our models and the utility of the model-assessment tool. In the end, we apply the proposed methodology to an siRNA screening and a gene expression experiment.

  2. Quantum statistics and squeezing for a microwave-driven interacting magnon system.

    PubMed

    Haghshenasfard, Zahra; Cottam, Michael G

    2017-02-01

    Theoretical studies are reported for the statistical properties of a microwave-driven interacting magnon system. Both the magnetic dipole-dipole and the exchange interactions are included and the theory is developed for the case of parallel pumping allowing for the inclusion of the nonlinear processes due to the four-magnon interactions. The method of second quantization is used to transform the total Hamiltonian from spin operators to boson creation and annihilation operators. By using the coherent magnon state representation we have studied the magnon occupation number and the statistical behavior of the system. In particular, it is shown that the nonlinearities introduced by the parallel pumping field and the four-magnon interactions lead to non-classical quantum statistical properties of the system, such as magnon squeezing. Also control of the collapse-and-revival phenomena for the time evolution of the average magnon number is demonstrated by varying the parallel pumping amplitude and the four-magnon coupling.

  3. Evaluation of control over the microbiological contamination of carcasses in a lamb carcass dressing process operated with or without pasteurizing treatment.

    PubMed

    Milios, K; Mataragas, M; Pantouvakis, A; Drosinos, E H; Zoiopoulos, P E

    2011-03-30

    The aim of this study was to quantify the hygienic status of a lamb slaughterhouse by means of multivariate statistical analysis, to demonstrate how the microbiological data could be exploited to improve the lamb slaughter process by constructing control charts and to evaluate the potential effect of an intervention step such as steam application on the microbiological quality of lamb carcasses. Results showed that pelt removal and evisceration were hygienically uncontrolled. TVC and Enterobacteriaceae progressively increased from the stage 'after pelt removal of hind and forelegs/before final pulling' to the stage 'after evisceration/before pluck removal' thus indicating possible deposition of microorganisms during these operations. It seems that the processing stages of freshly produced carcasses were better distinguished by Enterobacteriaceae, with evisceration contributing mostly to the final Enterobacteriaceae counts. Application of steam during the lamb slaughter process reduced microbial counts without adverse effects on the organoleptic characteristics of the carcasses. Moreover, the construction of control charts showed that decontamination with steam contributed to the maintenance of an in control process compared to that before the application of steam, suggesting the potential use of steam as an intervention step during the lamb slaughter process. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Cognitive effort and pupil dilation in controlled and automatic processes.

    PubMed

    Querino, Emanuel; Dos Santos, Lafaiete; Ginani, Giuliano; Nicolau, Eduardo; Miranda, Débora; Romano-Silva, Marco; Malloy-Diniz, Leandro

    2015-01-01

    The Five Digits Test (FDT) is a Stroop paradigm test that aims to evaluate executive functions. It is composed of four parts, two of which are related to automatic and two of which are related to controlled processes. It is known that pupillary diameter increases as the task's cognitive demand increases. In the present study, we evaluated whether the pupillary diameter could distinguish cognitive effort between automated and controlled cognitive processing during the FDT as the task progressed. As a control task, we used a simple reading paradigm with a similar visual aspect as the FDT. We then divided each of the four parts into two blocks in order to evaluate the differences between the first and second half of the task. Results indicated that, compared to a control task, the FDT required higher cognitive effort for each consecutive part. Moreover, the first half of every part of the FDT induced dilation more than the second. The differences in pupil dilation during the first half of the four FDT parts were statistically significant between the parts 2 and 4 (p=0.023), and between the parts 3 and 4 (p=0.006). These results provide further evidence that cognitive effort and pupil diameter can distinguish controlled from automatic processes.

  5. Performance of statistical process control methods for regional surgical site infection surveillance: a 10-year multicentre pilot study.

    PubMed

    Baker, Arthur W; Haridy, Salah; Salem, Joseph; Ilieş, Iulian; Ergai, Awatef O; Samareh, Aven; Andrianas, Nicholas; Benneyan, James C; Sexton, Daniel J; Anderson, Deverick J

    2017-11-24

    Traditional strategies for surveillance of surgical site infections (SSI) have multiple limitations, including delayed and incomplete outbreak detection. Statistical process control (SPC) methods address these deficiencies by combining longitudinal analysis with graphical presentation of data. We performed a pilot study within a large network of community hospitals to evaluate performance of SPC methods for detecting SSI outbreaks. We applied conventional Shewhart and exponentially weighted moving average (EWMA) SPC charts to 10 previously investigated SSI outbreaks that occurred from 2003 to 2013. We compared the results of SPC surveillance to the results of traditional SSI surveillance methods. Then, we analysed the performance of modified SPC charts constructed with different outbreak detection rules, EWMA smoothing factors and baseline SSI rate calculations. Conventional Shewhart and EWMA SPC charts both detected 8 of the 10 SSI outbreaks analysed, in each case prior to the date of traditional detection. Among detected outbreaks, conventional Shewhart chart detection occurred a median of 12 months prior to outbreak onset and 22 months prior to traditional detection. Conventional EWMA chart detection occurred a median of 7 months prior to outbreak onset and 14 months prior to traditional detection. Modified Shewhart and EWMA charts additionally detected several outbreaks earlier than conventional SPC charts. Shewhart and SPC charts had low false-positive rates when used to analyse separate control hospital SSI data. Our findings illustrate the potential usefulness and feasibility of real-time SPC surveillance of SSI to rapidly identify outbreaks and improve patient safety. Further study is needed to optimise SPC chart selection and calculation, statistical outbreak detection rules and the process for reacting to signals of potential outbreaks. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  6. Multivariate statistical analysis of a high rate biofilm process treating kraft mill bleach plant effluent.

    PubMed

    Goode, C; LeRoy, J; Allen, D G

    2007-01-01

    This study reports on a multivariate analysis of the moving bed biofilm reactor (MBBR) wastewater treatment system at a Canadian pulp mill. The modelling approach involved a data overview by principal component analysis (PCA) followed by partial least squares (PLS) modelling with the objective of explaining and predicting changes in the BOD output of the reactor. Over two years of data with 87 process measurements were used to build the models. Variables were collected from the MBBR control scheme as well as upstream in the bleach plant and in digestion. To account for process dynamics, a variable lagging approach was used for variables with significant temporal correlations. It was found that wood type pulped at the mill was a significant variable governing reactor performance. Other important variables included flow parameters, faults in the temperature or pH control of the reactor, and some potential indirect indicators of biomass activity (residual nitrogen and pH out). The most predictive model was found to have an RMSEP value of 606 kgBOD/d, representing a 14.5% average error. This was a good fit, given the measurement error of the BOD test. Overall, the statistical approach was effective in describing and predicting MBBR treatment performance.

  7. Listening to Sentences in Noise: Revealing Binaural Hearing Challenges in Patients with Schizophrenia.

    PubMed

    Abdul Wahab, Noor Alaudin; Zakaria, Mohd Normani; Abdul Rahman, Abdul Hamid; Sidek, Dinsuhaimi; Wahab, Suzaily

    2017-11-01

    The present, case-control, study investigates binaural hearing performance in schizophrenia patients towards sentences presented in quiet and noise. Participants were twenty-one healthy controls and sixteen schizophrenia patients with normal peripheral auditory functions. The binaural hearing was examined in four listening conditions by using the Malay version of hearing in noise test. The syntactically and semantically correct sentences were presented via headphones to the randomly selected subjects. In each condition, the adaptively obtained reception thresholds for speech (RTS) were used to determine RTS noise composite and spatial release from masking. Schizophrenia patients demonstrated significantly higher mean RTS value relative to healthy controls (p=0.018). The large effect size found in three listening conditions, i.e., in quiet (d=1.07), noise right (d=0.88) and noise composite (d=0.90) indicates statistically significant difference between the groups. However, noise front and noise left conditions show medium (d=0.61) and small (d=0.50) effect size respectively. No statistical difference between groups was noted in regards to spatial release from masking on right (p=0.305) and left (p=0.970) ear. The present findings suggest an abnormal unilateral auditory processing in central auditory pathway in schizophrenia patients. Future studies to explore the role of binaural and spatial auditory processing were recommended.

  8. Optimization strategies based on sequential quadratic programming applied for a fermentation process for butanol production.

    PubMed

    Pinto Mariano, Adriano; Bastos Borba Costa, Caliane; de Franceschi de Angelis, Dejanira; Maugeri Filho, Francisco; Pires Atala, Daniel Ibraim; Wolf Maciel, Maria Regina; Maciel Filho, Rubens

    2009-11-01

    In this work, the mathematical optimization of a continuous flash fermentation process for the production of biobutanol was studied. The process consists of three interconnected units, as follows: fermentor, cell-retention system (tangential microfiltration), and vacuum flash vessel (responsible for the continuous recovery of butanol from the broth). The objective of the optimization was to maximize butanol productivity for a desired substrate conversion. Two strategies were compared for the optimization of the process. In one of them, the process was represented by a deterministic model with kinetic parameters determined experimentally and, in the other, by a statistical model obtained using the factorial design technique combined with simulation. For both strategies, the problem was written as a nonlinear programming problem and was solved with the sequential quadratic programming technique. The results showed that despite the very similar solutions obtained with both strategies, the problems found with the strategy using the deterministic model, such as lack of convergence and high computational time, make the use of the optimization strategy with the statistical model, which showed to be robust and fast, more suitable for the flash fermentation process, being recommended for real-time applications coupling optimization and control.

  9. Argon-oxygen atmospheric pressure plasma treatment on carbon fiber reinforced polymer for improved bonding

    NASA Astrophysics Data System (ADS)

    Chartosias, Marios

    Acceptance of Carbon Fiber Reinforced Polymer (CFRP) structures requires a robust surface preparation method with improved process controls capable of ensuring high bond quality. Surface preparation in a production clean room environment prior to applying adhesive for bonding would minimize risk of contamination and reduce cost. Plasma treatment is a robust surface preparation process capable of being applied in a production clean room environment with process parameters that are easily controlled and documented. Repeatable and consistent processing is enabled through the development of a process parameter window utilizing techniques such as Design of Experiments (DOE) tailored to specific adhesive and substrate bonding applications. Insight from respective plasma treatment Original Equipment Manufacturers (OEMs) and screening tests determined critical process factors from non-factors and set the associated factor levels prior to execution of the DOE. Results from mode I Double Cantilever Beam (DCB) testing per ASTM D 5528 [1] standard and DOE statistical analysis software are used to produce a regression model and determine appropriate optimum settings for each factor.

  10. Statistical relations of salt and selenium loads to geospatial characteristics of corresponding subbasins of the Colorado and Gunnison Rivers in Colorado

    USGS Publications Warehouse

    Leib, Kenneth J.; Linard, Joshua I.; Williams, Cory A.

    2012-01-01

    Elevated loads of salt and selenium can impair the quality of water for both anthropogenic and natural uses. Understanding the environmental processes controlling how salt and selenium are introduced to streams is critical to managing and mitigating the effects of elevated loads. Dominant relations between salt and selenium loads and environmental characteristics can be established by using geospatial data. The U.S. Geological Survey, in cooperation with the Bureau of Reclamation, investigated statistical relations between seasonal salt or selenium loads emanating from the Upper Colorado River Basin and geospatial data. Salt and selenium loads measured during the irrigation and nonirrigation seasons were related to geospatial variables for 168 subbasins within the Gunnison and Colorado River Basins. These geospatial variables represented subbasin characteristics of the physical environment, precipitation, geology, land use, and the irrigation network. All subbasin variables with units of area had statistically significant relations with load. The few variables that were not in units of area but were statistically significant helped to identify types of geospatial data that might influence salt and selenium loading. Following a stepwise approach, combinations of these statistically significant variables were used to develop multiple linear regression models. The models can be used to help prioritize areas where salt and selenium control projects might be most effective.

  11. Statistical process control of mortality series in the Australian and New Zealand Intensive Care Society (ANZICS) adult patient database: implications of the data generating process

    PubMed Central

    2013-01-01

    Background Statistical process control (SPC), an industrial sphere initiative, has recently been applied in health care and public health surveillance. SPC methods assume independent observations and process autocorrelation has been associated with increase in false alarm frequency. Methods Monthly mean raw mortality (at hospital discharge) time series, 1995–2009, at the individual Intensive Care unit (ICU) level, were generated from the Australia and New Zealand Intensive Care Society adult patient database. Evidence for series (i) autocorrelation and seasonality was demonstrated using (partial)-autocorrelation ((P)ACF) function displays and classical series decomposition and (ii) “in-control” status was sought using risk-adjusted (RA) exponentially weighted moving average (EWMA) control limits (3 sigma). Risk adjustment was achieved using a random coefficient (intercept as ICU site and slope as APACHE III score) logistic regression model, generating an expected mortality series. Application of time-series to an exemplar complete ICU series (1995-(end)2009) was via Box-Jenkins methodology: autoregressive moving average (ARMA) and (G)ARCH ((Generalised) Autoregressive Conditional Heteroscedasticity) models, the latter addressing volatility of the series variance. Results The overall data set, 1995-2009, consisted of 491324 records from 137 ICU sites; average raw mortality was 14.07%; average(SD) raw and expected mortalities ranged from 0.012(0.113) and 0.013(0.045) to 0.296(0.457) and 0.278(0.247) respectively. For the raw mortality series: 71 sites had continuous data for assessment up to or beyond lag40 and 35% had autocorrelation through to lag40; and of 36 sites with continuous data for ≥ 72 months, all demonstrated marked seasonality. Similar numbers and percentages were seen with the expected series. Out-of-control signalling was evident for the raw mortality series with respect to RA-EWMA control limits; a seasonal ARMA model, with GARCH effects, displayed white-noise residuals which were in-control with respect to EWMA control limits and one-step prediction error limits (3SE). The expected series was modelled with a multiplicative seasonal autoregressive model. Conclusions The data generating process of monthly raw mortality series at the ICU level displayed autocorrelation, seasonality and volatility. False-positive signalling of the raw mortality series was evident with respect to RA-EWMA control limits. A time series approach using residual control charts resolved these issues. PMID:23705957

  12. Ground facility for information reception, processing, dissemination and scientific instruments management setup in the CORONAS-PHOTON space project

    NASA Astrophysics Data System (ADS)

    Buslov, A. S.; Kotov, Yu. D.; Yurov, V. N.; Bessonov, M. V.; Kalmykov, P. A.; Oreshnikov, E. M.; Alimov, A. M.; Tumanov, A. V.; Zhuchkova, E. A.

    2011-06-01

    This paper deals with the organizational structure of ground-based receiving, processing, and dissemination of scientific information created by the Astrophysics Institute of the Scientific Research Nuclear University, Moscow Engineering Physics Institute. Hardware structure and software features are described. The principles are given for forming sets of control commands for scientific equipment (SE) devices, and statistics data are presented on the operation of facility during flight tests of the spacecraft (SC) in the course of one year.

  13. Design and Production of Color Calibration Targets for Digital Input Devices

    DTIC Science & Technology

    2000-07-01

    gamuts . Fourth, color transform form CIELCH to sRGB will be described. Fifth, the relevant target mockups will be created. Sixth, the quality will be...Implement statistical _ • process controls Print, process and measure •, reject Transfer the measured CIEXYZ of I the target patches to SRGB a Genterate...Kodak Royal VII paper and sRGB . This plot shows all points on the a*-b* plane without information about the L*. The sRGB’s color gamut is obtained from

  14. No association of SORL1 SNPs with Alzheimer's disease.

    PubMed

    Minster, Ryan L; DeKosky, Steven T; Kamboh, M Ilyas

    2008-08-01

    SORL1 is an element of the amyloid precursor protein processing pathway and is therefore a good candidate for affecting Alzheimer's disease (AD) risk. Indeed, there have been reports of associations between variation in SORL1 and AD risk. We examined six statistically significant single-nucleotide polymorphisms from the initial observation in a large Caucasian American case-controls cohort (1000 late-onset AD [LOAD] cases and 1000 older controls). Analysis of allele, genotype and haplotype frequencies revealed no association with LOAD risk in our cohort.

  15. An approach to quality and performance control in a computer-assisted clinical chemistry laboratory.

    PubMed Central

    Undrill, P E; Frazer, S C

    1979-01-01

    A locally developed, computer-based clinical chemistry laboratory system has been in operation since 1970. This utilises a Digital Equipment Co Ltd PDP 12 and an interconnected PDP 8/F computer. Details are presented of the performance and quality control techniques incorporated into the system. Laboratory performance is assessed through analysis of results from fixed-level control sera as well as from cumulative sum methods. At a simple level the presentation may be considered purely indicative, while at a more sophisticated level statistical concepts have been introduced to aid the laboratory controller in decision-making processes. PMID:438340

  16. A method for evaluating treatment quality using in vivo EPID dosimetry and statistical process control in radiation therapy.

    PubMed

    Fuangrod, Todsaporn; Greer, Peter B; Simpson, John; Zwan, Benjamin J; Middleton, Richard H

    2017-03-13

    Purpose Due to increasing complexity, modern radiotherapy techniques require comprehensive quality assurance (QA) programmes, that to date generally focus on the pre-treatment stage. The purpose of this paper is to provide a method for an individual patient treatment QA evaluation and identification of a "quality gap" for continuous quality improvement. Design/methodology/approach A statistical process control (SPC) was applied to evaluate treatment delivery using in vivo electronic portal imaging device (EPID) dosimetry. A moving range control chart was constructed to monitor the individual patient treatment performance based on a control limit generated from initial data of 90 intensity-modulated radiotherapy (IMRT) and ten volumetric-modulated arc therapy (VMAT) patient deliveries. A process capability index was used to evaluate the continuing treatment quality based on three quality classes: treatment type-specific, treatment linac-specific, and body site-specific. Findings The determined control limits were 62.5 and 70.0 per cent of the χ pass-rate for IMRT and VMAT deliveries, respectively. In total, 14 patients were selected for a pilot study the results of which showed that about 1 per cent of all treatments contained errors relating to unexpected anatomical changes between treatment fractions. Both rectum and pelvis cancer treatments demonstrated process capability indices were less than 1, indicating the potential for quality improvement and hence may benefit from further assessment. Research limitations/implications The study relied on the application of in vivo EPID dosimetry for patients treated at the specific centre. Sampling patients for generating the control limits were limited to 100 patients. Whilst the quantitative results are specific to the clinical techniques and equipment used, the described method is generally applicable to IMRT and VMAT treatment QA. Whilst more work is required to determine the level of clinical significance, the authors have demonstrated the capability of the method for both treatment specific QA and continuing quality improvement. Practical implications The proposed method is a valuable tool for assessing the accuracy of treatment delivery whilst also improving treatment quality and patient safety. Originality/value Assessing in vivo EPID dosimetry with SPC can be used to improve the quality of radiation treatment for cancer patients.

  17. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. 3: A stochastic rain fade control algorithm for satellite link power via non linear Markow filtering theory

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1991-01-01

    The dynamic and composite nature of propagation impairments that are incurred on Earth-space communications links at frequencies in and above 30/20 GHz Ka band, i.e., rain attenuation, cloud and/or clear air scintillation, etc., combined with the need to counter such degradations after the small link margins have been exceeded, necessitate the use of dynamic statistical identification and prediction processing of the fading signal in order to optimally estimate and predict the levels of each of the deleterious attenuation components. Such requirements are being met in NASA's Advanced Communications Technology Satellite (ACTS) Project by the implementation of optimal processing schemes derived through the use of the Rain Attenuation Prediction Model and nonlinear Markov filtering theory.

  18. Database Performance Monitoring for the Photovoltaic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Katherine A.

    The Database Performance Monitoring (DPM) software (copyright in processes) is being developed at Sandia National Laboratories to perform quality control analysis on time series data. The software loads time indexed databases (currently csv format), performs a series of quality control tests defined by the user, and creates reports which include summary statistics, tables, and graphics. DPM can be setup to run on an automated schedule defined by the user. For example, the software can be run once per day to analyze data collected on the previous day. HTML formatted reports can be sent via email or hosted on a website.more » To compare performance of several databases, summary statistics and graphics can be gathered in a dashboard view which links to detailed reporting information for each database. The software can be customized for specific applications.« less

  19. Electrophysiological Measures of Resting State Functional Connectivity and Their Relationship with Working Memory Capacity in Childhood

    ERIC Educational Resources Information Center

    Barnes, Jessica J.; Woolrich, Mark W.; Baker, Kate; Colclough, Giles L.; Astle, Duncan E.

    2016-01-01

    Functional connectivity is the statistical association of neuronal activity time courses across distinct brain regions, supporting specific cognitive processes. This coordination of activity is likely to be highly important for complex aspects of cognition, such as the communication of fluctuating task goals from higher-order control regions to…

  20. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). Safety Section: Modules 1-3. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    These three modules, which were developed for use by instructors in a manufacturing firm's advanced technical preparation program, contain the materials required to present the safety section of the plant's adult-oriented, job-specific competency-based training program. The 3 modules contain 12 lessons on the following topics: lockout/tagout…

  1. Reducing Check-in Errors at Brigham Young University through Statistical Process Control

    ERIC Educational Resources Information Center

    Spackman, N. Andrew

    2005-01-01

    The relationship between the library and its patrons is damaged and the library's reputation suffers when returned items are not checked in. An informal survey reveals librarians' concern for this problem and their efforts to combat it, although few libraries collect objective measurements of errors or the effects of improvement efforts. Brigham…

  2. Exploring the Use of Statistical Process Control Methods to Assess Course Changes

    ERIC Educational Resources Information Center

    Vollstedt, Ann-Marie

    2010-01-01

    This dissertation pertains to the field of Engineering Education. The Department of Mechanical Engineering at the University of Nevada, Reno (UNR) is hosting this dissertation under a special agreement. This study was motivated by the desire to find an improved, quantitative measure of student quality that is both convenient to use and easy to…

  3. The Department of Defense Very High Speed Integrated Circuit (VHSIC) Technology Availability Program Plan for the Committees on Armed Services United States Congress.

    DTIC Science & Technology

    1986-06-30

    features of computer aided design systems and statistical quality control procedures that are generic to chip sets and processes. RADIATION HARDNESS -The...System PSP Programmable Signal Processor SSI Small Scale Integration ." TOW Tube Launched, Optically Tracked, Wire Guided TTL Transistor Transitor Logic

  4. Models and Muddles in Human Ecology: An Examination of High School Crime Rates. Report No. 255.

    ERIC Educational Resources Information Center

    Gottfredson, Gary D.

    Recent research in the human ecological tradition has made increasing use of causal modeling in the search for understanding of aggregate-level social processes. This approach has great appeal because it helps make hypotheses explicit, provides a convenient way to structure the application of statistical controls, allows the representation of…

  5. Effective solutions for monitoring the electrostatic separation of metal and plastic granular waste from electric and electronic equipment.

    PubMed

    Senouci, Khouira; Medles, Karim; Dascalescu, Lucian

    2013-02-01

    The variability of the quantity and purity of the recovered materials is a serious drawback for the application of electrostatic separation technologies to the recycling of granular wastes. In a series of previous articles we have pointed out how capability and classic control chart concepts could be employed for better mastering the outcome of such processes. In the present work, the multiple exponentially weighted moving average (MEWMA) control chart is introduced and shown to be more effective than the Hotelling T2 chart for monitoring slow varying changes in the electrostatic separation of granular mixtures originating from electric and electronic equipment waste. The operation of the industrial process was simulated by using a laboratory roll-type electrostatic separator and granular samples resulting from shredded electric cable wastes. The 25 tests carried out during the observation phase enabled the calculation of the upper and lower control limits for the two control charts considered in the present study. The 11 additional tests that simulated the monitoring phase pointed out that the MEWMA chart is more effective than Hotelling's T(2) chart in detecting slow varying changes in the outcome of a process. As the reverse is true in the case of abrupt alterations of monitored process performances, simultaneous usage of the two control charts is strongly recommended. While this study focused on a specific electrostatic separation process, using the MEWMA chart together with the well known Hotelling's T(2) chart should be applicable to the statistical control of other complex processes in the field of waste processing.

  6. Auditory processing and phonological awareness skills of five-year-old children with and without musical experience.

    PubMed

    Escalda, Júlia; Lemos, Stela Maris Aguiar; França, Cecília Cavalieri

    2011-09-01

    To investigate the relations between musical experience, auditory processing and phonological awareness of groups of 5-year-old children with and without musical experience. Participants were 56 5-year-old subjects of both genders, 26 in the Study Group, consisting of children with musical experience, and 30 in the Control Group, consisting of children without musical experience. All participants were assessed with the Simplified Auditory Processing Assessment and Phonological Awareness Test and the data was statistically analyzed. There was a statistically significant difference between the results of the sequential memory test for verbal and non-verbal sounds with four stimuli, phonological awareness tasks of rhyme recognition, phonemic synthesis and phonemic deletion. Analysis of multiple binary logistic regression showed that, with exception of the sequential verbal memory with four syllables, the observed difference in subjects' performance was associated with their musical experience. Musical experience improves auditory and metalinguistic abilities of 5-year-old children.

  7. Polonium-210 in the environment around a radioactive waste disposal area and phosphate ore processing plant.

    PubMed

    Arthur, W J; Markham, O D

    1984-04-01

    Polonium-210 concentrations were determined for soil, vegetation and small mammal tissues collected at a solid radioactive waste disposal area, near a phosphate ore processing plant and at two rural areas in southeastern Idaho. Polonium concentrations in media sampled near the radioactive waste disposal facility were equal to or less than values from rural area samples, indicating that disposal of solid radioactive waste at the Idaho National Engineering Laboratory Site has not resulted in increased environmental levels of polonium. Concentrations of 210Po in soils, deer mice hide and carcass samples collected near the phosphate processing plant were statistically (P less than or equal to 0.05) greater than the other sampling locations; however, the mean 210Po concentration in soils and small mammal tissues from sampling areas near the phosphate plant were only four and three times greater, respectively, than control values. No statistical (P greater than 0.05) difference was observed for 210Po concentrations in vegetation among any of the sampling locations.

  8. An Improved Incremental Learning Approach for KPI Prognosis of Dynamic Fuel Cell System.

    PubMed

    Yin, Shen; Xie, Xiaochen; Lam, James; Cheung, Kie Chung; Gao, Huijun

    2016-12-01

    The key performance indicator (KPI) has an important practical value with respect to the product quality and economic benefits for modern industry. To cope with the KPI prognosis issue under nonlinear conditions, this paper presents an improved incremental learning approach based on available process measurements. The proposed approach takes advantage of the algorithm overlapping of locally weighted projection regression (LWPR) and partial least squares (PLS), implementing the PLS-based prognosis in each locally linear model produced by the incremental learning process of LWPR. The global prognosis results including KPI prediction and process monitoring are obtained from the corresponding normalized weighted means of all the local models. The statistical indicators for prognosis are enhanced as well by the design of novel KPI-related and KPI-unrelated statistics with suitable control limits for non-Gaussian data. For application-oriented purpose, the process measurements from real datasets of a proton exchange membrane fuel cell system are employed to demonstrate the effectiveness of KPI prognosis. The proposed approach is finally extended to a long-term voltage prediction for potential reference of further fuel cell applications.

  9. Fracture resistance of Kevlar-reinforced poly(methyl methacrylate) resin: a preliminary study.

    PubMed

    Berrong, J M; Weed, R M; Young, J M

    1990-01-01

    The reinforcing effect of Kevlar fibers incorporated in processed poly(methyl methacrylate) resin samples was studied using 0% (controls), 0.5%, 1%, and 2% by weight of the added fibers. The samples were subjected to impact testing to determine fracture resistance, and sample groups were statistically compared using an ANOVA. Each reinforced sample had significantly greater fracture resistance (P less than 0.05) than the control, and no difference was found either within or between control groups. The use of reinforcing Kevlar fibers appears to enhance the fracture resistance of acrylic resin denture base materials.

  10. Effect of at-home bleaching with different thickeners and aging on physical properties of a nanocomposite.

    PubMed

    Gouveia, Thayla Hellen Nunes; Públio, Juliana do Carmo; Ambrosano, Glaucia Maria Bovi; Paulillo, Luís Alexandre Maffei Sartini; Aguiar, Flávio Henrique Baggio; Lima, Débora Alves Nunes Leite

    2016-01-01

    To evaluate the influence of 16% carbamide peroxide (CP) containing different thickeners on the physical characteristics of a nanocomposite resin submitted or not to accelerated artificial aging (AAA). One hundred samples were randomly distributed into two groups (n = 50) according to AAA. Each group was divided into 5 subgroups (n = 10) depending on the bleaching/thickener treatment: CP + carbopol, CP + natrosol, carbopol, natrosol, and no treatment (control). The physical properties tested were color (ΔE), gloss (GU), mean roughness (Ra), and Knoop microhardness (KHN). The resin surface was performed with atomic force microscopy (AFM). The color (variable Δ E) was assessed with two-way analysis of variance (ANOVA) and additionally with Tukey's and Dunnett's tests, the roughness values were submitted to Kruskal-Wallis, Dunn's, and Mann-Whitney's tests. Data on gloss and KHN were submitted to two-way ANOVA and Tukey's test (α = 0.05). Among the physical properties evaluated, CP + carbopol promoted a reduction in composite microhardness only, thus differing statistically from the controls. As for CP + natrosol, such a change was not observed. The aging process reduced all the physical properties, thus differing statistically from the nonaging group. CP + carbopol increased the roughness and decreased the gloss of aged resins, whereas natrosol reduced gloss only, which differed statistically from the controls. AFM showed evidence of the loss of organic matrix and exposure to load particles in the aged samples. Therefore, the replacement of carbopol with natrosol provided maintenance of the composite microhardness following bleaching. The aging process reduced the physical properties evaluated, and some changes were enhanced by the application of bleaching.

  11. Temporal evolution of soil moisture statistical fractal and controls by soil texture and regional groundwater flow

    NASA Astrophysics Data System (ADS)

    Ji, Xinye; Shen, Chaopeng; Riley, William J.

    2015-12-01

    Soil moisture statistical fractal is an important tool for downscaling remotely-sensed observations and has the potential to play a key role in multi-scale hydrologic modeling. The fractal was first introduced two decades ago, but relatively little is known regarding how its scaling exponents evolve in time in response to climatic forcings. Previous studies have neglected the process of moisture re-distribution due to regional groundwater flow. In this study we used a physically-based surface-subsurface processes model and numerical experiments to elucidate the patterns and controls of fractal temporal evolution in two U.S. Midwest basins. Groundwater flow was found to introduce large-scale spatial structure, thereby reducing the scaling exponents (τ), which has implications for the transferability of calibrated parameters to predict τ. However, the groundwater effects depend on complex interactions with other physical controls such as soil texture and land use. The fractal scaling exponents, while in general showing a seasonal mode that correlates with mean moisture content, display hysteresis after storm events that can be divided into three phases, consistent with literature findings: (a) wetting, (b) re-organizing, and (c) dry-down. Modeling experiments clearly show that the hysteresis is attributed to soil texture, whose "patchiness" is the primary contributing factor. We generalized phenomenological rules for the impacts of rainfall, soil texture, groundwater flow, and land use on τ evolution. Grid resolution has a mild influence on the results and there is a strong correlation between predictions of τ from different resolutions. Overall, our results suggest that groundwater flow should be given more consideration in studies of the soil moisture statistical fractal, especially in regions with a shallow water table.

  12. Characterization of Noise Signatures of Involuntary Head Motion in the Autism Brain Imaging Data Exchange Repository

    PubMed Central

    Caballero, Carla; Mistry, Sejal; Vero, Joe; Torres, Elizabeth B

    2018-01-01

    The variability inherently present in biophysical data is partly contributed by disparate sampling resolutions across instrumentations. This poses a potential problem for statistical inference using pooled data in open access repositories. Such repositories combine data collected from multiple research sites using variable sampling resolutions. One example is the Autism Brain Imaging Data Exchange repository containing thousands of imaging and demographic records from participants in the spectrum of autism and age-matched neurotypical controls. Further, statistical analyses of groups from different diagnoses and demographics may be challenging, owing to the disparate number of participants across different clinical subgroups. In this paper, we examine the noise signatures of head motion data extracted from resting state fMRI data harnessed under different sampling resolutions. We characterize the quality of the noise in the variability of the raw linear and angular speeds for different clinical phenotypes in relation to age-matched controls. Further, we use bootstrapping methods to ensure compatible group sizes for statistical comparison and report the ranges of physical involuntary head excursions of these groups. We conclude that different sampling rates do affect the quality of noise in the variability of head motion data and, consequently, the type of random process appropriate to characterize the time series data. Further, given a qualitative range of noise, from pink to brown noise, it is possible to characterize different clinical subtypes and distinguish them in relation to ranges of neurotypical controls. These results may be of relevance to the pre-processing stages of the pipeline of analyses of resting state fMRI data, whereby head motion enters the criteria to clean imaging data from motion artifacts. PMID:29556179

  13. Assessment of Fever Advisory Cards (FACs) as an Initiative to Improve Febrile Neutropenia Management in a Regional Cancer Center Emergency Department.

    PubMed

    Kapil, Priyanka; MacMillan, Meghan; Carvalho, Maritza; Lymburner, Patricia; Fung, Ron; Almeida, Bernadette; Van Dorn, Laurie; Enright, Katherine

    2016-09-01

    We aimed to improve the time to antibiotics (TTA) for patients treated with chemotherapy who present to the emergency department (ED) with febrile neutropenia (FN) by using standardized fever advisory cards (FACs). Patients treated with chemotherapy who visited the ED at the Peel Regional Cancer Center in Ontario, Canada, with suspected FN were identified, before (April 2012 to March 2013) and after (October 2013 to March 2014) FAC implementation. The primary outcome of interest was TTA. Additional process measures included Canadian Triage and Acuity Scale score, time to physician assessment, and FAC compliance. Outcomes were analyzed with descriptive statistics and control charts to determine whether the change in primary measures were within statistical control over time. Between the pre-FAC cohort (n = 239) and post-FAC cohort (n = 69), TTA did not change significantly post-FACs (195 v 244 min, P = .09), with monthly averages demonstrating normal variation by statistical process control methodology. The introduction of FACs increased the percentage of patients with correctly assigned Canadian Triage and Acuity Scale scores (87% v 100%) but did not affect time to physician assessment. Compliance with FACs among patients was not ideal, with only 62.5% using them as intended. The distribution of FACs was associated with an improved incidence of correct FN triaging but did not demonstrate a meaningful improvement in the quality of FN management. This may be explained by FAC use among patients not being ideal. Next steps in the continued effort toward high-quality FN care include redesign of FACs, reinforcement of provider and patient education, and ED outreach. Copyright © 2016 by American Society of Clinical Oncology.

  14. [Data collection in anesthesia. Experiences with the inauguration of a new information system].

    PubMed

    Zbinden, A M; Rothenbühler, H; Häberli, B

    1997-06-01

    In many institutions information systems are used to process off-line anaesthesia data for invoices, statistical purposes, and quality assurance. Information systems are also increasingly being used to improve process control in order to reduce costs. Most of today's systems were created when information technology and working processes in anaesthesia were very different from those in use today. Thus, many institutions must now replace their computer systems but are probably not aware of how complex this change will be. Modern information systems mostly use client-server architecture and relational data bases. Substituting an old system with a new one is frequently a greater task than designing a system from scratch. This article gives the conclusions drawn from the experience obtained when a large departmental computer system is redesigned in an university hospital. The new system was based on a client-server architecture and was developed by an external company without preceding conceptual analysis. Modules for patient, anaesthesia, surgical, and pain-service data were included. Data were analysed using a separate statistical package (RS/1 from Bolt Beranek), taking advantage of its powerful precompiled procedures. Development and introduction of the new system took much more time and effort than expected despite the use of modern software tools. Introduction of the new program required intensive user training despite the choice of modem graphic screen layouts. Automatic data-reading systems could not be used, as too many faults occurred and the effort for the user was too high. However, after the initial problems were solved the system turned out to be a powerful tool for quality control (both process and outcome quality), billing, and scheduling. The statistical analysis of the data resulted in meaningful and relevant conclusions. Before creating a new information system, the working processes have to be analysed and, if possible, made more efficient; a detailed programme specification must then be made. A servicing and maintenance contract should be drawn up before the order is given to a company. Time periods of equal duration have to be scheduled for defining, writing, testing and introducing the program. Modern client-server systems with relational data bases are by no means simpler to establish and maintain than previous mainframe systems with hierarchical data bases, and thus, experienced computer specialists need to be close at hand. We recommend collecting data only once for both statistics and quality control. To verify data quality, a system of random spot-sampling has to be established. Despite the large investments needed to build up such a system, we consider it a powerful tool for helping to solve the difficult daily problems of managing a surgical and anaesthesia unit.

  15. Statistical learning of music- and language-like sequences and tolerance for spectral shifts.

    PubMed

    Daikoku, Tatsuya; Yatomi, Yutaka; Yumoto, Masato

    2015-02-01

    In our previous study (Daikoku, Yatomi, & Yumoto, 2014), we demonstrated that the N1m response could be a marker for the statistical learning process of pitch sequence, in which each tone was ordered by a Markov stochastic model. The aim of the present study was to investigate how the statistical learning of music- and language-like auditory sequences is reflected in the N1m responses based on the assumption that both language and music share domain generality. By using vowel sounds generated by a formant synthesizer, we devised music- and language-like auditory sequences in which higher-ordered transitional rules were embedded according to a Markov stochastic model by controlling fundamental (F0) and/or formant frequencies (F1-F2). In each sequence, F0 and/or F1-F2 were spectrally shifted in the last one-third of the tone sequence. Neuromagnetic responses to the tone sequences were recorded from 14 right-handed normal volunteers. In the music- and language-like sequences with pitch change, the N1m responses to the tones that appeared with higher transitional probability were significantly decreased compared with the responses to the tones that appeared with lower transitional probability within the first two-thirds of each sequence. Moreover, the amplitude difference was even retained within the last one-third of the sequence after the spectral shifts. However, in the language-like sequence without pitch change, no significant difference could be detected. The pitch change may facilitate the statistical learning in language and music. Statistically acquired knowledge may be appropriated to process altered auditory sequences with spectral shifts. The relative processing of spectral sequences may be a domain-general auditory mechanism that is innate to humans. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Statistical monitoring of the hand, foot and mouth disease in China.

    PubMed

    Zhang, Jingnan; Kang, Yicheng; Yang, Yang; Qiu, Peihua

    2015-09-01

    In a period starting around 2007, the Hand, Foot, and Mouth Disease (HFMD) became wide-spreading in China, and the Chinese public health was seriously threatened. To prevent the outbreak of infectious diseases like HFMD, effective disease surveillance systems would be especially helpful to give signals of disease outbreaks as early as possible. Statistical process control (SPC) charts provide a major statistical tool in industrial quality control for detecting product defectives in a timely manner. In recent years, SPC charts have been used for disease surveillance. However, disease surveillance data often have much more complicated structures, compared to the data collected from industrial production lines. Major challenges, including lack of in-control data, complex seasonal effects, and spatio-temporal correlations, make the surveillance data difficult to handle. In this article, we propose a three-step procedure for analyzing disease surveillance data, and our procedure is demonstrated using the HFMD data collected during 2008-2009 in China. Our method uses nonparametric longitudinal data and time series analysis methods to eliminate the possible impact of seasonality and temporal correlation before the disease incidence data are sequentially monitored by a SPC chart. At both national and provincial levels, our proposed method can effectively detect the increasing trend of disease incidence rate before the disease becomes wide-spreading. © 2015, The International Biometric Society.

  17. Experiments on Nucleation in Different Flow Regimes

    NASA Technical Reports Server (NTRS)

    Bayuzick, R. J.; Hofmeister, W. H.; Morton, C. M.; Robinson, M. B.

    1999-01-01

    The vast majority of metallic engineering materials are solidified from the liquid phase. Understanding the solidification process is essential to control microstructure, which in turn, determines the properties of materials. The genesis of solidification is nucleation, where the first stable solid forms from the liquid phase. Nucleation kinetics determine the degree of undercooling and phase selection. As such, it is important to understand nucleation phenomena in order to control solidification or glass formation in metals and alloys. Early experiments in nucleation kinetics were accomplished by droplet dispersion methods. Dilatometry was used by Turnbull and others, and more recently differential thermal analysis and differential scanning calorimetry have been used for kinetic studies. These techniques have enjoyed success; however, there are difficulties with these experiments. Since materials are dispersed in a medium, the character of the emulsion/metal interface affects the nucleation behavior. Statistics are derived from the large number of particles observed in a single experiment, but dispersions have a finite size distribution which adds to the uncertainty of the kinetic determinations. Even though temperature can be controlled quite well before the onset of nucleation, the release of the latent heat of fusion during nucleation of particles complicates the assumption of isothermality during these experiments. Containerless processing has enabled another approach to the study of nucleation kinetics. With levitation techniques it is possible to undercool one sample to nucleation repeatedly in a controlled manner, such that the statistics of the nucleation process can be derived from multiple experiments on a single sample. The authors have fully developed the analysis of nucleation experiments on single samples following the suggestions of Skripov. The advantage of these experiments is that the samples are directly observable. The nucleation temperature can be measured by noncontact optical pyrometry, the mass of the sample is known, and post processing analysis can be conducted on the sample. The disadvantages are that temperature measurement must have exceptionally high precision, and it is not possible to isolate specific heterogeneous sites as in droplet dispersions.

  18. Modeling in the quality by design environment: Regulatory requirements and recommendations for design space and control strategy appointment.

    PubMed

    Djuris, Jelena; Djuric, Zorica

    2017-11-30

    Mathematical models can be used as an integral part of the quality by design (QbD) concept throughout the product lifecycle for variety of purposes, including appointment of the design space and control strategy, continual improvement and risk assessment. Examples of different mathematical modeling techniques (mechanistic, empirical and hybrid) in the pharmaceutical development and process monitoring or control are provided in the presented review. In the QbD context, mathematical models are predominantly used to support design space and/or control strategies. Considering their impact to the final product quality, models can be divided into the following categories: high, medium and low impact models. Although there are regulatory guidelines on the topic of modeling applications, review of QbD-based submission containing modeling elements revealed concerns regarding the scale-dependency of design spaces and verification of models predictions at commercial scale of manufacturing, especially regarding real-time release (RTR) models. Authors provide critical overview on the good modeling practices and introduce concepts of multiple-unit, adaptive and dynamic design space, multivariate specifications and methods for process uncertainty analysis. RTR specification with mathematical model and different approaches to multivariate statistical process control supporting process analytical technologies are also presented. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. A statistical learning strategy for closed-loop control of fluid flows

    NASA Astrophysics Data System (ADS)

    Guéniat, Florimond; Mathelin, Lionel; Hussaini, M. Yousuff

    2016-12-01

    This work discusses a closed-loop control strategy for complex systems utilizing scarce and streaming data. A discrete embedding space is first built using hash functions applied to the sensor measurements from which a Markov process model is derived, approximating the complex system's dynamics. A control strategy is then learned using reinforcement learning once rewards relevant with respect to the control objective are identified. This method is designed for experimental configurations, requiring no computations nor prior knowledge of the system, and enjoys intrinsic robustness. It is illustrated on two systems: the control of the transitions of a Lorenz'63 dynamical system, and the control of the drag of a cylinder flow. The method is shown to perform well.

  20. Contextual analysis of fluid intelligence.

    PubMed

    Salthouse, Timothy A; Pink, Jeffrey E; Tucker-Drob, Elliot M

    2008-01-01

    The nature of fluid intelligence was investigated by identifying variables that were, and were not, significantly related to this construct. Relevant information was obtained from three sources: re-analyses of data from previous studies, a study in which 791 adults performed storage-plus-processing working memory tasks, and a study in which 236 adults performed a variety of working memory, updating, and cognitive control tasks. The results suggest that fluid intelligence represents a broad individual difference dimension contributing to diverse types of controlled or effortful processing. The analyses also revealed that very few of the age-related effects on the target variables were statistically independent of effects on established cognitive abilities, which suggests most of the age-related influences on a wide variety of cognitive control variables overlap with age-related influences on cognitive abilities such as fluid intelligence, episodic memory, and perceptual speed.

  1. [Quality by design approaches for pharmaceutical development and manufacturing of Chinese medicine].

    PubMed

    Xu, Bing; Shi, Xin-Yuan; Wu, Zhi-Sheng; Zhang, Yan-Ling; Wang, Yun; Qiao, Yan-Jiang

    2017-03-01

    The pharmaceutical quality was built by design, formed in the manufacturing process and improved during the product's lifecycle. Based on the comprehensive literature review of pharmaceutical quality by design (QbD), the essential ideas and implementation strategies of pharmaceutical QbD were interpreted. Considering the complex nature of Chinese medicine, the "4H" model was innovated and proposed for implementing QbD in pharmaceutical development and industrial manufacture of Chinese medicine product. "4H" corresponds to the acronym of holistic design, holistic information analysis, holistic quality control, and holistic process optimization, which is consistent with the holistic concept of Chinese medicine theory. The holistic design aims at constructing both the quality problem space from the patient requirement and the quality solution space from multidisciplinary knowledge. Holistic information analysis emphasizes understanding the quality pattern of Chinese medicine by integrating and mining multisource data and information at a relatively high level. The batch-to-batch quality consistence and manufacturing system reliability can be realized by comprehensive application of inspective quality control, statistical quality control, predictive quality control and intelligent quality control strategies. Holistic process optimization is to improve the product quality and process capability during the product lifecycle management. The implementation of QbD is useful to eliminate the ecosystem contradictions lying in the pharmaceutical development and manufacturing process of Chinese medicine product, and helps guarantee the cost effectiveness. Copyright© by the Chinese Pharmaceutical Association.

  2. Ratings of perceived exertion by women with internal or external locus of control.

    PubMed

    Hassmén, P; Koivula, N

    1996-10-01

    Ratings of perceived exertion are frequently used to estimate the strain and effort experienced subjectively by individuals during various forms of physical activity. A number of factors, both physiological and psychological in origin, have been suggested to work as modifiers of the exertion perceived by the individual. It has been reported in nonsport-related research that individuals with an internal locus of control seem to pay more attention to relevant information and use the available information more adequately than individuals with an external locus of control. The reputed inferior information-processing abilities of externals compared with internals could possibly also influence the ratings of perceived exertion, with externals being less accurate in their ratings. Whether locus of control might be such a factor was investigated. Fifty women worked on an ergometer cycle at four different work loads. The results showed statistically significant differences in subjective ratings of perceived exertion between externals and internals, especially at heavier work loads. Such differences might be because of unequal information-processing abilities, as the observed discrepancies occurred at higher work intensities, when more cues are available for processing.

  3. Risk management for moisture related effects in dry manufacturing processes: a statistical approach.

    PubMed

    Quiroz, Jorge; Strong, John; Zhang, Lanju

    2016-03-01

    A risk- and science-based approach to control the quality in pharmaceutical manufacturing includes a full understanding of how product attributes and process parameters relate to product performance through a proactive approach in formulation and process development. For dry manufacturing, where moisture content is not directly manipulated within the process, the variability in moisture of the incoming raw materials can impact both the processability and drug product quality attributes. A statistical approach is developed using individual raw material historical lots as a basis for the calculation of tolerance intervals for drug product moisture content so that risks associated with excursions in moisture content can be mitigated. The proposed method is based on a model-independent approach that uses available data to estimate parameters of interest that describe the population of blend moisture content values and which do not require knowledge of the individual blend moisture content values. Another advantage of the proposed tolerance intervals is that, it does not require the use of tabulated values for tolerance factors. This facilitates the implementation on any spreadsheet program like Microsoft Excel. A computational example is used to demonstrate the proposed method.

  4. The MetabolomeExpress Project: enabling web-based processing, analysis and transparent dissemination of GC/MS metabolomics datasets.

    PubMed

    Carroll, Adam J; Badger, Murray R; Harvey Millar, A

    2010-07-14

    Standardization of analytical approaches and reporting methods via community-wide collaboration can work synergistically with web-tool development to result in rapid community-driven expansion of online data repositories suitable for data mining and meta-analysis. In metabolomics, the inter-laboratory reproducibility of gas-chromatography/mass-spectrometry (GC/MS) makes it an obvious target for such development. While a number of web-tools offer access to datasets and/or tools for raw data processing and statistical analysis, none of these systems are currently set up to act as a public repository by easily accepting, processing and presenting publicly submitted GC/MS metabolomics datasets for public re-analysis. Here, we present MetabolomeExpress, a new File Transfer Protocol (FTP) server and web-tool for the online storage, processing, visualisation and statistical re-analysis of publicly submitted GC/MS metabolomics datasets. Users may search a quality-controlled database of metabolite response statistics from publicly submitted datasets by a number of parameters (eg. metabolite, species, organ/biofluid etc.). Users may also perform meta-analysis comparisons of multiple independent experiments or re-analyse public primary datasets via user-friendly tools for t-test, principal components analysis, hierarchical cluster analysis and correlation analysis. They may interact with chromatograms, mass spectra and peak detection results via an integrated raw data viewer. Researchers who register for a free account may upload (via FTP) their own data to the server for online processing via a novel raw data processing pipeline. MetabolomeExpress https://www.metabolome-express.org provides a new opportunity for the general metabolomics community to transparently present online the raw and processed GC/MS data underlying their metabolomics publications. Transparent sharing of these data will allow researchers to assess data quality and draw their own insights from published metabolomics datasets.

  5. Permeation of iodide from iodine-enriched yeast through porcine intestine.

    PubMed

    Ryszka, Florian; Dolińska, Barbara; Zieliński, Michał; Chyra, Dagmara; Dobrzański, Zbigniew

    2013-01-01

    Iodine deficiency is a common phenomenon, threatening the whole global human population. Recommended daily intake of iodine is 150 μg for adults and 250 μg for pregnant and breastfeeding women. About 50% of human population can be at risk of moderate iodine deficiency. Due to this fact, increased iodine supplementation is recommended, through intake of iodized mineral water and salt iodization. The aim of this study was to investigate permeation and absorption of iodide from iodine bioplex (experimental group) in comparison with potassium iodide (controls). Permeation and absorption processes were investigated in vitro using a porcine intestine. The experimental model was based on a standard Franz diffusion cell (FD-Cell). The iodine bioplex was produced using Saccharomyces cerevisiae yeast and whey powder: iodine content - 388 μg/g, total protein - 28.5%, total fat - 0.9%., glutamic acid - 41.2%, asparaginic acid - 29.4%, lysine - 24.8%; purchased from: F.Z.N.P. Biochefa, Sosnowiec, Poland. Potassium iodide was used as controls, at 388 μg iodine concentration, which was the same as in iodine-enriched yeast bioplex. A statistically significant increase in iodide permeation was observed for iodine-enriched yeast bioplex in comparison with controls - potassium iodide. After 5h the total amount of permeated iodide from iodine-enriched yeast bioplex was 85%, which is ~ 2-fold higher than controls - 37%. Iodide absorption was by contrast statistically significantly higher in controls - 7.3%, in comparison with 4.5% in experimental group with iodine-enriched yeast bioplex. Presented results show that iodide permeation process dominates over absorption in case of iodine-enriched yeast bioplex.

  6. Specific cerebral activation due to visual erotic stimuli in male-to-female transsexuals compared with male and female controls: an fMRI study.

    PubMed

    Gizewski, Elke R; Krause, Eva; Schlamann, Marc; Happich, Friederike; Ladd, Mark E; Forsting, Michael; Senf, Wolfgang

    2009-02-01

    Transsexuals harbor the strong feeling of having been born to the wrong sex. There is a continuing controversial discussion of whether or not transsexualism has a biological representation. Differences between males and females in terms of functional imaging during erotic stimuli have been previously described, revealing gender-specific results. Therefore, we postulated that male-to-female (MTF) transsexuals may show specific cerebral activation differing from their biological gender. Cerebral activation patterns during viewing of erotic film excerpts in functional magnetic resonance imaging (fMRI). Twelve male and 12 female heterosexual volunteers and 12 MTF transsexuals before any treatment viewed erotic film excerpts during fMRI. Additionally, subjective rating of sexual arousal was assessed. Statistics were performed using the Statistical Parametric Mapping software. Significantly enhanced activation for men compared with women was revealed in brain areas involved in erotic processing, i.e., the thalamus, the amygdala, and the orbitofrontal and insular cortex, whereas no specific activation for women was found. When comparing MTF transsexuals with male volunteers, activation patterns similar to female volunteers being compared with male volunteers were revealed. Sexual arousal was assessed using standard rating scales and did not differ significantly for the three groups. We revealed a cerebral activation pattern in MTF transsexuals compared with male controls similar to female controls compared with male controls during viewing of erotic stimuli, indicating a tendency of female-like cerebral processing in transsexualism.

  7. Methods for processing microarray data.

    PubMed

    Ares, Manuel

    2014-02-01

    Quality control must be maintained at every step of a microarray experiment, from RNA isolation through statistical evaluation. Here we provide suggestions for analyzing microarray data. Because the utility of the results depends directly on the design of the experiment, the first critical step is to ensure that the experiment can be properly analyzed and interpreted. What is the biological question? What is the best way to perform the experiment? How many replicates will be required to obtain the desired statistical resolution? Next, the samples must be prepared, pass quality controls for integrity and representation, and be hybridized and scanned. Also, slides with defects, missing data, high background, or weak signal must be rejected. Data from individual slides must be normalized and combined so that the data are as free of systematic bias as possible. The third phase is to apply statistical filters and tests to the data to determine genes (1) expressed above background, (2) whose expression level changes in different samples, and (3) whose RNA-processing patterns or protein associations change. Next, a subset of the data should be validated by an alternative method, such as reverse transcription-polymerase chain reaction (RT-PCR). Provided that this endorses the general conclusions of the array analysis, gene sets whose expression, splicing, polyadenylation, protein binding, etc. change in different samples can be classified with respect to function, sequence motif properties, as well as other categories to extract hypotheses for their biological roles and regulatory logic.

  8. Phase-I monitoring of standard deviations in multistage linear profiles

    NASA Astrophysics Data System (ADS)

    Kalaei, Mahdiyeh; Soleimani, Paria; Niaki, Seyed Taghi Akhavan; Atashgar, Karim

    2018-03-01

    In most modern manufacturing systems, products are often the output of some multistage processes. In these processes, the stages are dependent on each other, where the output quality of each stage depends also on the output quality of the previous stages. This property is called the cascade property. Although there are many studies in multistage process monitoring, there are fewer works on profile monitoring in multistage processes, especially on the variability monitoring of a multistage profile in Phase-I for which no research is found in the literature. In this paper, a new methodology is proposed to monitor the standard deviation involved in a simple linear profile designed in Phase I to monitor multistage processes with the cascade property. To this aim, an autoregressive correlation model between the stages is considered first. Then, the effect of the cascade property on the performances of three types of T 2 control charts in Phase I with shifts in standard deviation is investigated. As we show that this effect is significant, a U statistic is next used to remove the cascade effect, based on which the investigated control charts are modified. Simulation studies reveal good performances of the modified control charts.

  9. [Descending control of quiet standing and walking: a plausible neurophysiological basis of falls in elderly people].

    PubMed

    Nakajima, Masashi

    2011-03-01

    Quiet standing and walking are generally considered to be an automatic process regulated by sensory feedback. In our report "Astasia without abasia due to peripheral neuropathy," which was published in 1994, we proposed that forced stepping in patients lacking the ankle torque is a compensatory motor control in order to maintain an upright posture. A statistical-biomechanics approach to the human postural control system has revealed open-loop (descending) control as well as closed-loop (feedback) control in quiet standing, and fractal dynamics in stride-to-stride fluctuations of walking. The descending control system of bipedal upright posture and gait may have a functional link to cognitive domains. Increasing dependence on the descending control system with aging may play a role in falls in elderly people.

  10. Cargo/Logistics Airlift System Study (CLASS), Volume 1

    NASA Technical Reports Server (NTRS)

    Norman, J. M.; Henderson, R. D.; Macey, F. C.; Tuttle, R. P.

    1978-01-01

    Current and advanced air cargo systems are evaluated using industrial and consumer statistics. Market and commodity characteristics that influence the use of the air mode are discussed along with a comparison of air and surface mode on typical routes. Results of on-site surveys of cargo processing facilities at airports are presented, and institutional controls and influences on air cargo operations are considered.

  11. Federal Research Opportunities: DOE, DOD, and HHS Need Better Guidance for Participant Activities

    DTIC Science & Technology

    2016-01-01

    process controls of advanced power systems, gas sensors and high temperatures, improving extraction of earth elements, quantum computing, biofilms ...chronic diseases (e.g., heart, obesity, cancer ), environmental health, toxic substances, health statistics, and public health preparedness. Food and...Health Localization of proteins using molecular markers, gene regulatory effects in cancer , medical informatics, and central nervous system

  12. 37 CFR 102.30 - Disclosure of record to person other than the individual to whom it pertains.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... will be used solely as a statistical research or reporting record, and the record is to be transferred... instrumentality of any governmental jurisdiction within or under the control of the United States for a civil or... records is disclosed to any person under compulsory legal process, promptly upon being informed that such...

  13. 15 CFR 4.30 - Disclosure of record to person other than the individual to whom it pertains.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... that the record will be used solely as a statistical research or reporting record, and the record is to... instrumentality of any governmental jurisdiction within or under the control of the United States for a civil or... when any record in a system of records is disclosed to any person under compulsory legal process...

  14. A primer of statistical methods for correlating parameters and properties of electrospun poly(L-lactide) scaffolds for tissue engineering--PART 1: design of experiments.

    PubMed

    Seyedmahmoud, Rasoul; Rainer, Alberto; Mozetic, Pamela; Maria Giannitelli, Sara; Trombetta, Marcella; Traversa, Enrico; Licoccia, Silvia; Rinaldi, Antonio

    2015-01-01

    Tissue engineering scaffolds produced by electrospinning are of enormous interest, but still lack a true understanding about the fundamental connection between the outstanding functional properties, the architecture, the mechanical properties, and the process parameters. Fragmentary results from several parametric studies only render some partial insights that are hard to compare and generally miss the role of parameters interactions. To bridge this gap, this article (Part-1 of 2) features a case study on poly-L-lactide scaffolds to demonstrate how statistical methods such as design of experiments can quantitatively identify the correlations existing between key scaffold properties and control parameters, in a systematic, consistent, and comprehensive manner disentangling main effects from interactions. The morphological properties (i.e., fiber distribution and porosity) and mechanical properties (Young's modulus) are "charted" as a function of molecular weight (MW) and other electrospinning process parameters (the Xs), considering the single effect as well as interactions between Xs. For the first time, the major role of the MW emerges clearly in controlling all scaffold properties. The correlation between mechanical and morphological properties is also addressed. © 2014 Wiley Periodicals, Inc.

  15. Health-care process improvement decisions: a systems perspective.

    PubMed

    Walley, Paul; Silvester, Kate; Mountford, Shaun

    2006-01-01

    The paper seeks to investigate decision-making processes within hospital improvement activity, to understand how performance measurement systems influence decisions and potentially lead to unsuccessful or unsustainable process changes. A longitudinal study over a 33-month period investigates key events, decisions and outcomes at one medium-sized hospital in the UK. Process improvement events are monitored using process control methods and by direct observation. The authors took a systems perspective of the health-care processes, ensuring that the impacts of decisions across the health-care supply chain were appropriately interpreted. The research uncovers the ways in which measurement systems disguise failed decisions and encourage managers to take a low-risk approach of "symptomatic relief" when trying to improve performance metrics. This prevents many managers from trying higher risk, sustainable process improvement changes. The behaviour of the health-care system is not understood by many managers and this leads to poor analysis of problem situations. Measurement using time-series methodologies, such as statistical process control are vital for a better understanding of the systems impact of changes. Senior managers must also be aware of the behavioural influence of similar performance measurement systems that discourage sustainable improvement. There is a risk that such experiences will tarnish the reputation of performance management as a discipline. Recommends process control measures as a way of creating an organization memory of how decisions affect performance--something that is currently lacking.

  16. Infective endocarditis detection through SPECT/CT images digital processing

    NASA Astrophysics Data System (ADS)

    Moreno, Albino; Valdés, Raquel; Jiménez, Luis; Vallejo, Enrique; Hernández, Salvador; Soto, Gabriel

    2014-03-01

    Infective endocarditis (IE) is a difficult-to-diagnose pathology, since its manifestation in patients is highly variable. In this work, it was proposed a semiautomatic algorithm based on SPECT images digital processing for the detection of IE using a CT images volume as a spatial reference. The heart/lung rate was calculated using the SPECT images information. There were no statistically significant differences between the heart/lung rates values of a group of patients diagnosed with IE (2.62+/-0.47) and a group of healthy or control subjects (2.84+/-0.68). However, it is necessary to increase the study sample of both the individuals diagnosed with IE and the control group subjects, as well as to improve the images quality.

  17. Optical Fourier diffractometry applied to degraded bone structure recognition

    NASA Astrophysics Data System (ADS)

    Galas, Jacek; Godwod, Krzysztof; Szawdyn, Jacek; Sawicki, Andrzej

    1993-09-01

    Image processing and recognition methods are useful in many fields. This paper presents the hybrid optical and digital method applied to recognition of pathological changes in bones involved by metabolic bone diseases. The trabecular bone structure, registered by x ray on the photographic film, is analyzed in the new type of computer controlled diffractometer. The set of image parameters, extracted from diffractogram, is evaluated by statistical analysis. The synthetic image descriptors in discriminant space, constructed on the base of 3 training groups of images (control, osteoporosis, and osteomalacia groups) by discriminant analysis, allow us to recognize bone samples with degraded bone structure and to recognize the disease. About 89% of the images were classified correctly. This method after optimization process will be verified in medical investigations.

  18. Black stains in the mixed dentition: a PCR microbiological study of the etiopathogenic bacteria.

    PubMed

    Saba, C; Solidani, M; Berlutti, F; Vestri, A; Ottolenghi, L; Polimeni, A

    2006-01-01

    The aim of this work is to emphasize that particular stains on the third cervical of the buccal and lingual surfaces in mixed dentition, called "black stain." Previous research showed the microbiological etiology of this discoloration by chromogen bacterias. Our study shows bacteria spp involved in stains by means of PCR process and electrophoresis gel on the agarose medium. Sample was formed by 100 subject with black stain and 100 control subjects stain-free. A statistical analysis (SPSS 10.0) using X2 was performed in this study. Porphyromonas gingivalis and Prevotella melaninogenica, were not involved in both in black stain subjects and in the control. On the contrary, Actinomyces could be involved in the pigmentation process.

  19. A Statistics-Based Cracking Criterion of Resin-Bonded Silica Sand for Casting Process Simulation

    NASA Astrophysics Data System (ADS)

    Wang, Huimin; Lu, Yan; Ripplinger, Keith; Detwiler, Duane; Luo, Alan A.

    2017-02-01

    Cracking of sand molds/cores can result in many casting defects such as veining. A robust cracking criterion is needed in casting process simulation for predicting/controlling such defects. A cracking probability map, relating to fracture stress and effective volume, was proposed for resin-bonded silica sand based on Weibull statistics. Three-point bending test results of sand samples were used to generate the cracking map and set up a safety line for cracking criterion. Tensile test results confirmed the accuracy of the safety line for cracking prediction. A laboratory casting experiment was designed and carried out to predict cracking of a cup mold during aluminum casting. The stress-strain behavior and the effective volume of the cup molds were calculated using a finite element analysis code ProCAST®. Furthermore, an energy dispersive spectroscopy fractographic examination of the sand samples confirmed the binder cracking in resin-bonded silica sand.

  20. Implications of clinical trial design on sample size requirements.

    PubMed

    Leon, Andrew C

    2008-07-01

    The primary goal in designing a randomized controlled clinical trial (RCT) is to minimize bias in the estimate of treatment effect. Randomized group assignment, double-blinded assessments, and control or comparison groups reduce the risk of bias. The design must also provide sufficient statistical power to detect a clinically meaningful treatment effect and maintain a nominal level of type I error. An attempt to integrate neurocognitive science into an RCT poses additional challenges. Two particularly relevant aspects of such a design often receive insufficient attention in an RCT. Multiple outcomes inflate type I error, and an unreliable assessment process introduces bias and reduces statistical power. Here we describe how both unreliability and multiple outcomes can increase the study costs and duration and reduce the feasibility of the study. The objective of this article is to consider strategies that overcome the problems of unreliability and multiplicity.

  1. Using Statistical Process Control for detecting anomalies in multivariate spatiotemporal Earth Observations

    NASA Astrophysics Data System (ADS)

    Flach, Milan; Mahecha, Miguel; Gans, Fabian; Rodner, Erik; Bodesheim, Paul; Guanche-Garcia, Yanira; Brenning, Alexander; Denzler, Joachim; Reichstein, Markus

    2016-04-01

    The number of available Earth observations (EOs) is currently substantially increasing. Detecting anomalous patterns in these multivariate time series is an important step in identifying changes in the underlying dynamical system. Likewise, data quality issues might result in anomalous multivariate data constellations and have to be identified before corrupting subsequent analyses. In industrial application a common strategy is to monitor production chains with several sensors coupled to some statistical process control (SPC) algorithm. The basic idea is to raise an alarm when these sensor data depict some anomalous pattern according to the SPC, i.e. the production chain is considered 'out of control'. In fact, the industrial applications are conceptually similar to the on-line monitoring of EOs. However, algorithms used in the context of SPC or process monitoring are rarely considered for supervising multivariate spatio-temporal Earth observations. The objective of this study is to exploit the potential and transferability of SPC concepts to Earth system applications. We compare a range of different algorithms typically applied by SPC systems and evaluate their capability to detect e.g. known extreme events in land surface processes. Specifically two main issues are addressed: (1) identifying the most suitable combination of data pre-processing and detection algorithm for a specific type of event and (2) analyzing the limits of the individual approaches with respect to the magnitude, spatio-temporal size of the event as well as the data's signal to noise ratio. Extensive artificial data sets that represent the typical properties of Earth observations are used in this study. Our results show that the majority of the algorithms used can be considered for the detection of multivariate spatiotemporal events and directly transferred to real Earth observation data as currently assembled in different projects at the European scale, e.g. http://baci-h2020.eu/index.php/ and http://earthsystemdatacube.net/. Known anomalies such as the Russian heatwave are detected as well as anomalies which are not detectable with univariate methods.

  2. Fault Detection and Diagnosis In Hall-Héroult Cells Based on Individual Anode Current Measurements Using Dynamic Kernel PCA

    NASA Astrophysics Data System (ADS)

    Yao, Yuchen; Bao, Jie; Skyllas-Kazacos, Maria; Welch, Barry J.; Akhmetov, Sergey

    2018-04-01

    Individual anode current signals in aluminum reduction cells provide localized cell conditions in the vicinity of each anode, which contain more information than the conventionally measured cell voltage and line current. One common use of this measurement is to identify process faults that can cause significant changes in the anode current signals. While this method is simple and direct, it ignores the interactions between anode currents and other important process variables. This paper presents an approach that applies multivariate statistical analysis techniques to individual anode currents and other process operating data, for the detection and diagnosis of local process abnormalities in aluminum reduction cells. Specifically, since the Hall-Héroult process is time-varying with its process variables dynamically and nonlinearly correlated, dynamic kernel principal component analysis with moving windows is used. The cell is discretized into a number of subsystems, with each subsystem representing one anode and cell conditions in its vicinity. The fault associated with each subsystem is identified based on multivariate statistical control charts. The results show that the proposed approach is able to not only effectively pinpoint the problematic areas in the cell, but also assess the effect of the fault on different parts of the cell.

  3. SU-F-T-294: The Analysis of Gamma Criteria for Delta4 Dosimetry Using Statistical Process Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cho, S; Ahn, S; Kim, J

    Purpose: To evaluate the sensitivity of gamma criteria for patient-specific volumetric modulated arc therapy(VMAT) quality assurance of the Delta{sup 4} dosimetry program using the statistical process control(SPC) methodology. Methods: The authors selected 20 patient-specific VMAT QA cases which were undertaken MapCHECK and ArcCHECK with gamma pass rate better than 97%. The QAs data were collected Delta4 Phantom+ and Elekta Agility six megavolts without using an angle incrementer. The gamma index(GI) were calculated in 2D planes with normalizing deviation to local dose(local gamma). The sensitivity of the GI methodology using criterion of 3%/3mm, 3%/2mm and 2%/3mm was analyzed with using processmore » acceptability indices. We used local confidence(LC) level, the upper control limit(UCL) and lower control limit(LCL) of I-MR chart for process capability index(Cp) and a process acceptability index (Cpk). Results: The lower local confidence levels of 3%/3mm, 3%/2mm and 2%/3mm were 92.0%, 83.6% and 78.8% respectively. All of the calculated Cp and Cpk values that used LC level were under 1.0 in this study. The calculated LCLs of I-MR charts were 89.5%, 79.0% and 70.5% respectively. These values were higher than 1.0 which means good quality of QA. For the generally used lower limit of 90%, we acquired over 1.3 of Cp value for the gamma index of 3%/3mm and lower than 1.0 in the rest of GI. Conclusion: We applied SPC methodology to evaluate the sensitivity of gamma criteria and could see the lower control limits of VMAT QA for the Delta 4 dosimetry and could see that Delta 4 phantom+ dosimetry more affected by the position error and the I-MR chart derived values are more suitable for establishing lower limits. Acknowledgement: This research was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education (No. 2015R1D1A1A01060463)« less

  4. Improved communication in post-ICU care by improving writing of ICU discharge letters: a longitudinal before-after study.

    PubMed

    Medlock, Stephanie; Eslami, Saeid; Askari, Marjan; van Lieshout, Erik Jan; Dongelmans, Dave A; Abu-Hanna, Ameen

    2011-11-01

    The discharge letter is the primary means of communication at patient discharge, yet discharge letters are often not completed on time. A multifaceted intervention was performed to improve communication in patient hand-off from the intensive care unit (ICU) to the wards by improving the timeliness of discharge letters. A management directive was operationalised by a working group of ICU staff in a longitudinal before-after study. The intervention consisted of (a) changing policy to require a letter for use as a transfer note at the time of ICU discharge, (b) changing the assignment of responsibility to an automatic process, (c) leveraging positive peer pressure by making the list of patients in need of letters visible to colleagues and (d) provision of decision support, through automatic copying of important content from the patient record to the letter and email reminders if letters were not written on time. Statistical process control charts were used to monitor the longitudinal effect of the intervention. The intervention resulted in a 77.9% absolute improvement in the proportion of patients with a complete transfer note at the time of discharge, and an 85.2% absolute improvement in the number of discharge letters written. Statistical process control shows that the effect was sustained over time. A multifaceted intervention can be highly effective for improving discharge communication from the ICU.

  5. Geochemical processes controlling water salinization in an irrigated basin in Spain: identification of natural and anthropogenic influence.

    PubMed

    Merchán, D; Auqué, L F; Acero, P; Gimeno, M J; Causapé, J

    2015-01-01

    Salinization of water bodies represents a significant risk in water systems. The salinization of waters in a small irrigated hydrological basin is studied herein through an integrated hydrogeochemical study including multivariate statistical analyses and geochemical modeling. The study zone has two well differentiated geologic materials: (i) Quaternary sediments of low salinity and high permeability and (ii) Tertiary sediments of high salinity and very low permeability. In this work, soil samples were collected and leaching experiments conducted on them in the laboratory. In addition, water samples were collected from precipitation, irrigation, groundwater, spring and surface waters. The waters show an increase in salinity from precipitation and irrigation water to ground- and, finally, surface water. The enrichment in salinity is related to the dissolution of soluble mineral present mainly in the Tertiary materials. Cation exchange, precipitation of calcite and, probably, incongruent dissolution of dolomite, have been inferred from the hydrochemical data set. Multivariate statistical analysis provided information about the structure of the data, differentiating the group of surface waters from the groundwaters and the salinization from the nitrate pollution processes. The available information was included in geochemical models in which hypothesis of consistency and thermodynamic feasibility were checked. The assessment of the collected information pointed to a natural control on salinization processes in the Lerma Basin with minimal influence of anthropogenic factors. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. Statistical process control analysis for patient quality assurance of intensity modulated radiation therapy

    NASA Astrophysics Data System (ADS)

    Lee, Rena; Kim, Kyubo; Cho, Samju; Lim, Sangwook; Lee, Suk; Shim, Jang Bo; Huh, Hyun Do; Lee, Sang Hoon; Ahn, Sohyun

    2017-11-01

    This study applied statistical process control to set and verify the quality assurances (QA) tolerance standard for our hospital's characteristics with the criteria standards that are applied to all the treatment sites with this analysis. Gamma test factor of delivery quality assurances (DQA) was based on 3%/3 mm. Head and neck, breast, prostate cases of intensity modulated radiation therapy (IMRT) or volumetric arc radiation therapy (VMAT) were selected for the analysis of the QA treatment sites. The numbers of data used in the analysis were 73 and 68 for head and neck patients. Prostate and breast were 49 and 152 by MapCHECK and ArcCHECK respectively. C p value of head and neck and prostate QA were above 1.0, C pml is 1.53 and 1.71 respectively, which is close to the target value of 100%. C pml value of breast (IMRT) was 1.67, data values are close to the target value of 95%. But value of was 0.90, which means that the data values are widely distributed. C p and C pml of breast VMAT QA were respectively 1.07 and 2.10. This suggests that the VMAT QA has better process capability than the IMRT QA. Consequently, we should pay more attention to planning and QA before treatment for breast Radiotherapy.

  7. A quality control circle process to improve implementation effect of prevention measures for high-risk patients.

    PubMed

    Feng, Haixia; Li, Guohong; Xu, Cuirong; Ju, Changping; Suo, Peiheng

    2017-12-01

    The aim of the study was to analyse the influence of prevention measures on pressure injuries for high-risk patients and to establish the most appropriate methods of implementation. Nurses assessed patients using a checklist and factors influencing the prevention of a pressure injury determined by brain storming. A specific series of measures was drawn up and an estimate of risk of pressure injury determined using the Braden Scale, analysis of nursing documents, implementation of prevention measures for pressure sores and awareness of the system both before and after carrying out a quality control circle (QCC) process. The overall scores of implementation of prevention measures ranged from 74.86 ± 14.24 to 87.06 ± 17.04, a result that was statistically significant (P < 0.0025). The Braden Scale scores ranged from 8.53 ± 3.21 to 13.48 ± 3.57. The nursing document scores ranged from 7.67 ± 3.98 to 10.12 ± 1.63; prevention measure scores ranged from 11.48 ± 4.18 to 13.96 ± 3.92. Differences in all of the above results are statistically significant (P < 0.05). Implementation of a QCC can standardise and improve the prevention measures for patients who are vulnerable to pressure sores and is of practical importance to their prevention and control. © 2017 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  8. Analysis of extreme rainfall events using attributes control charts in temporal rainfall processes

    NASA Astrophysics Data System (ADS)

    Villeta, María; Valencia, Jose Luis; Saá-Requejo, Antonio; María Tarquis, Ana

    2015-04-01

    The impacts of most intense rainfall events on agriculture and insurance industry can be very severe. This research focuses in the analysis of extreme rainfall events throughout the use of attributes control charts, which constitutes a usual tool in Statistical Process Control (SPC) but unusual in climate studios. Here, series of daily precipitations for the years 1931-2009 within a Spanish region are analyzed, based on a new type of attributes control chart that takes into account the autocorrelation between the extreme rainfall events. The aim is to conclude if there exist or not evidence of a change in the extreme rainfall model of the considered series. After adjusting seasonally the precipitation series and considering the data of the first 30 years, a frequency-based criterion allowed fixing specification limits in order to discriminate between extreme observed rainfall days and normal observed rainfall days. The autocorrelation amongst maximum precipitation is taken into account by a New Binomial Markov Extended Process obtained for each rainfall series. These modelling of the extreme rainfall processes provide a way to generate the attributes control charts for the annual fraction of rainfall extreme days. The extreme rainfall processes along the rest of the years under study can then be monitored by such attributes control charts. The results of the application of this methodology show evidence of change in the model of extreme rainfall events in some of the analyzed precipitation series. This suggests that the attributes control charts proposed for the analysis of the most intense precipitation events will be of practical interest to agriculture and insurance sectors in next future.

  9. Treatment of missing data in follow-up studies of randomised controlled trials: A systematic review of the literature.

    PubMed

    Sullivan, Thomas R; Yelland, Lisa N; Lee, Katherine J; Ryan, Philip; Salter, Amy B

    2017-08-01

    After completion of a randomised controlled trial, an extended follow-up period may be initiated to learn about longer term impacts of the intervention. Since extended follow-up studies often involve additional eligibility restrictions and consent processes for participation, and a longer duration of follow-up entails a greater risk of participant attrition, missing data can be a considerable threat in this setting. As a potential source of bias, it is critical that missing data are appropriately handled in the statistical analysis, yet little is known about the treatment of missing data in extended follow-up studies. The aims of this review were to summarise the extent of missing data in extended follow-up studies and the use of statistical approaches to address this potentially serious problem. We performed a systematic literature search in PubMed to identify extended follow-up studies published from January to June 2015. Studies were eligible for inclusion if the original randomised controlled trial results were also published and if the main objective of extended follow-up was to compare the original randomised groups. We recorded information on the extent of missing data and the approach used to treat missing data in the statistical analysis of the primary outcome of the extended follow-up study. Of the 81 studies included in the review, 36 (44%) reported additional eligibility restrictions and 24 (30%) consent processes for entry into extended follow-up. Data were collected at a median of 7 years after randomisation. Excluding 28 studies with a time to event primary outcome, 51/53 studies (96%) reported missing data on the primary outcome. The median percentage of randomised participants with complete data on the primary outcome was just 66% in these studies. The most common statistical approach to address missing data was complete case analysis (51% of studies), while likelihood-based analyses were also well represented (25%). Sensitivity analyses around the missing data mechanism were rarely performed (25% of studies), and when they were, they often involved unrealistic assumptions about the mechanism. Despite missing data being a serious problem in extended follow-up studies, statistical approaches to addressing missing data were often inadequate. We recommend researchers clearly specify all sources of missing data in follow-up studies and use statistical methods that are valid under a plausible assumption about the missing data mechanism. Sensitivity analyses should also be undertaken to assess the robustness of findings to assumptions about the missing data mechanism.

  10. No association of SORL1 SNPs with Alzheimer’s disease

    PubMed Central

    Minster, Ryan L.; DeKosky, Steven T.; Kamboh, M. Ilyas

    2008-01-01

    SORL1 is an element of the amyloid precursor protein processing pathway and is therefore a good candidate for affecting Alzheimer’s disease (AD) risk. Indeed, there have been reports of associations between variation in SORL1 and AD risk. We examined six statistically significant single-nucleotide polymorphisms from the initial observation in a large Caucasian American case–controls cohort (1000 late-onset AD [LOAD] cases and 1000 older controls). Analysis of allele, genotype and haplotype frequencies revealed no association with LOAD risk in our cohort. PMID:18562096

  11. [United Plackett-Burman and Box-Behnken design to control formation of indirubin in process of preparing indigo naturalis].

    PubMed

    Liu, Zeyu; Su, Zhetong; Yang, Ming; Zou, Wenquan

    2010-10-01

    To screen the factors that affect indirubin-generated significantly in the process of preparing indigo naturalis, optimize level combination and determine the optimum technology for indirubin-generated. Using concentration of indirubin (mg x g(-1)) that generated by fresh leaf as an index, Plackett-Burman design, Box-Behnken design response surface analysis as the statistical method, we screened the significantly influencing factors and the optimal level combination. The soaking and making indirubin process in preparing indigo naturalis was identified as the wax is not removed before immersion with immersion pH 7, solvent volume-leaf weight (mL: g)15, soaked not avoided light, soaking 48 h, temperature 60 degrees C, ventilation time of 180 min, and added ammonia water to adjust pH to 10.5. The soaking and making indirubin process in preparing indigo naturalis is optimized systematically. It clarify the various factors on the impact of the active ingredient indirubin which controlled by industrialized production become reality in the process of preparing indigo naturalis, at the same time, it lay the foundation for processing principle of indigo naturalis.

  12. Statistical Control Paradigm for Aerospace Structures Under Impulsive Disturbances

    DTIC Science & Technology

    2006-08-03

    attitude control system with an innovative and robust statistical controller design shows significant promise for use in attitude hold mode operation...indicate that the existing attitude control system with an innovative and robust statistical controller design shows significant promise for use in...and three thrusters are for use in controlling the attitude of the satellite. Then the angular momentum of the satellite with three thrusters and a

  13. Metabolic Syndrome in Psoriasis among Urban South Indians: A Case Control Study Using SAM-NCEP Criteria.

    PubMed

    Girisha, Banavasi S; Thomas, Neetha

    2017-02-01

    Psoriasis is a chronic inflammatory disease of the skin associated with increased cardiovascular morbidity. Metabolic syndrome is a significant forecaster of cardiovascular events. To assess the association of metabolic syndrome and its components in patients with psoriasis and to compare it with the age and sex matched control group. We conducted a hospital based case-control study on 156 adult patients with chronic plaque psoriasis and 156 patients with skin diseases other than psoriasis. Height, weight, BMI, blood pressure and waist circumference were documented in all the subjects. Fasting levels of serum glucose, serum triglycerides and serum HDL were estimated by automated clinical chemistry analyzer. The South Asian modified NCEP ATP criterion was used for the diagnosis of metabolic syndrome. Statistical analysis of the data was done using statistical processing software (SPSS-17). Metabolic syndrome was significantly more common in psoriatic patients than in controls (28.8% vs 16.7%, p=0.01). Hypertriglyceridemia was significantly more prevalent in cases than in controls (34% vs 20.5%, p=0.008). The reduced HDL levels also showed a significantly high occurrence among cases (27.6% vs 13.5%, p=0.002). Moderate increase of blood pressure was seen among cases as compared to controls but the difference was not statistically significant (p=0.1). Impaired blood glucose and abdominal obesity were similar in both groups. Smoking and alcoholism did not influence the association of metabolic syndrome with psoriasis. There was no correlation of metabolic syndrome with severity and duration of psoriasis. Our findings suggest that metabolic syndrome as well as dyslipidemia is common in psoriasis patients among urban South Indians. This study highlights the need for screening at diagnosis and regular follow up of the metabolic aspects of the disease along with the skin lesions.

  14. Impact of Performance Improvement Continuing Medical Education on Cardiometabolic Risk Factor Control: The COSEHC Initiative

    PubMed Central

    Joyner, JaNae; Moore, Michael A.; Simmons, Debra R.; Forrest, Brian; Yu-Isenberg, Kristina; Piccione, Ron; Caton, Kirt; Lackland, Daniel T.; Ferrario, Carlos M.

    2016-01-01

    Introduction The Consortium for Southeastern Hypertension Control (COSEHC) implemented a study to assess benefits of a performance improvement continuing medical education (PI CME) activity focused on cardiometabolic risk factor management in primary care patients. Methods Using the plan-do-study-act (PDSA) model as the foundation, this PI CME activity aimed at improving practice gaps by integrating evidence-based clinical interventions, physician-patient education, processes of care, performance metrics, and patient outcomes. The PI CME intervention was implemented in a group of South Carolina physician practices, while a comparable physician practice group served as a control. Performance outcomes at 6 months included changes in patients’ cardiometabolic risk factor values and control rates from baseline. We also compared changes in diabetic, African American, the elderly (> 65 years), and female patient subpopulations and in patients with uncontrolled risk factors at baseline. Results Only women receiving health care by intervention physicians showed a statistical improvement in their cardiometabolic risk factors as evidenced by a −3.0 mg/dL and a −3.5 mg/dL decrease in mean LDL cholesterol and non-HDL cholesterol, respectively, and a −7.0 mg/dL decrease in LDL cholesterol among females with uncontrolled baseline LDL cholesterol values. No other statistical differences were found. Discussion These data demonstrate that our PI CME activity is a useful strategy in assisting physicians to improve their management of cardiometabolic control rates in female patients with abnormal cholesterol control. Other studies that extend across longer PI CME PDSA periods may be needed to demonstrate statistical improvements in overall cardiometabolic treatment goals in men, women, and various subpopulations. PMID:24648361

  15. Evaluation of two spike-and-recovery controls for assessment of extraction efficiency in microbial source tracking studies

    USGS Publications Warehouse

    Stoeckel, D.M.; Stelzer, E.A.; Dick, L.K.

    2009-01-01

    Quantitative PCR (qPCR), applied to complex environmental samples such as water, wastewater, and feces, is susceptible to methodological and sample related biases. In this study, we evaluated two exogenous DNA spike-and-recovery controls as proxies for recovery efficiency of Bacteroidales 16S rDNA gene sequences (AllBac and qHF183) that are used for microbial source tracking (MST) in river water. Two controls-(1) the plant pathogen Pantoea stewartii, carrying the chromosomal target gene cpsD, and (2) Escherichia coli, carrying the plasmid-borne target gene DsRed2-were added to raw water samples immediately prior to concentration and DNA extraction for qPCR. When applied to samples processed in replicate, recovery of each control was positively correlated with the observed concentration of each MST marker. Adjustment of MST marker concentrations according to recovery efficiency reduced variability in replicate analyses when consistent processing and extraction methodologies were applied. Although the effects of this procedure on accuracy could not be tested due to uncertainties in control DNA concentrations, the observed reduction in variability should improve the strength of statistical comparisons. These findings suggest that either of the tested spike-and-recovery controls can be useful to measure efficiency of extraction and recovery in routine laboratory processing. ?? 2009 Elsevier Ltd.

  16. Three-level sampler having automated thresholds

    NASA Technical Reports Server (NTRS)

    Jurgens, R. F.

    1976-01-01

    A three-level sampler is described that has its thresholds controlled automatically so as to track changes in the statistics of the random process being sampled. In particular, the mean value is removed and the ratio of the standard deviation of the random process to the threshold is maintained constant. The system is configured in such a manner that slow drifts in the level comparators and digital-to-analog converters are also removed. The ratio of the standard deviation to threshold level may be chosen within the constraints of the ratios of two integers N and M. These may be chosen to minimize the quantizing noise of the sampled process.

  17. TQM: the essential concepts.

    PubMed

    Chambers, D W

    1998-01-01

    This is an introduction to the major concepts in total quality management, a loose collection of management approaches that focus on continuous improvement of processes, guided by routine data collection and adjustment of the processes. Customer focus and involvement of all members of an organization are also characteristics commonly found in TQM. The seventy-five-year history of the movement is sketched from its beginning in statistical work on quality assurance through the many improvements and redefinitions added by American and Japanese thinkers. Essential concepts covered include: control cycles, focus on the process rather than the defects, the GEAR model, importance of the customer, upstream quality, just-in-time, kaizen, and service quality.

  18. Statistical tools and control of internal lubricant content of inhalation grade HPMC capsules during manufacture.

    PubMed

    Ayala, Guillermo; Díez, Fernando; Gassó, María T; Jones, Brian E; Martín-Portugués, Rafael; Ramiro-Aparicio, Juan

    2016-04-30

    The internal lubricant content (ILC) of inhalation grade HPMC capsules is a key factor to ensure good powder release when the patient inhales a medicine from a dry powder inhaler (DPI). Powder release from capsules has been shown to be influenced by the ILC. The characteristics used to measure this are the emitted dose, fine particle fraction and mass median aerodynamic diameter. In addition the ILC level is critical for capsule shell manufacture because it is an essential part of the process that cannot work without it. An experiment has been applied to the manufacture of inhalation capsules with the required ILC. A full factorial model was used to identify the controlling factors and from this a linear model has been proposed to improve control of the process. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Corra: Computational framework and tools for LC-MS discovery and targeted mass spectrometry-based proteomics

    PubMed Central

    Brusniak, Mi-Youn; Bodenmiller, Bernd; Campbell, David; Cooke, Kelly; Eddes, James; Garbutt, Andrew; Lau, Hollis; Letarte, Simon; Mueller, Lukas N; Sharma, Vagisha; Vitek, Olga; Zhang, Ning; Aebersold, Ruedi; Watts, Julian D

    2008-01-01

    Background Quantitative proteomics holds great promise for identifying proteins that are differentially abundant between populations representing different physiological or disease states. A range of computational tools is now available for both isotopically labeled and label-free liquid chromatography mass spectrometry (LC-MS) based quantitative proteomics. However, they are generally not comparable to each other in terms of functionality, user interfaces, information input/output, and do not readily facilitate appropriate statistical data analysis. These limitations, along with the array of choices, present a daunting prospect for biologists, and other researchers not trained in bioinformatics, who wish to use LC-MS-based quantitative proteomics. Results We have developed Corra, a computational framework and tools for discovery-based LC-MS proteomics. Corra extends and adapts existing algorithms used for LC-MS-based proteomics, and statistical algorithms, originally developed for microarray data analyses, appropriate for LC-MS data analysis. Corra also adapts software engineering technologies (e.g. Google Web Toolkit, distributed processing) so that computationally intense data processing and statistical analyses can run on a remote server, while the user controls and manages the process from their own computer via a simple web interface. Corra also allows the user to output significantly differentially abundant LC-MS-detected peptide features in a form compatible with subsequent sequence identification via tandem mass spectrometry (MS/MS). We present two case studies to illustrate the application of Corra to commonly performed LC-MS-based biological workflows: a pilot biomarker discovery study of glycoproteins isolated from human plasma samples relevant to type 2 diabetes, and a study in yeast to identify in vivo targets of the protein kinase Ark1 via phosphopeptide profiling. Conclusion The Corra computational framework leverages computational innovation to enable biologists or other researchers to process, analyze and visualize LC-MS data with what would otherwise be a complex and not user-friendly suite of tools. Corra enables appropriate statistical analyses, with controlled false-discovery rates, ultimately to inform subsequent targeted identification of differentially abundant peptides by MS/MS. For the user not trained in bioinformatics, Corra represents a complete, customizable, free and open source computational platform enabling LC-MS-based proteomic workflows, and as such, addresses an unmet need in the LC-MS proteomics field. PMID:19087345

  20. Successful classification of cocaine dependence using brain imaging: a generalizable machine learning approach.

    PubMed

    Mete, Mutlu; Sakoglu, Unal; Spence, Jeffrey S; Devous, Michael D; Harris, Thomas S; Adinoff, Bryon

    2016-10-06

    Neuroimaging studies have yielded significant advances in the understanding of neural processes relevant to the development and persistence of addiction. However, these advances have not explored extensively for diagnostic accuracy in human subjects. The aim of this study was to develop a statistical approach, using a machine learning framework, to correctly classify brain images of cocaine-dependent participants and healthy controls. In this study, a framework suitable for educing potential brain regions that differed between the two groups was developed and implemented. Single Photon Emission Computerized Tomography (SPECT) images obtained during rest or a saline infusion in three cohorts of 2-4 week abstinent cocaine-dependent participants (n = 93) and healthy controls (n = 69) were used to develop a classification model. An information theoretic-based feature selection algorithm was first conducted to reduce the number of voxels. A density-based clustering algorithm was then used to form spatially connected voxel clouds in three-dimensional space. A statistical classifier, Support Vectors Machine (SVM), was then used for participant classification. Statistically insignificant voxels of spatially connected brain regions were removed iteratively and classification accuracy was reported through the iterations. The voxel-based analysis identified 1,500 spatially connected voxels in 30 distinct clusters after a grid search in SVM parameters. Participants were successfully classified with 0.88 and 0.89 F-measure accuracies in 10-fold cross validation (10xCV) and leave-one-out (LOO) approaches, respectively. Sensitivity and specificity were 0.90 and 0.89 for LOO; 0.83 and 0.83 for 10xCV. Many of the 30 selected clusters are highly relevant to the addictive process, including regions relevant to cognitive control, default mode network related self-referential thought, behavioral inhibition, and contextual memories. Relative hyperactivity and hypoactivity of regional cerebral blood flow in brain regions in cocaine-dependent participants are presented with corresponding level of significance. The SVM-based approach successfully classified cocaine-dependent and healthy control participants using voxels selected with information theoretic-based and statistical methods from participants' SPECT data. The regions found in this study align with brain regions reported in the literature. These findings support the future use of brain imaging and SVM-based classifier in the diagnosis of substance use disorders and furthering an understanding of their underlying pathology.

  1. Statistical properties of several models of fractional random point processes

    NASA Astrophysics Data System (ADS)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  2. Comparing and combining process-based crop models and statistical models with some implications for climate change

    NASA Astrophysics Data System (ADS)

    Roberts, Michael J.; Braun, Noah O.; Sinclair, Thomas R.; Lobell, David B.; Schlenker, Wolfram

    2017-09-01

    We compare predictions of a simple process-based crop model (Soltani and Sinclair 2012), a simple statistical model (Schlenker and Roberts 2009), and a combination of both models to actual maize yields on a large, representative sample of farmer-managed fields in the Corn Belt region of the United States. After statistical post-model calibration, the process model (Simple Simulation Model, or SSM) predicts actual outcomes slightly better than the statistical model, but the combined model performs significantly better than either model. The SSM, statistical model and combined model all show similar relationships with precipitation, while the SSM better accounts for temporal patterns of precipitation, vapor pressure deficit and solar radiation. The statistical and combined models show a more negative impact associated with extreme heat for which the process model does not account. Due to the extreme heat effect, predicted impacts under uniform climate change scenarios are considerably more severe for the statistical and combined models than for the process-based model.

  3. Processing speed enhances model-based over model-free reinforcement learning in the presence of high working memory functioning

    PubMed Central

    Schad, Daniel J.; Jünger, Elisabeth; Sebold, Miriam; Garbusow, Maria; Bernhardt, Nadine; Javadi, Amir-Homayoun; Zimmermann, Ulrich S.; Smolka, Michael N.; Heinz, Andreas; Rapp, Michael A.; Huys, Quentin J. M.

    2014-01-01

    Theories of decision-making and its neural substrates have long assumed the existence of two distinct and competing valuation systems, variously described as goal-directed vs. habitual, or, more recently and based on statistical arguments, as model-free vs. model-based reinforcement-learning. Though both have been shown to control choices, the cognitive abilities associated with these systems are under ongoing investigation. Here we examine the link to cognitive abilities, and find that individual differences in processing speed covary with a shift from model-free to model-based choice control in the presence of above-average working memory function. This suggests shared cognitive and neural processes; provides a bridge between literatures on intelligence and valuation; and may guide the development of process models of different valuation components. Furthermore, it provides a rationale for individual differences in the tendency to deploy valuation systems, which may be important for understanding the manifold neuropsychiatric diseases associated with malfunctions of valuation. PMID:25566131

  4. Fast machine-learning online optimization of ultra-cold-atom experiments.

    PubMed

    Wigley, P B; Everitt, P J; van den Hengel, A; Bastian, J W; Sooriyabandara, M A; McDonald, G D; Hardman, K S; Quinlivan, C D; Manju, P; Kuhn, C C N; Petersen, I R; Luiten, A N; Hope, J J; Robins, N P; Hush, M R

    2016-05-16

    We apply an online optimization process based on machine learning to the production of Bose-Einstein condensates (BEC). BEC is typically created with an exponential evaporation ramp that is optimal for ergodic dynamics with two-body s-wave interactions and no other loss rates, but likely sub-optimal for real experiments. Through repeated machine-controlled scientific experimentation and observations our 'learner' discovers an optimal evaporation ramp for BEC production. In contrast to previous work, our learner uses a Gaussian process to develop a statistical model of the relationship between the parameters it controls and the quality of the BEC produced. We demonstrate that the Gaussian process machine learner is able to discover a ramp that produces high quality BECs in 10 times fewer iterations than a previously used online optimization technique. Furthermore, we show the internal model developed can be used to determine which parameters are essential in BEC creation and which are unimportant, providing insight into the optimization process of the system.

  5. Fast machine-learning online optimization of ultra-cold-atom experiments

    PubMed Central

    Wigley, P. B.; Everitt, P. J.; van den Hengel, A.; Bastian, J. W.; Sooriyabandara, M. A.; McDonald, G. D.; Hardman, K. S.; Quinlivan, C. D.; Manju, P.; Kuhn, C. C. N.; Petersen, I. R.; Luiten, A. N.; Hope, J. J.; Robins, N. P.; Hush, M. R.

    2016-01-01

    We apply an online optimization process based on machine learning to the production of Bose-Einstein condensates (BEC). BEC is typically created with an exponential evaporation ramp that is optimal for ergodic dynamics with two-body s-wave interactions and no other loss rates, but likely sub-optimal for real experiments. Through repeated machine-controlled scientific experimentation and observations our ‘learner’ discovers an optimal evaporation ramp for BEC production. In contrast to previous work, our learner uses a Gaussian process to develop a statistical model of the relationship between the parameters it controls and the quality of the BEC produced. We demonstrate that the Gaussian process machine learner is able to discover a ramp that produces high quality BECs in 10 times fewer iterations than a previously used online optimization technique. Furthermore, we show the internal model developed can be used to determine which parameters are essential in BEC creation and which are unimportant, providing insight into the optimization process of the system. PMID:27180805

  6. Effectivity of artrihpi irrigation for diabetic ulcer healing: A randomized controlled trial

    NASA Astrophysics Data System (ADS)

    Gayatri, Dewi; Asmorohadi, Aries; Dahlia, Debie

    2018-02-01

    The healing process of diabetic ulcer is often impeded by inflammation, infection, and decreased immune state. High pressure irrigation (10-15 psi) may be used to control the infection level. This research was designed to identify the effectiveness of artrihpi irrigation device towards diabetic ulcers in public hospitals in the Central Java. This research is a randomized control trial with cross over design. Sixty four subjects were selected using block randomization technique, and were divided into control and intervention group. The intervention was given in 6 days along with wound healing evaluation in every 3 days. The results demonstrated that there was a significant difference decrease scoring healing after treatment, even though the difference scoring healing between both groups was not statistically significant. However, it means difference was found that in the intervention artrihpi the wound healing was better than the spuit. These results illustrates the artrihpi may be solution of using high pressure irrigation to help healing process diabetic ulcers.

  7. Active controllers and the time duration to learn a task

    NASA Technical Reports Server (NTRS)

    Repperger, D. W.; Goodyear, C.

    1986-01-01

    An active controller was used to help train naive subjects involved in a compensatory tracking task. The controller is called active in this context because it moves the subject's hand in a direction to improve tracking. It is of interest here to question whether the active controller helps the subject to learn a task more rapidly than the passive controller. Six subjects, inexperienced to compensatory tracking, were run to asymptote root mean square error tracking levels with an active controller or a passive controller. The time required to learn the task was defined several different ways. The results of the different measures of learning were examined across pools of subjects and across controllers using statistical tests. The comparison between the active controller and the passive controller as to their ability to accelerate the learning process as well as reduce levels of asymptotic tracking error is reported here.

  8. Evaluation of the Air Void Analyzer

    DTIC Science & Technology

    2013-07-01

    lack of measurement would help explain the difference in values shown. Brief descriptions of other unpublished testing (Wang et al. 2008)  CTL Group...structure measurements taken from the controlled laboratory mixtures. A three-phase approach was used to evaluate the machine. First, a global ...method. Hypothesis testing using t-statistics was performed to increase understanding of the data collected globally in terms of the processes used for

  9. Study of Personnel Attrition and Revocation within U.S. Marine Corps Air Traffic Control Specialties

    DTIC Science & Technology

    2012-03-01

    Entrance Processing Stations (MEPS) and recruit depots, to include non-cognitive testing, such as Navy Computer Adaptive Personality Scales ( NCAPS ...Revocation, Selection, MOS, Regression, Probit, dProbit, STATA, Statistics, Marginal Effects, ASVAB, AFQT, Composite Scores, Screening, NCAPS 15. NUMBER...Navy Computer Adaptive Personality Scales ( NCAPS ), during recruitment. It is also recommended that an economic analysis be conducted comparing the

  10. Designed experiment evaluation of key variables affecting the cutting performance of rotary instruments.

    PubMed

    Funkenbusch, Paul D; Rotella, Mario; Ercoli, Carlo

    2015-04-01

    Laboratory studies of tooth preparation are often performed under a limited range of conditions involving single values for all variables other than the 1 being tested. In contrast, in clinical settings not all variables can be tightly controlled. For example, a new dental rotary cutting instrument may be tested in the laboratory by making a specific cut with a fixed force, but in clinical practice, the instrument must make different cuts with individual dentists applying a range of different forces. Therefore, the broad applicability of laboratory results to diverse clinical conditions is uncertain and the comparison of effects across studies is difficult. The purpose of this study was to examine the effect of 9 process variables on dental cutting in a single experiment, allowing each variable to be robustly tested over a range of values for the other 8 and permitting a direct comparison of the relative importance of each on the cutting process. The effects of 9 key process variables on the efficiency of a simulated dental cutting operation were measured. A fractional factorial experiment was conducted by using a computer-controlled, dedicated testing apparatus to simulate dental cutting procedures and Macor blocks as the cutting substrate. Analysis of Variance (ANOVA) was used to judge the statistical significance (α=.05). Five variables consistently produced large, statistically significant effects (target applied load, cut length, starting rpm, diamond grit size, and cut type), while 4 variables produced relatively small, statistically insignificant effects (number of cooling ports, rotary cutting instrument diameter, disposability, and water flow rate). The control exerted by the dentist, simulated in this study by targeting a specific level of applied force, was the single most important factor affecting cutting efficiency. Cutting efficiency was also significantly affected by factors simulating patient/clinical circumstances as well as hardware choices. These results highlight the importance of local clinical conditions (procedure, dentist) in understanding dental cutting procedures and in designing adequate experimental methodologies for future studies. Copyright © 2015 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  11. Sedimentary control of volcanic debris-avalanche structures and transformation into lahars

    NASA Astrophysics Data System (ADS)

    Bernard, Karine; van Wyk de Vries, Benjamin; Thouret, Jean-Claude; Roche, Olivier; Samaniego Eguiguren, Pablo

    2017-04-01

    Volcanic debris avalanche structures and related transformations into lahars have been extensively analysed in order to establish a sedimentary classification of the deposits. Textural and structural variations of eight debris-avalanche deposits (DADs) have been correlated with Shape Preferred Orientation of 30,000 clasts together with grain-size distributions and statistical parameters from 156 sieved matrix samples. Granular segregation patterns have been observed with structural fault controls: proximal granular-segregation structures of the Tutupaca DAD ridges in Peru, basal sheared bands along overthrust lateral levee (Mt. Dore, France), mixing and cataclasis of fault-controlled deposits in half-graben during lateral spreading of distal thrust lobe (Pichu-Pichu, Peru), neo-cataclasis at the frontal thrust lobe (Meager, Canada and Mt. Dore, France). A logarithmic regression characterises the % matrix vs. matrix/gravels showing proximal and primary cataclasis, hybrid DADs with polymodal matrix and mixed facies up to transformations into lahar (Misti, Mt Dore). The sequential fragmentation helps to distinguish DAD that belong to Andean and Cascade Volcanic arcs (Tutupaca and Misti, Peru; Meager, Canada) to the hybrid DADs, before distal transformation into lahars (Pichu-Pichu); and hydrovolcanic fragmentation characterises the transformed lahar deposits (Misti). The fractal values of 150 sieved samples range between 2.3 and 2.7, implying extensional fractures with granular disaggregation. Skewness vs. kurtosis values help to distinguish the proximal mass wasting deposits and the transformed deposits by dilution. The sorting vs. median values enable us to differentiate the hybrid DADs with the transformed deposits by dilution. The sedimentological statistical parameters with Shape Preferred Orientation analysis that have been correlated with textural and structural observations show textural fabrics resulting from kinematic processes: cataclasis, hybrid matrix facies and transformations. Inherited fractures from tectono-volcanic structures contribute to the particle size distributions of DAD and associated deposits such as pyroclastic and lahar deposits (Misti, Mt Dore, Tutupaca). The statistical results highlight granular structure and kinematic process of DAD transformations into lahars and associated deposits, which would contribute to understand the rheological process behind the excess DAD run-out and to test granular models for DAD transformations. Key words: volcanic debris-avalanche deposits, lahar transformation, structure, sedimentology, hazard

  12. Self-referential processing influences functional activation during cognitive control: an fMRI study

    PubMed Central

    Koch, Kathrin; Schachtzabel, Claudia; Peikert, Gregor; Schultz, Carl Christoph; Reichenbach, Jürgen R.; Sauer, Heinrich; Schlösser, Ralf G.

    2013-01-01

    Rostral anterior cingulate cortex (rACC) plays a central role in the pathophysiology of major depressive disorder (MDD). As we reported in our previous study (Wagner et al., 2006), patients with MDD were characterized by an inability to deactivate this region during cognitive processing leading to a compensatory prefrontal hyperactivation. This hyperactivation in rACC may be related to a deficient inhibitory control of negative self-referential processes, which in turn may interfere with cognitive control task execution and the underlying fronto-cingulate network activation. To test this assumption, a functional magnetic resonance imaging study was conducted in 34 healthy subjects. Univariate and functional connectivity analyses in statistical parametric mapping software 8 were used. Self-referential stimuli and the Stroop task were presented in an event-related design. As hypothesized, rACC was specifically engaged during negative self-referential processing (SRP) and was significantly related to the degree of depressive symptoms in participants. BOLD signal in rACC showed increased valence-dependent (negative vs neutral SRP) interaction with BOLD signal in prefrontal and dorsal anterior cingulate regions during Stroop task performance. This result provides strong support for the notion that enhanced rACC interacts with brain regions involved in cognitive control processes and substantiates our previous interpretation of increased rACC and prefrontal activation in patients during Stroop task. PMID:22798398

  13. Initiating statistical process control to improve quality outcomes in colorectal surgery.

    PubMed

    Keller, Deborah S; Stulberg, Jonah J; Lawrence, Justin K; Samia, Hoda; Delaney, Conor P

    2015-12-01

    Unexpected variations in postoperative length of stay (LOS) negatively impact resources and patient outcomes. Statistical process control (SPC) measures performance, evaluates productivity, and modifies processes for optimal performance. The goal of this study was to initiate SPC to identify LOS outliers and evaluate its feasibility to improve outcomes in colorectal surgery. Review of a prospective database identified colorectal procedures performed by a single surgeon. Patients were grouped into elective and emergent categories and then stratified by laparoscopic and open approaches. All followed a standardized enhanced recovery protocol. SPC was applied to identify outliers and evaluate causes within each group. A total of 1294 cases were analyzed--83% elective (n = 1074) and 17% emergent (n = 220). Emergent cases were 70.5% open and 29.5% laparoscopic; elective cases were 36.8% open and 63.2% laparoscopic. All groups had a wide range in LOS. LOS outliers ranged from 8.6% (elective laparoscopic) to 10.8% (emergent laparoscopic). Evaluation of outliers demonstrated patient characteristics of higher ASA scores, longer operating times, ICU requirement, and temporary nursing at discharge. Outliers had higher postoperative complication rates in elective open (57.1 vs. 20.0%) and elective lap groups (77.6 vs. 26.1%). Outliers also had higher readmission rates for emergent open (11.4 vs. 5.4%), emergent lap (14.3 vs. 9.2%), and elective lap (32.8 vs. 6.9%). Elective open outliers did not follow trends of longer LOS or higher reoperation rates. SPC is feasible and promising for improving colorectal surgery outcomes. SPC identified patient and process characteristics associated with increased LOS. SPC may allow real-time outlier identification, during quality improvement efforts, and reevaluation of outcomes after introducing process change. SPC has clinical implications for improving patient outcomes and resource utilization.

  14. "Congratulations, you have been randomized into the control group!(?)": issues to consider when recruiting schools for matched-pair randomized control trials of prevention programs.

    PubMed

    Ji, Peter; DuBois, David L; Flay, Brian R; Brechling, Vanessa

    2008-03-01

    Recruiting schools into a matched-pair randomized control trial (MP-RCT) to evaluate the efficacy of a school-level prevention program presents challenges for researchers. We considered which of 2 procedures would be most effective for recruiting schools into the study and assigning them to conditions. In 1 procedure (recruit and match/randomize), we would recruit schools and match them prior to randomization, and in the other (match/randomize and recruitment), we would match schools and randomize them prior to recruitment. We considered how each procedure impacted the randomization process and our ability to recruit schools into the study. After implementing the selected procedure, the equivalence of both treatment and control group schools and the participating and nonparticipating schools on school demographic variables was evaluated. We decided on the recruit and match/randomize procedure because we thought it would provide the opportunity to build rapport with the schools and prepare them for the randomization process, thereby increasing the likelihood that they would accept their randomly assigned conditions. Neither the treatment and control group schools nor the participating and nonparticipating schools exhibited statistically significant differences from each other on any of the school demographic variables. Recruitment of schools prior to matching and randomization in an MP-RCT may facilitate the recruitment of schools and thus enhance both the statistical power and the representativeness of study findings. Future research would benefit from the consideration of a broader range of variables (eg, readiness to implement a comprehensive prevention program) both in matching schools and in evaluating their representativeness to nonparticipating schools.

  15. Probability of brittle failure

    NASA Technical Reports Server (NTRS)

    Kim, A.; Bosnyak, C. P.; Chudnovsky, A.

    1991-01-01

    A methodology was developed for collecting statistically representative data for crack initiation and arrest from small number of test specimens. An epoxy (based on bisphenol A diglycidyl ether and polyglycol extended diglycyl ether and cured with diethylene triamine) is selected as a model material. A compact tension specimen with displacement controlled loading is used to observe multiple crack initiation and arrests. The energy release rate at crack initiation is significantly higher than that at a crack arrest, as has been observed elsewhere. The difference between these energy release rates is found to depend on specimen size (scale effect), and is quantitatively related to the fracture surface morphology. The scale effect, similar to that in statistical strength theory, is usually attributed to the statistics of defects which control the fracture process. Triangular shaped ripples (deltoids) are formed on the fracture surface during the slow subcritical crack growth, prior to the smooth mirror-like surface characteristic of fast cracks. The deltoids are complementary on the two crack faces which excludes any inelastic deformation from consideration. Presence of defects is also suggested by the observed scale effect. However, there are no defects at the deltoid apexes detectable down to the 0.1 micron level.

  16. Iterative LQG Controller Design Through Closed-Loop Identification

    NASA Technical Reports Server (NTRS)

    Hsiao, Min-Hung; Huang, Jen-Kuang; Cox, David E.

    1996-01-01

    This paper presents an iterative Linear Quadratic Gaussian (LQG) controller design approach for a linear stochastic system with an uncertain open-loop model and unknown noise statistics. This approach consists of closed-loop identification and controller redesign cycles. In each cycle, the closed-loop identification method is used to identify an open-loop model and a steady-state Kalman filter gain from closed-loop input/output test data obtained by using a feedback LQG controller designed from the previous cycle. Then the identified open-loop model is used to redesign the state feedback. The state feedback and the identified Kalman filter gain are used to form an updated LQC controller for the next cycle. This iterative process continues until the updated controller converges. The proposed controller design is demonstrated by numerical simulations and experiments on a highly unstable large-gap magnetic suspension system.

  17. A head movement image (HMI)-controlled computer mouse for people with disabilities.

    PubMed

    Chen, Yu-Luen; Chen, Weoi-Luen; Kuo, Te-Son; Lai, Jin-Shin

    2003-02-04

    This study proposes image processing and microprocessor technology for use in developing a head movement image (HMI)-controlled computer mouse system for the spinal cord injured (SCI). The system controls the movement and direction of the mouse cursor by capturing head movement images using a marker installed on the user's headset. In the clinical trial, this new mouse system was compared with an infrared-controlled mouse system on various tasks with nine subjects with SCI. The results were favourable to the new mouse system. The differences between the new mouse system and the infrared-controlled mouse were reaching statistical significance in each of the test situations (p<0.05). The HMI-controlled computer mouse improves the input speed. People with disabilities need only wear the headset and move their heads to freely control the movement of the mouse cursor.

  18. Novel peptide-based platform for the dual presentation of biologically active peptide motifs on biomaterials.

    PubMed

    Mas-Moruno, Carlos; Fraioli, Roberta; Albericio, Fernando; Manero, José María; Gil, F Javier

    2014-05-14

    Biofunctionalization of metallic materials with cell adhesive molecules derived from the extracellular matrix is a feasible approach to improve cell-material interactions and enhance the biointegration of implant materials (e.g., osseointegration of bone implants). However, classical biomimetic strategies may prove insufficient to elicit complex and multiple biological signals required in the processes of tissue regeneration. Thus, newer strategies are focusing on installing multifunctionality on biomaterials. In this work, we introduce a novel peptide-based divalent platform with the capacity to simultaneously present distinct bioactive peptide motifs in a chemically controlled fashion. As a proof of concept, the integrin-binding sequences RGD and PHSRN were selected and introduced in the platform. The biofunctionalization of titanium with this platform showed a positive trend towards increased numbers of cell attachment, and statistically higher values of spreading and proliferation of osteoblast-like cells compared to control noncoated samples. Moreover, it displayed statistically comparable or improved cell responses compared to samples coated with the single peptides or with an equimolar mixture of the two motifs. Osteoblast-like cells produced higher levels of alkaline phosphatase on surfaces functionalized with the platform than on control titanium; however, these values were not statistically significant. This study demonstrates that these peptidic structures are versatile tools to convey multiple biofunctionality to biomaterials in a chemically defined manner.

  19. ASCS online fault detection and isolation based on an improved MPCA

    NASA Astrophysics Data System (ADS)

    Peng, Jianxin; Liu, Haiou; Hu, Yuhui; Xi, Junqiang; Chen, Huiyan

    2014-09-01

    Multi-way principal component analysis (MPCA) has received considerable attention and been widely used in process monitoring. A traditional MPCA algorithm unfolds multiple batches of historical data into a two-dimensional matrix and cut the matrix along the time axis to form subspaces. However, low efficiency of subspaces and difficult fault isolation are the common disadvantages for the principal component model. This paper presents a new subspace construction method based on kernel density estimation function that can effectively reduce the storage amount of the subspace information. The MPCA model and the knowledge base are built based on the new subspace. Then, fault detection and isolation with the squared prediction error (SPE) statistic and the Hotelling ( T 2) statistic are also realized in process monitoring. When a fault occurs, fault isolation based on the SPE statistic is achieved by residual contribution analysis of different variables. For fault isolation of subspace based on the T 2 statistic, the relationship between the statistic indicator and state variables is constructed, and the constraint conditions are presented to check the validity of fault isolation. Then, to improve the robustness of fault isolation to unexpected disturbances, the statistic method is adopted to set the relation between single subspace and multiple subspaces to increase the corrective rate of fault isolation. Finally fault detection and isolation based on the improved MPCA is used to monitor the automatic shift control system (ASCS) to prove the correctness and effectiveness of the algorithm. The research proposes a new subspace construction method to reduce the required storage capacity and to prove the robustness of the principal component model, and sets the relationship between the state variables and fault detection indicators for fault isolation.

  20. A statistical-based material and process guidelines for design of carbon nanotube field-effect transistors in gigascale integrated circuits.

    PubMed

    Ghavami, Behnam; Raji, Mohsen; Pedram, Hossein

    2011-08-26

    Carbon nanotube field-effect transistors (CNFETs) show great promise as building blocks of future integrated circuits. However, synthesizing single-walled carbon nanotubes (CNTs) with accurate chirality and exact positioning control has been widely acknowledged as an exceedingly complex task. Indeed, density and chirality variations in CNT growth can compromise the reliability of CNFET-based circuits. In this paper, we present a novel statistical compact model to estimate the failure probability of CNFETs to provide some material and process guidelines for the design of CNFETs in gigascale integrated circuits. We use measured CNT spacing distributions within the framework of detailed failure analysis to demonstrate that both the CNT density and the ratio of metallic to semiconducting CNTs play dominant roles in defining the failure probability of CNFETs. Besides, it is argued that the large-scale integration of these devices within an integrated circuit will be feasible only if a specific range of CNT density with an acceptable ratio of semiconducting to metallic CNTs can be adjusted in a typical synthesis process.

Top