Sample records for implementing statistical process

  1. 77 FR 46096 - Statistical Process Controls for Blood Establishments; Public Workshop

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-02

    ...] Statistical Process Controls for Blood Establishments; Public Workshop AGENCY: Food and Drug Administration... workshop entitled: ``Statistical Process Controls for Blood Establishments.'' The purpose of this public workshop is to discuss the implementation of statistical process controls to validate and monitor...

  2. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    ERIC Educational Resources Information Center

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  3. Improving Service Delivery in a County Health Department WIC Clinic: An Application of Statistical Process Control Techniques

    PubMed Central

    Boe, Debra Thingstad; Parsons, Helen

    2009-01-01

    Local public health agencies are challenged to continually improve service delivery, yet they frequently operate with constrained resources. Quality improvement methods and techniques such as statistical process control are commonly used in other industries, and they have recently been proposed as a means of improving service delivery and performance in public health settings. We analyzed a quality improvement project undertaken at a local Special Supplemental Nutrition Program for Women, Infants, and Children (WIC) clinic to reduce waiting times and improve client satisfaction with a walk-in nutrition education service. We used statistical process control techniques to evaluate initial process performance, implement an intervention, and assess process improvements. We found that implementation of these techniques significantly reduced waiting time and improved clients' satisfaction with the WIC service. PMID:19608964

  4. Implementation of Statistics Textbook Support with ICT and Portfolio Assessment Approach to Improve Students Teacher Mathematical Connection Skills

    NASA Astrophysics Data System (ADS)

    Hendikawati, P.; Dewi, N. R.

    2017-04-01

    Statistics needed for use in the data analysis process and had a comprehensive implementation in daily life so that students must master the well statistical material. The use of Statistics textbook support with ICT and portfolio assessment approach was expected to help the students to improve mathematical connection skills. The subject of this research was 30 student teachers who take Statistics courses. The results of this research are the use of Statistics textbook support with ICT and portfolio assessment approach can improve students mathematical connection skills.

  5. Statistical auditing of toxicology reports.

    PubMed

    Deaton, R R; Obenchain, R L

    1994-06-01

    Statistical auditing is a new report review process used by the quality assurance unit at Eli Lilly and Co. Statistical auditing allows the auditor to review the process by which the report was generated, as opposed to the process by which the data was generated. We have the flexibility to use different sampling techniques and still obtain thorough coverage of the report data. By properly implementing our auditing process, we can work smarter rather than harder and continue to help our customers increase the quality of their products (reports). Statistical auditing is helping our quality assurance unit meet our customers' need, while maintaining or increasing the quality of our regulatory obligations.

  6. Statistical Process Control. Impact and Opportunities for Ohio.

    ERIC Educational Resources Information Center

    Brown, Harold H.

    The first purpose of this study is to help the reader become aware of the evolution of Statistical Process Control (SPC) as it is being implemented and used in industry today. This is approached through the presentation of a brief historical account of SPC, from its inception through the technological miracle that has occurred in Japan. The…

  7. Implementing an Employee Assistance Program.

    ERIC Educational Resources Information Center

    Gam, John; And Others

    1983-01-01

    Describes in detail the implementation of an employee assistance program in a textile plant. Reviews the historical development, referral process, and termination guidelines of the program and contains descriptive statistics for six periods of the program's operation. (Author/JAC)

  8. Evaluation of hardware costs of implementing PSK signal detection circuit based on "system on chip"

    NASA Astrophysics Data System (ADS)

    Sokolovskiy, A. V.; Dmitriev, D. D.; Veisov, E. A.; Gladyshev, A. B.

    2018-05-01

    The article deals with the choice of the architecture of digital signal processing units for implementing the PSK signal detection scheme. As an assessment of the effectiveness of architectures, the required number of shift registers and computational processes are used when implementing the "system on a chip" on the chip. A statistical estimation of the normalized code sequence offset in the signal synchronization scheme for various hardware block architectures is used.

  9. Does daily nurse staffing match ward workload variability? Three hospitals' experiences.

    PubMed

    Gabbay, Uri; Bukchin, Michael

    2009-01-01

    Nurse shortage and rising healthcare resource burdens mean that appropriate workforce use is imperative. This paper aims to evaluate whether daily nursing staffing meets ward workload needs. Nurse attendance and daily nurses' workload capacity in three hospitals were evaluated. Statistical process control was used to evaluate intra-ward nurse workload capacity and day-to-day variations. Statistical process control is a statistics-based method for process monitoring that uses charts with predefined target measure and control limits. Standardization was performed for inter-ward analysis by converting ward-specific crude measures to ward-specific relative measures by dividing observed/expected. Two charts: acceptable and tolerable daily nurse workload intensity, were defined. Appropriate staffing indicators were defined as those exceeding predefined rates within acceptable and tolerable limits (50 percent and 80 percent respectively). A total of 42 percent of the overall days fell within acceptable control limits and 71 percent within tolerable control limits. Appropriate staffing indicators were met in only 33 percent of wards regarding acceptable nurse workload intensity and in only 45 percent of wards regarding tolerable workloads. The study work did not differentiate crude nurse attendance and it did not take into account patient severity since crude bed occupancy was used. Double statistical process control charts and certain staffing indicators were used, which is open to debate. Wards that met appropriate staffing indicators prove the method's feasibility. Wards that did not meet appropriate staffing indicators prove the importance and the need for process evaluations and monitoring. Methods presented for monitoring daily staffing appropriateness are simple to implement either for intra-ward day-to-day variation by using nurse workload capacity statistical process control charts or for inter-ward evaluation using standardized measure of nurse workload intensity. The real challenge will be to develop planning systems and implement corrective interventions such as dynamic and flexible daily staffing, which will face difficulties and barriers. The paper fulfils the need for workforce utilization evaluation. A simple method using available data for daily staffing appropriateness evaluation, which is easy to implement and operate, is presented. The statistical process control method enables intra-ward evaluation, while standardization by converting crude into relative measures enables inter-ward analysis. The staffing indicator definitions enable performance evaluation. This original study uses statistical process control to develop simple standardization methods and applies straightforward statistical tools. This method is not limited to crude measures, rather it uses weighted workload measures such as nursing acuity or weighted nurse level (i.e. grade/band).

  10. Coupling Sensing Hardware with Data Interrogation Software for Structural Health Monitoring

    DOE PAGES

    Farrar, Charles R.; Allen, David W.; Park, Gyuhae; ...

    2006-01-01

    The process of implementing a damage detection strategy for aerospace, civil and mechanical engineering infrastructure is referred to as structural health monitoring (SHM). The authors' approach is to address the SHM problem in the context of a statistical pattern recognition paradigm. In this paradigm, the process can be broken down into four parts: (1) Operational Evaluation, (2) Data Acquisition and Cleansing, (3) Feature Extraction and Data Compression, and (4) Statistical Model Development for Feature Discrimination. These processes must be implemented through hardware or software and, in general, some combination of these two approaches will be used. This paper will discussmore » each portion of the SHM process with particular emphasis on the coupling of a general purpose data interrogation software package for structural health monitoring with a modular wireless sensing and processing platform. More specifically, this paper will address the need to take an integrated hardware/software approach to developing SHM solutions.« less

  11. Numerical Implementation of Indicators and Statistical Control Tools in Monitoring and Evaluating CACEI-ISO Indicators of Study Program in Industrial Process by Systematization

    ERIC Educational Resources Information Center

    Ayala, Gabriela Cota; Real, Francia Angélica Karlos; Ivan, Ramirez Alvarado Edqar

    2016-01-01

    The research was conducted to determine if the study program of the career of industrial processes Technological University of Chihuahua, 1 year after that it was certified by CACEI, continues achieving the established indicators and ISO 9001: 2008, implementing quality tools, monitoring of essential indicators are determined, flow charts are…

  12. Methods and apparatuses for information analysis on shared and distributed computing systems

    DOEpatents

    Bohn, Shawn J [Richland, WA; Krishnan, Manoj Kumar [Richland, WA; Cowley, Wendy E [Richland, WA; Nieplocha, Jarek [Richland, WA

    2011-02-22

    Apparatuses and computer-implemented methods for analyzing, on shared and distributed computing systems, information comprising one or more documents are disclosed according to some aspects. In one embodiment, information analysis can comprise distributing one or more distinct sets of documents among each of a plurality of processes, wherein each process performs operations on a distinct set of documents substantially in parallel with other processes. Operations by each process can further comprise computing term statistics for terms contained in each distinct set of documents, thereby generating a local set of term statistics for each distinct set of documents. Still further, operations by each process can comprise contributing the local sets of term statistics to a global set of term statistics, and participating in generating a major term set from an assigned portion of a global vocabulary.

  13. DATA QUALITY OBJECTIVES AND STATISTICAL DESIGN SUPPORT FOR DEVELOPMENT OF A MONITORING PROTOCOL FOR RECREATIONAL WATERS

    EPA Science Inventory

    The purpose of this report is to describe the outputs of the Data Quality Objectives (DQOs) Process and discussions about developing a statistical design that will be used to implement the research study of recreational beach waters.

  14. A comparison of performance of automatic cloud coverage assessment algorithm for Formosat-2 image using clustering-based and spatial thresholding methods

    NASA Astrophysics Data System (ADS)

    Hsu, Kuo-Hsien

    2012-11-01

    Formosat-2 image is a kind of high-spatial-resolution (2 meters GSD) remote sensing satellite data, which includes one panchromatic band and four multispectral bands (Blue, Green, Red, near-infrared). An essential sector in the daily processing of received Formosat-2 image is to estimate the cloud statistic of image using Automatic Cloud Coverage Assessment (ACCA) algorithm. The information of cloud statistic of image is subsequently recorded as an important metadata for image product catalog. In this paper, we propose an ACCA method with two consecutive stages: preprocessing and post-processing analysis. For pre-processing analysis, the un-supervised K-means classification, Sobel's method, thresholding method, non-cloudy pixels reexamination, and cross-band filter method are implemented in sequence for cloud statistic determination. For post-processing analysis, Box-Counting fractal method is implemented. In other words, the cloud statistic is firstly determined via pre-processing analysis, the correctness of cloud statistic of image of different spectral band is eventually cross-examined qualitatively and quantitatively via post-processing analysis. The selection of an appropriate thresholding method is very critical to the result of ACCA method. Therefore, in this work, We firstly conduct a series of experiments of the clustering-based and spatial thresholding methods that include Otsu's, Local Entropy(LE), Joint Entropy(JE), Global Entropy(GE), and Global Relative Entropy(GRE) method, for performance comparison. The result shows that Otsu's and GE methods both perform better than others for Formosat-2 image. Additionally, our proposed ACCA method by selecting Otsu's method as the threshoding method has successfully extracted the cloudy pixels of Formosat-2 image for accurate cloud statistic estimation.

  15. 48 CFR 1401.7001-4 - Acquisition performance measurement systems.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the... through the review and oversight process. ...

  16. 48 CFR 1401.7001-4 - Acquisition performance measurement systems.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the... through the review and oversight process. ...

  17. 48 CFR 1401.7001-4 - Acquisition performance measurement systems.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the... through the review and oversight process. ...

  18. 48 CFR 1401.7001-4 - Acquisition performance measurement systems.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the... through the review and oversight process. ...

  19. Evaluation of Implementation, Adaptation and Use of the Recently Proposed Urea Cycle Disorders Guidelines.

    PubMed

    Häberle, Johannes; Huemer, Martina

    2015-01-01

    Implementation of guidelines and assessment of their adaptation is not an extensively investigated process in the field of rare diseases. However, whether targeted recipients are reached and willing and able to follow the recommendations has significant impact on the efficacy of guidelines. In 2012, a guideline for the management of urea cycle disorders (UCDs) has been published. We evaluate the efficacy of implementation, adaptation, and use of the UCD guidelines by applying different strategies. (i) Download statistics from online sources were recorded. (ii) Facilities relevant for the implementation of the guidelines were assessed in pediatric units in Germany and Austria. (iii) The guidelines were evaluated by targeted recipients using the AGREE instrument. (iv) A regional networking-based implementation process was evaluated. (i) Download statistics revealed high access with an increase in downloads over time. (ii) In 18% of hospitals ammonia testing was not available 24/7, and emergency drugs were often not available. (iii) Recipient criticism expressed in the AGREE instrument focused on incomplete inclusion of patients' perspectives. (iv) The implementation process improved the availability of ammonia measurements and access to emergency medication, patient care processes, and cooperation between nonspecialists and specialists. Interest in the UCD guidelines is high and sustained, but more precise targeting of the guidelines is advisable. Surprisingly, many hospitals do not possess all facilities necessary to apply the guidelines. Regional network and awareness campaigns result in the improvement of both facilities and knowledge.

  20. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    PubMed

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  1. Design and implementation of fishery rescue data mart system

    NASA Astrophysics Data System (ADS)

    Pan, Jun; Huang, Haiguang; Liu, Yousong

    A novel data mart based system for fishery rescue field was designed and implemented. The system runs ETL process to deal with original data from various databases and data warehouses, and then reorganized the data into the fishery rescue data mart. Next, online analytical processing (OLAP) are carried out and statistical reports are generated automatically. Particularly, quick configuration schemes are designed to configure query dimensions and OLAP data sets. The configuration file will be transformed into statistic interfaces automatically through a wizard-style process. The system provides various forms of reporting files, including crystal reports, flash graphical reports, and two-dimensional data grids. In addition, a wizard style interface was designed to guide users customizing inquiry processes, making it possible for nontechnical staffs to access customized reports. Characterized by quick configuration, safeness and flexibility, the system has been successfully applied in city fishery rescue department.

  2. Computer Administering of the Psychological Investigations: Set-Relational Representation

    NASA Astrophysics Data System (ADS)

    Yordzhev, Krasimir

    Computer administering of a psychological investigation is the computer representation of the entire procedure of psychological assessments - test construction, test implementation, results evaluation, storage and maintenance of the developed database, its statistical processing, analysis and interpretation. A mathematical description of psychological assessment with the aid of personality tests is discussed in this article. The set theory and the relational algebra are used in this description. A relational model of data, needed to design a computer system for automation of certain psychological assessments is given. Some finite sets and relation on them, which are necessary for creating a personality psychological test, are described. The described model could be used to develop real software for computer administering of any psychological test and there is full automation of the whole process: test construction, test implementation, result evaluation, storage of the developed database, statistical implementation, analysis and interpretation. A software project for computer administering personality psychological tests is suggested.

  3. General purpose graphic processing unit implementation of adaptive pulse compression algorithms

    NASA Astrophysics Data System (ADS)

    Cai, Jingxiao; Zhang, Yan

    2017-07-01

    This study introduces a practical approach to implement real-time signal processing algorithms for general surveillance radar based on NVIDIA graphical processing units (GPUs). The pulse compression algorithms are implemented using compute unified device architecture (CUDA) libraries such as CUDA basic linear algebra subroutines and CUDA fast Fourier transform library, which are adopted from open source libraries and optimized for the NVIDIA GPUs. For more advanced, adaptive processing algorithms such as adaptive pulse compression, customized kernel optimization is needed and investigated. A statistical optimization approach is developed for this purpose without needing much knowledge of the physical configurations of the kernels. It was found that the kernel optimization approach can significantly improve the performance. Benchmark performance is compared with the CPU performance in terms of processing accelerations. The proposed implementation framework can be used in various radar systems including ground-based phased array radar, airborne sense and avoid radar, and aerospace surveillance radar.

  4. Statistical process control: a practical application for hospitals.

    PubMed

    VanderVeen, L M

    1992-01-01

    A six-step plan based on using statistics was designed to improve quality in the central processing and distribution department of a 223-bed hospital in Oakland, CA. This article describes how the plan was implemented sequentially, starting with the crucial first step of obtaining administrative support. The QI project succeeded in overcoming beginners' fear of statistics and in training both managers and staff to use inspection checklists, Pareto charts, cause-and-effect diagrams, and control charts. The best outcome of the program was the increased commitment to quality improvement by the members of the department.

  5. 28 CFR Appendix D to Part 61 - Office of Justice Assistance, Research, and Statistics Procedures Relating to the Implementation...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... of the NEPA process and policies of the agencies can be obtained from: Policy and Management Planning... funded efforts; training programs, court improvement projects, research, and gathering statistical data. (2) Minor renovation projects or remodeling. (c) Actions which normally require environmental...

  6. 28 CFR Appendix D to Part 61 - Office of Justice Assistance, Research, and Statistics Procedures Relating to the Implementation...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... of the NEPA process and policies of the agencies can be obtained from: Policy and Management Planning... funded efforts; training programs, court improvement projects, research, and gathering statistical data. (2) Minor renovation projects or remodeling. (c) Actions which normally require environmental...

  7. 28 CFR Appendix D to Part 61 - Office of Justice Assistance, Research, and Statistics Procedures Relating to the Implementation...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... of the NEPA process and policies of the agencies can be obtained from: Policy and Management Planning... funded efforts; training programs, court improvement projects, research, and gathering statistical data. (2) Minor renovation projects or remodeling. (c) Actions which normally require environmental...

  8. Observing fermionic statistics with photons in arbitrary processes

    PubMed Central

    Matthews, Jonathan C. F.; Poulios, Konstantinos; Meinecke, Jasmin D. A.; Politi, Alberto; Peruzzo, Alberto; Ismail, Nur; Wörhoff, Kerstin; Thompson, Mark G.; O'Brien, Jeremy L.

    2013-01-01

    Quantum mechanics defines two classes of particles-bosons and fermions-whose exchange statistics fundamentally dictate quantum dynamics. Here we develop a scheme that uses entanglement to directly observe the correlated detection statistics of any number of fermions in any physical process. This approach relies on sending each of the entangled particles through identical copies of the process and by controlling a single phase parameter in the entangled state, the correlated detection statistics can be continuously tuned between bosonic and fermionic statistics. We implement this scheme via two entangled photons shared across the polarisation modes of a single photonic chip to directly mimic the fermion, boson and intermediate behaviour of two-particles undergoing a continuous time quantum walk. The ability to simulate fermions with photons is likely to have applications for verifying boson scattering and for observing particle correlations in analogue simulation using any physical platform that can prepare the entangled state prescribed here. PMID:23531788

  9. Guideline implementation in clinical practice: use of statistical process control charts as visual feedback devices.

    PubMed

    Al-Hussein, Fahad A

    2009-01-01

    To use statistical control charts in a series of audits to improve the acceptance and consistant use of guidelines, and reduce the variations in prescription processing in primary health care. A series of audits were done at the main satellite of King Saud Housing Family and Community Medicine Center, National Guard Health Affairs, Riyadh, where three general practitioners and six pharmacists provide outpatient care to about 3000 residents. Audits were carried out every fortnight to calculate the proportion of prescriptions that did not conform to the given guidelines of prescribing and dispensing. Simple random samples of thirty were chosen from a sampling frame of all prescriptions given in the two previous weeks. Thirty six audits were carried out from September 2004 to February 2006. P-charts were constructed around a parametric specification of non-conformities not exceeding 25%. Of the 1081 prescriptions, the most frequent non-conformity was failure to write generic names (35.5%), followed by the failure to record patient's weight (16.4%), pharmacist's name (14.3%), duration of therapy (9.1%), and the use of inappropriate abbreviations (6.0%). Initially, 100% of prescriptions did not conform to the guidelines, but within a period of three months, this came down to 40%. A process of audits in the context of statistical process control is necessary for any improvement in the implementation of guidelines in primary care. Statistical process control charts are an effective means of visual feedback to the care providers.

  10. An On-Demand Optical Quantum Random Number Generator with In-Future Action and Ultra-Fast Response

    PubMed Central

    Stipčević, Mario; Ursin, Rupert

    2015-01-01

    Random numbers are essential for our modern information based society e.g. in cryptography. Unlike frequently used pseudo-random generators, physical random number generators do not depend on complex algorithms but rather on a physicsal process to provide true randomness. Quantum random number generators (QRNG) do rely on a process, wich can be described by a probabilistic theory only, even in principle. Here we present a conceptualy simple implementation, which offers a 100% efficiency of producing a random bit upon a request and simultaneously exhibits an ultra low latency. A careful technical and statistical analysis demonstrates its robustness against imperfections of the actual implemented technology and enables to quickly estimate randomness of very long sequences. Generated random numbers pass standard statistical tests without any post-processing. The setup described, as well as the theory presented here, demonstrate the maturity and overall understanding of the technology. PMID:26057576

  11. The Krigifier: A Procedure for Generating Pseudorandom Nonlinear Objective Functions for Computational Experimentation

    NASA Technical Reports Server (NTRS)

    Trosset, Michael W.

    1999-01-01

    Comprehensive computational experiments to assess the performance of algorithms for numerical optimization require (among other things) a practical procedure for generating pseudorandom nonlinear objective functions. We propose a procedure that is based on the convenient fiction that objective functions are realizations of stochastic processes. This report details the calculations necessary to implement our procedure for the case of certain stationary Gaussian processes and presents a specific implementation in the statistical programming language S-PLUS.

  12. A Theory of Information Quality and a Framework for Its Implementation in the Requirements Engineering Process

    ERIC Educational Resources Information Center

    Grenn, Michael W.

    2013-01-01

    This dissertation introduces a theory of information quality to explain macroscopic behavior observed in the systems engineering process. The theory extends principles of Shannon's mathematical theory of communication [1948] and statistical mechanics to information development processes concerned with the flow, transformation, and meaning of…

  13. Use of ventilator associated pneumonia bundle and statistical process control chart to decrease VAP rate in Syria.

    PubMed

    Alsadat, Reem; Al-Bardan, Hussam; Mazloum, Mona N; Shamah, Asem A; Eltayeb, Mohamed F E; Marie, Ali; Dakkak, Abdulrahman; Naes, Ola; Esber, Faten; Betelmal, Ibrahim; Kherallah, Mazen

    2012-10-01

    Implementation of ventilator associated pneumonia (VAP) bundle as a performance improvement project in the critical care units for all mechanically ventilated patients aiming to decrease the VAP rates. VAP bundle was implemented in 4 teaching hospitals after educational sessions and compliance rates along with VAP rates were monitored using statistical process control charts. VAP bundle compliance rates were steadily increasing from 33 to 80% in hospital 1, from 33 to 86% in hospital 2 and from 83 to 100% in hospital 3 during the study period. The VAP bundle was not applied in hospital 4 therefore no data was available. A target level of 95% was reached only in hospital 3. This correlated with a decrease in VAP rates from 30 to 6.4 per 1000 ventilator days in hospital 1, from 12 to 4.9 per 1000 ventilator days in hospital 3, whereas VAP rate failed to decrease in hospital 2 (despite better compliance) and it remained high around 33 per 1000 ventilator days in hospital 4 where VAP bundle was not implemented. VAP bundle has performed differently in different hospitals in our study. Prevention of VAP requires a multidimensional strategy that includes strict infection control interventions, VAP bundle implementation, process and outcome surveillance and education.

  14. A quantitative assessment of patient and nurse outcomes of bedside nursing report implementation.

    PubMed

    Sand-Jecklin, Kari; Sherman, Jay

    2014-10-01

    To quantify quantitative outcomes of a practice change to a blended form of bedside nursing report. The literature identifies several benefits of bedside nursing shift report. However, published studies have not adequately quantified outcomes related to this process change, having either small or unreported sample sizes or not testing for statistical significance. Quasi-experimental pre- and postimplementation design. Seven medical-surgical units in a large university hospital implemented a blend of recorded and bedside nursing report. Outcomes monitored included patient and nursing satisfaction, patient falls, nursing overtime and medication errors. We found statistically significant improvements postimplementation in four patient survey items specifically impacted by the change to bedside report. Nursing perceptions of report were significantly improved in the areas of patient safety and involvement in care and nurse accountability postimplementation. However, there was a decline in nurse perception that report took a reasonable amount of time after bedside report implementation; contrary to these perceptions, there was no significant increase in nurse overtime. Patient falls at shift change decreased substantially after the implementation of bedside report. An intervening variable during the study period invalidated the comparison of medication errors pre- and postintervention. There was some indication from both patients and nurses that bedside report was not always consistently implemented. Several positive outcomes were documented in relation to the implementation of a blended bedside shift report, with few drawbacks. Nurse attitudes about report at the final data collection were more positive than at the initial postimplementation data collection. If properly implemented, nursing bedside report can result in improved patient and nursing satisfaction and patient safety outcomes. However, managers should involve staff nurses in the implementation process and continue to monitor consistency in report format as well as satisfaction with the process. © 2014 John Wiley & Sons Ltd.

  15. Changes in Library Technology and Reference Desk Statistics: Is There a Relationship?

    ERIC Educational Resources Information Center

    Thomsett-Scott, Beth; Reese, Patricia E.

    2006-01-01

    The incorporation of technology into library processes has tremendously impacted staff and users alike. The University of North Texas (UNT) Libraries is no exception. Sixteen years of reference statistics are analyzed to examine the relationships between the implementation of CD-ROMs and web-based resources and the number of reference questions.…

  16. 48 CFR 1401.7001-4 - Acquisition performance measurement systems.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-pronged approach that includes self assessment, statistical data for validation and flexible quality... regulations governing the acquisition process; and (3) Identify and implement changes necessary to improve the...

  17. A Complex Approach to UXO Discrimination: Combining Advanced EMI Forward Models and Statistical Signal Processing

    DTIC Science & Technology

    2012-01-01

    discrimination at live-UXO sites. Namely, under this project first we developed and implemented advanced, physically complete forward EMI models such as, the...detection and discrimination at live-UXO sites. Namely, under this project first we developed and implemented advanced, physically complete forward EMI...Shubitidze of Sky Research and Dartmouth College, conceived, implemented , and tested most of the approaches presented in this report. He developed

  18. Design of experiments (DoE) in pharmaceutical development.

    PubMed

    N Politis, Stavros; Colombo, Paolo; Colombo, Gaia; M Rekkas, Dimitrios

    2017-06-01

    At the beginning of the twentieth century, Sir Ronald Fisher introduced the concept of applying statistical analysis during the planning stages of research rather than at the end of experimentation. When statistical thinking is applied from the design phase, it enables to build quality into the product, by adopting Deming's profound knowledge approach, comprising system thinking, variation understanding, theory of knowledge, and psychology. The pharmaceutical industry was late in adopting these paradigms, compared to other sectors. It heavily focused on blockbuster drugs, while formulation development was mainly performed by One Factor At a Time (OFAT) studies, rather than implementing Quality by Design (QbD) and modern engineering-based manufacturing methodologies. Among various mathematical modeling approaches, Design of Experiments (DoE) is extensively used for the implementation of QbD in both research and industrial settings. In QbD, product and process understanding is the key enabler of assuring quality in the final product. Knowledge is achieved by establishing models correlating the inputs with the outputs of the process. The mathematical relationships of the Critical Process Parameters (CPPs) and Material Attributes (CMAs) with the Critical Quality Attributes (CQAs) define the design space. Consequently, process understanding is well assured and rationally leads to a final product meeting the Quality Target Product Profile (QTPP). This review illustrates the principles of quality theory through the work of major contributors, the evolution of the QbD approach and the statistical toolset for its implementation. As such, DoE is presented in detail since it represents the first choice for rational pharmaceutical development.

  19. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    PubMed

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  20. Compensation for the signal processing characteristics of ultrasound B-mode scanners in adaptive speckle reduction.

    PubMed

    Crawford, D C; Bell, D S; Bamber, J C

    1993-01-01

    A systematic method to compensate for nonlinear amplification of individual ultrasound B-scanners has been investigated in order to optimise performance of an adaptive speckle reduction (ASR) filter for a wide range of clinical ultrasonic imaging equipment. Three potential methods have been investigated: (1) a method involving an appropriate selection of the speckle recognition feature was successful when the scanner signal processing executes simple logarithmic compressions; (2) an inverse transform (decompression) of the B-mode image was effective in correcting for the measured characteristics of image data compression when the algorithm was implemented in full floating point arithmetic; (3) characterising the behaviour of the statistical speckle recognition feature under conditions of speckle noise was found to be the method of choice for implementation of the adaptive speckle reduction algorithm in limited precision integer arithmetic. In this example, the statistical features of variance and mean were investigated. The third method may be implemented on commercially available fast image processing hardware and is also better suited for transfer into dedicated hardware to facilitate real-time adaptive speckle reduction. A systematic method is described for obtaining ASR calibration data from B-mode images of a speckle producing phantom.

  1. Implementation of Statistical Process Control: Evaluating the Mechanical Performance of a Candidate Silicone Elastomer Docking Seal

    NASA Technical Reports Server (NTRS)

    Oravec, Heather Ann; Daniels, Christopher C.

    2014-01-01

    The National Aeronautics and Space Administration has been developing a novel docking system to meet the requirements of future exploration missions to low-Earth orbit and beyond. A dynamic gas pressure seal is located at the main interface between the active and passive mating components of the new docking system. This seal is designed to operate in the harsh space environment, but is also to perform within strict loading requirements while maintaining an acceptable level of leak rate. In this study, a candidate silicone elastomer seal was designed, and multiple subscale test articles were manufactured for evaluation purposes. The force required to fully compress each test article at room temperature was quantified and found to be below the maximum allowable load for the docking system. However, a significant amount of scatter was observed in the test results. Due to the stochastic nature of the mechanical performance of this candidate docking seal, a statistical process control technique was implemented to isolate unusual compression behavior from typical mechanical performance. The results of this statistical analysis indicated a lack of process control, suggesting a variation in the manufacturing phase of the process. Further investigation revealed that changes in the manufacturing molding process had occurred which may have influenced the mechanical performance of the seal. This knowledge improves the chance of this and future space seals to satisfy or exceed design specifications.

  2. Making It Right the First Time.

    ERIC Educational Resources Information Center

    Wilcox, John

    1987-01-01

    The author discusses how using statistical process control can help manufacturers save money and produce a better product. He covers barriers to its implementation within an organization, focusing on training workers in the methods. (CH)

  3. User’s guide for MapMark4GUI—A graphical user interface for the MapMark4 R package

    USGS Publications Warehouse

    Shapiro, Jason

    2018-05-29

    MapMark4GUI is an R graphical user interface (GUI) developed by the U.S. Geological Survey to support user implementation of the MapMark4 R statistical software package. MapMark4 was developed by the U.S. Geological Survey to implement probability calculations for simulating undiscovered mineral resources in quantitative mineral resource assessments. The GUI provides an easy-to-use tool to input data, run simulations, and format output results for the MapMark4 package. The GUI is written and accessed in the R statistical programming language. This user’s guide includes instructions on installing and running MapMark4GUI and descriptions of the statistical output processes, output files, and test data files.

  4. Industrial implementation of spatial variability control by real-time SPC

    NASA Astrophysics Data System (ADS)

    Roule, O.; Pasqualini, F.; Borde, M.

    2016-10-01

    Advanced technology nodes require more and more information to get the wafer process well setup. The critical dimension of components decreases following Moore's law. At the same time, the intra-wafer dispersion linked to the spatial non-uniformity of tool's processes is not capable to decrease in the same proportions. APC systems (Advanced Process Control) are being developed in waferfab to automatically adjust and tune wafer processing, based on a lot of process context information. It can generate and monitor complex intrawafer process profile corrections between different process steps. It leads us to put under control the spatial variability, in real time by our SPC system (Statistical Process Control). This paper will outline the architecture of an integrated process control system for shape monitoring in 3D, implemented in waferfab.

  5. Molecular Monte Carlo Simulations Using Graphics Processing Units: To Waste Recycle or Not?

    PubMed

    Kim, Jihan; Rodgers, Jocelyn M; Athènes, Manuel; Smit, Berend

    2011-10-11

    In the waste recycling Monte Carlo (WRMC) algorithm, (1) multiple trial states may be simultaneously generated and utilized during Monte Carlo moves to improve the statistical accuracy of the simulations, suggesting that such an algorithm may be well posed for implementation in parallel on graphics processing units (GPUs). In this paper, we implement two waste recycling Monte Carlo algorithms in CUDA (Compute Unified Device Architecture) using uniformly distributed random trial states and trial states based on displacement random-walk steps, and we test the methods on a methane-zeolite MFI framework system to evaluate their utility. We discuss the specific implementation details of the waste recycling GPU algorithm and compare the methods to other parallel algorithms optimized for the framework system. We analyze the relationship between the statistical accuracy of our simulations and the CUDA block size to determine the efficient allocation of the GPU hardware resources. We make comparisons between the GPU and the serial CPU Monte Carlo implementations to assess speedup over conventional microprocessors. Finally, we apply our optimized GPU algorithms to the important problem of determining free energy landscapes, in this case for molecular motion through the zeolite LTA.

  6. Automating approximate Bayesian computation by local linear regression.

    PubMed

    Thornton, Kevin R

    2009-07-07

    In several biological contexts, parameter inference often relies on computationally-intensive techniques. "Approximate Bayesian Computation", or ABC, methods based on summary statistics have become increasingly popular. A particular flavor of ABC based on using a linear regression to approximate the posterior distribution of the parameters, conditional on the summary statistics, is computationally appealing, yet no standalone tool exists to automate the procedure. Here, I describe a program to implement the method. The software package ABCreg implements the local linear-regression approach to ABC. The advantages are: 1. The code is standalone, and fully-documented. 2. The program will automatically process multiple data sets, and create unique output files for each (which may be processed immediately in R), facilitating the testing of inference procedures on simulated data, or the analysis of multiple data sets. 3. The program implements two different transformation methods for the regression step. 4. Analysis options are controlled on the command line by the user, and the program is designed to output warnings for cases where the regression fails. 5. The program does not depend on any particular simulation machinery (coalescent, forward-time, etc.), and therefore is a general tool for processing the results from any simulation. 6. The code is open-source, and modular.Examples of applying the software to empirical data from Drosophila melanogaster, and testing the procedure on simulated data, are shown. In practice, the ABCreg simplifies implementing ABC based on local-linear regression.

  7. Quality Control of the Print with the Application of Statistical Methods

    NASA Astrophysics Data System (ADS)

    Simonenko, K. V.; Bulatova, G. S.; Antropova, L. B.; Varepo, L. G.

    2018-04-01

    The basis for standardizing the process of offset printing is the control of print quality indicators. The solution of this problem has various approaches, among which the most important are statistical methods. Practical implementation of them for managing the quality of the printing process is very relevant and is reflected in this paper. The possibility of using the method of constructing a Control Card to identify the reasons for the deviation of the optical density for a triad of inks in offset printing is shown.

  8. Real-Time Noise Removal for Line-Scanning Hyperspectral Devices Using a Minimum Noise Fraction-Based Approach

    PubMed Central

    Bjorgan, Asgeir; Randeberg, Lise Lyngsnes

    2015-01-01

    Processing line-by-line and in real-time can be convenient for some applications of line-scanning hyperspectral imaging technology. Some types of processing, like inverse modeling and spectral analysis, can be sensitive to noise. The MNF (minimum noise fraction) transform provides suitable denoising performance, but requires full image availability for the estimation of image and noise statistics. In this work, a modified algorithm is proposed. Incrementally-updated statistics enables the algorithm to denoise the image line-by-line. The denoising performance has been compared to conventional MNF and found to be equal. With a satisfying denoising performance and real-time implementation, the developed algorithm can denoise line-scanned hyperspectral images in real-time. The elimination of waiting time before denoised data are available is an important step towards real-time visualization of processed hyperspectral data. The source code can be found at http://www.github.com/ntnu-bioopt/mnf. This includes an implementation of conventional MNF denoising. PMID:25654717

  9. Low-cost and high-speed optical mark reader based on an intelligent line camera

    NASA Astrophysics Data System (ADS)

    Hussmann, Stephan; Chan, Leona; Fung, Celine; Albrecht, Martin

    2003-08-01

    Optical Mark Recognition (OMR) is thoroughly reliable and highly efficient provided that high standards are maintained at both the planning and implementation stages. It is necessary to ensure that OMR forms are designed with due attention to data integrity checks, the best use is made of features built into the OMR, used data integrity is checked before the data is processed and data is validated before it is processed. This paper describes the design and implementation of an OMR prototype system for marking multiple-choice tests automatically. Parameter testing is carried out before the platform and the multiple-choice answer sheet has been designed. Position recognition and position verification methods have been developed and implemented in an intelligent line scan camera. The position recognition process is implemented into a Field Programmable Gate Array (FPGA), whereas the verification process is implemented into a micro-controller. The verified results are then sent to the Graphical User Interface (GUI) for answers checking and statistical analysis. At the end of the paper the proposed OMR system will be compared with commercially available system on the market.

  10. Guidelines for Assessment and Instruction in Statistics Education (GAISE): extending GAISE into nursing education.

    PubMed

    Hayat, Matthew J

    2014-04-01

    Statistics coursework is usually a core curriculum requirement for nursing students at all degree levels. The American Association of Colleges of Nursing (AACN) establishes curriculum standards for academic nursing programs. However, the AACN provides little guidance on statistics education and does not offer standardized competency guidelines or recommendations about course content or learning objectives. Published standards may be used in the course development process to clarify course content and learning objectives. This article includes suggestions for implementing and integrating recommendations given in the Guidelines for Assessment and Instruction in Statistics Education (GAISE) report into statistics education for nursing students. Copyright 2014, SLACK Incorporated.

  11. Effects of a proposed quality improvement process in the proportion of the reported ultrasound findings unsupported by stored images.

    PubMed

    Schenone, Mauro; Ziebarth, Sarah; Duncan, Jose; Stokes, Lea; Hernandez, Angela

    2018-02-05

    To investigate the proportion of documented ultrasound findings that were unsupported by stored ultrasound images in the obstetric ultrasound unit, before and after the implementation of a quality improvement process consisting of a checklist and feedback. A quality improvement process was created involving utilization of a checklist and feedback from physician to sonographer. The feedback was based on findings of the physician's review of the report and images using a check list. To assess the impact of this process, two groups were compared. Group 1 consisted of 58 ultrasound reports created prior to initiation of the process. Group 2 included 65 ultrasound reports created after process implementation. Each chart was reviewed by a physician and a sonographer. Findings considered unsupported by stored images by both reviewers were used for analysis, and the proportion of unsupported findings was compared between the two groups. Results are expressed as mean ± standard error. A p value of < .05 was used to determine statistical significance. Univariate analysis of baseline characteristics and potential confounders showed no statistically significant difference between the groups. The mean proportion of unsupported findings in Group 1 was 5.1 ± 0.87, with Group 2 having a significantly lower proportion (2.6 ± 0.62) (p value = .018). Results suggest a significant decrease in the proportion of unsupported findings in ultrasound reports after quality improvement process implementation. Thus, we present a simple yet effective quality improvement process to reduce unsupported ultrasound findings.

  12. Quality control troubleshooting tools for the mill floor

    Treesearch

    John Dramm

    2000-01-01

    Statistical Process Control (SPC) provides effective tools for improving process quality in the forest products industry resulting in reduced costs and improved productivity. Implementing SPC helps identify and locate problems that occur in wood products manufacturing. SPC tools achieve their real value when applied on the mill floor for monitoring and troubleshooting...

  13. Information integration and diagnosis analysis of equipment status and production quality for machining process

    NASA Astrophysics Data System (ADS)

    Zan, Tao; Wang, Min; Hu, Jianzhong

    2010-12-01

    Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.

  14. Implementation of GPU accelerated SPECT reconstruction with Monte Carlo-based scatter correction.

    PubMed

    Bexelius, Tobias; Sohlberg, Antti

    2018-06-01

    Statistical SPECT reconstruction can be very time-consuming especially when compensations for collimator and detector response, attenuation, and scatter are included in the reconstruction. This work proposes an accelerated SPECT reconstruction algorithm based on graphics processing unit (GPU) processing. Ordered subset expectation maximization (OSEM) algorithm with CT-based attenuation modelling, depth-dependent Gaussian convolution-based collimator-detector response modelling, and Monte Carlo-based scatter compensation was implemented using OpenCL. The OpenCL implementation was compared against the existing multi-threaded OSEM implementation running on a central processing unit (CPU) in terms of scatter-to-primary ratios, standardized uptake values (SUVs), and processing speed using mathematical phantoms and clinical multi-bed bone SPECT/CT studies. The difference in scatter-to-primary ratios, visual appearance, and SUVs between GPU and CPU implementations was minor. On the other hand, at its best, the GPU implementation was noticed to be 24 times faster than the multi-threaded CPU version on a normal 128 × 128 matrix size 3 bed bone SPECT/CT data set when compensations for collimator and detector response, attenuation, and scatter were included. GPU SPECT reconstructions show great promise as an every day clinical reconstruction tool.

  15. A survey of statistics in three UK general practice journal

    PubMed Central

    Rigby, Alan S; Armstrong, Gillian K; Campbell, Michael J; Summerton, Nick

    2004-01-01

    Background Many medical specialities have reviewed the statistical content of their journals. To our knowledge this has not been done in general practice. Given the main role of a general practitioner as a diagnostician we thought it would be of interest to see whether the statistical methods reported reflect the diagnostic process. Methods Hand search of three UK journals of general practice namely the British Medical Journal (general practice section), British Journal of General Practice and Family Practice over a one-year period (1 January to 31 December 2000). Results A wide variety of statistical techniques were used. The most common methods included t-tests and Chi-squared tests. There were few articles reporting likelihood ratios and other useful diagnostic methods. There was evidence that the journals with the more thorough statistical review process reported a more complex and wider variety of statistical techniques. Conclusions The BMJ had a wider range and greater diversity of statistical methods than the other two journals. However, in all three journals there was a dearth of papers reflecting the diagnostic process. Across all three journals there were relatively few papers describing randomised controlled trials thus recognising the difficulty of implementing this design in general practice. PMID:15596014

  16. Low-level processing for real-time image analysis

    NASA Technical Reports Server (NTRS)

    Eskenazi, R.; Wilf, J. M.

    1979-01-01

    A system that detects object outlines in television images in real time is described. A high-speed pipeline processor transforms the raw image into an edge map and a microprocessor, which is integrated into the system, clusters the edges, and represents them as chain codes. Image statistics, useful for higher level tasks such as pattern recognition, are computed by the microprocessor. Peak intensity and peak gradient values are extracted within a programmable window and are used for iris and focus control. The algorithms implemented in hardware and the pipeline processor architecture are described. The strategy for partitioning functions in the pipeline was chosen to make the implementation modular. The microprocessor interface allows flexible and adaptive control of the feature extraction process. The software algorithms for clustering edge segments, creating chain codes, and computing image statistics are also discussed. A strategy for real time image analysis that uses this system is given.

  17. Process-based organization design and hospital efficiency.

    PubMed

    Vera, Antonio; Kuntz, Ludwig

    2007-01-01

    The central idea of process-based organization design is that organizing a firm around core business processes leads to cost reductions and quality improvements. We investigated theoretically and empirically whether the implementation of a process-based organization design is advisable in hospitals. The data came from a database compiled by the Statistical Office of the German federal state of Rheinland-Pfalz and from a written questionnaire, which was sent to the chief executive officers (CEOs) of all 92 hospitals in this federal state. We used data envelopment analysis (DEA) to measure hospital efficiency, and factor analysis and regression analysis to test our hypothesis. Our principal finding is that a high degree of process-based organization has a moderate but significant positive effect on the efficiency of hospitals. The main implication is that hospitals should implement a process-based organization to improve their efficiency. However, to actually achieve positive effects on efficiency, it is of paramount importance to observe some implementation rules, in particular to mobilize physician participation and to create an adequate organizational culture.

  18. Design of an FPGA-Based Algorithm for Real-Time Solutions of Statistics-Based Positioning

    PubMed Central

    DeWitt, Don; Johnson-Williams, Nathan G.; Miyaoka, Robert S.; Li, Xiaoli; Lockhart, Cate; Lewellen, Tom K.; Hauck, Scott

    2010-01-01

    We report on the implementation of an algorithm and hardware platform to allow real-time processing of the statistics-based positioning (SBP) method for continuous miniature crystal element (cMiCE) detectors. The SBP method allows an intrinsic spatial resolution of ~1.6 mm FWHM to be achieved using our cMiCE design. Previous SBP solutions have required a postprocessing procedure due to the computation and memory intensive nature of SBP. This new implementation takes advantage of a combination of algebraic simplifications, conversion to fixed-point math, and a hierarchal search technique to greatly accelerate the algorithm. For the presented seven stage, 127 × 127 bin LUT implementation, these algorithm improvements result in a reduction from >7 × 106 floating-point operations per event for an exhaustive search to < 5 × 103 integer operations per event. Simulations show nearly identical FWHM positioning resolution for this accelerated SBP solution, and positioning differences of <0.1 mm from the exhaustive search solution. A pipelined field programmable gate array (FPGA) implementation of this optimized algorithm is able to process events in excess of 250 K events per second, which is greater than the maximum expected coincidence rate for an individual detector. In contrast with all detectors being processed at a centralized host, as in the current system, a separate FPGA is available at each detector, thus dividing the computational load. These methods allow SBP results to be calculated in real-time and to be presented to the image generation components in real-time. A hardware implementation has been developed using a commercially available prototype board. PMID:21197135

  19. A quality control circle process to improve implementation effect of prevention measures for high-risk patients.

    PubMed

    Feng, Haixia; Li, Guohong; Xu, Cuirong; Ju, Changping; Suo, Peiheng

    2017-12-01

    The aim of the study was to analyse the influence of prevention measures on pressure injuries for high-risk patients and to establish the most appropriate methods of implementation. Nurses assessed patients using a checklist and factors influencing the prevention of a pressure injury determined by brain storming. A specific series of measures was drawn up and an estimate of risk of pressure injury determined using the Braden Scale, analysis of nursing documents, implementation of prevention measures for pressure sores and awareness of the system both before and after carrying out a quality control circle (QCC) process. The overall scores of implementation of prevention measures ranged from 74.86 ± 14.24 to 87.06 ± 17.04, a result that was statistically significant (P < 0.0025). The Braden Scale scores ranged from 8.53 ± 3.21 to 13.48 ± 3.57. The nursing document scores ranged from 7.67 ± 3.98 to 10.12 ± 1.63; prevention measure scores ranged from 11.48 ± 4.18 to 13.96 ± 3.92. Differences in all of the above results are statistically significant (P < 0.05). Implementation of a QCC can standardise and improve the prevention measures for patients who are vulnerable to pressure sores and is of practical importance to their prevention and control. © 2017 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  20. Implementation and evaluation of an efficient secure computation system using ‘R’ for healthcare statistics

    PubMed Central

    Chida, Koji; Morohashi, Gembu; Fuji, Hitoshi; Magata, Fumihiko; Fujimura, Akiko; Hamada, Koki; Ikarashi, Dai; Yamamoto, Ryuichi

    2014-01-01

    Background and objective While the secondary use of medical data has gained attention, its adoption has been constrained due to protection of patient privacy. Making medical data secure by de-identification can be problematic, especially when the data concerns rare diseases. We require rigorous security management measures. Materials and methods Using secure computation, an approach from cryptography, our system can compute various statistics over encrypted medical records without decrypting them. An issue of secure computation is that the amount of processing time required is immense. We implemented a system that securely computes healthcare statistics from the statistical computing software ‘R’ by effectively combining secret-sharing-based secure computation with original computation. Results Testing confirmed that our system could correctly complete computation of average and unbiased variance of approximately 50 000 records of dummy insurance claim data in a little over a second. Computation including conditional expressions and/or comparison of values, for example, t test and median, could also be correctly completed in several tens of seconds to a few minutes. Discussion If medical records are simply encrypted, the risk of leaks exists because decryption is usually required during statistical analysis. Our system possesses high-level security because medical records remain in encrypted state even during statistical analysis. Also, our system can securely compute some basic statistics with conditional expressions using ‘R’ that works interactively while secure computation protocols generally require a significant amount of processing time. Conclusions We propose a secure statistical analysis system using ‘R’ for medical data that effectively integrates secret-sharing-based secure computation and original computation. PMID:24763677

  1. Implementation and evaluation of an efficient secure computation system using 'R' for healthcare statistics.

    PubMed

    Chida, Koji; Morohashi, Gembu; Fuji, Hitoshi; Magata, Fumihiko; Fujimura, Akiko; Hamada, Koki; Ikarashi, Dai; Yamamoto, Ryuichi

    2014-10-01

    While the secondary use of medical data has gained attention, its adoption has been constrained due to protection of patient privacy. Making medical data secure by de-identification can be problematic, especially when the data concerns rare diseases. We require rigorous security management measures. Using secure computation, an approach from cryptography, our system can compute various statistics over encrypted medical records without decrypting them. An issue of secure computation is that the amount of processing time required is immense. We implemented a system that securely computes healthcare statistics from the statistical computing software 'R' by effectively combining secret-sharing-based secure computation with original computation. Testing confirmed that our system could correctly complete computation of average and unbiased variance of approximately 50,000 records of dummy insurance claim data in a little over a second. Computation including conditional expressions and/or comparison of values, for example, t test and median, could also be correctly completed in several tens of seconds to a few minutes. If medical records are simply encrypted, the risk of leaks exists because decryption is usually required during statistical analysis. Our system possesses high-level security because medical records remain in encrypted state even during statistical analysis. Also, our system can securely compute some basic statistics with conditional expressions using 'R' that works interactively while secure computation protocols generally require a significant amount of processing time. We propose a secure statistical analysis system using 'R' for medical data that effectively integrates secret-sharing-based secure computation and original computation. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  2. Signal-Processing Algorithm Development for the ACLAIM Sensor

    NASA Technical Reports Server (NTRS)

    vonLaven, Scott

    1995-01-01

    Methods for further minimizing the risk by making use of previous lidar observations were investigated. EOFs are likely to play an important role in these methods, and a procedure for extracting EOFs from data has been implemented, The new processing methods involving EOFs could range from extrapolation, as discussed, to more complicated statistical procedures for maintaining low unstart risk.

  3. Implementation of a real-time statistical process control system in hardwood sawmills

    Treesearch

    Timothy M. Young; Brian H. Bond; Jan Wiedenbeck

    2007-01-01

    Variation in sawmill processes reduces the financial benefit of converting fiber from a log into lumber. Lumber is intentionally oversized during manufacture to allow for sawing variation, shrinkage from drying, and final surfacing. This oversizing of lumber due to sawing variation requires higher operating targets and leads to suboptimal fiber recovery. For more than...

  4. Role of University-Industry-Government Linkages in the Innovation Processes of a Small Catching-Up Economy

    ERIC Educational Resources Information Center

    Varblane, Urmas; Mets, Tonis; Ukrainski, Kadri

    2008-01-01

    During the transformation process from a command economy, the extraordinary statist university-industry-government (UIG) linkages model was replaced by an extreme version of laissez-faire relationships. A more modern interaction-based UIG model could be implemented only by changing the whole national innovation system of catching-up economies. The…

  5. Stakeholders' views of the introduction of assistive technology in the classroom: How family-centred is Australian practice for students with cerebral palsy?

    PubMed

    Karlsson, P; Johnston, C; Barker, K

    2017-07-01

    With family-centred care widely recognized as a cornerstone for effective assistive technology service provision, the current study was undertaken to investigate to what extent such approaches were used by schools when assistive technology assessments and implementation occurred in the classroom. In this cross-sectional study, we compare survey results from parents (n = 76), school staff (n = 33) and allied health professionals (n = 65) with experience in the use of high-tech assistive technology. Demographic characteristics and the stakeholders' perceived helpfulness and frequency attending assessment and set-up sessions were captured. To evaluate how family-centred the assistive technology services were perceived to be, the parents filled out the Measure of Processes of Care for Caregivers, and the professionals completed the Measure of Processes of Care for Service Providers. Descriptive statistics and one-way analysis of variance were used to conduct the data analysis. Findings show that parents are more involved during the assessment stage than during the implementation and that classroom teachers are often not involved in the initial stage. Speech pathologists in particular are seen to be to a great extent helpful when implementing assistive technology in the classroom. This study found that family-centred service is not yet fully achieved in schools despite being endorsed in early intervention and disability services for over 20 years. No statistically significant differences were found with respect to school staff and allied health professionals' roles, their years of experience working with students with cerebral palsy and the scales in the Measure of Processes of Care for Service Providers. To enhance the way technology is matched to the student and successfully implemented, classroom teachers need to be fully involved in the whole assistive technology process. The findings also point to the significance of parents' involvement, with the support of allied health professionals, in the process of selecting and implementing assistive technology in the classroom. © 2017 John Wiley & Sons Ltd.

  6. In-network processing of joins in wireless sensor networks.

    PubMed

    Kang, Hyunchul

    2013-03-11

    The join or correlated filtering of sensor readings is one of the fundamental query operations in wireless sensor networks (WSNs). Although the join in centralized or distributed databases is a well-researched problem, join processing in WSNs has quite different characteristics and is much more difficult to perform due to the lack of statistics on sensor readings and the resource constraints of sensor nodes. Since data transmission is orders of magnitude more costly than processing at a sensor node, in-network processing of joins is essential. In this paper, the state-of-the-art techniques for join implementation in WSNs are surveyed. The requirements and challenges, join types, and components of join implementation are described. The open issues for further research are identified.

  7. In-Network Processing of Joins in Wireless Sensor Networks

    PubMed Central

    Kang, Hyunchul

    2013-01-01

    The join or correlated filtering of sensor readings is one of the fundamental query operations in wireless sensor networks (WSNs). Although the join in centralized or distributed databases is a well-researched problem, join processing in WSNs has quite different characteristics and is much more difficult to perform due to the lack of statistics on sensor readings and the resource constraints of sensor nodes. Since data transmission is orders of magnitude more costly than processing at a sensor node, in-network processing of joins is essential. In this paper, the state-of-the-art techniques for join implementation in WSNs are surveyed. The requirements and challenges, join types, and components of join implementation are described. The open issues for further research are identified. PMID:23478603

  8. Application of pedagogy reflective in statistical methods course and practicum statistical methods

    NASA Astrophysics Data System (ADS)

    Julie, Hongki

    2017-08-01

    Subject Elementary Statistics, Statistical Methods and Statistical Methods Practicum aimed to equip students of Mathematics Education about descriptive statistics and inferential statistics. The students' understanding about descriptive and inferential statistics were important for students on Mathematics Education Department, especially for those who took the final task associated with quantitative research. In quantitative research, students were required to be able to present and describe the quantitative data in an appropriate manner, to make conclusions from their quantitative data, and to create relationships between independent and dependent variables were defined in their research. In fact, when students made their final project associated with quantitative research, it was not been rare still met the students making mistakes in the steps of making conclusions and error in choosing the hypothetical testing process. As a result, they got incorrect conclusions. This is a very fatal mistake for those who did the quantitative research. There were some things gained from the implementation of reflective pedagogy on teaching learning process in Statistical Methods and Statistical Methods Practicum courses, namely: 1. Twenty two students passed in this course and and one student did not pass in this course. 2. The value of the most accomplished student was A that was achieved by 18 students. 3. According all students, their critical stance could be developed by them, and they could build a caring for each other through a learning process in this course. 4. All students agreed that through a learning process that they undergo in the course, they can build a caring for each other.

  9. Using a Systematic Conceptual Model for a Process Evaluation of a Middle School Obesity Risk-Reduction Nutrition Curriculum Intervention: Choice, Control & Change

    PubMed Central

    Lee, Heewon; Contento, Isobel R.; Koch, Pamela

    2012-01-01

    Objective To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, Choice, Control & Change, designed to promote dietary and physical activity behaviors that reduce obesity risk. Design A process evaluation study based on a systematic conceptual model. Setting Five middle schools in New York City. Participants 562 students in 20 classes and their science teachers (n=8). Main Outcome Measures Based on the model, teacher professional development, teacher implementation, and student reception were evaluated. Also measured were teacher characteristics, teachers’ curriculum evaluation, and satisfaction with teaching the curriculum. Analysis Descriptive statistics and Spearman’s Rho Correlation for quantitative analysis and content analysis for qualitative data were used. Results Mean score of the teacher professional development evaluation was 4.75 on a 5-point scale. Average teacher implementation rate was 73%, and student reception rate was 69%. Ongoing teacher support was highly valued by teachers. Teachers’ satisfaction with teaching the curriculum was highly correlated with students’ satisfaction (p <.05). Teachers’ perception of amount of student work was negatively correlated with implementation and with student satisfaction (p<.05). Conclusions and implications Use of a systematic conceptual model and comprehensive process measures improves understanding of the implementation process and helps educators to better implement interventions as designed. PMID:23321021

  10. Using a systematic conceptual model for a process evaluation of a middle school obesity risk-reduction nutrition curriculum intervention: choice, control & change.

    PubMed

    Lee, Heewon; Contento, Isobel R; Koch, Pamela

    2013-03-01

    To use and review a conceptual model of process evaluation and to examine the implementation of a nutrition education curriculum, Choice, Control & Change, designed to promote dietary and physical activity behaviors that reduce obesity risk. A process evaluation study based on a systematic conceptual model. Five middle schools in New York City. Five hundred sixty-two students in 20 classes and their science teachers (n = 8). Based on the model, teacher professional development, teacher implementation, and student reception were evaluated. Also measured were teacher characteristics, teachers' curriculum evaluation, and satisfaction with teaching the curriculum. Descriptive statistics and Spearman ρ correlation for quantitative analysis and content analysis for qualitative data were used. Mean score of the teacher professional development evaluation was 4.75 on a 5-point scale. Average teacher implementation rate was 73%, and the student reception rate was 69%. Ongoing teacher support was highly valued by teachers. Teacher satisfaction with teaching the curriculum was highly correlated with student satisfaction (P < .05). Teacher perception of amount of student work was negatively correlated with implementation and with student satisfaction (P < .05). Use of a systematic conceptual model and comprehensive process measures improves understanding of the implementation process and helps educators to better implement interventions as designed. Copyright © 2013 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  11. Total Quality Management (TQM), an Overview

    DTIC Science & Technology

    1991-09-01

    Quality Management (TQM). It discusses the reasons TQM is a current growth industry, what it is, and how one implements it. It describes the basic analytical tools, statistical process control, some advanced analytical tools, tools used by process improvement teams to enhance their own operations, and action plans for making improvements. The final sections discuss assessing quality efforts and measuring the quality to knowledge

  12. A quality improvement project to improve the Medicare and Medicaid Services (CMS) sepsis bundle compliance rate in a large healthcare system.

    PubMed

    Raschke, Robert A; Groves, Robert H; Khurana, Hargobind S; Nikhanj, Nidhi; Utter, Ethel; Hartling, Didi; Stoffer, Brenda; Nunn, Kristina; Tryon, Shona; Bruner, Michelle; Calleja, Maria; Curry, Steven C

    2017-01-01

    Sepsis is a leading cause of mortality and morbidity in hospitalised patients. The Centers for Medicare and Medicaid Services (CMS) mandated that US hospitals report sepsis bundle compliance rate as a quality process measure in October 2015. The specific aim of our study was to improve the CMS sepsis bundle compliance rate from 30% to 40% across 20 acute care hospitals in our healthcare system within 1 year. The study included all adult inpatients with sepsis sampled according to CMS specifications from October 2015 to September 2016. The CMS sepsis bundle compliance rate was tracked monthly using statistical process control charting. A baseline rate of 28.5% with 99% control limits was established. We implemented multiple interventions including computerised decision support systems (CDSSs) to increase compliance with the most commonly missing bundle elements. Compliance reached 42% (99% statistical process control limits 18.4%-38.6%) as CDSS was implemented system-wide, but this improvement was not sustained after CMS changed specifications of the outcome measure. Difficulties encountered elucidate shortcomings of our study methodology and of the CMS sepsis bundle compliance rate as a quality process measure.

  13. Total Quality Management Implementation Strategy: Directorate of Quality Assurance

    DTIC Science & Technology

    1989-05-01

    Total Quality Control Harrington, H. James The Improvement Process Imai, Masaaki Kaizen Ishikawa , Kaoru What is Total Quality Control Ishikawa ... Kaoru Statistical Quality Control Juran, J. M. Managerial Breakthrough Juran, J. M. Quality Control Handbook Mizuno, Ed Managing for Quality Improvements

  14. Using process data to understand outcomes in sexual health promotion: an example from a review of school-based programmes to prevent sexually transmitted infections.

    PubMed

    Shepherd, J; Harden, A; Barnett-Page, E; Kavanagh, J; Picot, J; Frampton, G K; Cooper, K; Hartwell, D; Clegg, A

    2014-08-01

    This article discusses how process indicators can complement outcomes as part of a comprehensive explanatory evaluation framework, using the example of skills-based behavioural interventions to prevent sexually transmitted infections and promote sexual health among young people in schools. A systematic review was conducted, yielding 12 eligible outcome evaluations, 9 of which included a process evaluation. There were few statistically significant effects in terms of changes in sexual behaviour outcomes, but statistically significant effects were more common for knowledge and self-efficacy. Synthesis of the findings of the process evaluations identified a range of factors that might explain outcomes, and these were organized into two overarching categories: the implementation of interventions, and student engagement and intervention acceptability. Factors which supported implementation and engagement and acceptability included good quality teacher training, involvement and motivation of key school stakeholders and relevance and appeal to young people. Factors which had a negative impact included teachers' failure to comprehend the theoretical basis for behaviour change, school logistical problems and omission of topics that young people considered important. It is recommended that process indicators such as these be assessed in future evaluations of school-based sexual health behavioural interventions, as part of a logic model. © Crown copyright 2014.

  15. Effect of promoting self-esteem by participatory learning process on emotional intelligence among early adolescents.

    PubMed

    Munsawaengsub, Chokchai; Yimklib, Somkid; Nanthamongkolchai, Sutham; Apinanthavech, Suporn

    2009-12-01

    To study the effect of promoting self-esteem by participatory learning program on emotional intelligence among early adolescents. The quasi-experimental study was conducted in grade 9 students from two schools in Bangbuathong district, Nonthaburi province. Each experimental and comparative group consisted of 34 students with the lowest score of emotional intelligence. The instruments were questionnaires, Program to Develop Emotional Intelligence and Handbook of Emotional Intelligence Development. The experimental group attended 8 participatory learning activities in 4 weeks to Develop Emotional Intelligence while the comparative group received the handbook for self study. Assessment the effectiveness of program was done by pre-test and post-test immediately and 4 weeks apart concerning the emotional intelligence. Implementation and evaluation was done during May 24-August 12, 2005. Data were analyzed by frequency, percentage, mean, standard deviation, Chi-square, independent sample t-test and paired sample t-test. Before program implementation, both groups had no statistical difference in mean score of emotional intelligence. After intervention, the experimental group had higher mean score of emotional intelligence both immediately and 4 weeks later with statistical significant (p = 0.001 and < 0.001). At 4 weeks after experiment, the mean score in experimental group was higher than the mean score at immediate after experiment with statistical significance (p < 0.001). The program to promote self-esteem by participatory learning process could enhance the emotional intelligence in early-adolescent. This program could be modified and implemented for early adolescent in the community.

  16. Radiation from quantum weakly dynamical horizons in loop quantum gravity.

    PubMed

    Pranzetti, Daniele

    2012-07-06

    We provide a statistical mechanical analysis of quantum horizons near equilibrium in the grand canonical ensemble. By matching the description of the nonequilibrium phase in terms of weakly dynamical horizons with a local statistical framework, we implement loop quantum gravity dynamics near the boundary. The resulting radiation process provides a quantum gravity description of the horizon evaporation. For large black holes, the spectrum we derive presents a discrete structure which could be potentially observable.

  17. Description and texts for the auxiliary programs for processing video information on the YeS computer. Part 3: Test program 2

    NASA Technical Reports Server (NTRS)

    Borisenko, V. I., G.g.; Stetsenko, Z. A.

    1980-01-01

    The functions were discribed and the operating instructions, the block diagram and the proposed versions are given for modifying the program in order to obtain the statistical characteristics of multi-channel video information. The program implements certain man-machine methods for investigating video information. It permits representation of the material and its statistical characteristics in a form which is convenient for the user.

  18. Using statistical process control to make data-based clinical decisions.

    PubMed

    Pfadt, A; Wheeler, D J

    1995-01-01

    Applied behavior analysis is based on an investigation of variability due to interrelationships among antecedents, behavior, and consequences. This permits testable hypotheses about the causes of behavior as well as for the course of treatment to be evaluated empirically. Such information provides corrective feedback for making data-based clinical decisions. This paper considers how a different approach to the analysis of variability based on the writings of Walter Shewart and W. Edwards Deming in the area of industrial quality control helps to achieve similar objectives. Statistical process control (SPC) was developed to implement a process of continual product improvement while achieving compliance with production standards and other requirements for promoting customer satisfaction. SPC involves the use of simple statistical tools, such as histograms and control charts, as well as problem-solving techniques, such as flow charts, cause-and-effect diagrams, and Pareto charts, to implement Deming's management philosophy. These data-analytic procedures can be incorporated into a human service organization to help to achieve its stated objectives in a manner that leads to continuous improvement in the functioning of the clients who are its customers. Examples are provided to illustrate how SPC procedures can be used to analyze behavioral data. Issues related to the application of these tools for making data-based clinical decisions and for creating an organizational climate that promotes their routine use in applied settings are also considered.

  19. Implementing Lean Six Sigma to achieve inventory control in supply chain management

    NASA Astrophysics Data System (ADS)

    Hong, Chen

    2017-11-01

    The inventory cost has important impact on the production cost. In order to get the maximum circulation of funds of enterprise with minimum inventory cost, the inventory control with Lean Six Sigma is presented in supply chain management. The inventory includes both the raw material and the semi-finished parts in manufacturing process. Though the inventory is often studied, the inventory control in manufacturing process is seldom mentioned. This paper reports the inventory control from the perspective of manufacturing process by using statistical techniques including DMAIC, Control Chart, and Statistical Process Control. The process stability is evaluated and the process capability is verified with Lean Six Sigma philosophy. The demonstration in power meter production shows the inventory is decreased from 25% to 0.4%, which indicates the inventory control can be achieved with Lean Six Sigma philosophy and the inventory cost in production can be saved for future sustainable development in supply chain management.

  20. 28 CFR Appendix D to Part 61 - Office of Justice Assistance, Research, and Statistics Procedures Relating to the Implementation...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., and Statistics Procedures Relating to the Implementation of the National Environmental Policy Act D... Assistance, Research, and Statistics Procedures Relating to the Implementation of the National Environmental... Statistics (OJARS) assists State and local units of government in strengthening and improving law enforcement...

  1. 28 CFR Appendix D to Part 61 - Office of Justice Assistance, Research, and Statistics Procedures Relating to the Implementation...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., and Statistics Procedures Relating to the Implementation of the National Environmental Policy Act D... Assistance, Research, and Statistics Procedures Relating to the Implementation of the National Environmental... Statistics (OJARS) assists State and local units of government in strengthening and improving law enforcement...

  2. An architecture for a brain-image database

    NASA Technical Reports Server (NTRS)

    Herskovits, E. H.

    2000-01-01

    The widespread availability of methods for noninvasive assessment of brain structure has enabled researchers to investigate neuroimaging correlates of normal aging, cerebrovascular disease, and other processes; we designate such studies as image-based clinical trials (IBCTs). We propose an architecture for a brain-image database, which integrates image processing and statistical operators, and thus supports the implementation and analysis of IBCTs. The implementation of this architecture is described and results from the analysis of image and clinical data from two IBCTs are presented. We expect that systems such as this will play a central role in the management and analysis of complex research data sets.

  3. Reproducible Computing: a new Technology for Statistics Education and Educational Research

    NASA Astrophysics Data System (ADS)

    Wessa, Patrick

    2009-05-01

    This paper explains how the R Framework (http://www.wessa.net) and a newly developed Compendium Platform (http://www.freestatistics.org) allow us to create, use, and maintain documents that contain empirical research results which can be recomputed and reused in derived work. It is illustrated that this technological innovation can be used to create educational applications that can be shown to support effective learning of statistics and associated analytical skills. It is explained how a Compendium can be created by anyone, without the need to understand the technicalities of scientific word processing (L style="font-variant: small-caps">ATEX) or statistical computing (R code). The proposed Reproducible Computing system allows educational researchers to objectively measure key aspects of the actual learning process based on individual and constructivist activities such as: peer review, collaboration in research, computational experimentation, etc. The system was implemented and tested in three statistics courses in which the use of Compendia was used to create an interactive e-learning environment that simulated the real-world process of empirical scientific research.

  4. Texture analysis with statistical methods for wheat ear extraction

    NASA Astrophysics Data System (ADS)

    Bakhouche, M.; Cointault, F.; Gouton, P.

    2007-01-01

    In agronomic domain, the simplification of crop counting, necessary for yield prediction and agronomic studies, is an important project for technical institutes such as Arvalis. Although the main objective of our global project is to conceive a mobile robot for natural image acquisition directly in a field, Arvalis has proposed us first to detect by image processing the number of wheat ears in images before to count them, which will allow to obtain the first component of the yield. In this paper we compare different texture image segmentation techniques based on feature extraction by first and higher order statistical methods which have been applied on our images. The extracted features are used for unsupervised pixel classification to obtain the different classes in the image. So, the K-means algorithm is implemented before the choice of a threshold to highlight the ears. Three methods have been tested in this feasibility study with very average error of 6%. Although the evaluation of the quality of the detection is visually done, automatic evaluation algorithms are currently implementing. Moreover, other statistical methods of higher order will be implemented in the future jointly with methods based on spatio-frequential transforms and specific filtering.

  5. Implementing and sustaining a hand hygiene culture change programme at Auckland District Health Board.

    PubMed

    Roberts, Sally A; Sieczkowski, Christine; Campbell, Taima; Balla, Greg; Keenan, Andrew

    2012-05-11

    In January 2009 Auckland District Health Board commenced implementation of the Hand Hygiene New Zealand (HHNZ) programme to bring about a culture change and to improve hand hygiene compliance by healthcare workers. We describe the implementation process and assess the effectiveness of this programme 36 months after implementation. In keeping with the HHNZ guideline the implementation was divided into five steps: roll-out and facility preparation, baseline evaluation, implementation, follow-up evaluation and sustainability. The process measure was improvement in hand hygiene compliance and the outcome measure was Staphylococcus aureus clinical infection and bacteraemia rates. The mean (95% CI; range) baseline compliance rates for the national reporting wards was 35% (95% CI 24-46%, 25-61%). The overall compliance by the 7th audit period was 60% (95% CI 46-74; range 47-91). All healthcare worker groups had improvement in compliance. The reduction in healthcare-associated S. aureus bacteraemia rates following the implementation was statistically significant (p=0.027). Compliance with hand hygiene improved following implementation of a culture change programme. Sustaining this improvement requires commitment and strong leadership at a senior level both nationally and within each District Health Board.

  6. A cross sectional study on nursing process implementation and associated factors among nurses working in selected hospitals of Central and Northwest zones, Tigray Region, Ethiopia.

    PubMed

    Baraki, Zeray; Girmay, Fiseha; Kidanu, Kalayou; Gerensea, Hadgu; Gezehgne, Dejen; Teklay, Hafte

    2017-01-01

    The nursing process is a systematic method of planning, delivering, and evaluating individualized care for clients in any state of health or illness. Many countries have adopted the nursing process as the standard of care to guide nursing practice; however, the problem is its implementation. If nurses fail to carry out the necessary nursing care through the nursing process; the effectiveness of patient progress may be compromised and can lead to preventable adverse events. This study was aimed to assess the implementation of nursing process and associated factors among nurses working in selected hospitals of central and northwest zones of Tigray, Ethiopia, 2015. A cross sectional observational study design was utilized. Data was collected from 200 participants using structured self-administered questionnaire which was contextually adapted from standardized, reliable and validated measures. The data were entered using Epi Info version 7 and analyzed using SPSS version 20 software. Data were summarized and described using descriptive statistics and multivariate logistic regression was used to determine the relationship of independent and dependent variable. Then, finally, data were presented in tables, graphs, frequency percentage of different variables. Seventy (35%) of participants have implemented nursing process. Different factors showed significant association. Nurses who worked in a stressful atmosphere of the workplace were 99% less likely to implement the nursing process than nurses who worked at a very good atmosphere. The nurses with an educational level of BSc. Degree were 6.972 times more likely to implement the nursing process than those who were diploma qualified. Nurses with no consistent material supply to use the nursing process were 95.1% less likely to implement the nursing process than nurses with consistent material supply. The majority of the participants were not implementing the nursing process properly. There are many factors that hinder them from applying the nursing process of which level of education, knowledge of nurses, skill of nurses, atmosphere of the work place, shortage of material supply to use the nursing process and high number of patient load were scientifically significant for the association test.

  7. Novel physical constraints on implementation of computational processes

    NASA Astrophysics Data System (ADS)

    Wolpert, David; Kolchinsky, Artemy

    Non-equilibrium statistical physics permits us to analyze computational processes, i.e., ways to drive a physical system such that its coarse-grained dynamics implements some desired map. It is now known how to implement any such desired computation without dissipating work, and what the minimal (dissipationless) work is that such a computation will require (the so-called generalized Landauer bound\\x9D). We consider how these analyses change if we impose realistic constraints on the computational process. First, we analyze how many degrees of freedom of the system must be controlled, in addition to the ones specifying the information-bearing degrees of freedom, in order to avoid dissipating work during a given computation, when local detailed balance holds. We analyze this issue for deterministic computations, deriving a state-space vs. speed trade-off, and use our results to motivate a measure of the complexity of a computation. Second, we consider computations that are implemented with logic circuits, in which only a small numbers of degrees of freedom are coupled at a time. We show that the way a computation is implemented using circuits affects its minimal work requirements, and relate these minimal work requirements to information-theoretic measures of complexity.

  8. Quantification and Statistical Analysis Methods for Vessel Wall Components from Stained Images with Masson's Trichrome

    PubMed Central

    Hernández-Morera, Pablo; Castaño-González, Irene; Travieso-González, Carlos M.; Mompeó-Corredera, Blanca; Ortega-Santana, Francisco

    2016-01-01

    Purpose To develop a digital image processing method to quantify structural components (smooth muscle fibers and extracellular matrix) in the vessel wall stained with Masson’s trichrome, and a statistical method suitable for small sample sizes to analyze the results previously obtained. Methods The quantification method comprises two stages. The pre-processing stage improves tissue image appearance and the vessel wall area is delimited. In the feature extraction stage, the vessel wall components are segmented by grouping pixels with a similar color. The area of each component is calculated by normalizing the number of pixels of each group by the vessel wall area. Statistical analyses are implemented by permutation tests, based on resampling without replacement from the set of the observed data to obtain a sampling distribution of an estimator. The implementation can be parallelized on a multicore machine to reduce execution time. Results The methods have been tested on 48 vessel wall samples of the internal saphenous vein stained with Masson’s trichrome. The results show that the segmented areas are consistent with the perception of a team of doctors and demonstrate good correlation between the expert judgments and the measured parameters for evaluating vessel wall changes. Conclusion The proposed methodology offers a powerful tool to quantify some components of the vessel wall. It is more objective, sensitive and accurate than the biochemical and qualitative methods traditionally used. The permutation tests are suitable statistical techniques to analyze the numerical measurements obtained when the underlying assumptions of the other statistical techniques are not met. PMID:26761643

  9. Network Meta-Analysis Using R: A Review of Currently Available Automated Packages

    PubMed Central

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA) – a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously – has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA. PMID:25541687

  10. Network meta-analysis using R: a review of currently available automated packages.

    PubMed

    Neupane, Binod; Richer, Danielle; Bonner, Ashley Joel; Kibret, Taddele; Beyene, Joseph

    2014-01-01

    Network meta-analysis (NMA)--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i) data input and network plotting, (ii) modeling options, (iii) assumption checking and diagnostic testing, and (iv) inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.

  11. A Comparative Analysis of Taguchi Methodology and Shainin System DoE in the Optimization of Injection Molding Process Parameters

    NASA Astrophysics Data System (ADS)

    Khavekar, Rajendra; Vasudevan, Hari, Dr.; Modi, Bhavik

    2017-08-01

    Two well-known Design of Experiments (DoE) methodologies, such as Taguchi Methods (TM) and Shainin Systems (SS) are compared and analyzed in this study through their implementation in a plastic injection molding unit. Experiments were performed at a perfume bottle cap manufacturing company (made by acrylic material) using TM and SS to find out the root cause of defects and to optimize the process parameters for minimum rejection. Experiments obtained the rejection rate to be 8.57% from 40% (appx.) during trial runs, which is quiet low, representing successful implementation of these DoE methods. The comparison showed that both methodologies gave same set of variables as critical for defect reduction, but with change in their significance order. Also, Taguchi methods require more number of experiments and consume more time compared to the Shainin System. Shainin system is less complicated and is easy to implement, whereas Taguchi methods is statistically more reliable for optimization of process parameters. Finally, experimentations implied that DoE methods are strong and reliable in implementation, as organizations attempt to improve the quality through optimization.

  12. Improved processes for meeting the data requirements for implementing the Highway Safety Manual (HSM) and Safety Analyst in Florida.

    DOT National Transportation Integrated Search

    2014-03-01

    Recent research in highway safety has focused on the more advanced and statistically proven techniques of highway : safety analysis. This project focuses on the two most recent safety analysis tools, the Highway Safety Manual (HSM) : and SafetyAnalys...

  13. TQL, A Case Study of Implementation into the Operational Fleet

    DTIC Science & Technology

    1992-06-18

    Methods, Poka - Yoke (mistake proofing of a process), Total Preventive Maintenance, and Group Technology and Quality Circles. All of these methods can be...Thomas, What Every Manager Should Know About Quality, 1991, Marcel Dekker,inc. 9. Poka - Yoke , 1987, Productivity Press. 67 B. STATISTICAL METHODS: 1

  14. Power through Struggle in Introductory Statistics

    ERIC Educational Resources Information Center

    Autin, Melanie; Bateiha, Summer; Marchionda, Hope

    2013-01-01

    Traditional classroom instruction consists of teacher-centered learning in which the instructor presents course material through lectures. A recent trend in higher education is the implementation of student-centered learning in which students take a more active role in the learning process. The purpose of this article is to describe the discomfort…

  15. The development of algorithms for the deployment of new version of GEM-detector-based acquisition system

    NASA Astrophysics Data System (ADS)

    Krawczyk, Rafał D.; Czarski, Tomasz; Kolasiński, Piotr; Linczuk, Paweł; Poźniak, Krzysztof T.; Chernyshova, Maryna; Kasprowicz, Grzegorz; Wojeński, Andrzej; Zabolotny, Wojciech; Zienkiewicz, Paweł

    2016-09-01

    This article is an overview of what has been implemented in the process of development and testing the GEM detector based acquisition system in terms of post-processing algorithms. Information is given on mex functions for extended statistics collection, unified hex topology and optimized S-DAQ algorithm for splitting overlapped signals. Additional discussion on bottlenecks and major factors concerning optimization is presented.

  16. Statistically Modeling I-V Characteristics of CNT-FET with LASSO

    NASA Astrophysics Data System (ADS)

    Ma, Dongsheng; Ye, Zuochang; Wang, Yan

    2017-08-01

    With the advent of internet of things (IOT), the need for studying new material and devices for various applications is increasing. Traditionally we build compact models for transistors on the basis of physics. But physical models are expensive and need a very long time to adjust for non-ideal effects. As the vision for the application of many novel devices is not certain or the manufacture process is not mature, deriving generalized accurate physical models for such devices is very strenuous, whereas statistical modeling is becoming a potential method because of its data oriented property and fast implementation. In this paper, one classical statistical regression method, LASSO, is used to model the I-V characteristics of CNT-FET and a pseudo-PMOS inverter simulation based on the trained model is implemented in Cadence. The normalized relative mean square prediction error of the trained model versus experiment sample data and the simulation results show that the model is acceptable for digital circuit static simulation. And such modeling methodology can extend to general devices.

  17. Random noise effects in pulse-mode digital multilayer neural networks.

    PubMed

    Kim, Y C; Shanblatt, M A

    1995-01-01

    A pulse-mode digital multilayer neural network (DMNN) based on stochastic computing techniques is implemented with simple logic gates as basic computing elements. The pulse-mode signal representation and the use of simple logic gates for neural operations lead to a massively parallel yet compact and flexible network architecture, well suited for VLSI implementation. Algebraic neural operations are replaced by stochastic processes using pseudorandom pulse sequences. The distributions of the results from the stochastic processes are approximated using the hypergeometric distribution. Synaptic weights and neuron states are represented as probabilities and estimated as average pulse occurrence rates in corresponding pulse sequences. A statistical model of the noise (error) is developed to estimate the relative accuracy associated with stochastic computing in terms of mean and variance. Computational differences are then explained by comparison to deterministic neural computations. DMNN feedforward architectures are modeled in VHDL using character recognition problems as testbeds. Computational accuracy is analyzed, and the results of the statistical model are compared with the actual simulation results. Experiments show that the calculations performed in the DMNN are more accurate than those anticipated when Bernoulli sequences are assumed, as is common in the literature. Furthermore, the statistical model successfully predicts the accuracy of the operations performed in the DMNN.

  18. A product of independent beta probabilities dose escalation design for dual-agent phase I trials.

    PubMed

    Mander, Adrian P; Sweeting, Michael J

    2015-04-15

    Dual-agent trials are now increasingly common in oncology research, and many proposed dose-escalation designs are available in the statistical literature. Despite this, the translation from statistical design to practical application is slow, as has been highlighted in single-agent phase I trials, where a 3 + 3 rule-based design is often still used. To expedite this process, new dose-escalation designs need to be not only scientifically beneficial but also easy to understand and implement by clinicians. In this paper, we propose a curve-free (nonparametric) design for a dual-agent trial in which the model parameters are the probabilities of toxicity at each of the dose combinations. We show that it is relatively trivial for a clinician's prior beliefs or historical information to be incorporated in the model and updating is fast and computationally simple through the use of conjugate Bayesian inference. Monotonicity is ensured by considering only a set of monotonic contours for the distribution of the maximum tolerated contour, which defines the dose-escalation decision process. Varied experimentation around the contour is achievable, and multiple dose combinations can be recommended to take forward to phase II. Code for R, Stata and Excel are available for implementation. © 2015 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  19. Interventions to Improve Patient Safety During Intubation in the Neonatal Intensive Care Unit

    PubMed Central

    Grubb, Peter H.; Lea, Amanda S.; Walsh, William F.; Markham, Melinda H.; Maynord, Patrick O.; Whitney, Gina M.; Stark, Ann R.; Ely, E. Wesley

    2016-01-01

    OBJECTIVE: To improve patient safety in our NICU by decreasing the incidence of intubation-associated adverse events (AEs). METHODS: We sequentially implemented and tested 3 interventions: standardized checklist for intubation, premedication algorithm, and computerized provider order entry set for intubation. We compared baseline data collected over 10 months (period 1) with data collected over a 10-month intervention and sustainment period (period 2). Outcomes were the percentage of intubations containing any prospectively defined AE and intubations with bradycardia or hypoxemia. We followed process measures for each intervention. We used risk ratios (RRs) and statistical process control methods in a times series design to assess differences between the 2 periods. RESULTS: AEs occurred in 126/273 (46%) intubations during period 1 and 85/236 (36%) intubations during period 2 (RR = 0.78; 95% confidence interval [CI], 0.63–0.97). Significantly fewer intubations with bradycardia (24.2% vs 9.3%, RR = 0.39; 95% CI, 0.25–0.61) and hypoxemia (44.3% vs 33.1%, RR = 0.75, 95% CI 0.6–0.93) occurred during period 2. Using statistical process control methods, we identified 2 cases of special cause variation with a sustained decrease in AEs and bradycardia after implementation of our checklist. All process measures increased reflecting sustained improvement throughout data collection. CONCLUSIONS: Our interventions resulted in a 10% absolute reduction in AEs that was sustained. Implementation of a standardized checklist for intubation made the greatest impact, with reductions in both AEs and bradycardia. PMID:27694281

  20. Indigenous Mortality (Revealed): The Invisible Illuminated

    PubMed Central

    Ring, Ian; Arambula Solomon, Teshia G.; Gachupin, Francine C.; Smylie, Janet; Cutler, Tessa Louise; Waldon, John A.

    2015-01-01

    Inaccuracies in the identification of Indigenous status and the collection of and access to vital statistics data impede the strategic implementation of evidence-based public health initiatives to reduce avoidable deaths. The impact of colonization and subsequent government initiatives has been commonly observed among the Indigenous peoples of Australia, Canada, New Zealand, and the United States. The quality of Indigenous data that informs mortality statistics are similarly connected to these distal processes, which began with colonization. We discuss the methodological and technical challenges in measuring mortality for Indigenous populations within a historical and political context, and identify strategies for the accurate ascertainment and inclusion of Indigenous people in mortality statistics. PMID:25211754

  1. Algorithm for Identifying Erroneous Rain-Gauge Readings

    NASA Technical Reports Server (NTRS)

    Rickman, Doug

    2005-01-01

    An algorithm analyzes rain-gauge data to identify statistical outliers that could be deemed to be erroneous readings. Heretofore, analyses of this type have been performed in burdensome manual procedures that have involved subjective judgements. Sometimes, the analyses have included computational assistance for detecting values falling outside of arbitrary limits. The analyses have been performed without statistically valid knowledge of the spatial and temporal variations of precipitation within rain events. In contrast, the present algorithm makes it possible to automate such an analysis, makes the analysis objective, takes account of the spatial distribution of rain gauges in conjunction with the statistical nature of spatial variations in rainfall readings, and minimizes the use of arbitrary criteria. The algorithm implements an iterative process that involves nonparametric statistics.

  2. Generation of Non-Homogeneous Poisson Processes by Thinning: Programming Considerations and Comparision with Competing Algorithms.

    DTIC Science & Technology

    1978-12-01

    Poisson processes . The method is valid for Poisson processes with any given intensity function. The basic thinning algorithm is modified to exploit several refinements which reduce computer execution time by approximately one-third. The basic and modified thinning programs are compared with the Poisson decomposition and gap-statistics algorithm, which is easily implemented for Poisson processes with intensity functions of the form exp(a sub 0 + a sub 1t + a sub 2 t-squared. The thinning programs are competitive in both execution

  3. Implementation of quality improvement techniques for management and technical processes in the ACRV project

    NASA Technical Reports Server (NTRS)

    Raiman, Laura B.

    1992-01-01

    Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .

  4. Implementation of quality improvement techniques for management and technical processes in the ACRV project

    NASA Astrophysics Data System (ADS)

    Raiman, Laura B.

    1992-12-01

    Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .

  5. [Design and implementation of online statistical analysis function in information system of air pollution and health impact monitoring].

    PubMed

    Lü, Yiran; Hao, Shuxin; Zhang, Guoqing; Liu, Jie; Liu, Yue; Xu, Dongqun

    2018-01-01

    To implement the online statistical analysis function in information system of air pollution and health impact monitoring, and obtain the data analysis information real-time. Using the descriptive statistical method as well as time-series analysis and multivariate regression analysis, SQL language and visual tools to implement online statistical analysis based on database software. Generate basic statistical tables and summary tables of air pollution exposure and health impact data online; Generate tendency charts of each data part online and proceed interaction connecting to database; Generate butting sheets which can lead to R, SAS and SPSS directly online. The information system air pollution and health impact monitoring implements the statistical analysis function online, which can provide real-time analysis result to its users.

  6. Competitive Processes in Cross-Situational Word Learning

    PubMed Central

    Yurovsky, Daniel; Yu, Chen; Smith, Linda B.

    2013-01-01

    Cross-situational word learning, like any statistical learning problem, involves tracking the regularities in the environment. But the information that learners pick up from these regularities is dependent on their learning mechanism. This paper investigates the role of one type of mechanism in statistical word learning: competition. Competitive mechanisms would allow learners to find the signal in noisy input, and would help to explain the speed with which learners succeed in statistical learning tasks. Because cross-situational word learning provides information at multiple scales – both within and across trials/situations –learners could implement competition at either or both of these scales. A series of four experiments demonstrate that cross-situational learning involves competition at both levels of scale, and that these mechanisms interact to support rapid learning. The impact of both of these mechanisms is then considered from the perspective of a process-level understanding of cross-situational learning. PMID:23607610

  7. Competitive processes in cross-situational word learning.

    PubMed

    Yurovsky, Daniel; Yu, Chen; Smith, Linda B

    2013-07-01

    Cross-situational word learning, like any statistical learning problem, involves tracking the regularities in the environment. However, the information that learners pick up from these regularities is dependent on their learning mechanism. This article investigates the role of one type of mechanism in statistical word learning: competition. Competitive mechanisms would allow learners to find the signal in noisy input and would help to explain the speed with which learners succeed in statistical learning tasks. Because cross-situational word learning provides information at multiple scales-both within and across trials/situations-learners could implement competition at either or both of these scales. A series of four experiments demonstrate that cross-situational learning involves competition at both levels of scale, and that these mechanisms interact to support rapid learning. The impact of both of these mechanisms is considered from the perspective of a process-level understanding of cross-situational learning. Copyright © 2013 Cognitive Science Society, Inc.

  8. Statistical physics approach to earthquake occurrence and forecasting

    NASA Astrophysics Data System (ADS)

    de Arcangelis, Lucilla; Godano, Cataldo; Grasso, Jean Robert; Lippiello, Eugenio

    2016-04-01

    There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space-time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for different levels of prediction. In this review we also briefly discuss how the statistical mechanics approach can be applied to non-tectonic earthquakes and to other natural stochastic processes, such as volcanic eruptions and solar flares.

  9. Review of the patient positioning reproducibility in head-and-neck radiotherapy using Statistical Process Control.

    PubMed

    Moore, Sarah J; Herst, Patries M; Louwe, Robert J W

    2018-05-01

    A remarkable improvement in patient positioning was observed after the implementation of various process changes aiming to increase the consistency of patient positioning throughout the radiotherapy treatment chain. However, no tool was available to describe these changes over time in a standardised way. This study reports on the feasibility of Statistical Process Control (SPC) to highlight changes in patient positioning accuracy and facilitate correlation of these changes with the underlying process changes. Metrics were designed to quantify the systematic and random patient deformation as input for the SPC charts. These metrics were based on data obtained from multiple local ROI matches for 191 patients who were treated for head-and-neck cancer during the period 2011-2016. SPC highlighted a significant improvement in patient positioning that coincided with multiple intentional process changes. The observed improvements could be described as a combination of a reduction in outliers and a systematic improvement in the patient positioning accuracy of all patients. SPC is able to track changes in the reproducibility of patient positioning in head-and-neck radiation oncology, and distinguish between systematic and random process changes. Identification of process changes underlying these trends requires additional statistical analysis and seems only possible when the changes do not overlap in time. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. A Model Fit Statistic for Generalized Partial Credit Model

    ERIC Educational Resources Information Center

    Liang, Tie; Wells, Craig S.

    2009-01-01

    Investigating the fit of a parametric model is an important part of the measurement process when implementing item response theory (IRT), but research examining it is limited. A general nonparametric approach for detecting model misfit, introduced by J. Douglas and A. S. Cohen (2001), has exhibited promising results for the two-parameter logistic…

  11. Automatic Coding of Short Text Responses via Clustering in Educational Assessment

    ERIC Educational Resources Information Center

    Zehner, Fabian; Sälzer, Christine; Goldhammer, Frank

    2016-01-01

    Automatic coding of short text responses opens new doors in assessment. We implemented and integrated baseline methods of natural language processing and statistical modelling by means of software components that are available under open licenses. The accuracy of automatic text coding is demonstrated by using data collected in the "Programme…

  12. Improved processes for meeting the data requirements for implementing the Highway Safety Manual (HSM) and Safety Analyst in Florida : [summary].

    DOT National Transportation Integrated Search

    2014-03-01

    Similar to an ill patient, road safety issues can : also be diagnosed, if the right tools are available. : Statistics on roadway incidents can locate areas : that have a high rate of incidents and require : a solution, such as better signage, lightin...

  13. An introduction to Bayesian statistics in health psychology.

    PubMed

    Depaoli, Sarah; Rus, Holly M; Clifton, James P; van de Schoot, Rens; Tiemensma, Jitske

    2017-09-01

    The aim of the current article is to provide a brief introduction to Bayesian statistics within the field of health psychology. Bayesian methods are increasing in prevalence in applied fields, and they have been shown in simulation research to improve the estimation accuracy of structural equation models, latent growth curve (and mixture) models, and hierarchical linear models. Likewise, Bayesian methods can be used with small sample sizes since they do not rely on large sample theory. In this article, we discuss several important components of Bayesian statistics as they relate to health-based inquiries. We discuss the incorporation and impact of prior knowledge into the estimation process and the different components of the analysis that should be reported in an article. We present an example implementing Bayesian estimation in the context of blood pressure changes after participants experienced an acute stressor. We conclude with final thoughts on the implementation of Bayesian statistics in health psychology, including suggestions for reviewing Bayesian manuscripts and grant proposals. We have also included an extensive amount of online supplementary material to complement the content presented here, including Bayesian examples using many different software programmes and an extensive sensitivity analysis examining the impact of priors.

  14. Quantum interference in heterogeneous superconducting-photonic circuits on a silicon chip.

    PubMed

    Schuck, C; Guo, X; Fan, L; Ma, X; Poot, M; Tang, H X

    2016-01-21

    Quantum information processing holds great promise for communicating and computing data efficiently. However, scaling current photonic implementation approaches to larger system size remains an outstanding challenge for realizing disruptive quantum technology. Two main ingredients of quantum information processors are quantum interference and single-photon detectors. Here we develop a hybrid superconducting-photonic circuit system to show how these elements can be combined in a scalable fashion on a silicon chip. We demonstrate the suitability of this approach for integrated quantum optics by interfering and detecting photon pairs directly on the chip with waveguide-coupled single-photon detectors. Using a directional coupler implemented with silicon nitride nanophotonic waveguides, we observe 97% interference visibility when measuring photon statistics with two monolithically integrated superconducting single-photon detectors. The photonic circuit and detector fabrication processes are compatible with standard semiconductor thin-film technology, making it possible to implement more complex and larger scale quantum photonic circuits on silicon chips.

  15. Discrimination of dynamical system models for biological and chemical processes.

    PubMed

    Lorenz, Sönke; Diederichs, Elmar; Telgmann, Regina; Schütte, Christof

    2007-06-01

    In technical chemistry, systems biology and biotechnology, the construction of predictive models has become an essential step in process design and product optimization. Accurate modelling of the reactions requires detailed knowledge about the processes involved. However, when concerned with the development of new products and production techniques for example, this knowledge often is not available due to the lack of experimental data. Thus, when one has to work with a selection of proposed models, the main tasks of early development is to discriminate these models. In this article, a new statistical approach to model discrimination is described that ranks models wrt. the probability with which they reproduce the given data. The article introduces the new approach, discusses its statistical background, presents numerical techniques for its implementation and illustrates the application to examples from biokinetics.

  16. Automated speech understanding: the next generation

    NASA Astrophysics Data System (ADS)

    Picone, J.; Ebel, W. J.; Deshmukh, N.

    1995-04-01

    Modern speech understanding systems merge interdisciplinary technologies from Signal Processing, Pattern Recognition, Natural Language, and Linguistics into a unified statistical framework. These systems, which have applications in a wide range of signal processing problems, represent a revolution in Digital Signal Processing (DSP). Once a field dominated by vector-oriented processors and linear algebra-based mathematics, the current generation of DSP-based systems rely on sophisticated statistical models implemented using a complex software paradigm. Such systems are now capable of understanding continuous speech input for vocabularies of several thousand words in operational environments. The current generation of deployed systems, based on small vocabularies of isolated words, will soon be replaced by a new technology offering natural language access to vast information resources such as the Internet, and provide completely automated voice interfaces for mundane tasks such as travel planning and directory assistance.

  17. Statistical synthesis of contextual knowledge to increase the effectiveness of theory-based behaviour change interventions.

    PubMed

    Hanbury, Andria; Thompson, Carl; Mannion, Russell

    2011-07-01

    Tailored implementation strategies targeting health professionals' adoption of evidence-based recommendations are currently being developed. Research has focused on how to select an appropriate theoretical base, how to use that theoretical base to explore the local context, and how to translate theoretical constructs associated with the key factors found to influence innovation adoption into feasible and tailored implementation strategies. The reasons why an intervention is thought not to have worked are often cited as being: inappropriate choice of theoretical base; unsystematic development of the implementation strategies; and a poor evidence base to guide the process. One area of implementation research that is commonly overlooked is how to synthesize the data collected in a local context in order to identify what factors to target with the implementation strategies. This is suggested to be a critical process in the development of a theory-based intervention. The potential of multilevel modelling techniques to synthesize data collected at different hierarchical levels, for example, individual attitudes and team level variables, is discussed. Future research is needed to explore further the potential of multilevel modelling for synthesizing contextual data in implementation studies, as well as techniques for synthesizing qualitative and quantitative data.

  18. Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement.

    PubMed

    Dupont, Corinne; Occelli, Pauline; Deneux-Tharaux, Catherine; Touzet, Sandrine; Duclos, Antoine; Bouvier-Colle, Marie-Hélène; Rudigoz, René-Charles; Huissoud, Cyril

    2014-07-01

    Severe postpartum haemorrhage after vaginal delivery: a statistical process control chart to report seven years of continuous quality improvement To use statistical process control charts to describe trends in the prevalence of severe postpartum haemorrhage after vaginal delivery. This assessment was performed 7 years after we initiated a continuous quality improvement programme that began with regular criteria-based audits Observational descriptive study, in a French maternity unit in the Rhône-Alpes region. Quarterly clinical audit meetings to analyse all cases of severe postpartum haemorrhage after vaginal delivery and provide feedback on quality of care with statistical process control tools. The primary outcomes were the prevalence of severe PPH after vaginal delivery and its quarterly monitoring with a control chart. The secondary outcomes included the global quality of care for women with severe postpartum haemorrhage, including the performance rate of each recommended procedure. Differences in these variables between 2005 and 2012 were tested. From 2005 to 2012, the prevalence of severe postpartum haemorrhage declined significantly, from 1.2% to 0.6% of vaginal deliveries (p<0.001). Since 2010, the quarterly rate of severe PPH has not exceeded the upper control limits, that is, been out of statistical control. The proportion of cases that were managed consistently with the guidelines increased for all of their main components. Implementation of continuous quality improvement efforts began seven years ago and used, among other tools, statistical process control charts. During this period, the prevalence of severe postpartum haemorrhage after vaginal delivery has been reduced by 50%. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. Systematic comparisons between PRISM version 1.0.0, BAP, and CSMIP ground-motion processing

    USGS Publications Warehouse

    Kalkan, Erol; Stephens, Christopher

    2017-02-23

    A series of benchmark tests was run by comparing results of the Processing and Review Interface for Strong Motion data (PRISM) software version 1.0.0 to Basic Strong-Motion Accelerogram Processing Software (BAP; Converse and Brady, 1992), and to California Strong Motion Instrumentation Program (CSMIP) processing (Shakal and others, 2003, 2004). These tests were performed by using the MatLAB implementation of PRISM, which is equivalent to its public release version in Java language. Systematic comparisons were made in time and frequency domains of records processed in PRISM and BAP, and in CSMIP, by using a set of representative input motions with varying resolutions, frequency content, and amplitudes. Although the details of strong-motion records vary among the processing procedures, there are only minor differences among the waveforms for each component and within the frequency passband common to these procedures. A comprehensive statistical evaluation considering more than 1,800 ground-motion components demonstrates that differences in peak amplitudes of acceleration, velocity, and displacement time series obtained from PRISM and CSMIP processing are equal to or less than 4 percent for 99 percent of the data, and equal to or less than 2 percent for 96 percent of the data. Other statistical measures, including the Euclidian distance (L2 norm) and the windowed root mean square level of processed time series, also indicate that both processing schemes produce statistically similar products.

  20. Distributed Sensing and Processing for Multi-Camera Networks

    NASA Astrophysics Data System (ADS)

    Sankaranarayanan, Aswin C.; Chellappa, Rama; Baraniuk, Richard G.

    Sensor networks with large numbers of cameras are becoming increasingly prevalent in a wide range of applications, including video conferencing, motion capture, surveillance, and clinical diagnostics. In this chapter, we identify some of the fundamental challenges in designing such systems: robust statistical inference, computationally efficiency, and opportunistic and parsimonious sensing. We show that the geometric constraints induced by the imaging process are extremely useful for identifying and designing optimal estimators for object detection and tracking tasks. We also derive pipelined and parallelized implementations of popular tools used for statistical inference in non-linear systems, of which multi-camera systems are examples. Finally, we highlight the use of the emerging theory of compressive sensing in reducing the amount of data sensed and communicated by a camera network.

  1. A fast exact simulation method for a class of Markov jump processes.

    PubMed

    Li, Yao; Hu, Lili

    2015-11-14

    A new method of the stochastic simulation algorithm (SSA), named the Hashing-Leaping method (HLM), for exact simulations of a class of Markov jump processes, is presented in this paper. The HLM has a conditional constant computational cost per event, which is independent of the number of exponential clocks in the Markov process. The main idea of the HLM is to repeatedly implement a hash-table-like bucket sort algorithm for all times of occurrence covered by a time step with length τ. This paper serves as an introduction to this new SSA method. We introduce the method, demonstrate its implementation, analyze its properties, and compare its performance with three other commonly used SSA methods in four examples. Our performance tests and CPU operation statistics show certain advantages of the HLM for large scale problems.

  2. Implementation of Discovery Projects in Statistics

    ERIC Educational Resources Information Center

    Bailey, Brad; Spence, Dianna J.; Sinn, Robb

    2013-01-01

    Researchers and statistics educators consistently suggest that students will learn statistics more effectively by conducting projects through which they actively engage in a broad spectrum of tasks integral to statistical inquiry, in the authentic context of a real-world application. In keeping with these findings, we share an implementation of…

  3. Implementation of a community-based secondhand smoke reduction intervention for caregivers of urban children with asthma: process evaluation, successes and challenges

    PubMed Central

    Blaakman, Susan; Tremblay, Paul J.; Halterman, Jill S.; Fagnano, Maria; Borrelli, Belinda

    2013-01-01

    Many children, including those with asthma, remain exposed to secondhand smoke. This manuscript evaluates the process of implementing a secondhand smoke reduction counseling intervention using motivational interviewing (MI) for caregivers of urban children with asthma, including reach, dose delivered, dose received and fidelity. Challenges, strategies and successes in applying MI are highlighted. Data for 140 children (3–10 years) enrolled in the School Based Asthma Therapy trial, randomized to the treatment condition and living with one or more smoker, were analyzed. Summary statistics describe the sample, process measures related to intervention implementation, and primary caregiver (PCG) satisfaction with the intervention. The full intervention was completed by 79% of PCGs, but only 17% of other smoking caregivers. Nearly all (98%) PCGs were satisfied with the care study nurses provided and felt the program might be helpful to others. Despite challenges, this intervention was feasible and well received reaching caregivers who were not actively seeking treatment for smoking cessation or secondhand smoke reduction. Anticipating the strategies required to implement such an intervention may help promote participant engagement and retention to enhance the program’s ultimate success. PMID:22717938

  4. Clinical audit of diabetes management can improve the quality of care in a resource-limited primary care setting.

    PubMed

    Govender, Indira; Ehrlich, Rodney; Van Vuuren, Unita; De Vries, Elma; Namane, Mosedi; De Sa, Angela; Murie, Katy; Schlemmer, Arina; Govender, Strini; Isaacs, Abdul; Martell, Rob

    2012-12-01

    To determine whether clinical audit improved the performance of diabetic clinical processes in the health district in which it was implemented. Patient folders were systematically sampled annually for review. Primary health-care facilities in the Metro health district of the Western Cape Province in South Africa. Health-care workers involved in diabetes management. Clinical audit and feedback. The Skillings-Mack test was applied to median values of pooled audit results for nine diabetic clinical processes to measure whether there were statistically significant differences between annual audits performed in 2005, 2007, 2008 and 2009. Descriptive statistics were used to illustrate the order of values per process. A total of 40 community health centres participated in the baseline audit of 2005 that decreased to 30 in 2009. Except for two routine processes, baseline medians for six out of nine processes were below 50%. Pooled audit results showed statistically significant improvements in seven out of nine clinical processes. The findings indicate an association between the application of clinical audit and quality improvement in resource-limited settings. Co-interventions introduced after the baseline audit are likely to have contributed to improved outcomes. In addition, support from the relevant government health programmes and commitment of managers and frontline staff contributed to the audit's success.

  5. Putting Meaning Back Into the Mean: A Comment on the Misuse of Elementary Statistics in a Sample of Manuscripts Submitted to Clinical Therapeutics.

    PubMed

    Forrester, Janet E

    2015-12-01

    Errors in the statistical presentation and analyses of data in the medical literature remain common despite efforts to improve the review process, including the creation of guidelines for authors and the use of statistical reviewers. This article discusses common elementary statistical errors seen in manuscripts recently submitted to Clinical Therapeutics and describes some ways in which authors and reviewers can identify errors and thus correct them before publication. A nonsystematic sample of manuscripts submitted to Clinical Therapeutics over the past year was examined for elementary statistical errors. Clinical Therapeutics has many of the same errors that reportedly exist in other journals. Authors require additional guidance to avoid elementary statistical errors and incentives to use the guidance. Implementation of reporting guidelines for authors and reviewers by journals such as Clinical Therapeutics may be a good approach to reduce the rate of statistical errors. Copyright © 2015 Elsevier HS Journals, Inc. All rights reserved.

  6. Implementation of a GPS-RO data processing system for the KIAPS-LETKF data assimilation system

    NASA Astrophysics Data System (ADS)

    Kwon, H.; Kang, J.-S.; Jo, Y.; Kang, J. H.

    2014-11-01

    The Korea Institute of Atmospheric Prediction Systems (KIAPS) has been developing a new global numerical weather prediction model and an advanced data assimilation system. As part of the KIAPS Package for Observation Processing (KPOP) system for data assimilation, preprocessing and quality control modules for bending angle measurements of global positioning system radio occultation (GPS-RO) data have been implemented and examined. GPS-RO data processing system is composed of several steps for checking observation locations, missing values, physical values for Earth radius of curvature, and geoid undulation. An observation-minus-background check is implemented by use of a one-dimensional observational bending angle operator and tangent point drift is also considered in the quality control process. We have tested GPS-RO observations utilized by the Korean Meteorological Administration (KMA) within KPOP, based on both the KMA global model and the National Center for Atmospheric Research (NCAR) Community Atmosphere Model-Spectral Element (CAM-SE) as a model background. Background fields from the CAM-SE model are incorporated for the preparation of assimilation experiments with the KIAPS-LETKF data assimilation system, which has been successfully implemented to a cubed-sphere model with fully unstructured quadrilateral meshes. As a result of data processing, the bending angle departure statistics between observation and background shows significant improvement. Also, the first experiment in assimilating GPS-RO bending angle resulting from KPOP within KIAPS-LETKF shows encouraging results.

  7. Combat Ration Advanced Manufacturing Technology Demonstration (CRAMTD). ’Generic Inspection-Statistical Process Control System for a Combat Ration Manufacturing Facility’. Short Term Project (STP) Number 3.

    DTIC Science & Technology

    1996-01-01

    failure as due to an adhesive layer between the foil and inner polypropylene layers. "* Under subcontract, NFPA provided HACCP draft manuals for the...parameters of the production process and to ensure that they are within their target values. In addition, a HACCP program was used to assure product...played an important part in implementing Hazard Analysis Critical Control Points ( HACCP ) as part of the Process and Quality Control manual. The National

  8. NATIONAL WATER INFORMATION SYSTEM OF THE U. S. GEOLOGICAL SURVEY.

    USGS Publications Warehouse

    Edwards, Melvin D.

    1985-01-01

    National Water Information System (NWIS) has been designed as an interactive, distributed data system. It will integrate the existing, diverse data-processing systems into a common system. It will also provide easier, more flexible use as well as more convenient access and expanded computing, dissemination, and data-analysis capabilities. The NWIS is being implemented as part of a Distributed Information System (DIS) being developed by the Survey's Water Resources Division. The NWIS will be implemented on each node of the distributed network for the local processing, storage, and dissemination of hydrologic data collected within the node's area of responsibility. The processor at each node will also be used to perform hydrologic modeling, statistical data analysis, text editing, and some administrative work.

  9. Semivariogram Analysis of Bone Images Implemented on FPGA Architectures.

    PubMed

    Shirvaikar, Mukul; Lagadapati, Yamuna; Dong, Xuanliang

    2017-03-01

    Osteoporotic fractures are a major concern for the healthcare of elderly and female populations. Early diagnosis of patients with a high risk of osteoporotic fractures can be enhanced by introducing second-order statistical analysis of bone image data using techniques such as variogram analysis. Such analysis is computationally intensive thereby creating an impediment for introduction into imaging machines found in common clinical settings. This paper investigates the fast implementation of the semivariogram algorithm, which has been proven to be effective in modeling bone strength, and should be of interest to readers in the areas of computer-aided diagnosis and quantitative image analysis. The semivariogram is a statistical measure of the spatial distribution of data, and is based on Markov Random Fields (MRFs). Semivariogram analysis is a computationally intensive algorithm that has typically seen applications in the geosciences and remote sensing areas. Recently, applications in the area of medical imaging have been investigated, resulting in the need for efficient real time implementation of the algorithm. A semi-variance, γ ( h ), is defined as the half of the expected squared differences of pixel values between any two data locations with a lag distance of h . Due to the need to examine each pair of pixels in the image or sub-image being processed, the base algorithm complexity for an image window with n pixels is O ( n 2 ) Field Programmable Gate Arrays (FPGAs) are an attractive solution for such demanding applications due to their parallel processing capability. FPGAs also tend to operate at relatively modest clock rates measured in a few hundreds of megahertz. This paper presents a technique for the fast computation of the semivariogram using two custom FPGA architectures. A modular architecture approach is chosen to allow for replication of processing units. This allows for high throughput due to concurrent processing of pixel pairs. The current implementation is focused on isotropic semivariogram computations only. The algorithm is benchmarked using VHDL on a Xilinx XUPV5-LX110T development Kit, which utilizes the Virtex5 FPGA. Medical image data from DXA scans are utilized for the experiments. Implementation results show that a significant advantage in computational speed is attained by the architectures with respect to implementation on a personal computer with an Intel i7 multi-core processor.

  10. Semivariogram Analysis of Bone Images Implemented on FPGA Architectures

    PubMed Central

    Shirvaikar, Mukul; Lagadapati, Yamuna; Dong, Xuanliang

    2016-01-01

    Osteoporotic fractures are a major concern for the healthcare of elderly and female populations. Early diagnosis of patients with a high risk of osteoporotic fractures can be enhanced by introducing second-order statistical analysis of bone image data using techniques such as variogram analysis. Such analysis is computationally intensive thereby creating an impediment for introduction into imaging machines found in common clinical settings. This paper investigates the fast implementation of the semivariogram algorithm, which has been proven to be effective in modeling bone strength, and should be of interest to readers in the areas of computer-aided diagnosis and quantitative image analysis. The semivariogram is a statistical measure of the spatial distribution of data, and is based on Markov Random Fields (MRFs). Semivariogram analysis is a computationally intensive algorithm that has typically seen applications in the geosciences and remote sensing areas. Recently, applications in the area of medical imaging have been investigated, resulting in the need for efficient real time implementation of the algorithm. A semi-variance, γ(h), is defined as the half of the expected squared differences of pixel values between any two data locations with a lag distance of h. Due to the need to examine each pair of pixels in the image or sub-image being processed, the base algorithm complexity for an image window with n pixels is O (n2) Field Programmable Gate Arrays (FPGAs) are an attractive solution for such demanding applications due to their parallel processing capability. FPGAs also tend to operate at relatively modest clock rates measured in a few hundreds of megahertz. This paper presents a technique for the fast computation of the semivariogram using two custom FPGA architectures. A modular architecture approach is chosen to allow for replication of processing units. This allows for high throughput due to concurrent processing of pixel pairs. The current implementation is focused on isotropic semivariogram computations only. The algorithm is benchmarked using VHDL on a Xilinx XUPV5-LX110T development Kit, which utilizes the Virtex5 FPGA. Medical image data from DXA scans are utilized for the experiments. Implementation results show that a significant advantage in computational speed is attained by the architectures with respect to implementation on a personal computer with an Intel i7 multi-core processor. PMID:28428829

  11. Implementation of quality by design principles in the development of microsponges as drug delivery carriers: Identification and optimization of critical factors using multivariate statistical analyses and design of experiments studies.

    PubMed

    Simonoska Crcarevska, Maja; Dimitrovska, Aneta; Sibinovska, Nadica; Mladenovska, Kristina; Slavevska Raicki, Renata; Glavas Dodov, Marija

    2015-07-15

    Microsponges drug delivery system (MDDC) was prepared by double emulsion-solvent-diffusion technique using rotor-stator homogenization. Quality by design (QbD) concept was implemented for the development of MDDC with potential to be incorporated into semisolid dosage form (gel). Quality target product profile (QTPP) and critical quality attributes (CQA) were defined and identified, accordingly. Critical material attributes (CMA) and Critical process parameters (CPP) were identified using quality risk management (QRM) tool, failure mode, effects and criticality analysis (FMECA). CMA and CPP were identified based on results obtained from principal component analysis (PCA-X&Y) and partial least squares (PLS) statistical analysis along with literature data, product and process knowledge and understanding. FMECA identified amount of ethylcellulose, chitosan, acetone, dichloromethane, span 80, tween 80 and water ratio in primary/multiple emulsions as CMA and rotation speed and stirrer type used for organic solvent removal as CPP. The relationship between identified CPP and particle size as CQA was described in the design space using design of experiments - one-factor response surface method. Obtained results from statistically designed experiments enabled establishment of mathematical models and equations that were used for detailed characterization of influence of identified CPP upon MDDC particle size and particle size distribution and their subsequent optimization. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. The Triangulation Algorithmic: A Transformative Function for Designing and Deploying Effective Educational Technology Assessment Instruments

    ERIC Educational Resources Information Center

    Osler, James Edward

    2013-01-01

    This paper discusses the implementation of the Tri-Squared Test as an advanced statistical measure used to verify and validate the research outcomes of Educational Technology software. A mathematical and epistemological rational is provided for the transformative process of qualitative data into quantitative outcomes through the Tri-Squared Test…

  13. Comparing Data Input Requirements of Statistical vs. Process-based Watershed Models Applied for Prediction of Fecal Indicator and Pathogen Levels in Recreational Beaches

    EPA Science Inventory

    Same day prediction of fecal indicator bacteria (FIB) concentrations and bather protection from the risk of exposure to pathogens are two important goals of implementing a modeling program at recreational beaches. Sampling efforts for modelling applications can be expensive and t...

  14. Human Resource Information System implementation readiness in the Ethiopian health sector: a cross-sectional study.

    PubMed

    Dilu, Eyilachew; Gebreslassie, Measho; Kebede, Mihiretu

    2017-12-20

    Health workforce information systems in low-income countries tend to be defective with poor relationship to information sources. Human Resource Information System (HRIS) is currently in a pilot implementation phase in the Federal Ministry of Health and Regional Health Bureaus of Ethiopia. Before scaling up the implementation, it is important to understand the implementation readiness of hospitals and health departments. The aims of this study were to assess the readiness for HRIS implementation, identify associated factors, and explore the implementation challenges in public hospitals and health departments of the Amhara National Regional State, Ethiopia. An institution-based cross-sectional study supplemented with a qualitative study was conducted from the 15th of February to the 30th of March 2016 in 19 public hospitals and health departments of the Amhara National Regional State, Ethiopia. A self-administered questionnaire was used to collect the data. The questionnaire includes items on socio-demographic characteristics and questions measuring technical, personal, and organizational factors adapted from the 32-item questionnaire of the Management Science for Health (MSH) HRIS readiness assessment tool. The data were entered and analyzed with statistical software. Descriptive statistics and bivariate and multivariable logistic regression analyses were performed. Odds ratios with 95% confidence interval were computed to identify the factors statistically associated with readiness of HRIS implementation. In-depth interviews and observation checklists were used to collect qualitative data. Thematic content analysis was used to analyze the qualitative data. A total of 246 human resource (HR) employees and 16 key informants have been included in the study. The HR employee's level of readiness for HRIS implementation in this study was 35.8%. Employee's Internet access (AOR = 2.59, 95%CI = 1.19, 5.62), availability of separate HR section (AOR = 8.08, 95%CI = 3.69, 17.70), basic computer skills (AOR = 6.74, 95%CI = 2.75, 16.56), and fear of unemployment (AOR = 2.83, 95%CI = 1.27, 6.32) were associated with readiness of HRIS implementation. Poor logistic supply, lack of competency, poor commitment, and shortage of finance were the challenges of HRIS implementation. In this study, readiness of HRIS implementation was low. Strategies targeting to improve skills, awareness, and attitude of HR employees would facilitate the implementation process.

  15. Uncertainty quantification in structural health monitoring: Applications on cultural heritage buildings

    NASA Astrophysics Data System (ADS)

    Lorenzoni, Filippo; Casarin, Filippo; Caldon, Mauro; Islami, Kleidi; Modena, Claudio

    2016-01-01

    In the last decades the need for an effective seismic protection and vulnerability reduction of cultural heritage buildings and sites determined a growing interest in structural health monitoring (SHM) as a knowledge-based assessment tool to quantify and reduce uncertainties regarding their structural performance. Monitoring can be successfully implemented in some cases as an alternative to interventions or to control the medium- and long-term effectiveness of already applied strengthening solutions. The research group at the University of Padua, in collaboration with public administrations, has recently installed several SHM systems on heritage structures. The paper reports the application of monitoring strategies implemented to avoid (or at least minimize) the execution of strengthening interventions/repairs and control the response as long as a clear worsening or damaging process is detected. Two emblematic case studies are presented and discussed: the Roman Amphitheatre (Arena) of Verona and the Conegliano Cathedral. Both are excellent examples of on-going monitoring activities, performed through static and dynamic approaches in combination with automated procedures to extract meaningful structural features from collected data. In parallel to the application of innovative monitoring techniques, statistical models and data processing algorithms have been developed and applied in order to reduce uncertainties and exploit monitoring results for an effective assessment and protection of historical constructions. Processing software for SHM was implemented to perform the continuous real time treatment of static data and the identification of modal parameters based on the structural response to ambient vibrations. Statistical models were also developed to filter out the environmental effects and thermal cycles from the extracted features.

  16. Experimental statistical signature of many-body quantum interference

    NASA Astrophysics Data System (ADS)

    Giordani, Taira; Flamini, Fulvio; Pompili, Matteo; Viggianiello, Niko; Spagnolo, Nicolò; Crespi, Andrea; Osellame, Roberto; Wiebe, Nathan; Walschaers, Mattia; Buchleitner, Andreas; Sciarrino, Fabio

    2018-03-01

    Multi-particle interference is an essential ingredient for fundamental quantum mechanics phenomena and for quantum information processing to provide a computational advantage, as recently emphasized by boson sampling experiments. Hence, developing a reliable and efficient technique to witness its presence is pivotal in achieving the practical implementation of quantum technologies. Here, we experimentally identify genuine many-body quantum interference via a recent efficient protocol, which exploits statistical signatures at the output of a multimode quantum device. We successfully apply the test to validate three-photon experiments in an integrated photonic circuit, providing an extensive analysis on the resources required to perform it. Moreover, drawing upon established techniques of machine learning, we show how such tools help to identify the—a priori unknown—optimal features to witness these signatures. Our results provide evidence on the efficacy and feasibility of the method, paving the way for its adoption in large-scale implementations.

  17. Implementation strategies to promote community-engaged efforts to counter tobacco marketing at the point of sale.

    PubMed

    Leeman, Jennifer; Myers, Allison; Grant, Jennifer C; Wangen, Mary; Queen, Tara L

    2017-09-01

    The US tobacco industry spends $8.2 billion annually on marketing at the point of sale (POS), a practice known to increase tobacco use. Evidence-based policy interventions (EBPIs) are available to reduce exposure to POS marketing, and nationwide, states are funding community-based tobacco control partnerships to promote local enactment of these EBPIs. Little is known, however, about what implementation strategies best support community partnerships' success enacting EBPI. Guided by Kingdon's theory of policy change, Counter Tools provides tools, training, and other implementation strategies to support community partnerships' performance of five core policy change processes: document local problem, formulate policy solutions, engage partners, raise awareness of problems and solutions, and persuade decision makers to enact new policy. We assessed Counter Tools' impact at 1 year on (1) partnership coordinators' self-efficacy, (2) partnerships' performance of core policy change processes, (3) community progress toward EBPI enactment, and (4) salient contextual factors. Counter Tools provided implementation strategies to 30 partnerships. Data on self-efficacy were collected using a pre-post survey. Structured interviews assessed performance of core policy change processes. Data also were collected on progress toward EBPI enactment and contextual factors. Analysis included descriptive and bivariate statistics and content analysis. Following 1-year exposure to implementation strategies, coordinators' self-efficacy increased significantly. Partnerships completed the greatest proportion of activities within the "engage partners" and "document local problem" core processes. Communities made only limited progress toward policy enactment. Findings can inform delivery of implementation strategies and tests of their effects on community-level efforts to enact EBPIs.

  18. Multiscale Structure of UXO Site Characterization: Spatial Estimation and Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ostrouchov, George; Doll, William E.; Beard, Les P.

    2009-01-01

    Unexploded ordnance (UXO) site characterization must consider both how the contamination is generated and how we observe that contamination. Within the generation and observation processes, dependence structures can be exploited at multiple scales. We describe a conceptual site characterization process, the dependence structures available at several scales, and consider their statistical estimation aspects. It is evident that most of the statistical methods that are needed to address the estimation problems are known but their application-specific implementation may not be available. We demonstrate estimation at one scale and propose a representation for site contamination intensity that takes full account of uncertainty,more » is flexible enough to answer regulatory requirements, and is a practical tool for managing detailed spatial site characterization and remediation. The representation is based on point process spatial estimation methods that require modern computational resources for practical application. These methods have provisions for including prior and covariate information.« less

  19. Design and Production of Color Calibration Targets for Digital Input Devices

    DTIC Science & Technology

    2000-07-01

    gamuts . Fourth, color transform form CIELCH to sRGB will be described. Fifth, the relevant target mockups will be created. Sixth, the quality will be...Implement statistical _ • process controls Print, process and measure •, reject Transfer the measured CIEXYZ of I the target patches to SRGB a Genterate...Kodak Royal VII paper and sRGB . This plot shows all points on the a*-b* plane without information about the L*. The sRGB’s color gamut is obtained from

  20. Environmental flow allocation and statistics calculator

    USGS Publications Warehouse

    Konrad, Christopher P.

    2011-01-01

    The Environmental Flow Allocation and Statistics Calculator (EFASC) is a computer program that calculates hydrologic statistics based on a time series of daily streamflow values. EFASC will calculate statistics for daily streamflow in an input file or will generate synthetic daily flow series from an input file based on rules for allocating and protecting streamflow and then calculate statistics for the synthetic time series. The program reads dates and daily streamflow values from input files. The program writes statistics out to a series of worksheets and text files. Multiple sites can be processed in series as one run. EFASC is written in MicrosoftRegistered Visual BasicCopyright for Applications and implemented as a macro in MicrosoftOffice Excel 2007Registered. EFASC is intended as a research tool for users familiar with computer programming. The code for EFASC is provided so that it can be modified for specific applications. All users should review how output statistics are calculated and recognize that the algorithms may not comply with conventions used to calculate streamflow statistics published by the U.S. Geological Survey.

  1. A fast exact simulation method for a class of Markov jump processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Yao, E-mail: yaoli@math.umass.edu; Hu, Lili, E-mail: lilyhu86@gmail.com

    2015-11-14

    A new method of the stochastic simulation algorithm (SSA), named the Hashing-Leaping method (HLM), for exact simulations of a class of Markov jump processes, is presented in this paper. The HLM has a conditional constant computational cost per event, which is independent of the number of exponential clocks in the Markov process. The main idea of the HLM is to repeatedly implement a hash-table-like bucket sort algorithm for all times of occurrence covered by a time step with length τ. This paper serves as an introduction to this new SSA method. We introduce the method, demonstrate its implementation, analyze itsmore » properties, and compare its performance with three other commonly used SSA methods in four examples. Our performance tests and CPU operation statistics show certain advantages of the HLM for large scale problems.« less

  2. GPUs for statistical data analysis in HEP: a performance study of GooFit on GPUs vs. RooFit on CPUs

    NASA Astrophysics Data System (ADS)

    Pompili, Alexis; Di Florio, Adriano; CMS Collaboration

    2016-10-01

    In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the Jψϕ invariant mass in the three-body decay B +→JψϕK +. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerably resulting speed-up, while comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may apply or does not apply because its regularity conditions are not satisfied.

  3. Statistical significance estimation of a signal within the GooFit framework on GPUs

    NASA Astrophysics Data System (ADS)

    Cristella, Leonardo; Di Florio, Adriano; Pompili, Alexis

    2017-03-01

    In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the J/ψϕ invariant mass in the three-body decay B+ → J/ψϕK+. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerable resulting speed-up, evident when comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may or may not apply because its regularity conditions are not satisfied.

  4. Performance studies of GooFit on GPUs vs RooFit on CPUs while estimating the statistical significance of a new physical signal

    NASA Astrophysics Data System (ADS)

    Di Florio, Adriano

    2017-10-01

    In order to test the computing capabilities of GPUs with respect to traditional CPU cores a high-statistics toy Monte Carlo technique has been implemented both in ROOT/RooFit and GooFit frameworks with the purpose to estimate the statistical significance of the structure observed by CMS close to the kinematical boundary of the J/ψϕ invariant mass in the three-body decay B + → J/ψϕK +. GooFit is a data analysis open tool under development that interfaces ROOT/RooFit to CUDA platform on nVidia GPU. The optimized GooFit application running on GPUs hosted by servers in the Bari Tier2 provides striking speed-up performances with respect to the RooFit application parallelised on multiple CPUs by means of PROOF-Lite tool. The considerable resulting speed-up, evident when comparing concurrent GooFit processes allowed by CUDA Multi Process Service and a RooFit/PROOF-Lite process with multiple CPU workers, is presented and discussed in detail. By means of GooFit it has also been possible to explore the behaviour of a likelihood ratio test statistic in different situations in which the Wilks Theorem may or may not apply because its regularity conditions are not satisfied.

  5. A comparison between EEG source localization and fMRI during the processing of emotional visual stimuli

    NASA Astrophysics Data System (ADS)

    Hu, Jin; Tian, Jie; Pan, Xiaohong; Liu, Jiangang

    2007-03-01

    The purpose of this paper is to compare between EEG source localization and fMRI during emotional processing. 108 pictures for EEG (categorized as positive, negative and neutral) and 72 pictures for fMRI were presented to 24 healthy, right-handed subjects. The fMRI data were analyzed using statistical parametric mapping with SPM2. LORETA was applied to grand averaged ERP data to localize intracranial sources. Statistical analysis was implemented to compare spatiotemporal activation of fMRI and EEG. The fMRI results are in accordance with EEG source localization to some extent, while part of mismatch in localization between the two methods was also observed. In the future we should apply the method for simultaneous recording of EEG and fMRI to our study.

  6. A neural network model of metaphor understanding with dynamic interaction based on a statistical language analysis: targeting a human-like model.

    PubMed

    Terai, Asuka; Nakagawa, Masanori

    2007-08-01

    The purpose of this paper is to construct a model that represents the human process of understanding metaphors, focusing specifically on similes of the form an "A like B". Generally speaking, human beings are able to generate and understand many sorts of metaphors. This study constructs the model based on a probabilistic knowledge structure for concepts which is computed from a statistical analysis of a large-scale corpus. Consequently, this model is able to cover the many kinds of metaphors that human beings can generate. Moreover, the model implements the dynamic process of metaphor understanding by using a neural network with dynamic interactions. Finally, the validity of the model is confirmed by comparing model simulations with the results from a psychological experiment.

  7. Structure-oriented versus process-oriented approach to enhance efficiency for emergency room operations: what lessons can we learn?

    PubMed

    Hwang, Taik Gun; Lee, Younsuk; Shin, Hojung

    2011-01-01

    The efficiency and quality of a healthcare system can be defined as interactions among the system structure, processes, and outcome. This article examines the effect of structural adjustment (change in floor plan or layout) and process improvement (critical pathway implementation) on performance of emergency room (ER) operations for acute cerebral infarction patients. Two large teaching hospitals participated in this study: Korea University (KU) Guro Hospital and KU Anam Hospital. The administration of Guro adopted a structure-oriented approach in improving its ER operations while the administration of Anam employed a process-oriented approach, facilitating critical pathways and protocols. To calibrate improvements, the data for time interval, length of stay, and hospital charges were collected, before and after the planned changes were implemented at each hospital. In particular, time interval is the most essential measure for handling acute stroke patients because patients' survival and recovery are affected by the promptness of diagnosis and treatment. Statistical analyses indicated that both redesign of layout at Guro and implementation of critical pathways at Anam had a positive influence on most of the performance measures. However, reduction in time interval was not consistent at Guro, demonstrating delays in processing time for a few processes. The adoption of critical pathways at Anam appeared more effective in reducing time intervals than the structural rearrangement at Guro, mainly as a result of the extensive employee training required for a critical pathway implementation. Thus, hospital managers should combine structure-oriented and process-oriented strategies to maximize effectiveness of improvement efforts.

  8. [Quality assessment in anesthesia].

    PubMed

    Kupperwasser, B

    1996-01-01

    Quality assessment (assurance/improvement) is the set of methods used to measure and improve the delivered care and the department's performance against pre-established criteria or standards. The four stages of the self-maintained quality assessment cycle are: problem identification, problem analysis, problem correction and evaluation of corrective actions. Quality assessment is a measurable entity for which it is necessary to define and calibrate measurement parameters (indicators) from available data gathered from the hospital anaesthesia environment. Problem identification comes from the accumulation of indicators. There are four types of quality indicators: structure, process, outcome and sentinel indicators. The latter signal a quality defect, are independent of outcomes, are easier to analyse by statistical methods and closely related to processes and main targets of quality improvement. The three types of methods to analyse the problems (indicators) are: peer review, quantitative methods and risks management techniques. Peer review is performed by qualified anaesthesiologists. To improve its validity, the review process should be explicited and conclusions based on standards of practice and literature references. The quantitative methods are statistical analyses applied to the collected data and presented in a graphic format (histogram, Pareto diagram, control charts). The risks management techniques include: a) critical incident analysis establishing an objective relationship between a 'critical' event and the associated human behaviours; b) system accident analysis, based on the fact that accidents continue to occur despite safety systems and sophisticated technologies, checks of all the process components leading to the impredictable outcome and not just the human factors; c) cause-effect diagrams facilitate the problem analysis in reducing its causes to four fundamental components (persons, regulations, equipment, process). Definition and implementation of corrective measures, based on the findings of the two previous stages, are the third step of the evaluation cycle. The Hawthorne effect is an outcome improvement, before the implementation of any corrective actions. Verification of the implemented actions is the final and mandatory step closing the evaluation cycle.

  9. Effects of performance feedback and coaching on the problem-solving process: Improving the integrity of implementation and enhancing student outcomes

    NASA Astrophysics Data System (ADS)

    Lundahl, Allison A.

    Schools implementing Response to Intervention (RtI) procedures frequently engage in team problem-solving processes to address the needs of students who require intensive and individualized services. Because the effectiveness of the problem-solving process will impact the overall success of RtI systems, the present study was designed to learn more about how to strengthen the integrity of the problem-solving process. Research suggests that school districts must ensure high quality training and ongoing support to enhance the effectiveness, acceptability, and sustainability of the problem-solving process within an RtI model; however, there is a dearth of research examining the effectiveness of methods to provide this training and support. Consequently, this study investigated the effects of performance feedback and coaching strategies on the integrity with which teams of educators conducted the problem-solving process in schools. In addition, the relationships between problem-solving integrity, teacher acceptability, and student outcomes were examined. Results suggested that the performance feedback increased problem-solving procedural integrity across two of the three participating schools. Conclusions about the effectiveness of the (a) coaching intervention and (b) interventions implemented in the third school were inconclusive. Regression analyses indicated that the integrity with which the teams conducted the problem-solving process was a significant predictor of student outcomes. However, the relationship between problem-solving procedural integrity and teacher acceptability was not statistically significant.

  10. Fuzzy model-based fault detection and diagnosis for a pilot heat exchanger

    NASA Astrophysics Data System (ADS)

    Habbi, Hacene; Kidouche, Madjid; Kinnaert, Michel; Zelmat, Mimoun

    2011-04-01

    This article addresses the design and real-time implementation of a fuzzy model-based fault detection and diagnosis (FDD) system for a pilot co-current heat exchanger. The design method is based on a three-step procedure which involves the identification of data-driven fuzzy rule-based models, the design of a fuzzy residual generator and the evaluation of the residuals for fault diagnosis using statistical tests. The fuzzy FDD mechanism has been implemented and validated on the real co-current heat exchanger, and has been proven to be efficient in detecting and isolating process, sensor and actuator faults.

  11. Load Balancing Using Time Series Analysis for Soft Real Time Systems with Statistically Periodic Loads

    NASA Technical Reports Server (NTRS)

    Hailperin, Max

    1993-01-01

    This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that our techniques allow more accurate estimation of the global system load ing, resulting in fewer object migration than local methods. Our method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive methods.

  12. Design of radar receivers

    NASA Astrophysics Data System (ADS)

    Sokolov, M. A.

    This handbook treats the design and analysis of of pulsed radar receivers, with emphasis on elements (especially IC elements) that implement optimal and suboptimal algorithms. The design methodology is developed from the viewpoint of statistical communications theory. Particular consideration is given to the synthesis of single-channel and multichannel detectors, the design of analog and digital signal-processing devices, and the analysis of IF amplifiers.

  13. Topological color codes on Union Jack lattices: a stable implementation of the whole Clifford group

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katzgraber, Helmut G.; Theoretische Physik, ETH Zurich, CH-8093 Zurich; Bombin, H.

    We study the error threshold of topological color codes on Union Jack lattices that allow for the full implementation of the whole Clifford group of quantum gates. After mapping the error-correction process onto a statistical mechanical random three-body Ising model on a Union Jack lattice, we compute its phase diagram in the temperature-disorder plane using Monte Carlo simulations. Surprisingly, topological color codes on Union Jack lattices have a similar error stability to color codes on triangular lattices, as well as to the Kitaev toric code. The enhanced computational capabilities of the topological color codes on Union Jack lattices with respectmore » to triangular lattices and the toric code combined with the inherent robustness of this implementation show good prospects for future stable quantum computer implementations.« less

  14. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    NASA Technical Reports Server (NTRS)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems which use numerical approximations even in cases where closed-form solutions exist. AutoBayes is implemented in Prolog and comprises approximately 75.000 lines of code. In this paper, we take one typical scientific data analysis problem-analyzing planetary nebulae images taken by the Hubble Space Telescope-and show how AutoBayes can be used to automate the implementation of the necessary anal- ysis programs. We initially follow the analysis described by Knuth and Hajian [KHO2] and use AutoBayes to derive code for the published models. We show the details of the code derivation process, including the symbolic computations and automatic integration of library procedures, and compare the results of the automatically generated and manually implemented code. We then go beyond the original analysis and use AutoBayes to derive code for a simple image segmentation procedure based on a mixture model which can be used to automate a manual preproceesing step. Finally, we combine the original approach with the simple segmentation which yields a more detailed analysis. This also demonstrates that AutoBayes makes it easy to combine different aspects of data analysis.

  15. Implementation of a GPS-RO data processing system for the KIAPS-LETKF data assimilation system

    NASA Astrophysics Data System (ADS)

    Kwon, H.; Kang, J.-S.; Jo, Y.; Kang, J. H.

    2015-03-01

    The Korea Institute of Atmospheric Prediction Systems (KIAPS) has been developing a new global numerical weather prediction model and an advanced data assimilation system. As part of the KIAPS package for observation processing (KPOP) system for data assimilation, preprocessing, and quality control modules for bending-angle measurements of global positioning system radio occultation (GPS-RO) data have been implemented and examined. The GPS-RO data processing system is composed of several steps for checking observation locations, missing values, physical values for Earth radius of curvature, and geoid undulation. An observation-minus-background check is implemented by use of a one-dimensional observational bending-angle operator, and tangent point drift is also considered in the quality control process. We have tested GPS-RO observations utilized by the Korean Meteorological Administration (KMA) within KPOP, based on both the KMA global model and the National Center for Atmospheric Research Community Atmosphere Model with Spectral Element dynamical core (CAM-SE) as a model background. Background fields from the CAM-SE model are incorporated for the preparation of assimilation experiments with the KIAPS local ensemble transform Kalman filter (LETKF) data assimilation system, which has been successfully implemented to a cubed-sphere model with unstructured quadrilateral meshes. As a result of data processing, the bending-angle departure statistics between observation and background show significant improvement. Also, the first experiment in assimilating GPS-RO bending angle from KPOP within KIAPS-LETKF shows encouraging results.

  16. Implementation of Lean System on Erbium Doped Fibre Amplifier Manufacturing Process to Reduce Production Time

    NASA Astrophysics Data System (ADS)

    Maneechote, T.; Luangpaiboon, P.

    2010-10-01

    A manufacturing process of erbium doped fibre amplifiers is complicated. It needs to meet the customers' requirements under a present economic status that products need to be shipped to customers as soon as possible after purchasing orders. This research aims to study and improve processes and production lines of erbium doped fibre amplifiers using lean manufacturing systems via an application of computer simulation. Three scenarios of lean tooled box systems are selected via the expert system. Firstly, the production schedule based on shipment date is combined with a first in first out control system. The second scenario focuses on a designed flow process plant layout. Finally, the previous flow process plant layout combines with production schedule based on shipment date including the first in first out control systems. The computer simulation with the limited data via an expected value is used to observe the performance of all scenarios. The most preferable resulted lean tooled box systems from a computer simulation are selected to implement in the real process of a production of erbium doped fibre amplifiers. A comparison is carried out to determine the actual performance measures via an analysis of variance of the response or the production time per unit achieved in each scenario. The goodness of an adequacy of the linear statistical model via experimental errors or residuals is also performed to check the normality, constant variance and independence of the residuals. The results show that a hybrid scenario of lean manufacturing system with the first in first out control and flow process plant lay out statistically leads to better performance in terms of the mean and variance of production times.

  17. A low-cost vector processor boosting compute-intensive image processing operations

    NASA Technical Reports Server (NTRS)

    Adorf, Hans-Martin

    1992-01-01

    Low-cost vector processing (VP) is within reach of everyone seriously engaged in scientific computing. The advent of affordable add-on VP-boards for standard workstations complemented by mathematical/statistical libraries is beginning to impact compute-intensive tasks such as image processing. A case in point in the restoration of distorted images from the Hubble Space Telescope. A low-cost implementation is presented of the standard Tarasko-Richardson-Lucy restoration algorithm on an Intel i860-based VP-board which is seamlessly interfaced to a commercial, interactive image processing system. First experience is reported (including some benchmarks for standalone FFT's) and some conclusions are drawn.

  18. The implementation of CMOS sensors within a real time digital mammography intelligent imaging system: The I-ImaS System

    NASA Astrophysics Data System (ADS)

    Esbrand, C.; Royle, G.; Griffiths, J.; Speller, R.

    2009-07-01

    The integration of technology with healthcare has undoubtedly propelled the medical imaging sector well into the twenty first century. The concept of digital imaging introduced during the 1970s has since paved the way for established imaging techniques where digital mammography, phase contrast imaging and CT imaging are just a few examples. This paper presents a prototype intelligent digital mammography system designed and developed by a European consortium. The final system, the I-ImaS system, utilises CMOS monolithic active pixel sensor (MAPS) technology promoting on-chip data processing, enabling the acts of data processing and image acquisition to be achieved simultaneously; consequently, statistical analysis of tissue is achievable in real-time for the purpose of x-ray beam modulation via a feedback mechanism during the image acquisition procedure. The imager implements a dual array of twenty 520 pixel × 40 pixel CMOS MAPS sensing devices with a 32μm pixel size, each individually coupled to a 100μm thick thallium doped structured CsI scintillator. This paper presents the first intelligent images of real breast tissue obtained from the prototype system of real excised breast tissue where the x-ray exposure was modulated via the statistical information extracted from the breast tissue itself. Conventional images were experimentally acquired where the statistical analysis of the data was done off-line, resulting in the production of simulated real-time intelligently optimised images. The results obtained indicate real-time image optimisation using the statistical information extracted from the breast as a means of a feedback mechanisms is beneficial and foreseeable in the near future.

  19. ARK: Aggregation of Reads by K-Means for Estimation of Bacterial Community Composition.

    PubMed

    Koslicki, David; Chatterjee, Saikat; Shahrivar, Damon; Walker, Alan W; Francis, Suzanna C; Fraser, Louise J; Vehkaperä, Mikko; Lan, Yueheng; Corander, Jukka

    2015-01-01

    Estimation of bacterial community composition from high-throughput sequenced 16S rRNA gene amplicons is a key task in microbial ecology. Since the sequence data from each sample typically consist of a large number of reads and are adversely impacted by different levels of biological and technical noise, accurate analysis of such large datasets is challenging. There has been a recent surge of interest in using compressed sensing inspired and convex-optimization based methods to solve the estimation problem for bacterial community composition. These methods typically rely on summarizing the sequence data by frequencies of low-order k-mers and matching this information statistically with a taxonomically structured database. Here we show that the accuracy of the resulting community composition estimates can be substantially improved by aggregating the reads from a sample with an unsupervised machine learning approach prior to the estimation phase. The aggregation of reads is a pre-processing approach where we use a standard K-means clustering algorithm that partitions a large set of reads into subsets with reasonable computational cost to provide several vectors of first order statistics instead of only single statistical summarization in terms of k-mer frequencies. The output of the clustering is then processed further to obtain the final estimate for each sample. The resulting method is called Aggregation of Reads by K-means (ARK), and it is based on a statistical argument via mixture density formulation. ARK is found to improve the fidelity and robustness of several recently introduced methods, with only a modest increase in computational complexity. An open source, platform-independent implementation of the method in the Julia programming language is freely available at https://github.com/dkoslicki/ARK. A Matlab implementation is available at http://www.ee.kth.se/ctsoftware.

  20. Quantum interference in heterogeneous superconducting-photonic circuits on a silicon chip

    PubMed Central

    Schuck, C.; Guo, X.; Fan, L.; Ma, X.; Poot, M.; Tang, H. X.

    2016-01-01

    Quantum information processing holds great promise for communicating and computing data efficiently. However, scaling current photonic implementation approaches to larger system size remains an outstanding challenge for realizing disruptive quantum technology. Two main ingredients of quantum information processors are quantum interference and single-photon detectors. Here we develop a hybrid superconducting-photonic circuit system to show how these elements can be combined in a scalable fashion on a silicon chip. We demonstrate the suitability of this approach for integrated quantum optics by interfering and detecting photon pairs directly on the chip with waveguide-coupled single-photon detectors. Using a directional coupler implemented with silicon nitride nanophotonic waveguides, we observe 97% interference visibility when measuring photon statistics with two monolithically integrated superconducting single-photon detectors. The photonic circuit and detector fabrication processes are compatible with standard semiconductor thin-film technology, making it possible to implement more complex and larger scale quantum photonic circuits on silicon chips. PMID:26792424

  1. A process evaluation of START NOW Skills Training for inmates with impulsive and aggressive behaviors.

    PubMed

    Shelton, Deborah; Wakai, Sara

    2011-01-01

    To conduct a formative evaluation of a treatment program designed for inmates with impulsive and aggressive behavior disorders in high-security facilities in Connecticut correctional facilities. Pencil-and-paper surveys and in-person inmate interviews were used to answer four evaluation questions. Descriptive statistics and content analyses were used to assess context, input, process, and products. A convenience sample of 26 adult male (18) and female (8) inmates participated in the study. Inmates were satisfied with the program (4-point scale, M = 3.38, SD = 0.75). Inmate hospital stays were reduced by 13.6%, and psychotropic medication use increased slightly (0.40%). Improved outcomes were noted for those inmates who attended more sessions. The findings of the formative evaluation were useful for moving the START NOW Skills Training treatment to the implementation phase. Recommendations for implementation modifications included development of an implementation team, reinforcement of training, and attention applied to uniform collection of outcome data to demonstrate its evidence base.

  2. Implementing QML for radiation hardness assurance

    NASA Astrophysics Data System (ADS)

    Winokur, P. S.; Sexton, F. W.; Fleetwood, D. M.; Terry, M. D.; Shaneyfelt, M. R.

    1990-12-01

    The US government has proposed a qualified manufacturers list (QML) methodology to qualify integrated circuits for high reliability and radiation hardness. An approach to implementing QML for single-event upset (SEU) immunity on 16k SRAMs that involves relating values of feedback resistance to system error rates is demonstrated. It is seen that the process capability indices, Cp and Cpk, for the manufacture of 400-k-ohm feedback resistors required to provide SEU tolerance do not conform to 6 sigma quality standards. For total-dose, interface trap charge, Delta Vit, shifts measured on transistors are correlated with circuit response in the space environment. Statistical process control (SPC) is illustrated for Delta Vit, and violations of SPC rules are interpreted in terms of continuous improvement. Design validation for SEU and quality conformance inspections for total-dose are identified as major obstacles to cost-effective QML implementation. Techniques and tools that will help QML provide real cost savings are identified as physical models, 3-D device-plus-circuit codes, and improved design simulators.

  3. Implementation of the common phrase index method on the phrase query for information retrieval

    NASA Astrophysics Data System (ADS)

    Fatmawati, Triyah; Zaman, Badrus; Werdiningsih, Indah

    2017-08-01

    As the development of technology, the process of finding information on the news text is easy, because the text of the news is not only distributed in print media, such as newspapers, but also in electronic media that can be accessed using the search engine. In the process of finding relevant documents on the search engine, a phrase often used as a query. The number of words that make up the phrase query and their position obviously affect the relevance of the document produced. As a result, the accuracy of the information obtained will be affected. Based on the outlined problem, the purpose of this research was to analyze the implementation of the common phrase index method on information retrieval. This research will be conducted in English news text and implemented on a prototype to determine the relevance level of the documents produced. The system is built with the stages of pre-processing, indexing, term weighting calculation, and cosine similarity calculation. Then the system will display the document search results in a sequence, based on the cosine similarity. Furthermore, system testing will be conducted using 100 documents and 20 queries. That result is then used for the evaluation stage. First, determine the relevant documents using kappa statistic calculation. Second, determine the system success rate using precision, recall, and F-measure calculation. In this research, the result of kappa statistic calculation was 0.71, so that the relevant documents are eligible for the system evaluation. Then the calculation of precision, recall, and F-measure produces precision of 0.37, recall of 0.50, and F-measure of 0.43. From this result can be said that the success rate of the system to produce relevant documents is low.

  4. A scalable moment-closure approximation for large-scale biochemical reaction networks

    PubMed Central

    Kazeroonian, Atefeh; Theis, Fabian J.; Hasenauer, Jan

    2017-01-01

    Abstract Motivation: Stochastic molecular processes are a leading cause of cell-to-cell variability. Their dynamics are often described by continuous-time discrete-state Markov chains and simulated using stochastic simulation algorithms. As these stochastic simulations are computationally demanding, ordinary differential equation models for the dynamics of the statistical moments have been developed. The number of state variables of these approximating models, however, grows at least quadratically with the number of biochemical species. This limits their application to small- and medium-sized processes. Results: In this article, we present a scalable moment-closure approximation (sMA) for the simulation of statistical moments of large-scale stochastic processes. The sMA exploits the structure of the biochemical reaction network to reduce the covariance matrix. We prove that sMA yields approximating models whose number of state variables depends predominantly on local properties, i.e. the average node degree of the reaction network, instead of the overall network size. The resulting complexity reduction is assessed by studying a range of medium- and large-scale biochemical reaction networks. To evaluate the approximation accuracy and the improvement in computational efficiency, we study models for JAK2/STAT5 signalling and NFκB signalling. Our method is applicable to generic biochemical reaction networks and we provide an implementation, including an SBML interface, which renders the sMA easily accessible. Availability and implementation: The sMA is implemented in the open-source MATLAB toolbox CERENA and is available from https://github.com/CERENADevelopers/CERENA. Contact: jan.hasenauer@helmholtz-muenchen.de or atefeh.kazeroonian@tum.de Supplementary information: Supplementary data are available at Bioinformatics online. PMID:28881983

  5. Hierarchical Bayesian modelling of gene expression time series across irregularly sampled replicates and clusters.

    PubMed

    Hensman, James; Lawrence, Neil D; Rattray, Magnus

    2013-08-20

    Time course data from microarrays and high-throughput sequencing experiments require simple, computationally efficient and powerful statistical models to extract meaningful biological signal, and for tasks such as data fusion and clustering. Existing methodologies fail to capture either the temporal or replicated nature of the experiments, and often impose constraints on the data collection process, such as regularly spaced samples, or similar sampling schema across replications. We propose hierarchical Gaussian processes as a general model of gene expression time-series, with application to a variety of problems. In particular, we illustrate the method's capacity for missing data imputation, data fusion and clustering.The method can impute data which is missing both systematically and at random: in a hold-out test on real data, performance is significantly better than commonly used imputation methods. The method's ability to model inter- and intra-cluster variance leads to more biologically meaningful clusters. The approach removes the necessity for evenly spaced samples, an advantage illustrated on a developmental Drosophila dataset with irregular replications. The hierarchical Gaussian process model provides an excellent statistical basis for several gene-expression time-series tasks. It has only a few additional parameters over a regular GP, has negligible additional complexity, is easily implemented and can be integrated into several existing algorithms. Our experiments were implemented in python, and are available from the authors' website: http://staffwww.dcs.shef.ac.uk/people/J.Hensman/.

  6. A quality improvement project using statistical process control methods for type 2 diabetes control in a resource-limited setting.

    PubMed

    Flood, David; Douglas, Kate; Goldberg, Vera; Martinez, Boris; Garcia, Pablo; Arbour, MaryCatherine; Rohloff, Peter

    2017-08-01

    Quality improvement (QI) is a key strategy for improving diabetes care in low- and middle-income countries (LMICs). This study reports on a diabetes QI project in rural Guatemala whose primary aim was to improve glycemic control of a panel of adult diabetes patients. Formative research suggested multiple areas for programmatic improvement in ambulatory diabetes care. This project utilized the Model for Improvement and Agile Global Health, our organization's complementary healthcare implementation framework. A bundle of improvement activities were implemented at the home, clinic and institutional level. Control charts of mean hemoglobin A1C (HbA1C) and proportion of patients meeting target HbA1C showed improvement as special cause variation was identified 3 months after the intervention began. Control charts for secondary process measures offered insights into the value of different components of the intervention. Intensity of home-based diabetes education emerged as an important driver of panel glycemic control. Diabetes QI work is feasible in resource-limited settings in LMICs and can improve glycemic control. Statistical process control charts are a promising methodology for use with panels or registries of diabetes patients. © The Author 2017. Published by Oxford University Press in association with the International Society for Quality in Health Care. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  7. Development of the ICD-10 simplified version and field test.

    PubMed

    Paoin, Wansa; Yuenyongsuwan, Maliwan; Yokobori, Yukiko; Endo, Hiroyoshi; Kim, Sukil

    2018-05-01

    The International Statistical Classification of Diseases and Related Health Problems, 10th Revision (ICD-10) has been used in various Asia-Pacific countries for more than 20 years. Although ICD-10 is a powerful tool, clinical coding processes are complex; therefore, many developing countries have not been able to implement ICD-10-based health statistics (WHO-FIC APN, 2007). This study aimed to simplify ICD-10 clinical coding processes, to modify index terms to facilitate computer searching and to provide a simplified version of ICD-10 for use in developing countries. The World Health Organization Family of International Classifications Asia-Pacific Network (APN) developed a simplified version of the ICD-10 and conducted field testing in Cambodia during February and March 2016. Ten hospitals were selected to participate. Each hospital sent a team to join a training workshop before using the ICD-10 simplified version to code 100 cases. All hospitals subsequently sent their coded records to the researchers. Overall, there were 1038 coded records with a total of 1099 ICD clinical codes assigned. The average accuracy rate was calculated as 80.71% (66.67-93.41%). Three types of clinical coding errors were found. These related to errors relating to the coder (14.56%), those resulting from the physician documentation (1.27%) and those considered system errors (3.46%). The field trial results demonstrated that the APN ICD-10 simplified version is feasible for implementation as an effective tool to implement ICD-10 clinical coding for hospitals. Developing countries may consider adopting the APN ICD-10 simplified version for ICD-10 code assignment in hospitals and health care centres. The simplified version can be viewed as an introductory tool which leads to the implementation of the full ICD-10 and may support subsequent ICD-11 adoption.

  8. Proper Image Subtraction—Optimal Transient Detection, Photometry, and Hypothesis Testing

    NASA Astrophysics Data System (ADS)

    Zackay, Barak; Ofek, Eran O.; Gal-Yam, Avishay

    2016-10-01

    Transient detection and flux measurement via image subtraction stand at the base of time domain astronomy. Due to the varying seeing conditions, the image subtraction process is non-trivial, and existing solutions suffer from a variety of problems. Starting from basic statistical principles, we develop the optimal statistic for transient detection, flux measurement, and any image-difference hypothesis testing. We derive a closed-form statistic that: (1) is mathematically proven to be the optimal transient detection statistic in the limit of background-dominated noise, (2) is numerically stable, (3) for accurately registered, adequately sampled images, does not leave subtraction or deconvolution artifacts, (4) allows automatic transient detection to the theoretical sensitivity limit by providing credible detection significance, (5) has uncorrelated white noise, (6) is a sufficient statistic for any further statistical test on the difference image, and, in particular, allows us to distinguish particle hits and other image artifacts from real transients, (7) is symmetric to the exchange of the new and reference images, (8) is at least an order of magnitude faster to compute than some popular methods, and (9) is straightforward to implement. Furthermore, we present extensions of this method that make it resilient to registration errors, color-refraction errors, and any noise source that can be modeled. In addition, we show that the optimal way to prepare a reference image is the proper image coaddition presented in Zackay & Ofek. We demonstrate this method on simulated data and real observations from the PTF data release 2. We provide an implementation of this algorithm in MATLAB and Python.

  9. MAGMA: analysis of two-channel microarrays made easy.

    PubMed

    Rehrauer, Hubert; Zoller, Stefan; Schlapbach, Ralph

    2007-07-01

    The web application MAGMA provides a simple and intuitive interface to identify differentially expressed genes from two-channel microarray data. While the underlying algorithms are not superior to those of similar web applications, MAGMA is particularly user friendly and can be used without prior training. The user interface guides the novice user through the most typical microarray analysis workflow consisting of data upload, annotation, normalization and statistical analysis. It automatically generates R-scripts that document MAGMA's entire data processing steps, thereby allowing the user to regenerate all results in his local R installation. The implementation of MAGMA follows the model-view-controller design pattern that strictly separates the R-based statistical data processing, the web-representation and the application logic. This modular design makes the application flexible and easily extendible by experts in one of the fields: statistical microarray analysis, web design or software development. State-of-the-art Java Server Faces technology was used to generate the web interface and to perform user input processing. MAGMA's object-oriented modular framework makes it easily extendible and applicable to other fields and demonstrates that modern Java technology is also suitable for rather small and concise academic projects. MAGMA is freely available at www.magma-fgcz.uzh.ch.

  10. Combustion Technology for Incinerating Wastes from Air Force Industrial Processes.

    DTIC Science & Technology

    1984-02-01

    The assumption of equilibrium between environmental compartments. * The statistical extrapolations yielding "safe" doses of various constituents...would be contacted to identify the assumptions and data requirements needed to design, construct and implement the model. The model’s primary objective...Recovery Planning Model (RRPLAN) is described. This section of the paper summarizes the model’s assumptions , major components and modes of operation

  11. The Effect of Multispectral Image Fusion Enhancement on Human Efficiency

    DTIC Science & Technology

    2017-03-20

    performance of the ideal observer is indicative of the relative amount of informa- tion across various experimental manipulations. In our experimental design ...registration and fusion processes, and contributed strongly to the statistical analyses. LMB contributed to the experimental design and writing structure. All... designed to be innovative, low-cost, and (relatively) easy-to-implement, and to provide support across the spectrum of possible users including

  12. Defining the best quality-control systems by design and inspection.

    PubMed

    Hinckley, C M

    1997-05-01

    Not all of the many approaches to quality control are equally effective. Nonconformities in laboratory testing are caused basically by excessive process variation and mistakes. Statistical quality control can effectively control process variation, but it cannot detect or prevent most mistakes. Because mistakes or blunders are frequently the dominant source of nonconformities, we conclude that statistical quality control by itself is not effective. I explore the 100% inspection methods essential for controlling mistakes. Unlike the inspection techniques that Deming described as ineffective, the new "source" inspection methods can detect mistakes and enable corrections before nonconformities are generated, achieving the highest degree of quality at a fraction of the cost of traditional methods. Key relationships between task complexity and nonconformity rates are also described, along with cultural changes that are essential for implementing the best quality-control practices.

  13. A statistical rain attenuation prediction model with application to the advanced communication technology satellite project. 3: A stochastic rain fade control algorithm for satellite link power via non linear Markow filtering theory

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1991-01-01

    The dynamic and composite nature of propagation impairments that are incurred on Earth-space communications links at frequencies in and above 30/20 GHz Ka band, i.e., rain attenuation, cloud and/or clear air scintillation, etc., combined with the need to counter such degradations after the small link margins have been exceeded, necessitate the use of dynamic statistical identification and prediction processing of the fading signal in order to optimally estimate and predict the levels of each of the deleterious attenuation components. Such requirements are being met in NASA's Advanced Communications Technology Satellite (ACTS) Project by the implementation of optimal processing schemes derived through the use of the Rain Attenuation Prediction Model and nonlinear Markov filtering theory.

  14. Mapping the Energy Cascade in the North Atlantic Ocean: The Coarse-graining Approach

    DOE PAGES

    Aluie, Hussein; Hecht, Matthew; Vallis, Geoffrey K.

    2017-11-14

    A coarse-graining framework is implemented to analyze nonlinear processes, measure energy transfer rates and map out the energy pathways from simulated global ocean data. Traditional tools to measure the energy cascade from turbulence theory, such as spectral flux or spectral transfer rely on the assumption of statistical homogeneity, or at least a large separation between the scales of motion and the scales of statistical inhomogeneity. The coarse-graining framework allows for probing the fully nonlinear dynamics simultaneously in scale and in space, and is not restricted by those assumptions. This study describes how the framework can be applied to ocean flows.

  15. Mapping the Energy Cascade in the North Atlantic Ocean: The Coarse-graining Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aluie, Hussein; Hecht, Matthew; Vallis, Geoffrey K.

    A coarse-graining framework is implemented to analyze nonlinear processes, measure energy transfer rates and map out the energy pathways from simulated global ocean data. Traditional tools to measure the energy cascade from turbulence theory, such as spectral flux or spectral transfer rely on the assumption of statistical homogeneity, or at least a large separation between the scales of motion and the scales of statistical inhomogeneity. The coarse-graining framework allows for probing the fully nonlinear dynamics simultaneously in scale and in space, and is not restricted by those assumptions. This study describes how the framework can be applied to ocean flows.

  16. Statistical study of generalized nonlinear phase step estimation methods in phase-shifting interferometry.

    PubMed

    Langoju, Rajesh; Patil, Abhijit; Rastogi, Pramod

    2007-11-20

    Signal processing methods based on maximum-likelihood theory, discrete chirp Fourier transform, and spectral estimation methods have enabled accurate measurement of phase in phase-shifting interferometry in the presence of nonlinear response of the piezoelectric transducer to the applied voltage. We present the statistical study of these generalized nonlinear phase step estimation methods to identify the best method by deriving the Cramér-Rao bound. We also address important aspects of these methods for implementation in practical applications and compare the performance of the best-identified method with other bench marking algorithms in the presence of harmonics and noise.

  17. Apollo Quality Program.

    PubMed

    Sibal, Anupam; Dewan, Shaveta; Uberoi, R S; Kar, Sujoy; Loria, Gaurav; Fernandes, Clive; Yatheesh, G; Sharma, Karan

    2012-01-01

    Ensuring patient safety is a vital step for any hospital in achieving the best clinical outcomes. The Apollo Quality Program aimed at standardization of processes for clinical handovers, medication safety, surgical safety, patient identification, verbal orders, hand washing compliance and falls prevention across the hospitals in the Group. Thirty-two hospitals across the Group in settings varying from rural to semi urban, urban and metropolitan implemented the program and over a period of one year demonstrated a visible improvement in the compliance to processes for patient safety translating into better patient safety statistics.

  18. Robust statistical methods for impulse noise suppressing of spread spectrum induced polarization data, with application to a mine site, Gansu province, China

    NASA Astrophysics Data System (ADS)

    Liu, Weiqiang; Chen, Rujun; Cai, Hongzhu; Luo, Weibin

    2016-12-01

    In this paper, we investigated the robust processing of noisy spread spectrum induced polarization (SSIP) data. SSIP is a new frequency domain induced polarization method that transmits pseudo-random m-sequence as source current where m-sequence is a broadband signal. The potential information at multiple frequencies can be obtained through measurement. Removing the noise is a crucial problem for SSIP data processing. Considering that if the ordinary mean stack and digital filter are not capable of reducing the impulse noise effectively in SSIP data processing, the impact of impulse noise will remain in the complex resistivity spectrum that will affect the interpretation of profile anomalies. We implemented a robust statistical method to SSIP data processing. The robust least-squares regression is used to fit and remove the linear trend from the original data before stacking. The robust M estimate is used to stack the data of all periods. The robust smooth filter is used to suppress the residual noise for data after stacking. For robust statistical scheme, the most appropriate influence function and iterative algorithm are chosen by testing the simulated data to suppress the outliers' influence. We tested the benefits of the robust SSIP data processing using examples of SSIP data recorded in a test site beside a mine in Gansu province, China.

  19. The population health record: concepts, definition, design, and implementation.

    PubMed

    Friedman, Daniel J; Parrish, R Gibson

    2010-01-01

    In 1997, the American Medical Informatics Association proposed a US information strategy that included a population health record (PopHR). Despite subsequent progress on the conceptualization, development, and implementation of electronic health records and personal health records, minimal progress has occurred on the PopHR. Adapting International Organization for Standarization electronic health records standards, we define the PopHR as a repository of statistics, measures, and indicators regarding the state of and influences on the health of a defined population, in computer processable form, stored and transmitted securely, and accessible by multiple authorized users. The PopHR is based upon an explicit population health framework and a standardized logical information model. PopHR purpose and uses, content and content sources, functionalities, business objectives, information architecture, and system architecture are described. Barriers to implementation and enabling factors and a three-stage implementation strategy are delineated.

  20. Racial and Socioeconomic Differences Manifest in Process Measure Adherence for Enhanced Recovery After Surgery Pathway.

    PubMed

    Leeds, Ira L; Alimi, Yewande; Hobson, Deborah R; Efron, Jonathan E; Wick, Elizabeth C; Haut, Elliott R; Johnston, Fabian M

    2017-10-01

    Adherence to care processes and surgical outcomes varies by population subgroups for the same procedure. Enhanced recovery after surgery pathways are intended to standardize care, but their effect on process adherence and outcomes for population subgroups is unknown. This study aims to demonstrate the association between recovery pathway implementation, process measures, and short-term surgical outcomes by population subgroup. This study is a pre- and post-quality improvement implementation cohort study. This study was conducted at a tertiary academic medical center. A modified colorectal enhanced recovery after surgery pathway was implemented. Patients were included who had elective colon and rectal resections before (2013) and following (2014-2016) recovery pathway implementation. Thirty-day outcomes by race and socioeconomic status were analyzed using a difference-in-difference approach with correlation to process adherence. We identified 639 cases (199 preimplementation, 440 postimplementation). In these cases, 75.2% of the patients were white, and 91.7% had a high socioeconomic status. Groups were similar in terms of other preoperative characteristics. Following pathway implementation, median lengths of stay improved in all subgroups (-1.0 days overall, p ≤ 0.001), but with no statistical difference by race or socioeconomic status (p = 0.89 and p = 0.29). Complication rates in both racial and socioeconomic groups were no different (26.4% vs 28.8%, p = 0.73; 27.3% vs 25.0%, p = 0.86) and remained unchanged with implementation (p = 0.93, p = 0.84). By race, overall adherence was 31.7% in white patients and 26.5% in nonwhite patients (p = 0.32). Although stratification by socioeconomic status demonstrated decreased overall adherence in the low-status group (31.8% vs 17.1%, p = 0.05), white patients were more likely to have regional pain therapy (57.1% vs 44.1%, p = 0.02) with a similar trend seen with socioeconomic status. Data were collected primarily for quality improvement purposes. Differences in outcomes by race and socioeconomic status did not arise following implementation of an enhanced recovery pathway. Differences in process measures by population subgroups highlight differences in care that require further investigation. See Video Abstract at http://links.lww.com/DCR/A386.

  1. Factors Influencing Implementation of a Physical Activity Intervention in Residential Children's Homes.

    PubMed

    Lau, Erica Y; Saunders, Ruth P; Pate, Russell R

    2016-11-01

    The Environmental Intervention in Children's Homes (ENRICH) study was the first published physical activity intervention undertaken in residential children's homes (RCHs). The study revealed differences in implementation across the homes, which may be a key factor that affects program effectiveness. The purpose of this study was to examine the direct and indirect effects of organizational capacity, provider characteristics, and quality of prevention support system on level of implementation of the ENRICH intervention. This study analyzed the ENRICH process evaluation data collected from 24 RCHs. Bayesian Path analysis was used to examine the direct and indirect effects of organizational capacity, provider characteristics, and quality of prevention support system on level of implementation. Level of implementation across RCHs was variable, ranging from 38 to 97 % (M = 68.3, SD = 14.45). Results revealed that organizational capacity and provider characteristics had significant direct associations with level of implementation. Neither direct nor indirect associations between quality of prevention support system and level of implementation reached statistical significance. Conducting formative assessments on organizational capacity and provider characteristics and incorporating such information in implementation planning may increase the likelihood of achieving higher levels of implementation in future studies.

  2. Using simulation modeling to improve patient flow at an outpatient orthopedic clinic.

    PubMed

    Rohleder, Thomas R; Lewkonia, Peter; Bischak, Diane P; Duffy, Paul; Hendijani, Rosa

    2011-06-01

    We report on the use of discrete event simulation modeling to support process improvements at an orthopedic outpatient clinic. The clinic was effective in treating patients, but waiting time and congestion in the clinic created patient dissatisfaction and staff morale issues. The modeling helped to identify improvement alternatives including optimized staffing levels, better patient scheduling, and an emphasis on staff arriving promptly. Quantitative results from the modeling provided motivation to implement the improvements. Statistical analysis of data taken before and after the implementation indicate that waiting time measures were significantly improved and overall patient time in the clinic was reduced.

  3. The OBRA-87 nursing home regulations and implementation of the Resident Assessment Instrument: effects on process quality.

    PubMed

    Hawes, C; Mor, V; Phillips, C D; Fries, B E; Morris, J N; Steele-Friedlob, E; Greene, A M; Nennstiel, M

    1997-08-01

    To characterize changes in key aspects of process quality received by nursing home residents before and after the implementation of the national nursing home Resident Assessment Instrument (RAI) and other aspects of the Omnibus Budget Reconciliation Act (OBRA) nursing home reforms. A quasi-experimental study using a complex, multistage probability-based sample design, with data collected before (1990) and after (1993) implementation of the RAI and other OBRA provisions. Two independent cohorts (n > 2000) of residents in a random sample of 254 nursing facilities located in metropolitan statistical areas in 10 states. OBRA-87 enhanced the regulation of nursing homes and included new requirements on quality of care, resident assessment, care planning, and the use of neuroleptic drugs and physical restraints. One of the key provisions, used to help implement the OBRA requirements in daily nursing home practice, was the mandatory use of a standardized, comprehensive system, known as the RAI, to assist in assessment and care planning. OBRA provisions went into effect in federal law on October 1, 1990, although delays issuing the regulations led to actual implementation of the RAI during the Spring of 1991. MEASUREMENTS AND ANALYSES: Research nurses spent an average of 4 days per facility in each data collection round, assessing a sample of residents, collecting data through interviews with and observations of residents, interviews with multiple shifts of direct staff caregivers for the sampled residents, and review of medical records, including physician's orders, treatment and care plans, nursing progress notes, and medication records. The RNs collected data on the characteristics of the sampled residents, on the care they received, and on facility practices. The effect of being a member of the 1990 pre-OBRA or the 1993 post-OBRA cohort was assessed on the accuracy of information in the residents' medical records, the comprehensiveness of care plans, and on other key aspects of process quality while controlling for any changes in resident case-mix. The data were analyzed using contingency tables and logistic regression and a special statistical software (SUDAAN) to assure proper variance estimation. Overall, the process of care in nursing homes improved in several important areas. The accuracy of information in residents' medical records increased substantially, as did the comprehensiveness of care plans. In addition, several problematic care practices declined during this period, including use of physical restraints (37.4 to 28.1% (P < .001)) and indwelling urinary catheters (9.8 to 7% (P < .001)). There were also increases in good care practices, such as the presence of advanced directives, participation in activities, and use of toileting programs for residents with bowel incontinence. These results were sustained after controlling for differences in the resident characteristics between 1990 and 1993. Other practices, such as use of antipsychotic drugs, behavior management programs, preventive skin care, and provision of therapies were unaffected, or the differences were not statistically significant, after adjusting for changes in resident case-mix. The OBRA reforms and introduction of the RAI constituted an unprecedented implementation of comprehensive geriatric assessment in Medicare- and Medicaid-certified nursing homes. The evaluation of the effects of these interventions demonstrates significant improvements in the quality of care provided to residents. At the same time, these findings suggest that more needs to be done to improve process quality. The results suggest the RAI is one tool that facility staff, therapists, pharmacy consultants, and physicians can use to support their continuing efforts to provide high quality of care and life to the nation's 1.7 million nursing home residents.

  4. Estimating error statistics for Chambon-la-Forêt observatory definitive data

    NASA Astrophysics Data System (ADS)

    Lesur, Vincent; Heumez, Benoît; Telali, Abdelkader; Lalanne, Xavier; Soloviev, Anatoly

    2017-08-01

    We propose a new algorithm for calibrating definitive observatory data with the goal of providing users with estimates of the data error standard deviations (SDs). The algorithm has been implemented and tested using Chambon-la-Forêt observatory (CLF) data. The calibration process uses all available data. It is set as a large, weakly non-linear, inverse problem that ultimately provides estimates of baseline values in three orthogonal directions, together with their expected standard deviations. For this inverse problem, absolute data error statistics are estimated from two series of absolute measurements made within a day. Similarly, variometer data error statistics are derived by comparing variometer data time series between different pairs of instruments over few years. The comparisons of these time series led us to use an autoregressive process of order 1 (AR1 process) as a prior for the baselines. Therefore the obtained baselines do not vary smoothly in time. They have relatively small SDs, well below 300 pT when absolute data are recorded twice a week - i.e. within the daily to weekly measures recommended by INTERMAGNET. The algorithm was tested against the process traditionally used to derive baselines at CLF observatory, suggesting that statistics are less favourable when this latter process is used. Finally, two sets of definitive data were calibrated using the new algorithm. Their comparison shows that the definitive data SDs are less than 400 pT and may be slightly overestimated by our process: an indication that more work is required to have proper estimates of absolute data error statistics. For magnetic field modelling, the results show that even on isolated sites like CLF observatory, there are very localised signals over a large span of temporal frequencies that can be as large as 1 nT. The SDs reported here encompass signals of a few hundred metres and less than a day wavelengths.

  5. Statistical Design Model (SDM) of satellite thermal control subsystem

    NASA Astrophysics Data System (ADS)

    Mirshams, Mehran; Zabihian, Ehsan; Aarabi Chamalishahi, Mahdi

    2016-07-01

    Satellites thermal control, is a satellite subsystem that its main task is keeping the satellite components at its own survival and activity temperatures. Ability of satellite thermal control plays a key role in satisfying satellite's operational requirements and designing this subsystem is a part of satellite design. In the other hand due to the lack of information provided by companies and designers still doesn't have a specific design process while it is one of the fundamental subsystems. The aim of this paper, is to identify and extract statistical design models of spacecraft thermal control subsystem by using SDM design method. This method analyses statistical data with a particular procedure. To implement SDM method, a complete database is required. Therefore, we first collect spacecraft data and create a database, and then we extract statistical graphs using Microsoft Excel, from which we further extract mathematical models. Inputs parameters of the method are mass, mission, and life time of the satellite. For this purpose at first thermal control subsystem has been introduced and hardware using in the this subsystem and its variants has been investigated. In the next part different statistical models has been mentioned and a brief compare will be between them. Finally, this paper particular statistical model is extracted from collected statistical data. Process of testing the accuracy and verifying the method use a case study. Which by the comparisons between the specifications of thermal control subsystem of a fabricated satellite and the analyses results, the methodology in this paper was proved to be effective. Key Words: Thermal control subsystem design, Statistical design model (SDM), Satellite conceptual design, Thermal hardware

  6. ecode - Electron Transport Algorithm Testing v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Franke, Brian C.; Olson, Aaron J.; Bruss, Donald Eugene

    2016-10-05

    ecode is a Monte Carlo code used for testing algorithms related to electron transport. The code can read basic physics parameters, such as energy-dependent stopping powers and screening parameters. The code permits simple planar geometries of slabs or cubes. Parallelization consists of domain replication, with work distributed at the start of the calculation and statistical results gathered at the end of the calculation. Some basic routines (such as input parsing, random number generation, and statistics processing) are shared with the Integrated Tiger Series codes. A variety of algorithms for uncertainty propagation are incorporated based on the stochastic collocation and stochasticmore » Galerkin methods. These permit uncertainty only in the total and angular scattering cross sections. The code contains algorithms for simulating stochastic mixtures of two materials. The physics is approximate, ranging from mono-energetic and isotropic scattering to screened Rutherford angular scattering and Rutherford energy-loss scattering (simple electron transport models). No production of secondary particles is implemented, and no photon physics is implemented.« less

  7. Clinical implementation of a GPU-based simplified Monte Carlo method for a treatment planning system of proton beam therapy.

    PubMed

    Kohno, R; Hotta, K; Nishioka, S; Matsubara, K; Tansho, R; Suzuki, T

    2011-11-21

    We implemented the simplified Monte Carlo (SMC) method on graphics processing unit (GPU) architecture under the computer-unified device architecture platform developed by NVIDIA. The GPU-based SMC was clinically applied for four patients with head and neck, lung, or prostate cancer. The results were compared to those obtained by a traditional CPU-based SMC with respect to the computation time and discrepancy. In the CPU- and GPU-based SMC calculations, the estimated mean statistical errors of the calculated doses in the planning target volume region were within 0.5% rms. The dose distributions calculated by the GPU- and CPU-based SMCs were similar, within statistical errors. The GPU-based SMC showed 12.30-16.00 times faster performance than the CPU-based SMC. The computation time per beam arrangement using the GPU-based SMC for the clinical cases ranged 9-67 s. The results demonstrate the successful application of the GPU-based SMC to a clinical proton treatment planning.

  8. The effect of organizational climate on patient-centered medical home implementation.

    PubMed

    Reddy, Ashok; Shea, Judy A; Canamucio, Anne; Werner, Rachel M

    2015-01-01

    Organizational climate is a key determinant of successful adoption of innovations; however, its relation to medical home implementation is unknown. This study examined the association between primary care providers' (PCPs') perception of organization climate and medical home implementation in the Veterans Health Administration. Multivariate regression was used to test the hypothesis that organizational climate predicts medical home implementation. This analysis of 191 PCPs found that higher scores in 2 domains of organizational climate (communication and cooperation, and orientation to quality improvement) were associated with a statistically significantly higher percentage (from 7 to 10 percentage points) of PCPs implementing structural changes to support the medical home model. In addition, some aspects of a better organizational climate were associated with improved organizational processes of care, including a higher percentage of patients contacted within 2 days of hospital discharge (by 2 to 3 percentage points) and appointments made within 3 days of a patient request (by 2 percentage points). © The Author(s) 2014.

  9. Development of a Convergent Spray Technologies(tm) Spray Process for a Solventless Sprayable Coating, MCC-1

    NASA Technical Reports Server (NTRS)

    Patel, Anil K.; Meeks, C.

    1998-01-01

    This paper discusses the application of Convergent Spray Technologies (TM) Spray Process to the development and successful implementation of Marshall Convergent Coating (MCC-1) as a primary Thermal Protection System (TPS) for the Space Shuttle Solid Rocket Boosters (SRBs). This paper discusses the environmental and process benefits of the MCC-1 technology, shows the systematic steps taken in developing the technology, including statistical sensitivity studies of about 35 variables. Based on the process and post-flight successes on the SRB, it will be seen that the technology is "field-proven". Application of this technology to other aerospace and commercial programs is summarized to illustrate the wide range of possibilities.

  10. [Quality by design approaches for pharmaceutical development and manufacturing of Chinese medicine].

    PubMed

    Xu, Bing; Shi, Xin-Yuan; Wu, Zhi-Sheng; Zhang, Yan-Ling; Wang, Yun; Qiao, Yan-Jiang

    2017-03-01

    The pharmaceutical quality was built by design, formed in the manufacturing process and improved during the product's lifecycle. Based on the comprehensive literature review of pharmaceutical quality by design (QbD), the essential ideas and implementation strategies of pharmaceutical QbD were interpreted. Considering the complex nature of Chinese medicine, the "4H" model was innovated and proposed for implementing QbD in pharmaceutical development and industrial manufacture of Chinese medicine product. "4H" corresponds to the acronym of holistic design, holistic information analysis, holistic quality control, and holistic process optimization, which is consistent with the holistic concept of Chinese medicine theory. The holistic design aims at constructing both the quality problem space from the patient requirement and the quality solution space from multidisciplinary knowledge. Holistic information analysis emphasizes understanding the quality pattern of Chinese medicine by integrating and mining multisource data and information at a relatively high level. The batch-to-batch quality consistence and manufacturing system reliability can be realized by comprehensive application of inspective quality control, statistical quality control, predictive quality control and intelligent quality control strategies. Holistic process optimization is to improve the product quality and process capability during the product lifecycle management. The implementation of QbD is useful to eliminate the ecosystem contradictions lying in the pharmaceutical development and manufacturing process of Chinese medicine product, and helps guarantee the cost effectiveness. Copyright© by the Chinese Pharmaceutical Association.

  11. Increased Adoption of Quality Improvement Interventions to Implement Evidence-Based Practices for Pressure Ulcer Prevention in U.S. Academic Medical Centers.

    PubMed

    Padula, William V; Mishra, Manish K; Makic, Mary Beth F; Wald, Heidi L; Campbell, Jonathan D; Nair, Kavita V; Valuck, Robert J

    2015-12-01

    In 2008, the U.S. Centers for Medicare and Medicaid Services enacted a nonpayment policy for stage III and IV hospital-acquired pressure ulcers (HAPUs), which incentivized hospitals to improve prevention efforts. In response, hospitals looked for ways to support implementation of evidence-based practices for HAPU prevention, such as adoption of quality improvement (QI) interventions. The objective of this study was to quantify adoption patterns of QI interventions for supporting evidence-based practices for HAPU prevention. This study surveyed wound care specialists working at hospitals within the University HealthSystem Consortium. A questionnaire was used to retrospectively describe QI adoption patterns according to 25 HAPU-specific QI interventions into four domains: leadership, staff, information technology (IT), and performance and improvement. Respondents indicated QI interventions implemented between 2007 and 2012 to the nearest quarter and year. Descriptive statistics defined patterns of QI adoption. A t-test and statistical process control chart established statistically significant increase in adoption following nonpayment policy enactment in October 2008. Increase are described in terms of scope (number of QI domains employed) and scale (number of QI interventions within domains). Fifty-three of the 55 hospitals surveyed reported implementing QI interventions for HAPU prevention. Leadership interventions were most frequent, increasing in scope from 40% to 63% between 2008 and 2012; "annual programs to promote pressure ulcer prevention" showed the greatest increase in scale. Staff interventions increased in scope from 32% to 53%; "frequent consult driven huddles" showed the greatest increase in scale. IT interventions increased in scope from 31% to 55%. Performance and improvement interventions increased in scope from 18% to 40%, with "new skin care products . . ." increasing the most. Academic medical centers increased adoption of QI interventions following changes in nonpayment policy. These QI interventions supported adherence to implementation of pressure ulcer prevention protocols. Changes in payment policies for prevention are effective in QI efforts. © 2015 Sigma Theta Tau International.

  12. Poisson-process generalization for the trading waiting-time distribution in a double-auction mechanism

    NASA Astrophysics Data System (ADS)

    Cincotti, Silvano; Ponta, Linda; Raberto, Marco; Scalas, Enrico

    2005-05-01

    In this paper, empirical analyses and computational experiments are presented on high-frequency data for a double-auction (book) market. Main objective of the paper is to generalize the order waiting time process in order to properly model such empirical evidences. The empirical study is performed on the best bid and best ask data of 7 U.S. financial markets, for 30-stock time series. In particular, statistical properties of trading waiting times have been analyzed and quality of fits is evaluated by suitable statistical tests, i.e., comparing empirical distributions with theoretical models. Starting from the statistical studies on real data, attention has been focused on the reproducibility of such results in an artificial market. The computational experiments have been performed within the Genoa Artificial Stock Market. In the market model, heterogeneous agents trade one risky asset in exchange for cash. Agents have zero intelligence and issue random limit or market orders depending on their budget constraints. The price is cleared by means of a limit order book. The order generation is modelled with a renewal process. Based on empirical trading estimation, the distribution of waiting times between two consecutive orders is modelled by a mixture of exponential processes. Results show that the empirical waiting-time distribution can be considered as a generalization of a Poisson process. Moreover, the renewal process can approximate real data and implementation on the artificial stocks market can reproduce the trading activity in a realistic way.

  13. An Improved Incremental Learning Approach for KPI Prognosis of Dynamic Fuel Cell System.

    PubMed

    Yin, Shen; Xie, Xiaochen; Lam, James; Cheung, Kie Chung; Gao, Huijun

    2016-12-01

    The key performance indicator (KPI) has an important practical value with respect to the product quality and economic benefits for modern industry. To cope with the KPI prognosis issue under nonlinear conditions, this paper presents an improved incremental learning approach based on available process measurements. The proposed approach takes advantage of the algorithm overlapping of locally weighted projection regression (LWPR) and partial least squares (PLS), implementing the PLS-based prognosis in each locally linear model produced by the incremental learning process of LWPR. The global prognosis results including KPI prediction and process monitoring are obtained from the corresponding normalized weighted means of all the local models. The statistical indicators for prognosis are enhanced as well by the design of novel KPI-related and KPI-unrelated statistics with suitable control limits for non-Gaussian data. For application-oriented purpose, the process measurements from real datasets of a proton exchange membrane fuel cell system are employed to demonstrate the effectiveness of KPI prognosis. The proposed approach is finally extended to a long-term voltage prediction for potential reference of further fuel cell applications.

  14. Online Denoising Based on the Second-Order Adaptive Statistics Model.

    PubMed

    Yi, Sheng-Lun; Jin, Xue-Bo; Su, Ting-Li; Tang, Zhen-Yun; Wang, Fa-Fa; Xiang, Na; Kong, Jian-Lei

    2017-07-20

    Online denoising is motivated by real-time applications in the industrial process, where the data must be utilizable soon after it is collected. Since the noise in practical process is usually colored, it is quite a challenge for denoising techniques. In this paper, a novel online denoising method was proposed to achieve the processing of the practical measurement data with colored noise, and the characteristics of the colored noise were considered in the dynamic model via an adaptive parameter. The proposed method consists of two parts within a closed loop: the first one is to estimate the system state based on the second-order adaptive statistics model and the other is to update the adaptive parameter in the model using the Yule-Walker algorithm. Specifically, the state estimation process was implemented via the Kalman filter in a recursive way, and the online purpose was therefore attained. Experimental data in a reinforced concrete structure test was used to verify the effectiveness of the proposed method. Results show the proposed method not only dealt with the signals with colored noise, but also achieved a tradeoff between efficiency and accuracy.

  15. WASP (Write a Scientific Paper) using Excel - 1: Data entry and validation.

    PubMed

    Grech, Victor

    2018-02-01

    Data collection for the purposes of analysis, after the planning and execution of a research study, commences with data input and validation. The process of data entry and analysis may appear daunting to the uninitiated, but as pointed out in the 1970s in a series of papers by British Medical Journal Deputy Editor TDV Swinscow, modern hardware and software (he was then referring to the availability of hand calculators) permits the performance of statistical testing outside a computer laboratory. In this day and age, modern software, such as the ubiquitous and almost universally familiar Microsoft Excel™ greatly facilitates this process. This first paper comprises the first of a collection of papers which will emulate Swinscow's series, in his own words, "addressed to readers who want to start at the beginning, not to those who are already skilled statisticians." These papers will have less focus on the actual arithmetic, and more emphasis on how to actually implement simple statistics, step by step, using Excel, thereby constituting the equivalent of Swinscow's papers in the personal computer age. Data entry can be facilitated by several underutilised features in Excel. This paper will explain Excel's little-known form function, data validation implementation at input stage, simple coding tips and data cleaning tools. Copyright © 2018 Elsevier B.V. All rights reserved.

  16. The European Southern Observatory-MIDAS table file system

    NASA Technical Reports Server (NTRS)

    Peron, M.; Grosbol, P.

    1992-01-01

    The new and substantially upgraded version of the Table File System in MIDAS is presented as a scientific database system. MIDAS applications for performing database operations on tables are discussed, for instance, the exchange of the data to and from the TFS, the selection of objects, the uncertainty joins across tables, and the graphical representation of data. This upgraded version of the TFS is a full implementation of the binary table extension of the FITS format; in addition, it also supports arrays of strings. Different storage strategies for optimal access of very large data sets are implemented and are addressed in detail. As a simple relational database, the TFS may be used for the management of personal data files. This opens the way to intelligent pipeline processing of large amounts of data. One of the key features of the Table File System is to provide also an extensive set of tools for the analysis of the final results of a reduction process. Column operations using standard and special mathematical functions as well as statistical distributions can be carried out; commands for linear regression and model fitting using nonlinear least square methods and user-defined functions are available. Finally, statistical tests of hypothesis and multivariate methods can also operate on tables.

  17. Application of quality by design principles to the development and technology transfer of a major process improvement for the manufacture of a recombinant protein.

    PubMed

    Looby, Mairead; Ibarra, Neysi; Pierce, James J; Buckley, Kevin; O'Donovan, Eimear; Heenan, Mary; Moran, Enda; Farid, Suzanne S; Baganz, Frank

    2011-01-01

    This study describes the application of quality by design (QbD) principles to the development and implementation of a major manufacturing process improvement for a commercially distributed therapeutic protein produced in Chinese hamster ovary cell culture. The intent of this article is to focus on QbD concepts, and provide guidance and understanding on how the various components combine together to deliver a robust process in keeping with the principles of QbD. A fed-batch production culture and a virus inactivation step are described as representative examples of upstream and downstream unit operations that were characterized. A systematic approach incorporating QbD principles was applied to both unit operations, involving risk assessment of potential process failure points, small-scale model qualification, design and execution of experiments, definition of operating parameter ranges and process validation acceptance criteria followed by manufacturing-scale implementation and process validation. Statistical experimental designs were applied to the execution of process characterization studies evaluating the impact of operating parameters on product quality attributes and process performance parameters. Data from process characterization experiments were used to define the proven acceptable range and classification of operating parameters for each unit operation. Analysis of variance and Monte Carlo simulation methods were used to assess the appropriateness of process design spaces. Successful implementation and validation of the process in the manufacturing facility and the subsequent manufacture of hundreds of batches of this therapeutic protein verifies the approaches taken as a suitable model for the development, scale-up and operation of any biopharmaceutical manufacturing process. Copyright © 2011 American Institute of Chemical Engineers (AIChE).

  18. Load Balancing Using Time Series Analysis for Soft Real Time Systems with Statistically Periodic Loads

    NASA Technical Reports Server (NTRS)

    Hailperin, M.

    1993-01-01

    This thesis provides design and analysis of techniques for global load balancing on ensemble architectures running soft-real-time object-oriented applications with statistically periodic loads. It focuses on estimating the instantaneous average load over all the processing elements. The major contribution is the use of explicit stochastic process models for both the loading and the averaging itself. These models are exploited via statistical time-series analysis and Bayesian inference to provide improved average load estimates, and thus to facilitate global load balancing. This thesis explains the distributed algorithms used and provides some optimality results. It also describes the algorithms' implementation and gives performance results from simulation. These results show that the authors' techniques allow more accurate estimation of the global system loading, resulting in fewer object migrations than local methods. The authors' method is shown to provide superior performance, relative not only to static load-balancing schemes but also to many adaptive load-balancing methods. Results from a preliminary analysis of another system and from simulation with a synthetic load provide some evidence of more general applicability.

  19. permGPU: Using graphics processing units in RNA microarray association studies.

    PubMed

    Shterev, Ivo D; Jung, Sin-Ho; George, Stephen L; Owzar, Kouros

    2010-06-16

    Many analyses of microarray association studies involve permutation, bootstrap resampling and cross-validation, that are ideally formulated as embarrassingly parallel computing problems. Given that these analyses are computationally intensive, scalable approaches that can take advantage of multi-core processor systems need to be developed. We have developed a CUDA based implementation, permGPU, that employs graphics processing units in microarray association studies. We illustrate the performance and applicability of permGPU within the context of permutation resampling for a number of test statistics. An extensive simulation study demonstrates a dramatic increase in performance when using permGPU on an NVIDIA GTX 280 card compared to an optimized C/C++ solution running on a conventional Linux server. permGPU is available as an open-source stand-alone application and as an extension package for the R statistical environment. It provides a dramatic increase in performance for permutation resampling analysis in the context of microarray association studies. The current version offers six test statistics for carrying out permutation resampling analyses for binary, quantitative and censored time-to-event traits.

  20. A rational approach to legacy data validation when transitioning between electronic health record systems.

    PubMed

    Pageler, Natalie M; Grazier G'Sell, Max Jacob; Chandler, Warren; Mailes, Emily; Yang, Christine; Longhurst, Christopher A

    2016-09-01

    The objective of this project was to use statistical techniques to determine the completeness and accuracy of data migrated during electronic health record conversion. Data validation during migration consists of mapped record testing and validation of a sample of the data for completeness and accuracy. We statistically determined a randomized sample size for each data type based on the desired confidence level and error limits. The only error identified in the post go-live period was a failure to migrate some clinical notes, which was unrelated to the validation process. No errors in the migrated data were found during the 12- month post-implementation period. Compared to the typical industry approach, we have demonstrated that a statistical approach to sampling size for data validation can ensure consistent confidence levels while maximizing efficiency of the validation process during a major electronic health record conversion. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  1. Streamstats: U.S. Geological Survey Web Application for Streamflow Statistics for Connecticut

    USGS Publications Warehouse

    Ahearn, Elizabeth A.; Ries, Kernell G.; Steeves, Peter A.

    2006-01-01

    Introduction An important mission of the U. S. Geological Survey (USGS) is to provide information on streamflow in the Nation's rivers. Streamflow statistics are used by water managers, engineers, scientists, and others to protect people and property during floods and droughts, and to manage land, water, and biological resources. Common uses for streamflow statistics include dam, bridge, and culvert design; water-supply planning and management; water-use appropriations and permitting; wastewater and industrial discharge permitting; hydropower-facility design and regulation; and flood-plain mapping for establishing flood-insurance rates and land-use zones. In an effort to improve access to published streamflow statistics, and to make the process of computing streamflow statistics for ungaged stream sites easier, more accurate, and more consistent, the USGS and the Environmental Systems Research Institute, Inc. (ESRI) developed StreamStats (Ries and others, 2004). StreamStats is a Geographic Information System (GIS)-based Web application for serving previously published streamflow statistics and basin characteristics for USGS data-collection stations, and computing streamflow statistics and basin characteristics for ungaged stream sites. The USGS, in cooperation with the Connecticut Department of Environmental Protection and the Connecticut Department of Transportation, has implemented StreamStats for Connecticut.

  2. Proficiency Testing for Determination of Water Content in Toluene of Chemical Reagents by iteration robust statistic technique

    NASA Astrophysics Data System (ADS)

    Wang, Hao; Wang, Qunwei; He, Ming

    2018-05-01

    In order to investigate and improve the level of detection technology of water content in liquid chemical reagents of domestic laboratories, proficiency testing provider PT0031 (CNAS) has organized proficiency testing program of water content in toluene, 48 laboratories from 18 provinces/cities/municipals took part in the PT. This paper introduces the implementation process of proficiency testing for determination of water content in toluene, including sample preparation, homogeneity and stability test, the results of statistics of iteration robust statistic technique and analysis, summarized and analyzed those of the different test standards which are widely used in the laboratories, put forward the technological suggestions for the improvement of the test quality of water content. Satisfactory results were obtained by 43 laboratories, amounting to 89.6% of the total participating laboratories.

  3. Internal quality control: planning and implementation strategies.

    PubMed

    Westgard, James O

    2003-11-01

    The first essential in setting up internal quality control (IQC) of a test procedure in the clinical laboratory is to select the proper IQC procedure to implement, i.e. choosing the statistical criteria or control rules, and the number of control measurements, according to the quality required for the test and the observed performance of the method. Then the right IQC procedure must be properly implemented. This review focuses on strategies for planning and implementing IQC procedures in order to improve the quality of the IQC. A quantitative planning process is described that can be implemented with graphical tools such as power function or critical-error graphs and charts of operating specifications. Finally, a total QC strategy is formulated to minimize cost and maximize quality. A general strategy for IQC implementation is recommended that employs a three-stage design in which the first stage provides high error detection, the second stage low false rejection and the third stage prescribes the length of the analytical run, making use of an algorithm involving the average of normal patients' data.

  4. Mapping sea ice leads with a coupled numeric/symbolic system

    NASA Technical Reports Server (NTRS)

    Key, J.; Schweiger, A. J.; Maslanik, J. A.

    1990-01-01

    A method is presented which facilitates the detection and delineation of leads with single-channel Landsat data by coupling numeric and symbolic procedures. The procedure consists of three steps: (1) using the dynamic threshold method, an image is mapped to a lead/no lead binary image; (2) the likelihood of fragments to be real leads is examined with a set of numeric rules; and (3) pairs of objects are examined geometrically and merged where possible. The processing ends when all fragments are merged and statistical characteristics are determined, and a map of valid lead objects are left which summarizes useful physical in the lead complexes. Direct implementation of domain knowledge and rapid prototyping are two benefits of the rule-based system. The approach is found to be more successfully applied to mid- and high-level processing, and the system can retrieve statistics about sea-ice leads as well as detect the leads.

  5. Treatment of automotive industry oily wastewater by electrocoagulation: statistical optimization of the operational parameters.

    PubMed

    GilPavas, Edison; Molina-Tirado, Kevin; Gómez-García, Miguel Angel

    2009-01-01

    An electrocoagulation process was used for the treatment of oily wastewater generated from an automotive industry in Medellín (Colombia). An electrochemical cell consisting of four parallel electrodes (Fe and Al) in bipolar configuration was implemented. A multifactorial experimental design was used for evaluating the influence of several parameters including: type and arrangement of electrodes, pH, and current density. Oil and grease removal was defined as the response variable for the statistical analysis. Additionally, the BOD(5), COD, and TOC were monitored during the treatment process. According to the results, at the optimum parameter values (current density = 4.3 mA/cm(2), distance between electrodes = 1.5 cm, Fe as anode, and pH = 12) it was possible to reach a c.a. 95% oils removal, COD and mineralization of 87.4% and 70.6%, respectively. A final biodegradability (BOD(5)/COD) of 0.54 was reached.

  6. LES ARM Symbiotic Simulation and Observation (LASSO) Implementation Strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gustafson Jr., WI; Vogelmann, AM

    2015-09-01

    This document illustrates the design of the Large-Eddy Simulation (LES) ARM Symbiotic Simulation and Observation (LASSO) workflow to provide a routine, high-resolution modeling capability to augment the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Climate Research Facility’s high-density observations. LASSO will create a powerful new capability for furthering ARM’s mission to advance understanding of cloud, radiation, aerosol, and land-surface processes. The combined observational and modeling elements will enable a new level of scientific inquiry by connecting processes and context to observations and providing needed statistics for details that cannot be measured. The result will be improved process understandingmore » that facilitates concomitant improvements in climate model parameterizations. The initial LASSO implementation will be for ARM’s Southern Great Plains site in Oklahoma and will focus on shallow convection, which is poorly simulated by climate models due in part to clouds’ typically small spatial scale compared to model grid spacing, and because the convection involves complicated interactions of microphysical and boundary layer processes.« less

  7. Evaluation of the implementation of an integrated program for musculoskeletal system care.

    PubMed

    Larrañaga, Igor; Soto-Gordoa, Myriam; Arrospide, Arantzazu; Jauregi, María Luz; Millas, Jesús; San Vicente, Ricardo; Aguirrebeña, Jabier; Mar, Javier

    The chronic nature of musculoskeletal diseases requires an integrated care which involves the Primary Care and the specialities of Rheumatology, Traumatology and Rehabilitation. The aim of this study was to assess the implementation of an integrated organizational model in osteoporosis, low back pain, shoulder disease and knee disease using Deming's continuous improvement process and considering referrals and resource consumption. A simulation model was used in the planning to predict the evolution of musculoskeletal diseases resource consumption and to carry out a Budget Impact Analysis from 2012 to 2020 in the Goierri-Alto Urola region. In the checking stage the status of the process in 2014 was evaluated using statistical analysis to check the degree of achievement of the objectives for each speciality. Simulation models showed that population with musculoskeletal disease in Goierri-Alto Urola will increase a 4.4% by 2020. Because of that, the expenses for a conventional healthcare system will have increased a 5.9%. However, if the intervention reaches its objectives the budget would decrease an 8.5%. The statistical analysis evidenced a decline in referrals to Traumatology service and a reduction of successive consultations in all specialities. The implementation of the integrated organizational model in osteoporosis, low back pain, shoulder disease and knee disease is still at an early stage. However, the empowerment of Primary Care improved patient referrals and reduced the costs. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Reumatología y Colegio Mexicano de Reumatología. All rights reserved.

  8. Continuous Evaluation of Fast Processes in Climate Models Using ARM Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Zhijin; Sha, Feng; Liu, Yangang

    2016-02-02

    This five-year award supports the project “Continuous Evaluation of Fast Processes in Climate Models Using ARM Measurements (FASTER)”. The goal of this project is to produce accurate, consistent and comprehensive data sets for initializing both single column models (SCMs) and cloud resolving models (CRMs) using data assimilation. A multi-scale three-dimensional variational data assimilation scheme (MS-3DVAR) has been implemented. This MS-3DVAR system is built on top of WRF/GSI. The Community Gridpoint Statistical Interpolation (GSI) system is an operational data assimilation system at the National Centers for Environmental Prediction (NCEP) and has been implemented in the Weather Research and Forecast (WRF) model.more » This MS-3DVAR is further enhanced by the incorporation of a land surface 3DVAR scheme and a comprehensive aerosol 3DVAR scheme. The data assimilation implementation focuses in the ARM SGP region. ARM measurements are assimilated along with other available satellite and radar data. Reanalyses are then generated for a few selected period of time. This comprehensive data assimilation system has also been employed for other ARM-related applications.« less

  9. Automated Monitoring with a BSP Fault-Detection Test

    NASA Technical Reports Server (NTRS)

    Bickford, Randall L.; Herzog, James P.

    2003-01-01

    The figure schematically illustrates a method and procedure for automated monitoring of an asset, as well as a hardware- and-software system that implements the method and procedure. As used here, asset could signify an industrial process, power plant, medical instrument, aircraft, or any of a variety of other systems that generate electronic signals (e.g., sensor outputs). In automated monitoring, the signals are digitized and then processed in order to detect faults and otherwise monitor operational status and integrity of the monitored asset. The major distinguishing feature of the present method is that the fault-detection function is implemented by use of a Bayesian sequential probability (BSP) technique. This technique is superior to other techniques for automated monitoring because it affords sensitivity, not only to disturbances in the mean values, but also to very subtle changes in the statistical characteristics (variance, skewness, and bias) of the monitored signals.

  10. Evaluation of a parallel implementation of the learning portion of the backward error propagation neural network: experiments in artifact identification.

    PubMed Central

    Sittig, D. F.; Orr, J. A.

    1991-01-01

    Various methods have been proposed in an attempt to solve problems in artifact and/or alarm identification including expert systems, statistical signal processing techniques, and artificial neural networks (ANN). ANNs consist of a large number of simple processing units connected by weighted links. To develop truly robust ANNs, investigators are required to train their networks on huge training data sets, requiring enormous computing power. We implemented a parallel version of the backward error propagation neural network training algorithm in the widely portable parallel programming language C-Linda. A maximum speedup of 4.06 was obtained with six processors. This speedup represents a reduction in total run-time from approximately 6.4 hours to 1.5 hours. We conclude that use of the master-worker model of parallel computation is an excellent method for obtaining speedups in the backward error propagation neural network training algorithm. PMID:1807607

  11. Potential Application of a Graphical Processing Unit to Parallel Computations in the NUBEAM Code

    NASA Astrophysics Data System (ADS)

    Payne, J.; McCune, D.; Prater, R.

    2010-11-01

    NUBEAM is a comprehensive computational Monte Carlo based model for neutral beam injection (NBI) in tokamaks. NUBEAM computes NBI-relevant profiles in tokamak plasmas by tracking the deposition and the slowing of fast ions. At the core of NUBEAM are vector calculations used to track fast ions. These calculations have recently been parallelized to run on MPI clusters. However, cost and interlink bandwidth limit the ability to fully parallelize NUBEAM on an MPI cluster. Recent implementation of double precision capabilities for Graphical Processing Units (GPUs) presents a cost effective and high performance alternative or complement to MPI computation. Commercially available graphics cards can achieve up to 672 GFLOPS double precision and can handle hundreds of thousands of threads. The ability to execute at least one thread per particle simultaneously could significantly reduce the execution time and the statistical noise of NUBEAM. Progress on implementation on a GPU will be presented.

  12. Robust matching for voice recognition

    NASA Astrophysics Data System (ADS)

    Higgins, Alan; Bahler, L.; Porter, J.; Blais, P.

    1994-10-01

    This paper describes an automated method of comparing a voice sample of an unknown individual with samples from known speakers in order to establish or verify the individual's identity. The method is based on a statistical pattern matching approach that employs a simple training procedure, requires no human intervention (transcription, work or phonetic marketing, etc.), and makes no assumptions regarding the expected form of the statistical distributions of the observations. The content of the speech material (vocabulary, grammar, etc.) is not assumed to be constrained in any way. An algorithm is described which incorporates frame pruning and channel equalization processes designed to achieve robust performance with reasonable computational resources. An experimental implementation demonstrating the feasibility of the concept is described.

  13. Risk management for moisture related effects in dry manufacturing processes: a statistical approach.

    PubMed

    Quiroz, Jorge; Strong, John; Zhang, Lanju

    2016-03-01

    A risk- and science-based approach to control the quality in pharmaceutical manufacturing includes a full understanding of how product attributes and process parameters relate to product performance through a proactive approach in formulation and process development. For dry manufacturing, where moisture content is not directly manipulated within the process, the variability in moisture of the incoming raw materials can impact both the processability and drug product quality attributes. A statistical approach is developed using individual raw material historical lots as a basis for the calculation of tolerance intervals for drug product moisture content so that risks associated with excursions in moisture content can be mitigated. The proposed method is based on a model-independent approach that uses available data to estimate parameters of interest that describe the population of blend moisture content values and which do not require knowledge of the individual blend moisture content values. Another advantage of the proposed tolerance intervals is that, it does not require the use of tabulated values for tolerance factors. This facilitates the implementation on any spreadsheet program like Microsoft Excel. A computational example is used to demonstrate the proposed method.

  14. Changing personnel behavior to promote quality care practices in an intensive care unit

    PubMed Central

    Cooper, Dominic; Farmery, Keith; Johnson, Martin; Harper, Christine; Clarke, Fiona L; Holton, Phillip; Wilson, Susan; Rayson, Paul; Bence, Hugh

    2005-01-01

    The delivery of safe high quality patient care is a major issue in clinical settings. However, the implementation of evidence-based practice and educational interventions are not always effective at improving performance. A staff-led behavioral management process was implemented in a large single-site acute (secondary and tertiary) hospital in the North of England for 26 weeks. A quasi-experimental, repeated-measures, within-groups design was used. Measurement focused on quality care behaviors (ie, documentation, charting, hand washing). The results demonstrate the efficacy of a staff-led behavioral management approach for improving quality-care practices. Significant behavioral change (F [6, 19] = 5.37, p < 0.01) was observed. Correspondingly, statistically significant (t-test [t] = 3.49, df = 25, p < 0.01) reductions in methicillin-resistant Staphylococcus aureus (MRSA) were obtained. Discussion focuses on implementation issues. PMID:18360574

  15. Facilitating the Feedback Process on a Clinical Clerkship Using a Smartphone Application.

    PubMed

    Joshi, Aditya; Generalla, Jenilee; Thompson, Britta; Haidet, Paul

    2017-10-01

    This pilot study evaluated the effects of a smartphone-triggered method of feedback delivery on students' perceptions of the feedback process. An interactive electronic feedback form was made available to students through a smartphone app. Students were asked to evaluate various aspects of the feedback process. Responses from a previous year served as control. In the first three quarters of academic year 2014-2015 (pre-implementation), only 65% of responders reported receiving oral feedback and 40% reported receiving written feedback. During the pilot phase (transition), these increased to 80% for both forms. Following full implementation in academic year 2015-2016 (post-implementation), 97% reported receiving oral feedback, and 92% reported receiving written feedback. A statistically significant difference was noted pre- to post-implementation for both oral and written feedback (p < 0.01). A significant increase from pre-implementation to transition was noted for written feedback (p < 0.01) and from transition to post-implementation for oral feedback (p < 0.01). Ninety-one and 94% of responders reported ease of access and timeliness of the feedback, 75% perceived the quality of the feedback to be good to excellent; 64% felt receiving feedback via the app improved their performance; 69% indicated the feedback method as better compared to other methods. Students acknowledged the facilitation of conversation with supervisors and the convenience of receiving feedback, as well as the promptness with which feedback was provided. The use of a drop-down menu was thought to limit the scope of conversation. These data point to the effectiveness of this method to cue supervisors to provide feedback to students.

  16. An investigation into pilot and system response to critical in-flight events. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Rockwell, T. H.; Griffin, W. C.

    1981-01-01

    Critical in-flight events (CIFE) that threaten the aircraft were studied. The scope of the CIFE was described and defined with emphasis on characterizing event development, detection and assessment; pilot information requirements, sources, acquisition, and interpretation, pilot response options, decision processed, and decision implementation and event outcome. Detailed scenarios were developed for use in simulators and paper and pencil testing for developing relationships between pilot performance and background information as well as for an analysis of pilot reaction decision and feedback processes. Statistical relationships among pilot characteristics and observed responses to CIFE's were developed.

  17. Statistical Techniques For Real-time Anomaly Detection Using Spark Over Multi-source VMware Performance Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Solaimani, Mohiuddin; Iftekhar, Mohammed; Khan, Latifur

    Anomaly detection refers to the identi cation of an irregular or unusual pat- tern which deviates from what is standard, normal, or expected. Such deviated patterns typically correspond to samples of interest and are assigned different labels in different domains, such as outliers, anomalies, exceptions, or malware. Detecting anomalies in fast, voluminous streams of data is a formidable chal- lenge. This paper presents a novel, generic, real-time distributed anomaly detection framework for heterogeneous streaming data where anomalies appear as a group. We have developed a distributed statistical approach to build a model and later use it to detect anomaly. Asmore » a case study, we investigate group anomaly de- tection for a VMware-based cloud data center, which maintains a large number of virtual machines (VMs). We have built our framework using Apache Spark to get higher throughput and lower data processing time on streaming data. We have developed a window-based statistical anomaly detection technique to detect anomalies that appear sporadically. We then relaxed this constraint with higher accuracy by implementing a cluster-based technique to detect sporadic and continuous anomalies. We conclude that our cluster-based technique out- performs other statistical techniques with higher accuracy and lower processing time.« less

  18. The writer independent online handwriting recognition system frog on hand and cluster generative statistical dynamic time warping.

    PubMed

    Bahlmann, Claus; Burkhardt, Hans

    2004-03-01

    In this paper, we give a comprehensive description of our writer-independent online handwriting recognition system frog on hand. The focus of this work concerns the presentation of the classification/training approach, which we call cluster generative statistical dynamic time warping (CSDTW). CSDTW is a general, scalable, HMM-based method for variable-sized, sequential data that holistically combines cluster analysis and statistical sequence modeling. It can handle general classification problems that rely on this sequential type of data, e.g., speech recognition, genome processing, robotics, etc. Contrary to previous attempts, clustering and statistical sequence modeling are embedded in a single feature space and use a closely related distance measure. We show character recognition experiments of frog on hand using CSDTW on the UNIPEN online handwriting database. The recognition accuracy is significantly higher than reported results of other handwriting recognition systems. Finally, we describe the real-time implementation of frog on hand on a Linux Compaq iPAQ embedded device.

  19. Testing evidence routine practice: Using an implementation framework to embed a clinically proven asthma service in Australian community pharmacy.

    PubMed

    Fuller, Joanne M; Saini, Bandana; Bosnic-Anticevich, Sinthia; Garcia Cardenas, Victoria; Benrimoj, Shalom I; Armour, Carol

    Community pharmacists are well placed and evidence clearly demonstrates that they can be suitably trained to deliver professional services that improve the management of asthma patients in clinical, economic and humanistic terms. However the gap between this evidence and practice reality remains wide. In this study we measure the implementation process as well as the service benefits of an asthma service model. Using an effectiveness-implementation hybrid design, a defined implementation process (progression from Exploration through Preparation and Testing to Operation stages) supporting an asthma service (promoting asthma control and inhaler technique) was tested in 17 community pharmacies across metropolitan Sydney. Seven pharmacies reached the Operation stage of implementation. Eight pharmacies reached the Testing stage of implementation and two pharmacies did not progress beyond the Preparation stage of implementation. A total of 128 patients were enrolled in the asthma service with 110 patients remaining enrolled at the close of the study. Asthma control showed a positive trend throughout the service with the overall proportion of patients with 'poor' asthma control at baseline decreasing from 72% to 57% at study close. There was a statistically significant increase in the proportion of patients with correct inhaler technique from 12% at Baseline (Visit 1) to 33% at Visit 2 and 57% at study close. Implementation of the asthma service varied across pharmacies. Different strategies specific to practice sites at different stages of the implementation model may result in greater uptake of professional services. The asthma service led to improved patient outcomes overall with a positive trend in asthma control and significant change in inhaler technique. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Implementation Processes and Pay for Performance in Healthcare: A Systematic Review.

    PubMed

    Kondo, Karli K; Damberg, Cheryl L; Mendelson, Aaron; Motu'apuaka, Makalapua; Freeman, Michele; O'Neil, Maya; Relevo, Rose; Low, Allison; Kansagara, Devan

    2016-04-01

    Over the last decade, various pay-for-performance (P4P) programs have been implemented to improve quality in health systems, including the VHA. P4P programs are complex, and their effects may vary by design, context, and other implementation processes. We conducted a systematic review and key informant (KI) interviews to better understand the implementation factors that modify the effectiveness of P4P. We searched PubMed, PsycINFO, and CINAHL through April 2014, and reviewed reference lists. We included trials and observational studies of P4P implementation. Two investigators abstracted data and assessed study quality. We interviewed P4P researchers to gain further insight. Among 1363 titles and abstracts, we selected 509 for full-text review, and included 41 primary studies. Of these 41 studies, 33 examined P4P programs in ambulatory settings, 7 targeted hospitals, and 1 study applied to nursing homes. Related to implementation, 13 studies examined program design, 8 examined implementation processes, 6 the outer setting, 18 the inner setting, and 5 provider characteristics. Results suggest the importance of considering underlying payment models and using statistically stringent methods of composite measure development, and ensuring that high-quality care will be maintained after incentive removal. We found no conclusive evidence that provider or practice characteristics relate to P4P effectiveness. Interviews with 14 KIs supported limited evidence that effective P4P program measures should be aligned with organizational goals, that incentive structures should be carefully considered, and that factors such as a strong infrastructure and public reporting may have a large influence. There is limited evidence from which to draw firm conclusions related to P4P implementation. Findings from studies and KI interviews suggest that P4P programs should undergo regular evaluation and should target areas of poor performance. Additionally, measures and incentives should align with organizational priorities, and programs should allow for changes over time in response to data and provider input.

  1. Circuit model for single-energy-level trap centers in FETs

    NASA Astrophysics Data System (ADS)

    Albahrani, Sayed Ali; Parker, Anthony; Heimlich, Michael

    2016-12-01

    A circuit implementation of a single-energy-level trap center in an FET is presented. When included in transistor models it explains the temperature-potential-dependent time constants seen in the circuit manifestations of charge trapping, being gate lag and drain overshoot. The implementation is suitable for both time-domain and harmonic-balance simulations. The proposed model is based on the Shockley-Read-Hall (SRH) statistics of the trapping process. The results of isothermal pulse measurements performed on a GaN HEMT are presented. These measurement allow characterizing charge trapping in isolation from the effect of self-heating. These results are used to obtain the parameters of the proposed model.

  2. Metrology: Calibration and measurement processes guidelines

    NASA Technical Reports Server (NTRS)

    Castrup, Howard T.; Eicke, Woodward G.; Hayes, Jerry L.; Mark, Alexander; Martin, Robert E.; Taylor, James L.

    1994-01-01

    The guide is intended as a resource to aid engineers and systems contracts in the design, implementation, and operation of metrology, calibration, and measurement systems, and to assist NASA personnel in the uniform evaluation of such systems supplied or operated by contractors. Methodologies and techniques acceptable in fulfilling metrology quality requirements for NASA programs are outlined. The measurement process is covered from a high level through more detailed discussions of key elements within the process, Emphasis is given to the flowdown of project requirements to measurement system requirements, then through the activities that will provide measurements with defined quality. In addition, innovations and techniques for error analysis, development of statistical measurement process control, optimization of calibration recall systems, and evaluation of measurement uncertainty are presented.

  3. Color features as an approach for the automated screening of Salmonella strain

    NASA Astrophysics Data System (ADS)

    Trujillo, Alejandra Serrano; González, Viridiana Contreras; Andrade Rincón, Saulo E.; Palafox, Luis E.

    2016-11-01

    We present the implementation of a feature extraction approach for the automated screening of Salmonella sp., a task visually carried out by a microbiologist, where the resulting color characteristics of the culture media plate indicate the presence of this strain. The screening of Salmonella sp. is based on the inoculation and incubation of a sample on an agar plate, allowing the isolation of this strain, if present. This process uses three media: Xylose lysine deoxycholate, Salmonella Shigella, and Brilliant Green agar plates, which exhibit specific color characteristics over the colonies and over the surrounding medium for a presumed positive interpretation. Under a controlled illumination environment, images of plates are captured and the characteristics found over each agar are processed separately. Each agar is analyzed using statistical descriptors for texture, to determine the presence of colonies, followed by the extraction of color features. A comparison among the color features seen over the three media, according to the FDA Bacteriological Analytical Manual, determines the presence of Salmonella sp. on a given sample. The implemented process proves that the task addressed can be accomplished under an image processing approach, leading to the future validation and automation of additional screening processes.

  4. Jllumina - A comprehensive Java-based API for statistical Illumina Infinium HumanMethylation450 and Infinium MethylationEPIC BeadChip data processing.

    PubMed

    Almeida, Diogo; Skov, Ida; Lund, Jesper; Mohammadnejad, Afsaneh; Silva, Artur; Vandin, Fabio; Tan, Qihua; Baumbach, Jan; Röttger, Richard

    2016-10-01

    Measuring differential methylation of the DNA is the nowadays most common approach to linking epigenetic modifications to diseases (called epigenome-wide association studies, EWAS). For its low cost, its efficiency and easy handling, the Illumina HumanMethylation450 BeadChip and its successor, the Infinium MethylationEPIC BeadChip, is the by far most popular techniques for conduction EWAS in large patient cohorts. Despite the popularity of this chip technology, raw data processing and statistical analysis of the array data remains far from trivial and still lacks dedicated software libraries enabling high quality and statistically sound downstream analyses. As of yet, only R-based solutions are freely available for low-level processing of the Illumina chip data. However, the lack of alternative libraries poses a hurdle for the development of new bioinformatic tools, in particular when it comes to web services or applications where run time and memory consumption matter, or EWAS data analysis is an integrative part of a bigger framework or data analysis pipeline. We have therefore developed and implemented Jllumina, an open-source Java library for raw data manipulation of Illumina Infinium HumanMethylation450 and Infinium MethylationEPIC BeadChip data, supporting the developer with Java functions covering reading and preprocessing the raw data, down to statistical assessment, permutation tests, and identification of differentially methylated loci. Jllumina is fully parallelizable and publicly available at http://dimmer.compbio.sdu.dk/download.html.

  5. Jllumina - A comprehensive Java-based API for statistical Illumina Infinium HumanMethylation450 and MethylationEPIC data processing.

    PubMed

    Almeida, Diogo; Skov, Ida; Lund, Jesper; Mohammadnejad, Afsaneh; Silva, Artur; Vandin, Fabio; Tan, Qihua; Baumbach, Jan; Röttger, Richard

    2016-12-18

    Measuring differential methylation of the DNA is the nowadays most common approach to linking epigenetic modifications to diseases (called epigenome-wide association studies, EWAS). For its low cost, its efficiency and easy handling, the Illumina HumanMethylation450 BeadChip and its successor, the Infinium MethylationEPIC BeadChip, is the by far most popular techniques for conduction EWAS in large patient cohorts. Despite the popularity of this chip technology, raw data processing and statistical analysis of the array data remains far from trivial and still lacks dedicated software libraries enabling high quality and statistically sound downstream analyses. As of yet, only R-based solutions are freely available for low-level processing of the Illumina chip data. However, the lack of alternative libraries poses a hurdle for the development of new bioinformatic tools, in particular when it comes to web services or applications where run time and memory consumption matter, or EWAS data analysis is an integrative part of a bigger framework or data analysis pipeline. We have therefore developed and implemented Jllumina, an open-source Java library for raw data manipulation of Illumina Infinium HumanMethylation450 and Infinium MethylationEPIC BeadChip data, supporting the developer with Java functions covering reading and preprocessing the raw data, down to statistical assessment, permutation tests, and identification of differentially methylated loci. Jllumina is fully parallelizable and publicly available at http://dimmer.compbio.sdu.dk/download.html.

  6. Racing to learn: statistical inference and learning in a single spiking neuron with adaptive kernels

    PubMed Central

    Afshar, Saeed; George, Libin; Tapson, Jonathan; van Schaik, André; Hamilton, Tara J.

    2014-01-01

    This paper describes the Synapto-dendritic Kernel Adapting Neuron (SKAN), a simple spiking neuron model that performs statistical inference and unsupervised learning of spatiotemporal spike patterns. SKAN is the first proposed neuron model to investigate the effects of dynamic synapto-dendritic kernels and demonstrate their computational power even at the single neuron scale. The rule-set defining the neuron is simple: there are no complex mathematical operations such as normalization, exponentiation or even multiplication. The functionalities of SKAN emerge from the real-time interaction of simple additive and binary processes. Like a biological neuron, SKAN is robust to signal and parameter noise, and can utilize both in its operations. At the network scale neurons are locked in a race with each other with the fastest neuron to spike effectively “hiding” its learnt pattern from its neighbors. The robustness to noise, high speed, and simple building blocks not only make SKAN an interesting neuron model in computational neuroscience, but also make it ideal for implementation in digital and analog neuromorphic systems which is demonstrated through an implementation in a Field Programmable Gate Array (FPGA). Matlab, Python, and Verilog implementations of SKAN are available at: http://www.uws.edu.au/bioelectronics_neuroscience/bens/reproducible_research. PMID:25505378

  7. Racing to learn: statistical inference and learning in a single spiking neuron with adaptive kernels.

    PubMed

    Afshar, Saeed; George, Libin; Tapson, Jonathan; van Schaik, André; Hamilton, Tara J

    2014-01-01

    This paper describes the Synapto-dendritic Kernel Adapting Neuron (SKAN), a simple spiking neuron model that performs statistical inference and unsupervised learning of spatiotemporal spike patterns. SKAN is the first proposed neuron model to investigate the effects of dynamic synapto-dendritic kernels and demonstrate their computational power even at the single neuron scale. The rule-set defining the neuron is simple: there are no complex mathematical operations such as normalization, exponentiation or even multiplication. The functionalities of SKAN emerge from the real-time interaction of simple additive and binary processes. Like a biological neuron, SKAN is robust to signal and parameter noise, and can utilize both in its operations. At the network scale neurons are locked in a race with each other with the fastest neuron to spike effectively "hiding" its learnt pattern from its neighbors. The robustness to noise, high speed, and simple building blocks not only make SKAN an interesting neuron model in computational neuroscience, but also make it ideal for implementation in digital and analog neuromorphic systems which is demonstrated through an implementation in a Field Programmable Gate Array (FPGA). Matlab, Python, and Verilog implementations of SKAN are available at: http://www.uws.edu.au/bioelectronics_neuroscience/bens/reproducible_research.

  8. Toward criteria for pragmatic measurement in implementation research and practice: a stakeholder-driven approach using concept mapping.

    PubMed

    Powell, Byron J; Stanick, Cameo F; Halko, Heather M; Dorsey, Caitlin N; Weiner, Bryan J; Barwick, Melanie A; Damschroder, Laura J; Wensing, Michel; Wolfenden, Luke; Lewis, Cara C

    2017-10-03

    Advancing implementation research and practice requires valid and reliable measures of implementation determinants, mechanisms, processes, strategies, and outcomes. However, researchers and implementation stakeholders are unlikely to use measures if they are not also pragmatic. The purpose of this study was to establish a stakeholder-driven conceptualization of the domains that comprise the pragmatic measure construct. It built upon a systematic review of the literature and semi-structured stakeholder interviews that generated 47 criteria for pragmatic measures, and aimed to further refine that set of criteria by identifying conceptually distinct categories of the pragmatic measure construct and providing quantitative ratings of the criteria's clarity and importance. Twenty-four stakeholders with expertise in implementation practice completed a concept mapping activity wherein they organized the initial list of 47 criteria into conceptually distinct categories and rated their clarity and importance. Multidimensional scaling, hierarchical cluster analysis, and descriptive statistics were used to analyze the data. The 47 criteria were meaningfully grouped into four distinct categories: (1) acceptable, (2) compatible, (3) easy, and (4) useful. Average ratings of clarity and importance at the category and individual criteria level will be presented. This study advances the field of implementation science and practice by providing clear and conceptually distinct domains of the pragmatic measure construct. Next steps will include a Delphi process to develop consensus on the most important criteria and the development of quantifiable pragmatic rating criteria that can be used to assess measures.

  9. SU-F-I-10: Spatially Local Statistics for Adaptive Image Filtering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iliopoulos, AS; Sun, X; Floros, D

    Purpose: To facilitate adaptive image filtering operations, addressing spatial variations in both noise and signal. Such issues are prevalent in cone-beam projections, where physical effects such as X-ray scattering result in spatially variant noise, violating common assumptions of homogeneous noise and challenging conventional filtering approaches to signal extraction and noise suppression. Methods: We present a computational mechanism for probing into and quantifying the spatial variance of noise throughout an image. The mechanism builds a pyramid of local statistics at multiple spatial scales; local statistical information at each scale includes (weighted) mean, median, standard deviation, median absolute deviation, as well asmore » histogram or dynamic range after local mean/median shifting. Based on inter-scale differences of local statistics, the spatial scope of distinguishable noise variation is detected in a semi- or un-supervised manner. Additionally, we propose and demonstrate the incorporation of such information in globally parametrized (i.e., non-adaptive) filters, effectively transforming the latter into spatially adaptive filters. The multi-scale mechanism is materialized by efficient algorithms and implemented in parallel CPU/GPU architectures. Results: We demonstrate the impact of local statistics for adaptive image processing and analysis using cone-beam projections of a Catphan phantom, fitted within an annulus to increase X-ray scattering. The effective spatial scope of local statistics calculations is shown to vary throughout the image domain, necessitating multi-scale noise and signal structure analysis. Filtering results with and without spatial filter adaptation are compared visually, illustrating improvements in imaging signal extraction and noise suppression, and in preserving information in low-contrast regions. Conclusion: Local image statistics can be incorporated in filtering operations to equip them with spatial adaptivity to spatial signal/noise variations. An efficient multi-scale computational mechanism is developed to curtail processing latency. Spatially adaptive filtering may impact subsequent processing tasks such as reconstruction and numerical gradient computations for deformable registration. NIH Grant No. R01-184173.« less

  10. Effects of advanced treatment of municipal wastewater on the White River near Indianapolis, Indiana; trends in water quality, 1978-86

    USGS Publications Warehouse

    Crawford, Charles G.; Wangsness, David J.

    1993-01-01

    The City of Indianapolis has constructed state-of-the-art advanced municipal wastewater-treatment systems to enlarge and upgrade the existing secondary-treatment processes at its Belmont and Southport treatment plants. These new advanced-wastewater-treatment plants became operational in 1983. A nonparametric statistical procedure--a modified form of the Wilcoxon-Mann-Whitney rank-sum test--was used to test for trends in time-series water-quality data from four sites on the White River and from the Belmont and Southport wastewater-treatment plants. Time-series data representative of pre-advanced- (1978-1980) and post-advanced- (1983--86) wastewater-treatment conditions were tested for trends, and the results indicate substantial changes in water quality of treated effluent and of the White River downstream from Indianapolis after implementation of advanced wastewater treatment. Water quality from 1981 through 1982 was highly variable due to plant construction. Therefore, this time period was excluded from the analysis. Water quality at sample sites located upstream from the wastewater-treatment plants was relatively constant during the period of study (1978-86). Analysis of data from the two plants and downstream from the plants indicates statistically significant decreasing trends in effluent concentrations of total ammonia, 5-day biochemical-oxygen demand, fecal-coliform bacteria, total phosphate, and total solids at all sites where sufficient data were available for testing. Because of in-plant nitrification, increases in nitrate concentration were statistically significant in the two plants and in the White River. The decrease in ammonia concentrations and 5-day biochemical-oxygen demand in the White River resulted in a statistically significant increasing trend in dissolved-oxygen concentration in the river because of reduced oxygen demand for nitrification and biochemical oxidation processes. Following implementation of advanced wastewater treatment, the number of river-quality samples that failed to meet the water-quality standards for ammonia and dissolved oxygen that apply to the White River decreased substantially.

  11. Project IMPACT Pilot Report: Feasibility of Implementing a Hospital-to-Home Transition Bundle.

    PubMed

    Mallory, Leah A; Osorio, Snezana Nena; Prato, B Stephen; DiPace, Jennifer; Schmutter, Lisa; Soung, Paula; Rogers, Amanda; Woodall, William J; Burley, Kayla; Gage, Sandra; Cooperberg, David

    2017-03-01

    To improve hospital to home transitions, a 4-element pediatric patient-centered transition bundle was developed, including: a transition readiness checklist; predischarge teach-back education; timely and complete written handoff to the primary care provider; and a postdischarge phone call. The objective of this study was to demonstrate the feasibility of bundle implementation and report initial outcomes at 4 pilot sites. Outcome measures included postdischarge caregiver ability to teach-back key home management information and 30-day reuse rates. A multisite, observational time series using multiple planned sequential interventions to implement bundle components with non-technology-supported and technology-supported patients. Data were collected via electronic health record reviews and during postdischarge phone calls. Statistical process control charts were used to assess outcomes. Four pilot sites implemented the bundle between January 2014 and May 2015 for 2601 patients, of whom 1394 had postdischarge telephone encounters. Improvement was noted in the implementation of all bundle elements with the transitions readiness checklist posing the greatest feasibility challenge. Phone contact connection rates were 69%. Caregiver ability to teach-back essential home management information postdischarge improved from 18% to 82%. No improvement was noted in reuse rates, which differed dramatically between technology-supported and non-technology-supported patients. A pediatric care transition bundle was successfully tested and implemented, as demonstrated by improvement in all process measures, as well as caregiver home management skills. Important considerations for successful implementation and evaluation of the discharge bundle include the role of local context, electronic health record integration, and subgroup analysis for technology-supported patients. Copyright © 2017 by the American Academy of Pediatrics.

  12. A model for indexing medical documents combining statistical and symbolic knowledge.

    PubMed

    Avillach, Paul; Joubert, Michel; Fieschi, Marius

    2007-10-11

    To develop and evaluate an information processing method based on terminologies, in order to index medical documents in any given documentary context. We designed a model using both symbolic general knowledge extracted from the Unified Medical Language System (UMLS) and statistical knowledge extracted from a domain of application. Using statistical knowledge allowed us to contextualize the general knowledge for every particular situation. For each document studied, the extracted terms are ranked to highlight the most significant ones. The model was tested on a set of 17,079 French standardized discharge summaries (SDSs). The most important ICD-10 term of each SDS was ranked 1st or 2nd by the method in nearly 90% of the cases. The use of several terminologies leads to more precise indexing. The improvement achieved in the models implementation performances as a result of using semantic relationships is encouraging.

  13. Use of observational and model-derived fields and regime model output statistics in mesoscale forecasting

    NASA Technical Reports Server (NTRS)

    Forbes, G. S.; Pielke, R. A.

    1985-01-01

    Various empirical and statistical weather-forecasting studies which utilize stratification by weather regime are described. Objective classification was used to determine weather regime in some studies. In other cases the weather pattern was determined on the basis of a parameter representing the physical and dynamical processes relevant to the anticipated mesoscale phenomena, such as low level moisture convergence and convective precipitation, or the Froude number and the occurrence of cold-air damming. For mesoscale phenomena already in existence, new forecasting techniques were developed. The use of cloud models in operational forecasting is discussed. Models to calculate the spatial scales of forcings and resultant response for mesoscale systems are presented. The use of these models to represent the climatologically most prevalent systems, and to perform case-by-case simulations is reviewed. Operational implementation of mesoscale data into weather forecasts, using both actual simulation output and method-output statistics is discussed.

  14. A Model for Indexing Medical Documents Combining Statistical and Symbolic Knowledge.

    PubMed Central

    Avillach, Paul; Joubert, Michel; Fieschi, Marius

    2007-01-01

    OBJECTIVES: To develop and evaluate an information processing method based on terminologies, in order to index medical documents in any given documentary context. METHODS: We designed a model using both symbolic general knowledge extracted from the Unified Medical Language System (UMLS) and statistical knowledge extracted from a domain of application. Using statistical knowledge allowed us to contextualize the general knowledge for every particular situation. For each document studied, the extracted terms are ranked to highlight the most significant ones. The model was tested on a set of 17,079 French standardized discharge summaries (SDSs). RESULTS: The most important ICD-10 term of each SDS was ranked 1st or 2nd by the method in nearly 90% of the cases. CONCLUSIONS: The use of several terminologies leads to more precise indexing. The improvement achieved in the model’s implementation performances as a result of using semantic relationships is encouraging. PMID:18693792

  15. When mechanism matters: Bayesian forecasting using models of ecological diffusion

    USGS Publications Warehouse

    Hefley, Trevor J.; Hooten, Mevin B.; Russell, Robin E.; Walsh, Daniel P.; Powell, James A.

    2017-01-01

    Ecological diffusion is a theory that can be used to understand and forecast spatio-temporal processes such as dispersal, invasion, and the spread of disease. Hierarchical Bayesian modelling provides a framework to make statistical inference and probabilistic forecasts, using mechanistic ecological models. To illustrate, we show how hierarchical Bayesian models of ecological diffusion can be implemented for large data sets that are distributed densely across space and time. The hierarchical Bayesian approach is used to understand and forecast the growth and geographic spread in the prevalence of chronic wasting disease in white-tailed deer (Odocoileus virginianus). We compare statistical inference and forecasts from our hierarchical Bayesian model to phenomenological regression-based methods that are commonly used to analyse spatial occurrence data. The mechanistic statistical model based on ecological diffusion led to important ecological insights, obviated a commonly ignored type of collinearity, and was the most accurate method for forecasting.

  16. Implementation of an Evidence Based Guideline for Assessment and Documentation of the Civil Commitment Process.

    PubMed

    Perrigo, Tabitha L; Williams, Kimberly A

    2016-11-01

    The purpose of this quality improvement project was to implement an evidence-based practice guideline for assessment and documentation of the civil commitment process. Participants included six civil commitment examiners who conduct court ordered psychiatric evaluations at two crisis intervention centers in rural area of southeaster state. Data collection was conducted utilizing a chart audit tool both pre and post intervention of 100 civil commitment evaluations. The intervention included the development of an evidenced based form for documentation of civil commitment evaluations and a one on one educational training session was conducted for each participant. Descriptive statistics (t test) was utilized to analyze the data collected. The project demonstrated a significant increase as 25.5 % of evaluations contained the America Psychiatric Association's recommended 11 domains of assessment prior to implementation compared to 65.6 % (p value = 0.018) post implementation. Moreover, participants with family practice training showed an increase in commitment rates from 60 to 77.3 % (p value = 0.066). Whereas, psychiatric trained participants showed a decrease from 83.75 to 77.66 % (p value = 0.38). Demonstrating that court ordered evaluations guided by a standardized form based on evidence affected examiners recommendations for commitments.

  17. Using simulated historical time series to prioritize fuel treatments on landscapes across the United States: The LANDFIRE prototype project

    USGS Publications Warehouse

    Keane, Robert E.; Rollins, Matthew; Zhu, Zhi-Liang

    2007-01-01

    Canopy and surface fuels in many fire-prone forests of the United States have increased over the last 70 years as a result of modern fire exclusion policies, grazing, and other land management activities. The Healthy Forest Restoration Act and National Fire Plan establish a national commitment to reduce fire hazard and restore fire-adapted ecosystems across the USA. The primary index used to prioritize treatment areas across the nation is Fire Regime Condition Class (FRCC) computed as departures of current conditions from the historical fire and landscape conditions. This paper describes a process that uses an extensive set of ecological models to map FRCC from a departure statistic computed from simulated time series of historical landscape composition. This mapping process uses a data-driven, biophysical approach where georeferenced field data, biogeochemical simulation models, and spatial data libraries are integrated using spatial statistical modeling to map environmental gradients that are then used to predict vegetation and fuels characteristics over space. These characteristics are then fed into a landscape fire and succession simulation model to simulate a time series of historical landscape compositions that are then compared to the composition of current landscapes to compute departure, and the FRCC values. Intermediate products from this process are then used to create ancillary vegetation, fuels, and fire regime layers that are useful in the eventual planning and implementation of fuel and restoration treatments at local scales. The complex integration of varied ecological models at different scales is described and problems encountered during the implementation of this process in the LANDFIRE prototype project are addressed.

  18. Attitude and perception of farmers to the implementation of conservation farming in the mountainous area of South Sulawesi

    NASA Astrophysics Data System (ADS)

    Busthanul, N.; Lumoindong, Y.; Syafiuddin, M.; Heliawaty; Lanuhu, N.; Ibrahim, T.; Ambrosius, R. R.

    2018-05-01

    Farmers’ attitudes and perceptions may be the cause of ineffective implementation of conservation farming for agriculture sustainability due to vary of implementing of conservation techniques. The purpose of this research is to know the attitude and perception of farmer toward the application of conservation technique and to know correlation between farmer attitude and perception toward the application of conservation technique. The research was carried out in Kanreapia Village, Tombolo Pao District, Gowa Regency, South Sulawesi Province, Indonesia. Sampling was done by randomly with 30 farmers; using non-parametric statistics with quantitative and qualitative descriptive data analysis approach, using Likert scale. The result showed that farmer attitude and perception toward conservation technique implementation which having the highest category (appropriate) is seasonal crop rotation, while the lowest with less appropriate category is the processing of land according to the contour and the cultivation of the plants accordingly. There is a very strong relationship between farmer attitude and perception. The implications of the findings are that improvements the implementation of conservation farming techniques should be made through improved perceptions.

  19. A method to evaluate process performance by integrating time and resources

    NASA Astrophysics Data System (ADS)

    Wang, Yu; Wei, Qingjie; Jin, Shuang

    2017-06-01

    The purpose of process mining is to improve the existing process of the enterprise, so how to measure the performance of the process is particularly important. However, the current research on the performance evaluation method is still insufficient. The main methods of evaluation are mainly using time or resource. These basic statistics cannot evaluate process performance very well. In this paper, a method of evaluating the performance of the process based on time dimension and resource dimension is proposed. This method can be used to measure the utilization and redundancy of resources in the process. This paper will introduce the design principle and formula of the evaluation algorithm. Then, the design and the implementation of the evaluation method will be introduced. Finally, we will use the evaluating method to analyse the event log from a telephone maintenance process and propose an optimization plan.

  20. MuffinInfo: HTML5-Based Statistics Extractor from Next-Generation Sequencing Data.

    PubMed

    Alic, Andy S; Blanquer, Ignacio

    2016-09-01

    Usually, the information known a priori about a newly sequenced organism is limited. Even resequencing the same organism can generate unpredictable output. We introduce MuffinInfo, a FastQ/Fasta/SAM information extractor implemented in HTML5 capable of offering insights into next-generation sequencing (NGS) data. Our new tool can run on any software or hardware environment, in command line or graphically, and in browser or standalone. It presents information such as average length, base distribution, quality scores distribution, k-mer histogram, and homopolymers analysis. MuffinInfo improves upon the existing extractors by adding the ability to save and then reload the results obtained after a run as a navigable file (also supporting saving pictures of the charts), by supporting custom statistics implemented by the user, and by offering user-adjustable parameters involved in the processing, all in one software. At the moment, the extractor works with all base space technologies such as Illumina, Roche, Ion Torrent, Pacific Biosciences, and Oxford Nanopore. Owing to HTML5, our software demonstrates the readiness of web technologies for mild intensive tasks encountered in bioinformatics.

  1. IMPLEMENTATION AND VALIDATION OF STATISTICAL TESTS IN RESEARCH'S SOFTWARE HELPING DATA COLLECTION AND PROTOCOLS ANALYSIS IN SURGERY.

    PubMed

    Kuretzki, Carlos Henrique; Campos, Antônio Carlos Ligocki; Malafaia, Osvaldo; Soares, Sandramara Scandelari Kusano de Paula; Tenório, Sérgio Bernardo; Timi, Jorge Rufino Ribas

    2016-03-01

    The use of information technology is often applied in healthcare. With regard to scientific research, the SINPE(c) - Integrated Electronic Protocols was created as a tool to support researchers, offering clinical data standardization. By the time, SINPE(c) lacked statistical tests obtained by automatic analysis. Add to SINPE(c) features for automatic realization of the main statistical methods used in medicine . The study was divided into four topics: check the interest of users towards the implementation of the tests; search the frequency of their use in health care; carry out the implementation; and validate the results with researchers and their protocols. It was applied in a group of users of this software in their thesis in the strict sensu master and doctorate degrees in one postgraduate program in surgery. To assess the reliability of the statistics was compared the data obtained both automatically by SINPE(c) as manually held by a professional in statistics with experience with this type of study. There was concern for the use of automatic statistical tests, with good acceptance. The chi-square, Mann-Whitney, Fisher and t-Student were considered as tests frequently used by participants in medical studies. These methods have been implemented and thereafter approved as expected. The incorporation of the automatic SINPE (c) Statistical Analysis was shown to be reliable and equal to the manually done, validating its use as a research tool for medical research.

  2. Parents' and professionals' perceptions of family-centered care for children with autism spectrum disorder across service sectors.

    PubMed

    Hodgetts, Sandra; Nicholas, David; Zwaigenbaum, Lonnie; McConnell, David

    2013-11-01

    Family-centered care (FCC) has been linked with improved parent and child outcomes, yet its implementation can be challenging due to family, professional, organizational and systemic factors and policies. This study aims to increase knowledge and understanding of how families with children with autism spectrum disorder (ASD) experience FCC in Alberta, Canada. 152 parents with a child with ASD completed the Measure of Processes of Care, separately for each utilized service sector, and 146 professionals working with persons with ASD completed the Measure of Processes of Care - Service Providers. Additionally, in-depth interviews were conducted with a sub-sample of 19 parents, purposefully sampled for diversity in child and family characteristics. Data were collected in 2011. Descriptive and inferential statistics were used to analyze quantitative data. Interview transcripts were analyzed using grounded theory constant comparison methods, yielding a data generated theoretical model depicting families' experiences with FCC over time and across service sectors. There were no statistically significant differences in FCC scores across service sectors, but statistically significant differences in FCC scores between parents' and professionals' were found. Qualitative data revealed positive experiences and perceptions of receiving FCC from professionals "on the ground" across sectors, but negative experiences and perceptions of FCC at the systems level (i.e., administration, funders). These broad experiences emerged as a core theme "System of Exclusion", which integrated the key themes: (1) "The Fight", (2) "Roles and Restrictions of Care", and (3) "Therapeutic Rapport". Professionals and service providers can use findings to ensure that services reflect current conceptualizations of FCC, and decision and policy makers can use findings to recognize systemic barriers to implementing FCC and inform policy change. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Analysis of STAT laboratory turnaround times before and after conversion of the hospital information system.

    PubMed

    Lowe, Gary R; Griffin, Yolanda; Hart, Michael D

    2014-08-01

    Modern electronic health record systems (EHRS) reportedly offer advantages including improved quality, error prevention, cost reduction, and increased efficiency. This project reviewed the impact on specimen turnaround times (TAT) and percent compliance for specimens processed in a STAT laboratory after implementation of an upgraded EHRS. Before EHRS implementation, laboratory personnel received instruction and training for specimen processing. One laboratory member per shift received additional training. TAT and percent compliance data sampling occurred 4 times monthly for 13 months post-conversion and were compared with the mean of data collected for 3 months pre-conversion. Percent compliance was gauged using a benchmark of reporting 95% of all specimens within 7 min from receipt. Control charts were constructed for TAT and percent compliance with control limits set at 2 SD and applied continuously through the data collection period. TAT recovered to pre-conversion levels by the 6th month post-conversion. Percent compliance consistently returned to pre-conversion levels by the 10th month post-conversion. Statistical analyses revealed the TAT were significantly longer for 3 months post-conversion (P < .001) compared with pre-conversion levels. Statistical significance was not observed for subsequent groups. Percent compliance results were significantly lower for 6 months post-conversion (P < .001). Statistical significance was not observed for subsequent groups. Extensive efforts were made to train and prepare personnel for challenges expected after the EHRS upgrade. Specific causes identified with the upgraded EHRS included multiple issues involving personnel and the EHRS. These data suggest that system and user issues contributed to delays in returning to pre-conversion TAT and percent compliance levels following the upgrade in the EHRS.

  4. Mathematical problems in the application of multilinear models to facial emotion processing experiments

    NASA Astrophysics Data System (ADS)

    Andersen, Anders H.; Rayens, William S.; Li, Ren-Cang; Blonder, Lee X.

    2000-10-01

    In this paper we describe the enormous potential that multilinear models hold for the analysis of data from neuroimaging experiments that rely on functional magnetic resonance imaging (MRI) or other imaging modalities. A case is made for why one might fully expect that the successful introduction of these models to the neuroscience community could define the next generation of structure-seeking paradigms in the area. In spite of the potential for immediate application, there is much to do from the perspective of statistical science. That is, although multilinear models have already been particularly successful in chemistry and psychology, relatively little is known about their statistical properties. To that end, our research group at the University of Kentucky has made significant progress. In particular, we are in the process of developing formal influence measures for multilinear methods as well as associated classification models and effective implementations. We believe that these problems will be among the most important and useful to the scientific community. Details are presented herein and an application is given in the context of facial emotion processing experiments.

  5. The Importance of Time and Frequency Reference in Quantum Astronomy and Quantum Communications

    DTIC Science & Technology

    2007-11-01

    simulator, but the same general results are valid for optical fiber and also different quantum state transmission technologies (i.e. Entangled Photons ...protocols [6]). The Matlab simulation starts from a sequence of pulses of duration Ton; the number of photons per pulse has been implemented like a...astrophysical emission mechanisms or scattering processes by measuring the statistics of the arrival time of each incoming photon . This line of research will be

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gregory Reaman

    The initiative will enable the COG Biopathology Center (Biospecimen Repository), the Molecular Genetics Laboratory and other participating reference laboratories to upload large data sets to the eRDES. The capability streamlines data currency and accuracy allowing the centers to export data from local systems and import the defined data to the eRDES. The process will aid in the best practices which have been defined by the Office of Biorepository and Biospecimen Research (OBBR) and the Group Banking Committee (GBC). The initiative allows for batch import and export, a data validation process and reporting mechanism, and a model for other labs tomore » incorporate. All objectives are complete. The solutions provided and the defined process eliminates dual data entry resulting in data consistency. The audit trail capabilities allow for complete tracking of the data exchange between laboratories and the Statistical Data Center (SDC). The impact is directly on time and efforts. In return, the process will save money and improve the data utilized by the COG. Ongoing efforts include implementing new technologies to further enhance the current solutions and process currently in place. Web Services and Reporting Services are technologies that have become industry standards and will allow for further harmonization with caBIG (cancer Biolnforrnatics Grid). Additional testing and implementation of the model for other laboratories is in process.« less

  7. Assessment of MSFCs Process for the Development and Activation of Space Act Agreements

    NASA Technical Reports Server (NTRS)

    Daugherty, Rachel A.

    2014-01-01

    A Space Act Agreement (SAA) is a contractual vehicle that NASA utilizes to form partnerships with non-NASA entities to stimulate cutting-edge innovation within the science and technology communities while concurrently supporting the NASA missions. SAAs are similar to traditional contracts in that they involve the commitment of Agency resources but allow more flexibility and are more cost effective to implement than traditional contracts. Consequently, the use of SAAs to develop partnerships has greatly increased over the past several years. To facilitate this influx of SAAs, Marshall Space Flight Center (MSFC) developed a process during a kaizen event to streamline and improve the quality of SAAs developed at the Center level. This study assessed the current SAA process to determine if improvements could be implemented to increase productivity, decrease time to activation, and improve the quality of deliverables. Using a combination of direct procedural observation, personnel interviews, and statistical analysis, elements of the process in need of remediation were identified and potential solutions developed. The findings focus primarily on the difficulties surrounding tracking and enforcing process adherence and communication issues among stakeholders. Potential solutions include utilizing customer relationship management (CRM) software to facilitate process coordination and co-locating or potentially merging the two separate organizations involved in SAA development and activation at MSFC.

  8. A Novel Approach for Adaptive Signal Processing

    NASA Technical Reports Server (NTRS)

    Chen, Ya-Chin; Juang, Jer-Nan

    1998-01-01

    Adaptive linear predictors have been used extensively in practice in a wide variety of forms. In the main, their theoretical development is based upon the assumption of stationarity of the signals involved, particularly with respect to the second order statistics. On this basis, the well-known normal equations can be formulated. If high- order statistical stationarity is assumed, then the equivalent normal equations involve high-order signal moments. In either case, the cross moments (second or higher) are needed. This renders the adaptive prediction procedure non-blind. A novel procedure for blind adaptive prediction has been proposed and considerable implementation has been made in our contributions in the past year. The approach is based upon a suitable interpretation of blind equalization methods that satisfy the constant modulus property and offers significant deviations from the standard prediction methods. These blind adaptive algorithms are derived by formulating Lagrange equivalents from mechanisms of constrained optimization. In this report, other new update algorithms are derived from the fundamental concepts of advanced system identification to carry out the proposed blind adaptive prediction. The results of the work can be extended to a number of control-related problems, such as disturbance identification. The basic principles are outlined in this report and differences from other existing methods are discussed. The applications implemented are speech processing, such as coding and synthesis. Simulations are included to verify the novel modelling method.

  9. An Exploratory Study of OEE Implementation in Indian Manufacturing Companies

    NASA Astrophysics Data System (ADS)

    Kumar, J.; Soni, V. K.

    2015-04-01

    Globally, the implementation of Overall equipment effectiveness (OEE) has proven to be highly effective in improving availability, performance rate and quality rate while reducing unscheduled breakdown and wastage that stems from the equipment. This paper investigates the present status and future scope of OEE metrics in Indian manufacturing companies through an extensive survey. In this survey, opinions of Production and Maintenance Managers have been analyzed statistically to explore the relationship between factors, perspective of OEE and potential use of OEE metrics. Although the sample has been divers in terms of product, process type, size, and geographic location of the companies, they are enforced to implement improvement techniques such as OEE metrics to improve performance. The findings reveal that OEE metrics has huge potential and scope to improve performance. Responses indicate that Indian companies are aware of OEE but they are not utilizing full potential of OEE metrics.

  10. Implementation of a quantum random number generator based on the optimal clustering of photocounts

    NASA Astrophysics Data System (ADS)

    Balygin, K. A.; Zaitsev, V. I.; Klimov, A. N.; Kulik, S. P.; Molotkov, S. N.

    2017-10-01

    To implement quantum random number generators, it is fundamentally important to have a mathematically provable and experimentally testable process of measurements of a system from which an initial random sequence is generated. This makes sure that randomness indeed has a quantum nature. A quantum random number generator has been implemented with the use of the detection of quasi-single-photon radiation by a silicon photomultiplier (SiPM) matrix, which makes it possible to reliably reach the Poisson statistics of photocounts. The choice and use of the optimal clustering of photocounts for the initial sequence of photodetection events and a method of extraction of a random sequence of 0's and 1's, which is polynomial in the length of the sequence, have made it possible to reach a yield rate of 64 Mbit/s of the output certainly random sequence.

  11. Separation of man-made and natural patterns in high-altitude imagery of agricultural areas

    NASA Technical Reports Server (NTRS)

    Samulon, A. S.

    1975-01-01

    A nonstationary linear digital filter is designed and implemented which extracts the natural features from high-altitude imagery of agricultural areas. Essentially, from an original image a new image is created which displays information related to soil properties, drainage patterns, crop disease, and other natural phenomena, and contains no information about crop type or row spacing. A model is developed to express the recorded brightness in a narrow-band image in terms of man-made and natural contributions and which describes statistically the spatial properties of each. The form of the minimum mean-square error linear filter for estimation of the natural component of the scene is derived and a suboptimal filter is implemented. Nonstationarity of the two-dimensional random processes contained in the model requires a unique technique for deriving the optimum filter. Finally, the filter depends on knowledge of field boundaries. An algorithm for boundary location is proposed, discussed, and implemented.

  12. Speeding Up Non-Parametric Bootstrap Computations for Statistics Based on Sample Moments in Small/Moderate Sample Size Applications

    PubMed Central

    Chaibub Neto, Elias

    2015-01-01

    In this paper we propose a vectorized implementation of the non-parametric bootstrap for statistics based on sample moments. Basically, we adopt the multinomial sampling formulation of the non-parametric bootstrap, and compute bootstrap replications of sample moment statistics by simply weighting the observed data according to multinomial counts instead of evaluating the statistic on a resampled version of the observed data. Using this formulation we can generate a matrix of bootstrap weights and compute the entire vector of bootstrap replications with a few matrix multiplications. Vectorization is particularly important for matrix-oriented programming languages such as R, where matrix/vector calculations tend to be faster than scalar operations implemented in a loop. We illustrate the application of the vectorized implementation in real and simulated data sets, when bootstrapping Pearson’s sample correlation coefficient, and compared its performance against two state-of-the-art R implementations of the non-parametric bootstrap, as well as a straightforward one based on a for loop. Our investigations spanned varying sample sizes and number of bootstrap replications. The vectorized bootstrap compared favorably against the state-of-the-art implementations in all cases tested, and was remarkably/considerably faster for small/moderate sample sizes. The same results were observed in the comparison with the straightforward implementation, except for large sample sizes, where the vectorized bootstrap was slightly slower than the straightforward implementation due to increased time expenditures in the generation of weight matrices via multinomial sampling. PMID:26125965

  13. An introduction to statistical process control in research proteomics.

    PubMed

    Bramwell, David

    2013-12-16

    Statistical process control is a well-established and respected method which provides a general purpose, and consistent framework for monitoring and improving the quality of a process. It is routinely used in many industries where the quality of final products is critical and is often required in clinical diagnostic laboratories [1,2]. To date, the methodology has been little utilised in research proteomics. It has been shown to be capable of delivering quantitative QC procedures for qualitative clinical assays [3] making it an ideal methodology to apply to this area of biological research. To introduce statistical process control as an objective strategy for quality control and show how it could be used to benefit proteomics researchers and enhance the quality of the results they generate. We demonstrate that rules which provide basic quality control are easy to derive and implement and could have a major impact on data quality for many studies. Statistical process control is a powerful tool for investigating and improving proteomics research work-flows. The process of characterising measurement systems and defining control rules forces the exploration of key questions that can lead to significant improvements in performance. This work asserts that QC is essential to proteomics discovery experiments. Every experimenter must know the current capabilities of their measurement system and have an objective means for tracking and ensuring that performance. Proteomic analysis work-flows are complicated and multi-variate. QC is critical for clinical chemistry measurements and huge strides have been made in ensuring the quality and validity of results in clinical biochemistry labs. This work introduces some of these QC concepts and works to bridge their use from single analyte QC to applications in multi-analyte systems. This article is part of a Special Issue entitled: Standardization and Quality Control in Proteomics. Copyright © 2013 The Author. Published by Elsevier B.V. All rights reserved.

  14. A statistical metadata model for clinical trials' data management.

    PubMed

    Vardaki, Maria; Papageorgiou, Haralambos; Pentaris, Fragkiskos

    2009-08-01

    We introduce a statistical, process-oriented metadata model to describe the process of medical research data collection, management, results analysis and dissemination. Our approach explicitly provides a structure for pieces of information used in Clinical Study Data Management Systems, enabling a more active role for any associated metadata. Using the object-oriented paradigm, we describe the classes of our model that participate during the design of a clinical trial and the subsequent collection and management of the relevant data. The advantage of our approach is that we focus on presenting the structural inter-relation of these classes when used during datasets manipulation by proposing certain transformations that model the simultaneous processing of both data and metadata. Our solution reduces the possibility of human errors and allows for the tracking of all changes made during datasets lifecycle. The explicit modeling of processing steps improves data quality and assists in the problem of handling data collected in different clinical trials. The case study illustrates the applicability of the proposed framework demonstrating conceptually the simultaneous handling of datasets collected during two randomized clinical studies. Finally, we provide the main considerations for implementing the proposed framework into a modern Metadata-enabled Information System.

  15. Statistical Calibration and Validation of a Homogeneous Ventilated Wall-Interference Correction Method for the National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Walker, Eric L.

    2005-01-01

    Wind tunnel experiments will continue to be a primary source of validation data for many types of mathematical and computational models in the aerospace industry. The increased emphasis on accuracy of data acquired from these facilities requires understanding of the uncertainty of not only the measurement data but also any correction applied to the data. One of the largest and most critical corrections made to these data is due to wall interference. In an effort to understand the accuracy and suitability of these corrections, a statistical validation process for wall interference correction methods has been developed. This process is based on the use of independent cases which, after correction, are expected to produce the same result. Comparison of these independent cases with respect to the uncertainty in the correction process establishes a domain of applicability based on the capability of the method to provide reasonable corrections with respect to customer accuracy requirements. The statistical validation method was applied to the version of the Transonic Wall Interference Correction System (TWICS) recently implemented in the National Transonic Facility at NASA Langley Research Center. The TWICS code generates corrections for solid and slotted wall interference in the model pitch plane based on boundary pressure measurements. Before validation could be performed on this method, it was necessary to calibrate the ventilated wall boundary condition parameters. Discrimination comparisons are used to determine the most representative of three linear boundary condition models which have historically been used to represent longitudinally slotted test section walls. Of the three linear boundary condition models implemented for ventilated walls, the general slotted wall model was the most representative of the data. The TWICS code using the calibrated general slotted wall model was found to be valid to within the process uncertainty for test section Mach numbers less than or equal to 0.60. The scatter among the mean corrected results of the bodies of revolution validation cases was within one count of drag on a typical transport aircraft configuration for Mach numbers at or below 0.80 and two counts of drag for Mach numbers at or below 0.90.

  16. Collective behavior of networks with linear (VLSI) integrate-and-fire neurons.

    PubMed

    Fusi, S; Mattia, M

    1999-04-01

    We analyze in detail the statistical properties of the spike emission process of a canonical integrate-and-fire neuron, with a linear integrator and a lower bound for the depolarization, as often used in VLSI implementations (Mead, 1989). The spike statistics of such neurons appear to be qualitatively similar to conventional (exponential) integrate-and-fire neurons, which exhibit a wide variety of characteristics observed in cortical recordings. We also show that, contrary to current opinion, the dynamics of a network composed of such neurons has two stable fixed points, even in the purely excitatory network, corresponding to two different states of reverberating activity. The analytical results are compared with numerical simulations and are found to be in good agreement.

  17. Bayesian methods in reliability

    NASA Astrophysics Data System (ADS)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  18. PyEvolve: a toolkit for statistical modelling of molecular evolution.

    PubMed

    Butterfield, Andrew; Vedagiri, Vivek; Lang, Edward; Lawrence, Cath; Wakefield, Matthew J; Isaev, Alexander; Huttley, Gavin A

    2004-01-05

    Examining the distribution of variation has proven an extremely profitable technique in the effort to identify sequences of biological significance. Most approaches in the field, however, evaluate only the conserved portions of sequences - ignoring the biological significance of sequence differences. A suite of sophisticated likelihood based statistical models from the field of molecular evolution provides the basis for extracting the information from the full distribution of sequence variation. The number of different problems to which phylogeny-based maximum likelihood calculations can be applied is extensive. Available software packages that can perform likelihood calculations suffer from a lack of flexibility and scalability, or employ error-prone approaches to model parameterisation. Here we describe the implementation of PyEvolve, a toolkit for the application of existing, and development of new, statistical methods for molecular evolution. We present the object architecture and design schema of PyEvolve, which includes an adaptable multi-level parallelisation schema. The approach for defining new methods is illustrated by implementing a novel dinucleotide model of substitution that includes a parameter for mutation of methylated CpG's, which required 8 lines of standard Python code to define. Benchmarking was performed using either a dinucleotide or codon substitution model applied to an alignment of BRCA1 sequences from 20 mammals, or a 10 species subset. Up to five-fold parallel performance gains over serial were recorded. Compared to leading alternative software, PyEvolve exhibited significantly better real world performance for parameter rich models with a large data set, reducing the time required for optimisation from approximately 10 days to approximately 6 hours. PyEvolve provides flexible functionality that can be used either for statistical modelling of molecular evolution, or the development of new methods in the field. The toolkit can be used interactively or by writing and executing scripts. The toolkit uses efficient processes for specifying the parameterisation of statistical models, and implements numerous optimisations that make highly parameter rich likelihood functions solvable within hours on multi-cpu hardware. PyEvolve can be readily adapted in response to changing computational demands and hardware configurations to maximise performance. PyEvolve is released under the GPL and can be downloaded from http://cbis.anu.edu.au/software.

  19. Gender-Mainstreaming in Technical and Vocational Education and Training

    NASA Astrophysics Data System (ADS)

    Nurhaeni, I. D. A.; Kurniawan, Y.

    2018-02-01

    Gender differences should be considered in vocational high schools so women and men can develop their potentials without being inhibited by gender bias. Gender mainstreaming in vocational high schools is a strategy to integrate gender differences at all stages in teaching-learning process for achieving gender equality and equity. This research evaluates the implementation of gender mainstreaming in vocational high schools consisting of seven key components of gender mainstreaming. Four vocational high schools in Sragen Regency Indonesia have been purposively selected. The data were obtained through in-depth interviews and documentation studies. The data were analyzed using Kabeer’s model of gender analysis. The findings show that not all key components of gender mainstreaming have been implemented in vocational high schools. Most vocational high schools have implemented three of seven key components of gender mainstreaming, namely political will and leadership, policy framework and gender statistics. Meanwhile four of seven key components of gender mainstreaming, namely structure and mechanism, resources, infra structures and civil society have not been well-implemented. In conclusion gender mainstreaming has not been implemented effectively in vocational high schools. Accordingly, the government’s education office should continue to encourage and publish guidelines on the implementation of gender-mainstreaming in vocational high schools.

  20. Patient safety in the clinical laboratory: a longitudinal analysis of specimen identification errors.

    PubMed

    Wagar, Elizabeth A; Tamashiro, Lorraine; Yasin, Bushra; Hilborne, Lee; Bruckner, David A

    2006-11-01

    Patient safety is an increasingly visible and important mission for clinical laboratories. Attention to improving processes related to patient identification and specimen labeling is being paid by accreditation and regulatory organizations because errors in these areas that jeopardize patient safety are common and avoidable through improvement in the total testing process. To assess patient identification and specimen labeling improvement after multiple implementation projects using longitudinal statistical tools. Specimen errors were categorized by a multidisciplinary health care team. Patient identification errors were grouped into 3 categories: (1) specimen/requisition mismatch, (2) unlabeled specimens, and (3) mislabeled specimens. Specimens with these types of identification errors were compared preimplementation and postimplementation for 3 patient safety projects: (1) reorganization of phlebotomy (4 months); (2) introduction of an electronic event reporting system (10 months); and (3) activation of an automated processing system (14 months) for a 24-month period, using trend analysis and Student t test statistics. Of 16,632 total specimen errors, mislabeled specimens, requisition mismatches, and unlabeled specimens represented 1.0%, 6.3%, and 4.6% of errors, respectively. Student t test showed a significant decrease in the most serious error, mislabeled specimens (P < .001) when compared to before implementation of the 3 patient safety projects. Trend analysis demonstrated decreases in all 3 error types for 26 months. Applying performance-improvement strategies that focus longitudinally on specimen labeling errors can significantly reduce errors, therefore improving patient safety. This is an important area in which laboratory professionals, working in interdisciplinary teams, can improve safety and outcomes of care.

  1. Helping Students Develop Statistical Reasoning: Implementing a Statistical Reasoning Learning Environment

    ERIC Educational Resources Information Center

    Garfield, Joan; Ben-Zvi, Dani

    2009-01-01

    This article describes a model for an interactive, introductory secondary- or tertiary-level statistics course that is designed to develop students' statistical reasoning. This model is called a "Statistical Reasoning Learning Environment" and is built on the constructivist theory of learning.

  2. Technology and medication errors: impact in nursing homes.

    PubMed

    Baril, Chantal; Gascon, Viviane; St-Pierre, Liette; Lagacé, Denis

    2014-01-01

    The purpose of this paper is to study a medication distribution technology's (MDT) impact on medication errors reported in public nursing homes in Québec Province. The work was carried out in six nursing homes (800 patients). Medication error data were collected from nursing staff through a voluntary reporting process before and after MDT was implemented. The errors were analysed using: totals errors; medication error type; severity and patient consequences. A statistical analysis verified whether there was a significant difference between the variables before and after introducing MDT. The results show that the MDT detected medication errors. The authors' analysis also indicates that errors are detected more rapidly resulting in less severe consequences for patients. MDT is a step towards safer and more efficient medication processes. Our findings should convince healthcare administrators to implement technology such as electronic prescriber or bar code medication administration systems to improve medication processes and to provide better healthcare to patients. Few studies have been carried out in long-term healthcare facilities such as nursing homes. The authors' study extends what is known about MDT's impact on medication errors in nursing homes.

  3. Can There Ever Be Enough to Impact Water Quality? Evaluating BMPs in Elliot Ditch, Indiana Using the LTHIA-LID Model

    NASA Astrophysics Data System (ADS)

    Rahman, M. S.; Hoover, F. A.; Bowling, L. C.

    2017-12-01

    Elliot Ditch is an urban/urbanizing watershed located in the city of Lafayette, IN, USA. The city continues to struggle with stormwater management and combined sewer overflow (CSO) events. Several best-management practices (BMP) such as rain gardens, green roofs, and bioswales have been implemented in the watershed, but the level of adoption needed to achieve meaningful impact is currently unknown. This study's goal is to determine what level of BMP coverage is needed to impact water quality, whether meaningful impact is determined by achieving water quality targets or statistical significance. A power analysis was performed using water quality data for total suspended solids (TSS), E.coli, total phosphorus (TP) and nitrate (NO3-N) from Elliot Ditch from 2011 to 2015. The minimum detectable difference (MDD) was calculated as the percent reduction in load needed to detect a significant change in the watershed. The water quality targets were proposed by stakeholders as part of a watershed management planning process. The water quality targets and the MDD percentages were then compared to simulated load reductions due to BMP implementation using the Long-term Hydrologic Impact Assessment-Low Impact Development (LTHIA-LID) model. Seven baseline model scenarios were simulated by implementing the maximum number of each of six types of BMPs (rain barrels, permeable patios, green roofs, grassed swale/bioswales, bioretention/rain gardens, and porous pavement), as well as all the practices combined in the watershed. These provide the baseline for targeted implementation scenarios designed to determine if statistically and physically meaningful load reductions can be achieved through BMP implementation alone.

  4. [Implementation of a best practice guideline for the prevention of falls: Perception among hospitalized patients and its caregivers].

    PubMed

    Saiz-Vinuesa, M D; Muñoz-Mansilla, E; Muñoz-Serrano, T; Córcoles-Jiménez, M P; Ruiz-García, M V; Fernández-Pallarés, P; Herreros-Sáez, L; Calero-Yáñez, F

    To analyze the influence that the implementation of a fall prevention Best Practice Guideline (BPG) could have on the perception of patients and their caregivers about the utility of the activities implemented, about the care provided during admission and the adherence (the level of follow-up) to the recommendations received at discharge. Design. Quasi-experimental study. Patients >65 years admitted≥48h to the Medical Area of the General Hospital of Albacete. 104 subjects (consecutive sampling January-March 2013). Experimental group (EG). Patients admitted to BPG implementation units. Control group (CG). Usual care units. Sociodemographic characteristics; previous and during admission falls, cognitive status (Pfeiffer); independence in daily life activities (ADLs); satisfaction with care and information provided, utility perceived, adherence to recommendations at discharge. Interview and clinical history. Statistical analysis (SPSS 15.0). Descriptive and bivariant. Relative Risk. CI95%. 104 patients, EG 46.2% (48) and CG 53.8% (56). Women 51.9%, average age 79.9 years (s.d.=7.8). Pfeiffer 4,3 (s.d.=3.7). Previous falls 31.1%. In process, 1 fall in each group. There were statistically significant differences between EG/CG: age, cognitive status and independence in ADLs. In the EG was higher the percentage of perception about the usefulness of the recommendations to prevent falls (P<.001), greater adherence to them (P=0.0002), and to be very or quite satisfied with the information (P<.00004) and care received (P=.002). To implement recommendations according to an Evidence-based BPG to prevent falls in older people has shown, in users and caregivers, greater satisfaction, better perception of its usefulness and greater adherence to the recommendations. Copyright © 2016 SECA. Publicado por Elsevier España, S.L.U. All rights reserved.

  5. Information processing requirements for on-board monitoring of automatic landing

    NASA Technical Reports Server (NTRS)

    Sorensen, J. A.; Karmarkar, J. S.

    1977-01-01

    A systematic procedure is presented for determining the information processing requirements for on-board monitoring of automatic landing systems. The monitoring system detects landing anomalies through use of appropriate statistical tests. The time-to-correct aircraft perturbations is determined from covariance analyses using a sequence of suitable aircraft/autoland/pilot models. The covariance results are used to establish landing safety and a fault recovery operating envelope via an event outcome tree. This procedure is demonstrated with examples using the NASA Terminal Configured Vehicle (B-737 aircraft). The procedure can also be used to define decision height, assess monitoring implementation requirements, and evaluate alternate autoland configurations.

  6. A new computer code for discrete fracture network modelling

    NASA Astrophysics Data System (ADS)

    Xu, Chaoshui; Dowd, Peter

    2010-03-01

    The authors describe a comprehensive software package for two- and three-dimensional stochastic rock fracture simulation using marked point processes. Fracture locations can be modelled by a Poisson, a non-homogeneous, a cluster or a Cox point process; fracture geometries and properties are modelled by their respective probability distributions. Virtual sampling tools such as plane, window and scanline sampling are included in the software together with a comprehensive set of statistical tools including histogram analysis, probability plots, rose diagrams and hemispherical projections. The paper describes in detail the theoretical basis of the implementation and provides a case study in rock fracture modelling to demonstrate the application of the software.

  7. A Data Warehouse Architecture for DoD Healthcare Performance Measurements.

    DTIC Science & Technology

    1999-09-01

    design, develop, implement, and apply statistical analysis and data mining tools to a Data Warehouse of healthcare metrics. With the DoD healthcare...framework, this thesis defines a methodology to design, develop, implement, and apply statistical analysis and data mining tools to a Data Warehouse...21 F. INABILITY TO CONDUCT HELATHCARE ANALYSIS

  8. Using experimental design modules for process characterization in manufacturing/materials processes laboratories

    NASA Technical Reports Server (NTRS)

    Ankenman, Bruce; Ermer, Donald; Clum, James A.

    1994-01-01

    Modules dealing with statistical experimental design (SED), process modeling and improvement, and response surface methods have been developed and tested in two laboratory courses. One course was a manufacturing processes course in Mechanical Engineering and the other course was a materials processing course in Materials Science and Engineering. Each module is used as an 'experiment' in the course with the intent that subsequent course experiments will use SED methods for analysis and interpretation of data. Evaluation of the modules' effectiveness has been done by both survey questionnaires and inclusion of the module methodology in course examination questions. Results of the evaluation have been very positive. Those evaluation results and details of the modules' content and implementation are presented. The modules represent an important component for updating laboratory instruction and to provide training in quality for improved engineering practice.

  9. Students' Emergent Articulations of Statistical Models and Modeling in Making Informal Statistical Inferences

    ERIC Educational Resources Information Center

    Braham, Hana Manor; Ben-Zvi, Dani

    2017-01-01

    A fundamental aspect of statistical inference is representation of real-world data using statistical models. This article analyzes students' articulations of statistical models and modeling during their first steps in making informal statistical inferences. An integrated modeling approach (IMA) was designed and implemented to help students…

  10. Statistical power in parallel group point exposure studies with time-to-event outcomes: an empirical comparison of the performance of randomized controlled trials and the inverse probability of treatment weighting (IPTW) approach.

    PubMed

    Austin, Peter C; Schuster, Tibor; Platt, Robert W

    2015-10-15

    Estimating statistical power is an important component of the design of both randomized controlled trials (RCTs) and observational studies. Methods for estimating statistical power in RCTs have been well described and can be implemented simply. In observational studies, statistical methods must be used to remove the effects of confounding that can occur due to non-random treatment assignment. Inverse probability of treatment weighting (IPTW) using the propensity score is an attractive method for estimating the effects of treatment using observational data. However, sample size and power calculations have not been adequately described for these methods. We used an extensive series of Monte Carlo simulations to compare the statistical power of an IPTW analysis of an observational study with time-to-event outcomes with that of an analysis of a similarly-structured RCT. We examined the impact of four factors on the statistical power function: number of observed events, prevalence of treatment, the marginal hazard ratio, and the strength of the treatment-selection process. We found that, on average, an IPTW analysis had lower statistical power compared to an analysis of a similarly-structured RCT. The difference in statistical power increased as the magnitude of the treatment-selection model increased. The statistical power of an IPTW analysis tended to be lower than the statistical power of a similarly-structured RCT.

  11. Evaluation of probabilistic forecasts with the scoringRules package

    NASA Astrophysics Data System (ADS)

    Jordan, Alexander; Krüger, Fabian; Lerch, Sebastian

    2017-04-01

    Over the last decades probabilistic forecasts in the form of predictive distributions have become popular in many scientific disciplines. With the proliferation of probabilistic models arises the need for decision-theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way in order to better understand sources of prediction errors and to improve the models. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. In coherence with decision-theoretical principles they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This contribution presents the software package scoringRules for the statistical programming language R, which provides functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. For univariate variables, two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, ensemble weather forecasts take this form. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices. Recent developments include the addition of scoring rules to evaluate multivariate forecast distributions. The use of the scoringRules package is illustrated in an example on post-processing ensemble forecasts of temperature.

  12. Urban land use monitoring from computer-implemented processing of airborne multispectral data

    NASA Technical Reports Server (NTRS)

    Todd, W. J.; Mausel, P. W.; Baumgardner, M. F.

    1976-01-01

    Machine processing techniques were applied to multispectral data obtained from airborne scanners at an elevation of 600 meters over central Indianapolis in August, 1972. Computer analysis of these spectral data indicate that roads (two types), roof tops (three types), dense grass (two types), sparse grass (two types), trees, bare soil, and water (two types) can be accurately identified. Using computers, it is possible to determine land uses from analysis of type, size, shape, and spatial associations of earth surface images identified from multispectral data. Land use data developed through machine processing techniques can be programmed to monitor land use changes, simulate land use conditions, and provide impact statistics that are required to analyze stresses placed on spatial systems.

  13. Optimization of business processes in banks through flexible workflow

    NASA Astrophysics Data System (ADS)

    Postolache, V.

    2017-08-01

    This article describes an integrated business model of a commercial bank. There are examples of components that go into its composition: wooden models and business processes, strategic goals, organizational structure, system architecture, operational and marketing risk models, etc. The practice has shown that the development and implementation of the integrated business model of the bank significantly increase operating efficiency and its management, ensures organizational and technology stable development. Considering the evolution of business processes in the banking sector, should be analysed their common characteristics. From the author’s point of view, a business process is a set of various activities of a commercial bank in which “Input” is one or more financial and material resources, as a result of this activity and “output” is created by banking product, which is some value to consumer. Using workflow technology, management business process efficiency issue is a matter of managing the integration of resources and sequence of actions aimed at achieving this goal. In turn, it implies management of jobs or functions’ interaction, synchronizing of the assignments periods, reducing delays in the transmission of the results etc. Workflow technology is very important for managers at all levels, as they can use it to easily strengthen the control over what is happening in a particular unit, and in the bank as a whole. The manager is able to plan, to implement rules, to interact within the framework of the company’s procedures and tasks entrusted to the system of the distribution function and execution control, alert on the implementation and issuance of the statistical data on the effectiveness of operating procedures. Development and active use of the integrated bank business model is one of the key success factors that contribute to long-term and stable development of the bank, increase employee efficiency and business processes, implement the strategic objectives.

  14. Safe and effective nursing shift handover with NURSEPASS: An interrupted time series.

    PubMed

    Smeulers, Marian; Dolman, Christine D; Atema, Danielle; van Dieren, Susan; Maaskant, Jolanda M; Vermeulen, Hester

    2016-11-01

    Implementation of a locally developed evidence based nursing shift handover blueprint with a bedside-safety-check to determine the effect size on quality of handover. A mixed methods design with: (1) an interrupted time series analysis to determine the effect on handover quality in six domains; (2) descriptive statistics to analyze the intercepted discrepancies by the bedside-safety-check; (3) evaluation sessions to gather experiences with the new handover process. We observed a continued trend of improvement in handover quality and a significant improvement in two domains of handover: organization/efficiency and contents. The bedside-safety-check successfully identified discrepancies on drains, intravenous medications, bandages or general condition and was highly appreciated. Use of the nursing shift handover blueprint showed promising results on effectiveness as well as on feasibility and acceptability. However, to enable long term measurement on effectiveness, evaluation with large scale interrupted times series or statistical process control is needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Efficient and Flexible Climate Analysis with Python in a Cloud-Based Distributed Computing Framework

    NASA Astrophysics Data System (ADS)

    Gannon, C.

    2017-12-01

    As climate models become progressively more advanced, and spatial resolution further improved through various downscaling projects, climate projections at a local level are increasingly insightful and valuable. However, the raw size of climate datasets presents numerous hurdles for analysts wishing to develop customized climate risk metrics or perform site-specific statistical analysis. Four Twenty Seven, a climate risk consultancy, has implemented a Python-based distributed framework to analyze large climate datasets in the cloud. With the freedom afforded by efficiently processing these datasets, we are able to customize and continually develop new climate risk metrics using the most up-to-date data. Here we outline our process for using Python packages such as XArray and Dask to evaluate netCDF files in a distributed framework, StarCluster to operate in a cluster-computing environment, cloud computing services to access publicly hosted datasets, and how this setup is particularly valuable for generating climate change indicators and performing localized statistical analysis.

  16. Certification Strategies using Run-Time Safety Assurance for Part 23 Autopilot Systems

    NASA Technical Reports Server (NTRS)

    Hook, Loyd R.; Clark, Matthew; Sizoo, David; Skoog, Mark A.; Brady, James

    2016-01-01

    Part 23 aircraft operation, and in particular general aviation, is relatively unsafe when compared to other common forms of vehicle travel. Currently, there exists technologies that could increase safety statistics for these aircraft; however, the high burden and cost of performing the requisite safety critical certification processes for these systems limits their proliferation. For this reason, many entities, including the Federal Aviation Administration, NASA, and the US Air Force, are considering new options for certification for technologies that will improve aircraft safety. Of particular interest, are low cost autopilot systems for general aviation aircraft, as these systems have the potential to positively and significantly affect safety statistics. This paper proposes new systems and techniques, leveraging run-time verification, for the assurance of general aviation autopilot systems, which would be used to supplement the current certification process and provide a viable path for near-term low-cost implementation. In addition, discussions on preliminary experimentation and building the assurance case for a system, based on these principles, is provided.

  17. Development of an errorable car-following driver model

    NASA Astrophysics Data System (ADS)

    Yang, H.-H.; Peng, H.

    2010-06-01

    An errorable car-following driver model is presented in this paper. An errorable driver model is one that emulates human driver's functions and can generate both nominal (error-free), as well as devious (with error) behaviours. This model was developed for evaluation and design of active safety systems. The car-following data used for developing and validating the model were obtained from a large-scale naturalistic driving database. The stochastic car-following behaviour was first analysed and modelled as a random process. Three error-inducing behaviours were then introduced. First, human perceptual limitation was studied and implemented. Distraction due to non-driving tasks was then identified based on the statistical analysis of the driving data. Finally, time delay of human drivers was estimated through a recursive least-square identification process. By including these three error-inducing behaviours, rear-end collisions with the lead vehicle could occur. The simulated crash rate was found to be similar but somewhat higher than that reported in traffic statistics.

  18. Implementation of an Algorithm for Prosthetic Joint Infection: Deviations and Problems.

    PubMed

    Mühlhofer, Heinrich M L; Kanz, Karl-Georg; Pohlig, Florian; Lenze, Ulrich; Lenze, Florian; Toepfer, Andreas; von Eisenhart-Rothe, Ruediger; Schauwecker, Johannes

    The outcome of revision surgery in arthroplasty is based on a precise diagnosis. In addition, the treatment varies based on whether the prosthetic failure is caused by aseptic or septic loosening. Algorithms can help to identify periprosthetic joint infections (PJI) and standardize diagnostic steps, however, algorithms tend to oversimplify the treatment of complex cases. We conducted a process analysis during the implementation of a PJI algorithm to determine problems and deviations associated with the implementation of this algorithm. Fifty patients who were treated after implementing a standardized algorithm were monitored retrospectively. Their treatment plans and diagnostic cascades were analyzed for deviations from the implemented algorithm. Each diagnostic procedure was recorded, compared with the algorithm, and evaluated statistically. We detected 52 deviations while treating 50 patients. In 25 cases, no discrepancy was observed. Synovial fluid aspiration was not performed in 31.8% of patients (95% confidence interval [CI], 18.1%-45.6%), while white blood cell counts (WBCs) and neutrophil differentiation were assessed in 54.5% of patients (95% CI, 39.8%-69.3%). We also observed that the prolonged incubation of cultures was not requested in 13.6% of patients (95% CI, 3.5%-23.8%). In seven of 13 cases (63.6%; 95% CI, 35.2%-92.1%), arthroscopic biopsy was performed; 6 arthroscopies were performed in discordance with the algorithm (12%; 95% CI, 3%-21%). Self-critical analysis of diagnostic processes and monitoring of deviations using algorithms are important and could increase the quality of treatment by revealing recurring faults.

  19. Implementation status of accrual accounting system in health sector.

    PubMed

    Mehrolhassani, Mohammad Hossien; Khayatzadeh-Mahani, Akram; Emami, Mozhgan

    2014-07-29

    Management of financial resources in health systems is one of the major issues of concern for policy makers globally. As a sub-set of financial management, accounting system is of paramount importance. In this paper, which presents part of the results of a wider research project on transition process from a cash accounting system to an accrual accounting system, we look at the impact of components of change on implementation of the new system. Implementing changes is fraught with many obstacles and surveying these challenges will help policy makers to better overcome them. The study applied a quantitative manner in 2012 at Kerman University of Medical Science in Iran. For the evaluation, a teacher made valid questionnaire with Likert scale was used (Cranach's alpha of 0.89) which included 7 change components in accounting system. The study population was 32 subordinate units of Kerman University of Medical Sciences and for data analysis, descriptive and inferential statistics and correlation coefficient in SPSS version 19 were used. Level of effect of all components on the implementation was average downward (5.06±1.86), except for the component "management & leadership (3.46±2.25)" (undesirable from external evaluators' viewpoint) and "technology (6.61±1.92) and work processes (6.35±2.19)" (middle to high from internal evaluators' viewpoint). Results showed that the establishment of accrual accounting system faces infrastructural challenges, especially the components of leadership and management and followers. As such, developing effective measures to overcome implementation obstacles should target these components.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kegel, T.M.

    Calibration laboratories are faced with the need to become accredited or registered to one or more quality standards. One requirement common to all of these standards is the need to have in place a measurement assurance program. What is a measurement assurance program? Brian Belanger, in Measurement Assurance Programs: Part 1, describes it as a {open_quotes}quality assurance program for a measurement process that quantifies the total uncertainty of the measurements (both random and systematic components of error) with respect to national or designated standards and demonstrates that the total uncertainty is sufficiently small to meet the user`s requirements.{close_quotes} Rolf Schumachermore » is more specific in Measurement Assurance in Your Own Laboratory. He states, {open_quotes}Measurement assurance is the application of broad quality control principles to measurements of calibrations.{close_quotes} Here, the focus is on one important part of any measurement assurance program: implementation of statistical process control (SPC). Paraphrasing Juran`s Quality Control Handbook, a process is in statistical control if the only observed variations are those that can be attributed to random causes. Conversely, a process that exhibits variations due to assignable causes is not in a state of statistical control. Finally, Carrol Croarkin states, {open_quotes}In the measurement assurance context the measurement algorithm including instrumentation, reference standards and operator interactions is the process that is to be controlled, and its direct product is the measurement per se. The measurements are assumed to be valid if the measurement algorithm is operating in a state of control.{close_quotes} Implicit in this statement is the important fact that an out-of-control process cannot produce valid measurements. 7 figs.« less

  1. Implementation of a Systematic Accountability Framework in 2014 to Improve the Performance of the Nigerian Polio Program

    PubMed Central

    Tegegne, Sisay G.; MKanda, Pascal; Yehualashet, Yared G.; Erbeto, Tesfaye B.; Touray, Kebba; Nsubuga, Peter; Banda, Richard; Vaz, Rui G.

    2016-01-01

    Background. An accountability framework is a central feature of managing human and financial resources. One of its primary goals is to improve program performance through close monitoring of selected priority activities. The principal objective of this study was to determine the contribution of a systematic accountability framework to improving the performance of the World Health Organization (WHO)–Nigeria polio program staff, as well as the program itself. Methods. The effect of implementation of the accountability framework was evaluated using data on administrative actions and select process indicators associated with acute flaccid paralysis (AFP) surveillance, routine immunization, and polio supplemental immunization activities. Data were collected in 2014 during supportive supervision, using Magpi software (a company that provides service to collect data using mobile phones). A total of 2500 staff were studied. Results. Data on administrative actions and process indicators from quarters 2–4 in 2014 were compared. With respect to administrative actions, 1631 personnel (74%) received positive feedback (written or verbal commendation) in quarter 4 through the accountability framework, compared with 1569 (73%) and 1152 (61%) during quarters 3 and 2, respectively. These findings accorded with data on process indicators associated with AFP surveillance and routine immunization, showing statistically significant improvements in staff performance at the end of quarter 4, compared with other quarters. Conclusions. Improvements in staff performance and process indicators were observed for the WHO-Nigeria polio program after implementation of a systematic accountability framework. PMID:26823334

  2. Contribution au developpement d'une methode de controle des procedes dans une usine de bouletage

    NASA Astrophysics Data System (ADS)

    Gosselin, Claude

    This thesis, a collaborative effort between Ecole de technologie superieure and ArcelorMittal Company, presents the development of a methodology for monitoring and quality control of multivariable industrial production processes. This innovation research mandate was developed at ArcelorMittal Exploitation Miniere (AMEM) pellet plant in Port-Cartier (Quebec, Canada). With this undertaking, ArcelorMittal is striving to maintain its world class level of excellence and continues to pursue initiatives that can augment its competitive advantage worldwide. The plant's gravimetric classification process was retained as a prototype and development laboratory due to its effect on the company's competitiveness and its impact on subsequent steps leading to final production of iron oxide pellets. Concretely, the development of this expertise in process control and in situ monitoring will establish a firm basic knowledge in the fields of complex system physical modeling, data reconciliation, statistical observers, multivariate command and quality control using real-time monitoring of the desirability function. The hydraulic classifier is mathematically modeled. Using planned disturbances on the production line, an identification procedure was established to provide empirical estimations of the model's structural parameters. A new sampling campaign and a previously unpublished data collection and consolidation policy were implemented plant-wide. Access to these invaluable data sources has enabled the establishment of new thresholds that govern the production process and its control. Finally, as a substitute for the traditional quality control process, we have implemented a new strategy based on the use of the desirability function. Our innovation is not in using this Finally, as a substitute for the traditional quality control process, we have implemented a new strategy based on the use of the desirability function. Our innovation is not in using this function as an indicator of overall (economic) satisfaction in the production process, but rather in proposing it as an "observer" of the system's state. The first implementation steps have already demonstrated the method's feasibility as well as other numerous industrial impacts on production processes within the company. Namely, the emergence of the economical aspect as a strategic variable that assures better governance of production processes where quality variables present strategic issues.

  3. Lessons Learned from the Implementation of Total Quality Management at the Naval Aviation Depot, North Island, California

    DTIC Science & Technology

    1988-12-01

    Kaoru Ishikawa recognized the potential of statistical process control during one of Dr. Deming’s many instructional visits to Japan. He wrote the Guide...to Quality Control which has been utilized for both self-study and classroom training. In the Guide to Quality Control, Dr. Ishikawa describes...job data are essential for making a proper evaluation.( Ishikawa , p. 14) The gathering of data and its subsequent analysis are the foundation of

  4. Prognostics

    NASA Technical Reports Server (NTRS)

    Goebel, Kai; Vachtsevanos, George; Orchard, Marcos E.

    2013-01-01

    Knowledge discovery, statistical learning, and more specifically an understanding of the system evolution in time when it undergoes undesirable fault conditions, are critical for an adequate implementation of successful prognostic systems. Prognosis may be understood as the generation of long-term predictions describing the evolution in time of a particular signal of interest or fault indicator, with the purpose of estimating the remaining useful life (RUL) of a failing component/subsystem. Predictions are made using a thorough understanding of the underlying processes and factor in the anticipated future usage.

  5. A Statistical Estimate of the Validity and Reliability of a Rubric Developed by Connecticut's State Education Resource Center to Evaluate the Quality of Individualized Education Programs for Students with Disabilities

    ERIC Educational Resources Information Center

    Mearman, Kimberly A.

    2013-01-01

    Because of the critical function of the IEP in the planning and implementation of effective instruction for students with disabilities, educators need a reference to determine the standards of a quality IEP and a process by which to compare an IEP to those standards. A rubric can support educators in examining the quality of IEPs. This study used…

  6. Teaching Biology through Statistics: Application of Statistical Methods in Genetics and Zoology Courses

    PubMed Central

    Colon-Berlingeri, Migdalisel; Burrowes, Patricia A.

    2011-01-01

    Incorporation of mathematics into biology curricula is critical to underscore for undergraduate students the relevance of mathematics to most fields of biology and the usefulness of developing quantitative process skills demanded in modern biology. At our institution, we have made significant changes to better integrate mathematics into the undergraduate biology curriculum. The curricular revision included changes in the suggested course sequence, addition of statistics and precalculus as prerequisites to core science courses, and incorporating interdisciplinary (math–biology) learning activities in genetics and zoology courses. In this article, we describe the activities developed for these two courses and the assessment tools used to measure the learning that took place with respect to biology and statistics. We distinguished the effectiveness of these learning opportunities in helping students improve their understanding of the math and statistical concepts addressed and, more importantly, their ability to apply them to solve a biological problem. We also identified areas that need emphasis in both biology and mathematics courses. In light of our observations, we recommend best practices that biology and mathematics academic departments can implement to train undergraduates for the demands of modern biology. PMID:21885822

  7. Statistical Model of Dynamic Markers of the Alzheimer's Pathological Cascade.

    PubMed

    Balsis, Steve; Geraci, Lisa; Benge, Jared; Lowe, Deborah A; Choudhury, Tabina K; Tirso, Robert; Doody, Rachelle S

    2018-05-05

    Alzheimer's disease (AD) is a progressive disease reflected in markers across assessment modalities, including neuroimaging, cognitive testing, and evaluation of adaptive function. Identifying a single continuum of decline across assessment modalities in a single sample is statistically challenging because of the multivariate nature of the data. To address this challenge, we implemented advanced statistical analyses designed specifically to model complex data across a single continuum. We analyzed data from the Alzheimer's Disease Neuroimaging Initiative (ADNI; N = 1,056), focusing on indicators from the assessments of magnetic resonance imaging (MRI) volume, fluorodeoxyglucose positron emission tomography (FDG-PET) metabolic activity, cognitive performance, and adaptive function. Item response theory was used to identify the continuum of decline. Then, through a process of statistical scaling, indicators across all modalities were linked to that continuum and analyzed. Findings revealed that measures of MRI volume, FDG-PET metabolic activity, and adaptive function added measurement precision beyond that provided by cognitive measures, particularly in the relatively mild range of disease severity. More specifically, MRI volume, and FDG-PET metabolic activity become compromised in the very mild range of severity, followed by cognitive performance and finally adaptive function. Our statistically derived models of the AD pathological cascade are consistent with existing theoretical models.

  8. Functional annotation of regulatory pathways.

    PubMed

    Pandey, Jayesh; Koyutürk, Mehmet; Kim, Yohan; Szpankowski, Wojciech; Subramaniam, Shankar; Grama, Ananth

    2007-07-01

    Standardized annotations of biomolecules in interaction networks (e.g. Gene Ontology) provide comprehensive understanding of the function of individual molecules. Extending such annotations to pathways is a critical component of functional characterization of cellular signaling at the systems level. We propose a framework for projecting gene regulatory networks onto the space of functional attributes using multigraph models, with the objective of deriving statistically significant pathway annotations. We first demonstrate that annotations of pairwise interactions do not generalize to indirect relationships between processes. Motivated by this result, we formalize the problem of identifying statistically overrepresented pathways of functional attributes. We establish the hardness of this problem by demonstrating the non-monotonicity of common statistical significance measures. We propose a statistical model that emphasizes the modularity of a pathway, evaluating its significance based on the coupling of its building blocks. We complement the statistical model by an efficient algorithm and software, Narada, for computing significant pathways in large regulatory networks. Comprehensive results from our methods applied to the Escherichia coli transcription network demonstrate that our approach is effective in identifying known, as well as novel biological pathway annotations. Narada is implemented in Java and is available at http://www.cs.purdue.edu/homes/jpandey/narada/.

  9. Teaching biology through statistics: application of statistical methods in genetics and zoology courses.

    PubMed

    Colon-Berlingeri, Migdalisel; Burrowes, Patricia A

    2011-01-01

    Incorporation of mathematics into biology curricula is critical to underscore for undergraduate students the relevance of mathematics to most fields of biology and the usefulness of developing quantitative process skills demanded in modern biology. At our institution, we have made significant changes to better integrate mathematics into the undergraduate biology curriculum. The curricular revision included changes in the suggested course sequence, addition of statistics and precalculus as prerequisites to core science courses, and incorporating interdisciplinary (math-biology) learning activities in genetics and zoology courses. In this article, we describe the activities developed for these two courses and the assessment tools used to measure the learning that took place with respect to biology and statistics. We distinguished the effectiveness of these learning opportunities in helping students improve their understanding of the math and statistical concepts addressed and, more importantly, their ability to apply them to solve a biological problem. We also identified areas that need emphasis in both biology and mathematics courses. In light of our observations, we recommend best practices that biology and mathematics academic departments can implement to train undergraduates for the demands of modern biology.

  10. TreSpEx—Detection of Misleading Signal in Phylogenetic Reconstructions Based on Tree Information

    PubMed Central

    Struck, Torsten H

    2014-01-01

    Phylogenies of species or genes are commonplace nowadays in many areas of comparative biological studies. However, for phylogenetic reconstructions one must refer to artificial signals such as paralogy, long-branch attraction, saturation, or conflict between different datasets. These signals might eventually mislead the reconstruction even in phylogenomic studies employing hundreds of genes. Unfortunately, there has been no program allowing the detection of such effects in combination with an implementation into automatic process pipelines. TreSpEx (Tree Space Explorer) now combines different approaches (including statistical tests), which utilize tree-based information like nodal support or patristic distances (PDs) to identify misleading signals. The program enables the parallel analysis of hundreds of trees and/or predefined gene partitions, and being command-line driven, it can be integrated into automatic process pipelines. TreSpEx is implemented in Perl and supported on Linux, Mac OS X, and MS Windows. Source code, binaries, and additional material are freely available at http://www.annelida.de/research/bioinformatics/software.html. PMID:24701118

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, Janine Camille; Thompson, David; Pebay, Philippe Pierre

    Statistical analysis is typically used to reduce the dimensionality of and infer meaning from data. A key challenge of any statistical analysis package aimed at large-scale, distributed data is to address the orthogonal issues of parallel scalability and numerical stability. Many statistical techniques, e.g., descriptive statistics or principal component analysis, are based on moments and co-moments and, using robust online update formulas, can be computed in an embarrassingly parallel manner, amenable to a map-reduce style implementation. In this paper we focus on contingency tables, through which numerous derived statistics such as joint and marginal probability, point-wise mutual information, information entropy,more » and {chi}{sup 2} independence statistics can be directly obtained. However, contingency tables can become large as data size increases, requiring a correspondingly large amount of communication between processors. This potential increase in communication prevents optimal parallel speedup and is the main difference with moment-based statistics (which we discussed in [1]) where the amount of inter-processor communication is independent of data size. Here we present the design trade-offs which we made to implement the computation of contingency tables in parallel. We also study the parallel speedup and scalability properties of our open source implementation. In particular, we observe optimal speed-up and scalability when the contingency statistics are used in their appropriate context, namely, when the data input is not quasi-diffuse.« less

  12. Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.

    PubMed

    Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N

    2009-10-27

    The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a highly adaptable, integrative, yet flexible tool which can be used for automated quality control, analysis, annotation and visualization of microarray data, constituting a starting point for further data interpretation and integration with numerous other tools.

  13. Improvements to an earth observing statistical performance model with applications to LWIR spectral variability

    NASA Astrophysics Data System (ADS)

    Zhao, Runchen; Ientilucci, Emmett J.

    2017-05-01

    Hyperspectral remote sensing systems provide spectral data composed of hundreds of narrow spectral bands. Spectral remote sensing systems can be used to identify targets, for example, without physical interaction. Often it is of interested to characterize the spectral variability of targets or objects. The purpose of this paper is to identify and characterize the LWIR spectral variability of targets based on an improved earth observing statistical performance model, known as the Forecasting and Analysis of Spectroradiometric System Performance (FASSP) model. FASSP contains three basic modules including a scene model, sensor model and a processing model. Instead of using mean surface reflectance only as input to the model, FASSP transfers user defined statistical characteristics of a scene through the image chain (i.e., from source to sensor). The radiative transfer model, MODTRAN, is used to simulate the radiative transfer based on user defined atmospheric parameters. To retrieve class emissivity and temperature statistics, or temperature / emissivity separation (TES), a LWIR atmospheric compensation method is necessary. The FASSP model has a method to transform statistics in the visible (ie., ELM) but currently does not have LWIR TES algorithm in place. This paper addresses the implementation of such a TES algorithm and its associated transformation of statistics.

  14. [The main directions of reforming the service of medical statistics in Ukraine].

    PubMed

    Golubchykov, Mykhailo V; Orlova, Nataliia M; Bielikova, Inna V

    2018-01-01

    Introduction: Implementation of new methods of information support of managerial decision-making should ensure of the effective health system reform and create conditions for improving the quality of operational management, reasonable planning of medical care and increasing the efficiency of the use of system resources. Reforming of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The aim: This work is an analysis of the current situation and justification of the main directions of reforming of Medical Statistics Service of Ukraine. Material and methods: In the work is used a range of methods: content analysis, bibliosemantic, systematic approach. The information base of the research became: WHO strategic and program documents, data of the Medical Statistics Center of the Ministry of Health of Ukraine. Review: The Medical Statistics Service of Ukraine has a completed and effective structure, headed by the State Institution "Medical Statistics Center of the Ministry of Health of Ukraine." This institution reports on behalf of the Ministry of Health of Ukraine to the State Statistical Service of Ukraine, the WHO European Office and other international organizations. An analysis of the current situation showed that to achieve this goal it is necessary: to improve the system of statistical indicators for an adequate assessment of the performance of health institutions, including in the economic aspect; creation of a developed medical and statistical base of administrative territories; change of existing technologies for the formation of information resources; strengthening the material-technical base of the structural units of Medical Statistics Service; improvement of the system of training and retraining of personnel for the service of medical statistics; development of international cooperation in the field of methodology and practice of medical statistics, implementation of internationally accepted methods for collecting, processing, analyzing and disseminating medical and statistical information; the creation of a medical and statistical service that adapted to the specifics of market relations in health care, flexible and sensitive to changes in international methodologies and standards. Conclusions: The data of medical statistics are the basis for taking managerial decisions by managers at all levels of health care. Reform of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The main directions of the reform of the medical statistics service in Ukraine are: the introduction of information technologies, the improvement of the training of personnel for the service, the improvement of material and technical equipment, the maximum reuse of the data obtained, which provides for the unification of primary data and a system of indicators. The most difficult area is the formation of information funds and the introduction of modern information technologies.

  15. [The main directions of reforming the service of medical statistics in Ukraine].

    PubMed

    Golubchykov, Mykhailo V; Orlova, Nataliia M; Bielikova, Inna V

    Introduction: Implementation of new methods of information support of managerial decision-making should ensure of the effective health system reform and create conditions for improving the quality of operational management, reasonable planning of medical care and increasing the efficiency of the use of system resources. Reforming of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The aim: This work is an analysis of the current situation and justification of the main directions of reforming of Medical Statistics Service of Ukraine. Material and methods: In the work is used a range of methods: content analysis, bibliosemantic, systematic approach. The information base of the research became: WHO strategic and program documents, data of the Medical Statistics Center of the Ministry of Health of Ukraine. Review: The Medical Statistics Service of Ukraine has a completed and effective structure, headed by the State Institution "Medical Statistics Center of the Ministry of Health of Ukraine." This institution reports on behalf of the Ministry of Health of Ukraine to the State Statistical Service of Ukraine, the WHO European Office and other international organizations. An analysis of the current situation showed that to achieve this goal it is necessary: to improve the system of statistical indicators for an adequate assessment of the performance of health institutions, including in the economic aspect; creation of a developed medical and statistical base of administrative territories; change of existing technologies for the formation of information resources; strengthening the material-technical base of the structural units of Medical Statistics Service; improvement of the system of training and retraining of personnel for the service of medical statistics; development of international cooperation in the field of methodology and practice of medical statistics, implementation of internationally accepted methods for collecting, processing, analyzing and disseminating medical and statistical information; the creation of a medical and statistical service that adapted to the specifics of market relations in health care, flexible and sensitive to changes in international methodologies and standards. Conclusions: The data of medical statistics are the basis for taking managerial decisions by managers at all levels of health care. Reform of Medical Statistics Service of Ukraine should be considered only in the context of the reform of the entire health system. The main directions of the reform of the medical statistics service in Ukraine are: the introduction of information technologies, the improvement of the training of personnel for the service, the improvement of material and technical equipment, the maximum reuse of the data obtained, which provides for the unification of primary data and a system of indicators. The most difficult area is the formation of information funds and the introduction of modern information technologies.

  16. The design and hardware implementation of a low-power real-time seizure detection algorithm

    NASA Astrophysics Data System (ADS)

    Raghunathan, Shriram; Gupta, Sumeet K.; Ward, Matthew P.; Worth, Robert M.; Roy, Kaushik; Irazoqui, Pedro P.

    2009-10-01

    Epilepsy affects more than 1% of the world's population. Responsive neurostimulation is emerging as an alternative therapy for the 30% of the epileptic patient population that does not benefit from pharmacological treatment. Efficient seizure detection algorithms will enable closed-loop epilepsy prostheses by stimulating the epileptogenic focus within an early onset window. Critically, this is expected to reduce neuronal desensitization over time and lead to longer-term device efficacy. This work presents a novel event-based seizure detection algorithm along with a low-power digital circuit implementation. Hippocampal depth-electrode recordings from six kainate-treated rats are used to validate the algorithm and hardware performance in this preliminary study. The design process illustrates crucial trade-offs in translating mathematical models into hardware implementations and validates statistical optimizations made with empirical data analyses on results obtained using a real-time functioning hardware prototype. Using quantitatively predicted thresholds from the depth-electrode recordings, the auto-updating algorithm performs with an average sensitivity and selectivity of 95.3 ± 0.02% and 88.9 ± 0.01% (mean ± SEα = 0.05), respectively, on untrained data with a detection delay of 8.5 s [5.97, 11.04] from electrographic onset. The hardware implementation is shown feasible using CMOS circuits consuming under 350 nW of power from a 250 mV supply voltage from simulations on the MIT 180 nm SOI process.

  17. Pragmatic trial of a multidisciplinary lung cancer care model in a community healthcare setting: study design, implementation evaluation, and baseline clinical results

    PubMed Central

    Smeltzer, Matthew P.; Rugless, Fedoria E.; Jackson, Bianca M.; Berryman, Courtney L.; Faris, Nicholas R.; Ray, Meredith A.; Meadows, Meghan; Patel, Anita A.; Roark, Kristina S.; Kedia, Satish K.; DeBon, Margaret M.; Crossley, Fayre J.; Oliver, Georgia; McHugh, Laura M.; Hastings, Willeen; Osborne, Orion; Osborne, Jackie; Ill, Toni; Ill, Mark; Jones, Wynett; Lee, Hyo K.; Signore, Raymond S.; Fox, Roy C.; Li, Jingshan; Robbins, Edward T.; Ward, Kenneth D.; Klesges, Lisa M.

    2018-01-01

    Background Responsible for 25% of all US cancer deaths, lung cancer presents complex care-delivery challenges. Adoption of the highly recommended multidisciplinary care model suffers from a dearth of good quality evidence. Leading up to a prospective comparative-effectiveness study of multidisciplinary vs. serial care, we studied the implementation of a rigorously benchmarked multidisciplinary lung cancer clinic. Methods We used a mixed-methods approach to conduct a patient-centered, combined implementation and effectiveness study of a multidisciplinary model of lung cancer care. We established a co-located multidisciplinary clinic to study the implementation of this care-delivery model. We identified and engaged key stakeholders from the onset, used their input to develop the program structure, processes, performance benchmarks, and study endpoints (outcome-related process measures, patient- and caregiver-reported outcomes, survival). In this report, we describe the study design, process of implementation, comparative populations, and how they contrast with patients within the local and regional healthcare system. Trial Registration: ClinicalTrials.gov Identifier: NCT02123797. Results Implementation: the multidisciplinary clinic obtained an overall treatment concordance rate of 90% (target >85%). Satisfaction scores were high, with >95% of patients and caregivers rating themselves as being “very satisfied” with all aspects of care from the multidisciplinary team (patient/caregiver response rate >90%). The Reach of the multidisciplinary clinic included a higher proportion of minority patients, more women, and younger patients than the regional population. Comparative effectiveness: The comparative effectiveness trial conducted in the last phase of the study met the planned enrollment per statistical design, with 178 patients in the multidisciplinary arm and 348 in the serial care arm. The multidisciplinary cohort had older age and a higher percentage of racial minorities, with a higher proportion of stage IV patients in the serial care arm. Conclusions This study demonstrates a comprehensive implementation of a multidisciplinary model of lung cancer care, which will advance the science behind implementing this much-advocated clinical care model. PMID:29535915

  18. A tale of two audits: statistical process control for improving diabetes care in primary care settings.

    PubMed

    Al-Hussein, Fahad Abdullah

    2008-01-01

    Diabetes constitutes a major burden of disease globally. Both primary and secondary prevention need to improve in order to face this challenge. Improving management of diabetes in primary care is therefore of fundamental importance. The objective of these series of audits was to find means of improving diabetes management in chronic disease mini-clinics in primary health care. In the process, we were able to study the effect and practical usefulness of different audit designs - those measuring clinical outcomes, process of care, or both. King Saud City Family and Community Medicine Centre, Saudi National Guard Health Affairs in Riyadh city, Saudi Arabia. Simple random samples of 30 files were selected every two weeks from a sampling frame of file numbers for all diabetes clients seen over the period. Information was transferred to a form, entered on the computer and an automated response was generated regarding the appropriateness of management, a criterion mutually agreed upon by care providers. The results were plotted on statistical process control charts, p charts, displayed for all employees. Data extraction, archiving, entry, analysis, plotting and design and preparation of p charts were managed by nursing staff specially trained for the purpose by physicians with relevant previous experience. Audit series with mixed outcome and process measures failed to detect any changes in the proportion of non-conforming cases over a period of one year. The process measures series, on the other hand, showed improvement in care corresponding to a reduction in the proportion non-conforming by 10% within a period of 3 months. Non-conformities dropped from a mean of 5.0 to 1.4 over the year (P < 0.001). It is possible to improve providers' behaviour regarding implementation of given guidelines through periodic process audits and feedbacks. Frequent process audits in the context of statistical process control should be supplemented with concurrent outcome audits, once or twice a year.

  19. Improving Education in Medical Statistics: Implementing a Blended Learning Model in the Existing Curriculum

    PubMed Central

    Milic, Natasa M.; Trajkovic, Goran Z.; Bukumiric, Zoran M.; Cirkovic, Andja; Nikolic, Ivan M.; Milin, Jelena S.; Milic, Nikola V.; Savic, Marko D.; Corac, Aleksandar M.; Marinkovic, Jelena M.; Stanisavljevic, Dejana M.

    2016-01-01

    Background Although recent studies report on the benefits of blended learning in improving medical student education, there is still no empirical evidence on the relative effectiveness of blended over traditional learning approaches in medical statistics. We implemented blended along with on-site (i.e. face-to-face) learning to further assess the potential value of web-based learning in medical statistics. Methods This was a prospective study conducted with third year medical undergraduate students attending the Faculty of Medicine, University of Belgrade, who passed (440 of 545) the final exam of the obligatory introductory statistics course during 2013–14. Student statistics achievements were stratified based on the two methods of education delivery: blended learning and on-site learning. Blended learning included a combination of face-to-face and distance learning methodologies integrated into a single course. Results Mean exam scores for the blended learning student group were higher than for the on-site student group for both final statistics score (89.36±6.60 vs. 86.06±8.48; p = 0.001) and knowledge test score (7.88±1.30 vs. 7.51±1.36; p = 0.023) with a medium effect size. There were no differences in sex or study duration between the groups. Current grade point average (GPA) was higher in the blended group. In a multivariable regression model, current GPA and knowledge test scores were associated with the final statistics score after adjusting for study duration and learning modality (p<0.001). Conclusion This study provides empirical evidence to support educator decisions to implement different learning environments for teaching medical statistics to undergraduate medical students. Blended and on-site training formats led to similar knowledge acquisition; however, students with higher GPA preferred the technology assisted learning format. Implementation of blended learning approaches can be considered an attractive, cost-effective, and efficient alternative to traditional classroom training in medical statistics. PMID:26859832

  20. Improving Education in Medical Statistics: Implementing a Blended Learning Model in the Existing Curriculum.

    PubMed

    Milic, Natasa M; Trajkovic, Goran Z; Bukumiric, Zoran M; Cirkovic, Andja; Nikolic, Ivan M; Milin, Jelena S; Milic, Nikola V; Savic, Marko D; Corac, Aleksandar M; Marinkovic, Jelena M; Stanisavljevic, Dejana M

    2016-01-01

    Although recent studies report on the benefits of blended learning in improving medical student education, there is still no empirical evidence on the relative effectiveness of blended over traditional learning approaches in medical statistics. We implemented blended along with on-site (i.e. face-to-face) learning to further assess the potential value of web-based learning in medical statistics. This was a prospective study conducted with third year medical undergraduate students attending the Faculty of Medicine, University of Belgrade, who passed (440 of 545) the final exam of the obligatory introductory statistics course during 2013-14. Student statistics achievements were stratified based on the two methods of education delivery: blended learning and on-site learning. Blended learning included a combination of face-to-face and distance learning methodologies integrated into a single course. Mean exam scores for the blended learning student group were higher than for the on-site student group for both final statistics score (89.36±6.60 vs. 86.06±8.48; p = 0.001) and knowledge test score (7.88±1.30 vs. 7.51±1.36; p = 0.023) with a medium effect size. There were no differences in sex or study duration between the groups. Current grade point average (GPA) was higher in the blended group. In a multivariable regression model, current GPA and knowledge test scores were associated with the final statistics score after adjusting for study duration and learning modality (p<0.001). This study provides empirical evidence to support educator decisions to implement different learning environments for teaching medical statistics to undergraduate medical students. Blended and on-site training formats led to similar knowledge acquisition; however, students with higher GPA preferred the technology assisted learning format. Implementation of blended learning approaches can be considered an attractive, cost-effective, and efficient alternative to traditional classroom training in medical statistics.

  1. Parallel ICA and its hardware implementation in hyperspectral image analysis

    NASA Astrophysics Data System (ADS)

    Du, Hongtao; Qi, Hairong; Peterson, Gregory D.

    2004-04-01

    Advances in hyperspectral images have dramatically boosted remote sensing applications by providing abundant information using hundreds of contiguous spectral bands. However, the high volume of information also results in excessive computation burden. Since most materials have specific characteristics only at certain bands, a lot of these information is redundant. This property of hyperspectral images has motivated many researchers to study various dimensionality reduction algorithms, including Projection Pursuit (PP), Principal Component Analysis (PCA), wavelet transform, and Independent Component Analysis (ICA), where ICA is one of the most popular techniques. It searches for a linear or nonlinear transformation which minimizes the statistical dependence between spectral bands. Through this process, ICA can eliminate superfluous but retain practical information given only the observations of hyperspectral images. One hurdle of applying ICA in hyperspectral image (HSI) analysis, however, is its long computation time, especially for high volume hyperspectral data sets. Even the most efficient method, FastICA, is a very time-consuming process. In this paper, we present a parallel ICA (pICA) algorithm derived from FastICA. During the unmixing process, pICA divides the estimation of weight matrix into sub-processes which can be conducted in parallel on multiple processors. The decorrelation process is decomposed into the internal decorrelation and the external decorrelation, which perform weight vector decorrelations within individual processors and between cooperative processors, respectively. In order to further improve the performance of pICA, we seek hardware solutions in the implementation of pICA. Until now, there are very few hardware designs for ICA-related processes due to the complicated and iterant computation. This paper discusses capacity limitation of FPGA implementations for pICA in HSI analysis. A synthesis of Application-Specific Integrated Circuit (ASIC) is designed for pICA-based dimensionality reduction in HSI analysis. The pICA design is implemented using standard-height cells and aimed at TSMC 0.18 micron process. During the synthesis procedure, three ICA-related reconfigurable components are developed for the reuse and retargeting purpose. Preliminary results show that the standard-height cell based ASIC synthesis provide an effective solution for pICA and ICA-related processes in HSI analysis.

  2. The new final Clinical Skills examination in human medicine in Switzerland: Essential steps of exam development, implementation and evaluation, and central insights from the perspective of the national Working Group

    PubMed Central

    Berendonk, Christoph; Schirlo, Christian; Balestra, Gianmarco; Bonvin, Raphael; Feller, Sabine; Huber, Philippe; Jünger, Ernst; Monti, Matteo; Schnabel, Kai; Beyeler, Christine; Guttormsen, Sissel; Huwendiek, Sören

    2015-01-01

    Objective: Since 2011, the new national final examination in human medicine has been implemented in Switzerland, with a structured clinical-practical part in the OSCE format. From the perspective of the national Working Group, the current article describes the essential steps in the development, implementation and evaluation of the Federal Licensing Examination Clinical Skills (FLE CS) as well as the applied quality assurance measures. Finally, central insights gained from the last years are presented. Methods: Based on the principles of action research, the FLE CS is in a constant state of further development. On the foundation of systematically documented experiences from previous years, in the Working Group, unresolved questions are discussed and resulting solution approaches are substantiated (planning), implemented in the examination (implementation) and subsequently evaluated (reflection). The presented results are the product of this iterative procedure. Results: The FLE CS is created by experts from all faculties and subject areas in a multistage process. The examination is administered in German and French on a decentralised basis and consists of twelve interdisciplinary stations per candidate. As important quality assurance measures, the national Review Board (content validation) and the meetings of the standardised patient trainers (standardisation) have proven worthwhile. The statistical analyses show good measurement reliability and support the construct validity of the examination. Among the central insights of the past years, it has been established that the consistent implementation of the principles of action research contributes to the successful further development of the examination. Conclusion: The centrally coordinated, collaborative-iterative process, incorporating experts from all faculties, makes a fundamental contribution to the quality of the FLE CS. The processes and insights presented here can be useful for others planning a similar undertaking. PMID:26483853

  3. The First Six Years of Building and Implementing a Return-to-Work Service for Patients with Acquired Brain Injury. The Rapid-Return-to-Work-Cohort-Study.

    PubMed

    Haveraaen, L; Brouwers, E P M; Sveen, U; Skarpaas, L S; Sagvaag, H; Aas, R W

    2017-12-01

    Background and objective Despite large activity worldwide in building and implementing new return-to-work (RTW) services, few studies have focused on how such implementation processes develop. The aim of this study was to examine the development in patient and service characteristics the first six years of implementing a RTW service for persons with acquired brain injury (ABI). Methods The study was designed as a cohort study (n=189). Data were collected by questionnaires, filled out by the service providers. The material was divided into, and analyzed with, two implementation phases. Non-parametrical statistical methods and hierarchical regression analyses were applied on the material. Results The number of patients increased significantly, and the patient group became more homogeneous. Both the duration of the service, and the number of consultations and group session days were significantly reduced. Conclusion The patient group became more homogenous, but also significantly larger during the first six years of building the RTW service. At the same time, the duration of the service decreased. This study therefore questions if there is a lack of consensus on the intensity of work rehabilitation for this group.

  4. The Path Toward Universal Health Coverage.

    PubMed

    Yassoub, Rami; Alameddine, Mohamad; Saleh, Shadi

    2017-04-01

    Lebanon is a middle-income country with a market-maximized healthcare system that provides limited social protection for its citizens. Estimates reveal that half of the population lacks sufficient health coverage and resorts to out-of-pocket payments. This study triangulated data from a comprehensive review of health packages of countries similar to Lebanon, the Ministry of Public Health statistics, and services suggested by the World Health Organization for inclusion in a health benefits package (HBP). To determine the acceptability and viability of implementing the HBP, a stakeholder analysis was conducted to identify the knowledge, positions, and available resources for the package. The results revealed that the private health sector, having the most resources, is least in favor of implementing the package, whereas the political and civil society sectors support implementation. The main divergence in opinions among stakeholders was on the abolishment of out-of-pocket payments, mainly attributed to the potential abuse of the HBP's services by users. The study's findings encourage health decision makers to capitalize on the current political readiness by proposing the HBP for implementation in the path toward universal health coverage. This requires a consultative process, involving all stakeholders, in devising the strategy and implementation framework of a HBP.

  5. Methods for determining and processing 3D errors and uncertainties for AFM data analysis

    NASA Astrophysics Data System (ADS)

    Klapetek, P.; Nečas, D.; Campbellová, A.; Yacoot, A.; Koenders, L.

    2011-02-01

    This paper describes the processing of three-dimensional (3D) scanning probe microscopy (SPM) data. It is shown that 3D volumetric calibration error and uncertainty data can be acquired for both metrological atomic force microscope systems and commercial SPMs. These data can be used within nearly all the standard SPM data processing algorithms to determine local values of uncertainty of the scanning system. If the error function of the scanning system is determined for the whole measurement volume of an SPM, it can be converted to yield local dimensional uncertainty values that can in turn be used for evaluation of uncertainties related to the acquired data and for further data processing applications (e.g. area, ACF, roughness) within direct or statistical measurements. These have been implemented in the software package Gwyddion.

  6. Emotional metacontrol of attention: Top-down modulation of sensorimotor processes in a robotic visual search task.

    PubMed

    Belkaid, Marwen; Cuperlier, Nicolas; Gaussier, Philippe

    2017-01-01

    Emotions play a significant role in internal regulatory processes. In this paper, we advocate four key ideas. First, novelty detection can be grounded in the sensorimotor experience and allow higher order appraisal. Second, cognitive processes, such as those involved in self-assessment, influence emotional states by eliciting affects like boredom and frustration. Third, emotional processes such as those triggered by self-assessment influence attentional processes. Last, close emotion-cognition interactions implement an efficient feedback loop for the purpose of top-down behavior regulation. The latter is what we call 'Emotional Metacontrol'. We introduce a model based on artificial neural networks. This architecture is used to control a robotic system in a visual search task. The emotional metacontrol intervenes to bias the robot visual attention during active object recognition. Through a behavioral and statistical analysis, we show that this mechanism increases the robot performance and fosters the exploratory behavior to avoid deadlocks.

  7. The second phase in creating the cardiac center for the next generation: beyond structure to process improvement.

    PubMed

    Woods, J

    2001-01-01

    The third generation cardiac institute will build on the successes of the past in structuring the service line, re-organizing to assimilate specialist interests, and re-positioning to expand cardiac services into cardiovascular services. To meet the challenges of an increasingly competitive marketplace and complex delivery system, the focus for this new model will shift away from improved structures, and toward improved processes. This shift will require a sound methodology for statistically measuring and sustaining process changes related to the optimization of cardiovascular care. In recent years, GE Medical Systems has successfully applied Six Sigma methodologies to enable cardiac centers to control key clinical and market development processes through its DMADV, DMAIC and Change Acceleration processes. Data indicates Six Sigma is having a positive impact within organizations across the United States, and when appropriately implemented, this approach can serve as a solid foundation for building the next generation cardiac institute.

  8. Design and implementation of a hot-wire probe for simultaneous velocity and vorticity vector measurements in boundary layers

    NASA Astrophysics Data System (ADS)

    Zimmerman, S.; Morrill-Winter, C.; Klewicki, J.

    2017-10-01

    A multi-sensor hot-wire probe for simultaneously measuring all three components of velocity and vorticity in boundary layers has been designed, fabricated and implemented in experiments up to large Reynolds numbers. The probe consists of eight hot-wires, compactly arranged in two pairs of orthogonal ×-wire arrays. The ×-wire sub-arrays are symmetrically configured such that the full velocity and vorticity vectors are resolved about a single central location. During its design phase, the capacity of this sensor to accurately measure each component of velocity and vorticity was first evaluated via a synthetic experiment in a set of well-resolved DNS fields. The synthetic experiments clarified probe geometry effects, allowed assessment of various processing schemes, and predicted the effects of finite wire length and wire separation on turbulence statistics. The probe was subsequently fabricated and employed in large Reynolds number experiments in the Flow Physics Facility wind tunnel at the University of New Hampshire. Comparisons of statistics from the actual probe with those from the simulated sensor exhibit very good agreement in trend, but with some differences in magnitude. These comparisons also reveal that the use of gradient information in processing the probe data can significantly improve the accuracy of the spanwise velocity measurement near the wall. To the authors' knowledge, the present are the largest Reynolds number laboratory-based measurements of all three vorticity components in boundary layers.

  9. Circularly-symmetric complex normal ratio distribution for scalar transmissibility functions. Part III: Application to statistical modal analysis

    NASA Astrophysics Data System (ADS)

    Yan, Wang-Ji; Ren, Wei-Xin

    2018-01-01

    This study applies the theoretical findings of circularly-symmetric complex normal ratio distribution Yan and Ren (2016) [1,2] to transmissibility-based modal analysis from a statistical viewpoint. A probabilistic model of transmissibility function in the vicinity of the resonant frequency is formulated in modal domain, while some insightful comments are offered. It theoretically reveals that the statistics of transmissibility function around the resonant frequency is solely dependent on 'noise-to-signal' ratio and mode shapes. As a sequel to the development of the probabilistic model of transmissibility function in modal domain, this study poses the process of modal identification in the context of Bayesian framework by borrowing a novel paradigm. Implementation issues unique to the proposed approach are resolved by Lagrange multiplier approach. Also, this study explores the possibility of applying Bayesian analysis in distinguishing harmonic components and structural ones. The approaches are verified through simulated data and experimentally testing data. The uncertainty behavior due to variation of different factors is also discussed in detail.

  10. Image preprocessing for improving computational efficiency in implementation of restoration and superresolution algorithms.

    PubMed

    Sundareshan, Malur K; Bhattacharjee, Supratik; Inampudi, Radhika; Pang, Ho-Yuen

    2002-12-10

    Computational complexity is a major impediment to the real-time implementation of image restoration and superresolution algorithms in many applications. Although powerful restoration algorithms have been developed within the past few years utilizing sophisticated mathematical machinery (based on statistical optimization and convex set theory), these algorithms are typically iterative in nature and require a sufficient number of iterations to be executed to achieve the desired resolution improvement that may be needed to meaningfully perform postprocessing image exploitation tasks in practice. Additionally, recent technological breakthroughs have facilitated novel sensor designs (focal plane arrays, for instance) that make it possible to capture megapixel imagery data at video frame rates. A major challenge in the processing of these large-format images is to complete the execution of the image processing steps within the frame capture times and to keep up with the output rate of the sensor so that all data captured by the sensor can be efficiently utilized. Consequently, development of novel methods that facilitate real-time implementation of image restoration and superresolution algorithms is of significant practical interest and is the primary focus of this study. The key to designing computationally efficient processing schemes lies in strategically introducing appropriate preprocessing steps together with the superresolution iterations to tailor optimized overall processing sequences for imagery data of specific formats. For substantiating this assertion, three distinct methods for tailoring a preprocessing filter and integrating it with the superresolution processing steps are outlined. These methods consist of a region-of-interest extraction scheme, a background-detail separation procedure, and a scene-derived information extraction step for implementing a set-theoretic restoration of the image that is less demanding in computation compared with the superresolution iterations. A quantitative evaluation of the performance of these algorithms for restoring and superresolving various imagery data captured by diffraction-limited sensing operations are also presented.

  11. Bridging the gap between the economic evaluation literature and daily practice in occupational health: a qualitative study among decision-makers in the healthcare sector.

    PubMed

    van Dongen, Johanna M; Tompa, Emile; Clune, Laurie; Sarnocinska-Hart, Anna; Bongers, Paulien M; van Tulder, Maurits W; van der Beek, Allard J; van Wier, Marieke F

    2013-06-03

    Continued improvements in occupational health can only be ensured if decisions regarding the implementation and continuation of occupational health and safety interventions (OHS interventions) are based on the best available evidence. To ensure that this is the case, scientific evidence should meet the needs of decision-makers. As a first step in bridging the gap between the economic evaluation literature and daily practice in occupational health, this study aimed to provide insight into the occupational health decision-making process and information needs of decision-makers. An exploratory qualitative study was conducted with a purposeful sample of occupational health decision-makers in the Ontario healthcare sector. Eighteen in-depth interviews were conducted to explore the process by which occupational health decisions are made and the importance given to the financial implications of OHS interventions. Twenty-five structured telephone interviews were conducted to explore the sources of information used during the decision-making process, and decision-makers' knowledge on economic evaluation methods. In-depth interview data were analyzed according to the constant comparative method. For the structured telephone interviews, summary statistics were prepared. The occupational health decision-making process generally consists of three stages: initiation stage, establishing the need for an intervention; pre-implementation stage, developing an intervention and its business case in order to receive senior management approval; and implementation and evaluation stage, implementing and evaluating an intervention. During this process, information on the financial implications of OHS interventions was found to be of great importance, especially the employer's costs and benefits. However, scientific evidence was rarely consulted, sound ex-post program evaluations were hardly ever performed, and there seemed to be a need to advance the economic evaluation skill set of decision-makers. Financial information is particularly important at the front end of implementation decisions, and can be a key deciding factor of whether to go forward with a new OHS intervention. In addition, it appears that current practice in occupational health in the healthcare sector is not solidly grounded in evidence-based decision-making and strategies should be developed to improve this.

  12. Bridging the gap between the economic evaluation literature and daily practice in occupational health: a qualitative study among decision-makers in the healthcare sector

    PubMed Central

    2013-01-01

    Background Continued improvements in occupational health can only be ensured if decisions regarding the implementation and continuation of occupational health and safety interventions (OHS interventions) are based on the best available evidence. To ensure that this is the case, scientific evidence should meet the needs of decision-makers. As a first step in bridging the gap between the economic evaluation literature and daily practice in occupational health, this study aimed to provide insight into the occupational health decision-making process and information needs of decision-makers. Methods An exploratory qualitative study was conducted with a purposeful sample of occupational health decision-makers in the Ontario healthcare sector. Eighteen in-depth interviews were conducted to explore the process by which occupational health decisions are made and the importance given to the financial implications of OHS interventions. Twenty-five structured telephone interviews were conducted to explore the sources of information used during the decision-making process, and decision-makers’ knowledge on economic evaluation methods. In-depth interview data were analyzed according to the constant comparative method. For the structured telephone interviews, summary statistics were prepared. Results The occupational health decision-making process generally consists of three stages: initiation stage, establishing the need for an intervention; pre-implementation stage, developing an intervention and its business case in order to receive senior management approval; and implementation and evaluation stage, implementing and evaluating an intervention. During this process, information on the financial implications of OHS interventions was found to be of great importance, especially the employer’s costs and benefits. However, scientific evidence was rarely consulted, sound ex-post program evaluations were hardly ever performed, and there seemed to be a need to advance the economic evaluation skill set of decision-makers. Conclusions Financial information is particularly important at the front end of implementation decisions, and can be a key deciding factor of whether to go forward with a new OHS intervention. In addition, it appears that current practice in occupational health in the healthcare sector is not solidly grounded in evidence-based decision-making and strategies should be developed to improve this. PMID:23731570

  13. Professional development in statistics, technology, and cognitively demanding tasks: classroom implementation and obstacles

    NASA Astrophysics Data System (ADS)

    Foley, Gregory D.; Bakr Khoshaim, Heba; Alsaeed, Maha; Nihan Er, S.

    2012-03-01

    Attending professional development programmes can support teachers in applying new strategies for teaching mathematics and statistics. This study investigated (a) the extent to which the participants in a professional development programme subsequently used the techniques they had learned when teaching mathematics and statistics and (b) the obstacles they encountered in enacting cognitively demanding instructional tasks in their classrooms. The programme created an intellectual learning community among the participants and helped them gain confidence as teachers of statistics, and the students of participating teachers became actively engaged in deep mathematical thinking. The participants indicated, however, that time, availability of resources and students' prior achievement critically affected the implementation of cognitively demanding instructional activities.

  14. Implementing New Reform Guidelines in Teaching Introductory College Statistics Courses

    ERIC Educational Resources Information Center

    Everson, Michelle; Zieffler, Andrew; Garfield, Joan

    2008-01-01

    This article introduces the recently adopted Guidelines for the Assessment and Instruction in Statistics Education (GAISE) and provides two examples of introductory statistics courses that have been redesigned to better align with these guidelines.

  15. Preservice Secondary Mathematics Teachers' Statistical Knowledge: A Snapshot of Strengths and Weaknesses

    ERIC Educational Resources Information Center

    Lovett, Jennifer N.; Lee, Hollylynne S.

    2017-01-01

    Amid the implementation of new curriculum standard regarding statistics and new recommendations for preservice secondary mathematics teachers [PSMTs] to teach statistics, there is a need to examine the current state of PSMTs' common statistical knowledge. This study reports on the statistical knowledge 217 PSMTs from a purposeful sample of 18…

  16. Neural Systems with Numerically Matched Input-Output Statistic: Isotonic Bivariate Statistical Modeling

    PubMed Central

    Fiori, Simone

    2007-01-01

    Bivariate statistical modeling from incomplete data is a useful statistical tool that allows to discover the model underlying two data sets when the data in the two sets do not correspond in size nor in ordering. Such situation may occur when the sizes of the two data sets do not match (i.e., there are “holes” in the data) or when the data sets have been acquired independently. Also, statistical modeling is useful when the amount of available data is enough to show relevant statistical features of the phenomenon underlying the data. We propose to tackle the problem of statistical modeling via a neural (nonlinear) system that is able to match its input-output statistic to the statistic of the available data sets. A key point of the new implementation proposed here is that it is based on look-up-table (LUT) neural systems, which guarantee a computationally advantageous way of implementing neural systems. A number of numerical experiments, performed on both synthetic and real-world data sets, illustrate the features of the proposed modeling procedure. PMID:18566641

  17. Implementation of Insight Responsibilities in Process Engineering

    NASA Technical Reports Server (NTRS)

    Osborne, Deborah M.

    1997-01-01

    This report describes an approach for evaluating flight readiness (COFR) and contractor performance evaluation (award fee) as part of the insight role of NASA Process Engineering at Kennedy Space Center. Several evaluation methods are presented, including systems engineering evaluations and use of systems performance data. The transition from an oversight function to the insight function is described. The types of analytical tools appropriate for achieving the flight readiness and contractor performance evaluation goals are described and examples are provided. Special emphasis is placed upon short and small run statistical quality control techniques. Training requirements for system engineers are delineated. The approach described herein would be equally appropriate in other directorates at Kennedy Space Center.

  18. Software Analytical Instrument for Assessment of the Process of Casting Slabs

    NASA Astrophysics Data System (ADS)

    Franěk, Zdeněk; Kavička, František; Štětina, Josef; Masarik, Miloš

    2010-06-01

    The paper describes the original proposal of ways of solution and function of the program equipment for assessment of the process of casting slabs. The program system LITIOS was developed and implemented in EVRAZ Vitkovice Steel Ostrava on the equipment of continuous casting of steel (further only ECC). This program system works on the data warehouse of technological parameters of casting and quality parameters of slabs. It enables an ECC technologist to analyze the course of casting melt and with using statistics methods to set the influence of single technological parameters on the duality of final slabs. The system also enables long term monitoring and optimization of the production.

  19. Bayesian modelling of uncertainties of Monte Carlo radiative-transfer simulations

    NASA Astrophysics Data System (ADS)

    Beaujean, Frederik; Eggers, Hans C.; Kerzendorf, Wolfgang E.

    2018-04-01

    One of the big challenges in astrophysics is the comparison of complex simulations to observations. As many codes do not directly generate observables (e.g. hydrodynamic simulations), the last step in the modelling process is often a radiative-transfer treatment. For this step, the community relies increasingly on Monte Carlo radiative transfer due to the ease of implementation and scalability with computing power. We show how to estimate the statistical uncertainty given the output of just a single radiative-transfer simulation in which the number of photon packets follows a Poisson distribution and the weight (e.g. energy or luminosity) of a single packet may follow an arbitrary distribution. Our Bayesian approach produces a posterior distribution that is valid for any number of packets in a bin, even zero packets, and is easy to implement in practice. Our analytic results for large number of packets show that we generalise existing methods that are valid only in limiting cases. The statistical problem considered here appears in identical form in a wide range of Monte Carlo simulations including particle physics and importance sampling. It is particularly powerful in extracting information when the available data are sparse or quantities are small.

  20. Bayesian modelling of uncertainties of Monte Carlo radiative-transfer simulations

    NASA Astrophysics Data System (ADS)

    Beaujean, Frederik; Eggers, Hans C.; Kerzendorf, Wolfgang E.

    2018-07-01

    One of the big challenges in astrophysics is the comparison of complex simulations to observations. As many codes do not directly generate observables (e.g. hydrodynamic simulations), the last step in the modelling process is often a radiative-transfer treatment. For this step, the community relies increasingly on Monte Carlo radiative transfer due to the ease of implementation and scalability with computing power. We consider simulations in which the number of photon packets is Poisson distributed, while the weight assigned to a single photon packet follows any distribution of choice. We show how to estimate the statistical uncertainty of the sum of weights in each bin from the output of a single radiative-transfer simulation. Our Bayesian approach produces a posterior distribution that is valid for any number of packets in a bin, even zero packets, and is easy to implement in practice. Our analytic results for large number of packets show that we generalize existing methods that are valid only in limiting cases. The statistical problem considered here appears in identical form in a wide range of Monte Carlo simulations including particle physics and importance sampling. It is particularly powerful in extracting information when the available data are sparse or quantities are small.

  1. Recurrent network dynamics reconciles visual motion segmentation and integration.

    PubMed

    Medathati, N V Kartheek; Rankin, James; Meso, Andrew I; Kornprobst, Pierre; Masson, Guillaume S

    2017-09-12

    In sensory systems, a range of computational rules are presumed to be implemented by neuronal subpopulations with different tuning functions. For instance, in primate cortical area MT, different classes of direction-selective cells have been identified and related either to motion integration, segmentation or transparency. Still, how such different tuning properties are constructed is unclear. The dominant theoretical viewpoint based on a linear-nonlinear feed-forward cascade does not account for their complex temporal dynamics and their versatility when facing different input statistics. Here, we demonstrate that a recurrent network model of visual motion processing can reconcile these different properties. Using a ring network, we show how excitatory and inhibitory interactions can implement different computational rules such as vector averaging, winner-take-all or superposition. The model also captures ordered temporal transitions between these behaviors. In particular, depending on the inhibition regime the network can switch from motion integration to segmentation, thus being able to compute either a single pattern motion or to superpose multiple inputs as in motion transparency. We thus demonstrate that recurrent architectures can adaptively give rise to different cortical computational regimes depending upon the input statistics, from sensory flow integration to segmentation.

  2. ICAP: An Interactive Cluster Analysis Procedure for analyzing remotely sensed data. [to classify the radiance data to produce a thematic map

    NASA Technical Reports Server (NTRS)

    Wharton, S. W.

    1980-01-01

    An Interactive Cluster Analysis Procedure (ICAP) was developed to derive classifier training statistics from remotely sensed data. The algorithm interfaces the rapid numerical processing capacity of a computer with the human ability to integrate qualitative information. Control of the clustering process alternates between the algorithm, which creates new centroids and forms clusters and the analyst, who evaluate and elect to modify the cluster structure. Clusters can be deleted or lumped pairwise, or new centroids can be added. A summary of the cluster statistics can be requested to facilitate cluster manipulation. The ICAP was implemented in APL (A Programming Language), an interactive computer language. The flexibility of the algorithm was evaluated using data from different LANDSAT scenes to simulate two situations: one in which the analyst is assumed to have no prior knowledge about the data and wishes to have the clusters formed more or less automatically; and the other in which the analyst is assumed to have some knowledge about the data structure and wishes to use that information to closely supervise the clustering process. For comparison, an existing clustering method was also applied to the two data sets.

  3. Damage localization by statistical evaluation of signal-processed mode shapes

    NASA Astrophysics Data System (ADS)

    Ulriksen, M. D.; Damkilde, L.

    2015-07-01

    Due to their inherent, ability to provide structural information on a local level, mode shapes and t.lieir derivatives are utilized extensively for structural damage identification. Typically, more or less advanced mathematical methods are implemented to identify damage-induced discontinuities in the spatial mode shape signals, hereby potentially facilitating damage detection and/or localization. However, by being based on distinguishing damage-induced discontinuities from other signal irregularities, an intrinsic deficiency in these methods is the high sensitivity towards measurement, noise. The present, article introduces a damage localization method which, compared to the conventional mode shape-based methods, has greatly enhanced robustness towards measurement, noise. The method is based on signal processing of spatial mode shapes by means of continuous wavelet, transformation (CWT) and subsequent, application of a generalized discrete Teager-Kaiser energy operator (GDTKEO) to identify damage-induced mode shape discontinuities. In order to evaluate whether the identified discontinuities are in fact, damage-induced, outlier analysis of principal components of the signal-processed mode shapes is conducted on the basis of T2-statistics. The proposed method is demonstrated in the context, of analytical work with a free-vibrating Euler-Bernoulli beam under noisy conditions.

  4. Systematic review of the use of Statistical Process Control methods to measure the success of pressure ulcer prevention.

    PubMed

    Clark, Michael; Young, Trudie; Fallon, Maureen

    2018-06-01

    Successful prevention of pressure ulcers is the end product of a complex series of care processes including, but not limited to, the assessment of vulnerability to pressure damage; skin assessment and care; nutritional support; repositioning; and the use of beds, mattresses, and cushions to manage mechanical loads on the skin and soft tissues. The purpose of this review was to examine where and how Statistical Process Control (SPC) measures have been used to assess the success of quality improvement initiatives intended to improve pressure ulcer prevention. A search of 7 electronic bibliographic databases was performed on May 17th, 2017, for studies that met the inclusion criteria. SPC methods have been reported in 9 publications since 2010 to interpret changes in the incidence of pressure ulcers over time. While these methods offer rapid interpretation of changes in incidence than is gained from a comparison of 2 arbitrarily selected time points pre- and post-implementation of change, more work is required to ensure that the clinical and scientific communities adopt the most appropriate SPC methods. © 2018 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  5. Dynamic rain fade compensation techniques for the advanced communications technology satellite

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1992-01-01

    The dynamic and composite nature of propagation impairments that are incurred on earth-space communications links at frequencies in and above the 30/20 GHz Ka band necessitate the use of dynamic statistical identification and prediction processing of the fading signal in order to optimally estimate and predict the levels of each of the deleterious attenuation components. Such requirements are being met in NASA's Advanced Communications Technology Satellite (ACTS) project by the implementation of optimal processing schemes derived through the use of the ACTS Rain Attenuation Prediction Model and nonlinear Markov filtering theory. The ACTS Rain Attenuation Prediction Model discerns climatological variations on the order of 0.5 deg in latitude and longitude in the continental U.S. The time-dependent portion of the model gives precise availability predictions for the 'spot beam' links of ACTS. However, the structure of the dynamic portion of the model, which yields performance parameters such as fade duration probabilities, is isomorphic to the state-variable approach of stochastic control theory and is amenable to the design of such statistical fade processing schemes which can be made specific to the particular climatological location at which they are employed.

  6. Long-memory and the sea level-temperature relationship: a fractional cointegration approach.

    PubMed

    Ventosa-Santaulària, Daniel; Heres, David R; Martínez-Hernández, L Catalina

    2014-01-01

    Through thermal expansion of oceans and melting of land-based ice, global warming is very likely contributing to the sea level rise observed during the 20th century. The amount by which further increases in global average temperature could affect sea level is only known with large uncertainties due to the limited capacity of physics-based models to predict sea levels from global surface temperatures. Semi-empirical approaches have been implemented to estimate the statistical relationship between these two variables providing an alternative measure on which to base potentially disrupting impacts on coastal communities and ecosystems. However, only a few of these semi-empirical applications had addressed the spurious inference that is likely to be drawn when one nonstationary process is regressed on another. Furthermore, it has been shown that spurious effects are not eliminated by stationary processes when these possess strong long memory. Our results indicate that both global temperature and sea level indeed present the characteristics of long memory processes. Nevertheless, we find that these variables are fractionally cointegrated when sea-ice extent is incorporated as an instrumental variable for temperature which in our estimations has a statistically significant positive impact on global sea level.

  7. The development of mini project interactive media on junior statistical materials (developmental research in junior high school)

    NASA Astrophysics Data System (ADS)

    Fauziah, D.; Mardiyana; Saputro, D. R. S.

    2018-05-01

    Assessment is an integral part in the learning process. The process and the result should be in line, regarding to measure the ability of learners. Authentic assessment refers to a form of assessment that measures the competence of attitudes, knowledge, and skills. In fact, many teachers including mathematics teachers who have implemented curriculum based teaching 2013 feel confuse and difficult in mastering the use of authentic assessment instruments. Therefore, it is necessary to design an authentic assessment instrument with an interactive mini media project where teacher can adopt it in the assessment. The type of this research is developmental research. The developmental research refers to the 4D models development, which consist of four stages: define, design, develop and disseminate. The research purpose is to create a valid mini project interactive media on statistical materials in junior high school. The retrieved valid instrument based on expert judgment are 3,1 for eligibility constructions aspect, and 3,2 for eligibility presentation aspect, 3,25 for eligibility contents aspect, and 2,9 for eligibility didactic aspect. The research results obtained interactive mini media projects on statistical materials using Adobe Flash so it can help teachers and students in achieving learning objectives.

  8. Redesigning the ICU nursing discharge process: a quality improvement study.

    PubMed

    Chaboyer, Wendy; Lin, Frances; Foster, Michelle; Retallick, Lorraine; Panuwatwanich, Kriengsak; Richards, Brent

    2012-02-01

    To evaluate the impact of a redesigned intensive care unit (ICU) nursing discharge process on ICU discharge delay, hospital mortality, and ICU readmission within 72 hours. A quality improvement study using a time series design and statistical process control analysis was conducted in one Australian general ICU. The primary outcome measure was hours of discharge delay per patient discharged alive per month, measured for 15 months prior to, and for 12 months after the redesigned process was implemented. The redesign process included appointing a change agent to facilitate process improvement, developing a patient handover sheet, requesting ward staff to nominate an estimated transfer time, and designing a daily ICU discharge alert sheet that included an expected date of discharge. A total of 1,787 ICU discharges were included in this study, 1,001 in the 15 months before and 786 in the 12 months after the implementation of the new discharge processes. There was no difference in in-hospital mortality after discharge from ICU or ICU readmission within 72 hours during the study period. However, process improvement was demonstrated by a reduction in the average patient discharge delay time of 3.2 hours (from 4.6 hour baseline to 1.0 hours post-intervention). Involving both ward and ICU staff in the redesign process may have contributed to a shared situational awareness of the problems, which led to more timely and effective ICU discharge processes. The use of a change agent, whose ongoing role involved follow-up of patients discharged from ICU, may have helped to embed the new process into practice. ©2011 Sigma Theta Tau International.

  9. Strategic planning, implementation, and evaluation processes in hospital systems: a survey from Iran.

    PubMed

    Sadeghifar, Jamil; Jafari, Mehdi; Tofighi, Shahram; Ravaghi, Hamid; Maleki, Mohammad Reza

    2014-09-28

    Strategic planning has been presented as an important management practice. However, evidence of its deployment in healthcare systems in low-income and middle-income countries (LMICs) is limited. This study investigated the strategic management process in Iranian hospitals. The present study was accomplished in 24 teaching hospitals in Tehran, Iran from September 2012 to March 2013. The data collection instrument was a questionnaire including 130 items. This questionnaire measured the situation of formulation, implementation, and evaluation of strategic plan as well as the requirements, facilitators, and its benefits in the studied hospitals. All the investigated hospitals had a strategic plan. The obtained percentages for the items "the rate of the compliance to requirements" and "the quantity of planning facilitators" (68.75%), attention to the stakeholder participation in the planning (55.74%), attention to the planning components (62.22%), the status of evaluating strategic plan (59.94%) and the benefits of strategic planning for hospitals (65.15%) were in the medium limit. However, the status of implementation of the strategic plan (53.71%) was found to be weak. Significant statistical correlations were observed between the incentive for developing strategic plan and status of evaluating phase (P=0.04), and between status of implementation phase and having a documented strategic plan (P=0.03). According to the results, it seems that absence of appropriate internal incentive for formulating and implementing strategies led more hospitals to start formulation strategic planning in accordance with the legal requirements of Ministry of Health. Consequently, even though all the investigated hospital had the documented strategic plan, the plan has not been implemented efficiently and valid evaluation of results is yet to be achieved.

  10. Strategic Planning, Implementation, and Evaluation Processes in Hospital Systems: A Survey From Iran

    PubMed Central

    Sadeghifar, Jamil; Jafari, Mehdi; Tofighi, Shahram; Ravaghi, Hamid; Maleki, Mohammad Reza

    2015-01-01

    Aim & Background: Strategic planning has been presented as an important management practice. However, evidence of its deployment in healthcare systems in low-income and middle-income countries (LMICs) is limited. This study investigated the strategic management process in Iranian hospitals. Methods: The present study was accomplished in 24 teaching hospitals in Tehran, Iran from September 2012 to March 2013. The data collection instrument was a questionnaire including 130 items. This questionnaire measured the situation of formulation, implementation, and evaluation of strategic plan as well as the requirements, facilitators, and its benefits in the studied hospitals. Results: All the investigated hospitals had a strategic plan. The obtained percentages for the items “the rate of the compliance to requirements” and “the quantity of planning facilitators” (68.75%), attention to the stakeholder participation in the planning (55.74%), attention to the planning components (62.22%), the status of evaluating strategic plan (59.94%) and the benefits of strategic planning for hospitals (65.15%) were in the medium limit. However, the status of implementation of the strategic plan (53.71%) was found to be weak. Significant statistical correlations were observed between the incentive for developing strategic plan and status of evaluating phase (P=0.04), and between status of implementation phase and having a documented strategic plan (P=0.03). Conclusion: According to the results, it seems that absence of appropriate internal incentive for formulating and implementing strategies led more hospitals to start formulation strategic planning in accordance with the legal requirements of Ministry of Health. Consequently, even though all the investigated hospital had the documented strategic plan, the plan has not been implemented efficiently and valid evaluation of results is yet to be achieved. PMID:25716385

  11. Impact of Mobile Dose-Tracking Technology on Medication Distribution at an Academic Medical Center.

    PubMed

    Kelm, Matthew; Campbell, Udobi

    2016-05-01

    Medication dose-tracking technologies have the potential to improve efficiency and reduce costs associated with re-dispensing doses reported as missing. Data describing this technology and its impact on the medication use process are limited. The purpose of this study is to assess the impact of dose-tracking technology on pharmacy workload and drug expense at an academic, acute care medical center. Dose-tracking technology was implemented in June 2014. Pre-implementation data were collected from February to April 2014. Post-implementation data were collected from July to September 2014. The primary endpoint was the percent of re-dispensed oral syringe and compounded sterile product (CSP) doses within the pre- and post-implementation periods per 1,000 discharges. Secondary endpoints included pharmaceutical expense generated from re-dispensing doses, labor costs, and staff satisfaction with the medication distribution process. We observed an average 6% decrease in re-dispensing of oral syringe and CSP doses from pre- to post-implementation (15,440 vs 14,547 doses; p = .047). However, when values were adjusted per 1,000 discharges, this trend did not reach statistical significance (p = .074). Pharmaceutical expense generated from re-dispensing doses was significantly reduced from pre- to post-implementation ($834,830 vs $746,466 [savings of $88,364]; p = .047). We estimated that $2,563 worth of technician labor was avoided in re-dispensing missing doses. We also saw significant improvement in staff perception of technology assisting in reducing missing doses (p = .0003), as well as improvement in effectiveness of resolving or minimizing missing doses (p = .01). The use of mobile dose-tracking technology demonstrated meaningful reductions in both the number of doses re-dispensed and cost of pharmaceuticals dispensed.

  12. What characterizes the work culture at a hospital unit that successfully implements change - a correlation study.

    PubMed

    André, Beate; Sjøvold, Endre

    2017-07-14

    To successfully achieve change in healthcare, a balance between technology and "people ware", the human recourses, is necessary. However, the human aspect of the change implementation process has received less attention than the technological issues. The aim was to explore the factors that characterize the work culture in a hospital unit that successfully implemented change compared with the factors that characterize the work culture of a hospital unit with unsuccessful implementation. The Systematizing Person-Group Relations method was used for gathering and analyzing data to explore what dominate the behavior in a particular work environment identifying challenges, limitations and opportunities. This method applied six different dimensions, each representing different behavior in a work culture: Synergy, Withdrawal, Opposition, Dependence, Control and Nurture. We compared two different units at the same hospital, one that successfully implemented change and one that was unsuccessful. There were significant statistical differences between healthcare personnel working at a unit that successfully implemented change contrasted with the unit with unsuccessful implementation. These significant differences were found in both the synergy and control dimensions, which are important positive qualities in a work culture. The results of this study show that healthcare personnel at a unit with a successful implementation of change have a working environment with many positive qualities. This indicates that a work environment with a high focus on goal achievement and task orientation can handle the challenges of implementing changes.

  13. Statistical inference of the generation probability of T-cell receptors from sequence repertoires.

    PubMed

    Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G

    2012-10-02

    Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.

  14. A Diagnostics Tool to detect ensemble forecast system anomaly and guide operational decisions

    NASA Astrophysics Data System (ADS)

    Park, G. H.; Srivastava, A.; Shrestha, E.; Thiemann, M.; Day, G. N.; Draijer, S.

    2017-12-01

    The hydrologic community is moving toward using ensemble forecasts to take uncertainty into account during the decision-making process. The New York City Department of Environmental Protection (DEP) implements several types of ensemble forecasts in their decision-making process: ensemble products for a statistical model (Hirsch and enhanced Hirsch); the National Weather Service (NWS) Advanced Hydrologic Prediction Service (AHPS) forecasts based on the classical Ensemble Streamflow Prediction (ESP) technique; and the new NWS Hydrologic Ensemble Forecasting Service (HEFS) forecasts. To remove structural error and apply the forecasts to additional forecast points, the DEP post processes both the AHPS and the HEFS forecasts. These ensemble forecasts provide mass quantities of complex data, and drawing conclusions from these forecasts is time-consuming and difficult. The complexity of these forecasts also makes it difficult to identify system failures resulting from poor data, missing forecasts, and server breakdowns. To address these issues, we developed a diagnostic tool that summarizes ensemble forecasts and provides additional information such as historical forecast statistics, forecast skill, and model forcing statistics. This additional information highlights the key information that enables operators to evaluate the forecast in real-time, dynamically interact with the data, and review additional statistics, if needed, to make better decisions. We used Bokeh, a Python interactive visualization library, and a multi-database management system to create this interactive tool. This tool compiles and stores data into HTML pages that allows operators to readily analyze the data with built-in user interaction features. This paper will present a brief description of the ensemble forecasts, forecast verification results, and the intended applications for the diagnostic tool.

  15. Automatic Image Processing Workflow for the Keck/NIRC2 Vortex Coronagraph

    NASA Astrophysics Data System (ADS)

    Xuan, Wenhao; Cook, Therese; Ngo, Henry; Zawol, Zoe; Ruane, Garreth; Mawet, Dimitri

    2018-01-01

    The Keck/NIRC2 camera, equipped with the vortex coronagraph, is an instrument targeted at the high contrast imaging of extrasolar planets. To uncover a faint planet signal from the overwhelming starlight, we utilize the Vortex Image Processing (VIP) library, which carries out principal component analysis to model and remove the stellar point spread function. To bridge the gap between data acquisition and data reduction, we implement a workflow that 1) downloads, sorts, and processes data with VIP, 2) stores the analysis products into a database, and 3) displays the reduced images, contrast curves, and auxiliary information on a web interface. Both angular differential imaging and reference star differential imaging are implemented in the analysis module. A real-time version of the workflow runs during observations, allowing observers to make educated decisions about time distribution on different targets, hence optimizing science yield. The post-night version performs a standardized reduction after the observation, building up a valuable database that not only helps uncover new discoveries, but also enables a statistical study of the instrument itself. We present the workflow, and an examination of the contrast performance of the NIRC2 vortex with respect to factors including target star properties and observing conditions.

  16. Implementation of Malaria Dynamic Models in Municipality Level Early Warning Systems in Colombia. Part I: Description of Study Sites

    PubMed Central

    Ruiz, Daniel; Cerón, Viviana; Molina, Adriana M.; Quiñónes, Martha L.; Jiménez, Mónica M.; Ahumada, Martha; Gutiérrez, Patricia; Osorio, Salua; Mantilla, Gilma; Connor, Stephen J.; Thomson, Madeleine C.

    2014-01-01

    As part of the Integrated National Adaptation Pilot project and the Integrated Surveillance and Control System, the Colombian National Institute of Health is working on the design and implementation of a Malaria Early Warning System framework, supported by seasonal climate forecasting capabilities, weather and environmental monitoring, and malaria statistical and dynamic models. In this report, we provide an overview of the local ecoepidemiologic settings where four malaria process-based mathematical models are currently being implemented at a municipal level. The description includes general characteristics, malaria situation (predominant type of infection, malaria-positive cases data, malaria incidence, and seasonality), entomologic conditions (primary and secondary vectors, mosquito densities, and feeding frequencies), climatic conditions (climatology and long-term trends), key drivers of epidemic outbreaks, and non-climatic factors (populations at risk, control campaigns, and socioeconomic conditions). Selected pilot sites exhibit different ecoepidemiologic settings that must be taken into account in the development of the integrated surveillance and control system. PMID:24891460

  17. Extending the Peak Bandwidth of Parameters for Softmax Selection in Reinforcement Learning.

    PubMed

    Iwata, Kazunori

    2016-05-11

    Softmax selection is one of the most popular methods for action selection in reinforcement learning. Although various recently proposed methods may be more effective with full parameter tuning, implementing a complicated method that requires the tuning of many parameters can be difficult. Thus, softmax selection is still worth revisiting, considering the cost savings of its implementation and tuning. In fact, this method works adequately in practice with only one parameter appropriately set for the environment. The aim of this paper is to improve the variable setting of this method to extend the bandwidth of good parameters, thereby reducing the cost of implementation and parameter tuning. To achieve this, we take advantage of the asymptotic equipartition property in a Markov decision process to extend the peak bandwidth of softmax selection. Using a variety of episodic tasks, we show that our setting is effective in extending the bandwidth and that it yields a better policy in terms of stability. The bandwidth is quantitatively assessed in a series of statistical tests.

  18. Cross-Identification of Astronomical Catalogs on Multiple GPUs

    NASA Astrophysics Data System (ADS)

    Lee, M. A.; Budavári, T.

    2013-10-01

    One of the most fundamental problems in observational astronomy is the cross-identification of sources. Observations are made in different wavelengths, at different times, and from different locations and instruments, resulting in a large set of independent observations. The scientific outcome is often limited by our ability to quickly perform meaningful associations between detections. The matching, however, is difficult scientifically, statistically, as well as computationally. The former two require detailed physical modeling and advanced probabilistic concepts; the latter is due to the large volumes of data and the problem's combinatorial nature. In order to tackle the computational challenge and to prepare for future surveys, whose measurements will be exponentially increasing in size past the scale of feasible CPU-based solutions, we developed a new implementation which addresses the issue by performing the associations on multiple Graphics Processing Units (GPUs). Our implementation utilizes up to 6 GPUs in combination with the Thrust library to achieve an over 40x speed up verses the previous best implementation running on a multi-CPU SQL Server.

  19. A Brain-Machine Interface Operating with a Real-Time Spiking Neural Network Control Algorithm.

    PubMed

    Dethier, Julie; Nuyujukian, Paul; Eliasmith, Chris; Stewart, Terry; Elassaad, Shauki A; Shenoy, Krishna V; Boahen, Kwabena

    2011-01-01

    Motor prostheses aim to restore function to disabled patients. Despite compelling proof of concept systems, barriers to clinical translation remain. One challenge is to develop a low-power, fully-implantable system that dissipates only minimal power so as not to damage tissue. To this end, we implemented a Kalman-filter based decoder via a spiking neural network (SNN) and tested it in brain-machine interface (BMI) experiments with a rhesus monkey. The Kalman filter was trained to predict the arm's velocity and mapped on to the SNN using the Neural Engineering Framework (NEF). A 2,000-neuron embedded Matlab SNN implementation runs in real-time and its closed-loop performance is quite comparable to that of the standard Kalman filter. The success of this closed-loop decoder holds promise for hardware SNN implementations of statistical signal processing algorithms on neuromorphic chips, which may offer power savings necessary to overcome a major obstacle to the successful clinical translation of neural motor prostheses.

  20. Flexible, fast and accurate sequence alignment profiling on GPGPU with PaSWAS.

    PubMed

    Warris, Sven; Yalcin, Feyruz; Jackson, Katherine J L; Nap, Jan Peter

    2015-01-01

    To obtain large-scale sequence alignments in a fast and flexible way is an important step in the analyses of next generation sequencing data. Applications based on the Smith-Waterman (SW) algorithm are often either not fast enough, limited to dedicated tasks or not sufficiently accurate due to statistical issues. Current SW implementations that run on graphics hardware do not report the alignment details necessary for further analysis. With the Parallel SW Alignment Software (PaSWAS) it is possible (a) to have easy access to the computational power of NVIDIA-based general purpose graphics processing units (GPGPUs) to perform high-speed sequence alignments, and (b) retrieve relevant information such as score, number of gaps and mismatches. The software reports multiple hits per alignment. The added value of the new SW implementation is demonstrated with two test cases: (1) tag recovery in next generation sequence data and (2) isotype assignment within an immunoglobulin 454 sequence data set. Both cases show the usability and versatility of the new parallel Smith-Waterman implementation.

  1. To Assess Prerequisites Before an Implementation Strategy in an Orthopaedic Department in Sweden.

    PubMed

    Bahtsevani, Christel; Idvall, Ewa

    2016-01-01

    Promoting Action on Research Implementation in Health Services (PARiHS) asserts that the success of knowledge implementation relates to multiple factors in a complex and dynamic way, and therefore the effects of implementation strategies vary by method and context. An instrument based on the PARiHS framework was developed to help assess critical factors influencing implementation strategies so that strategies can be tailored to promote implementation.The purpose of this study was to use the Evaluation Before Implementation Questionnaire (EBIQ), to describe staff perceptions in one orthopaedic department, and to investigate differences between wards.Staff members in four different wards at one orthopaedic department at a university hospital in Sweden were invited to complete a questionnaire related to planning for the implementation of a clinical practice guideline. The 23 items in the EBIQ were expected to capture staff perceptions about the evidence, context, and facilitation factors that influence the implementation process. Descriptive statistics and differences between wards were analyzed. Although the overall response rate was low (n = 49), two of the four wards accounted for most of the completed questionnaires (n = 25 and n = 12, respectively), enabling a comparison of these wards. We found significant differences between respondents' perceptions at the two wards in six items regarding context and facilitation in terms of receptiveness to change, forms of leadership, and evaluation and presence of feedback and facilitators.The EBIQ instrument requires further testing, but there appears to be initial support for pre-implementation use of the EBIQ as a means to enhance planning for implementation.

  2. Implementation Status of Accrual Accounting System in Health Sector

    PubMed Central

    Mehrolhassani, Mohammad Hossien; Khayatzadeh-Mahani, Akram; Emami, Mozhgan

    2015-01-01

    Introduction: Management of financial resources in health systems is one of the major issues of concern for policy makers globally. As a sub-set of financial management, accounting system is of paramount importance. In this paper, which presents part of the results of a wider research project on transition process from a cash accounting system to an accrual accounting system, we look at the impact of components of change on implementation of the new system. Implementing changes is fraught with many obstacles and surveying these challenges will help policy makers to better overcome them. Methods: The study applied a quantitative manner in 2012 at Kerman University of Medical Science in Iran. For the evaluation, a teacher made valid questionnaire with Likert scale was used (Cranach’s alpha of 0.89) which included 7 change components in accounting system. The study population was 32 subordinate units of Kerman University of Medical Sciences and for data analysis, descriptive and inferential statistics and correlation coefficient in SPSS version 19 were used. Results: Level of effect of all components on the implementation was average downward (5.06±1.86), except for the component “management & leadership (3.46±2.25)” (undesirable from external evaluators’ viewpoint) and “technology (6.61±1.92) and work processes (6.35±2.19)” (middle to high from internal evaluators’ viewpoint). Conclusions: Results showed that the establishment of accrual accounting system faces infrastructural challenges, especially the components of leadership and management and followers. As such, developing effective measures to overcome implementation obstacles should target these components. PMID:25560337

  3. Integrating community-based verbal autopsy into civil registration and vital statistics (CRVS): system-level considerations

    PubMed Central

    de Savigny, Don; Riley, Ian; Chandramohan, Daniel; Odhiambo, Frank; Nichols, Erin; Notzon, Sam; AbouZahr, Carla; Mitra, Raj; Cobos Muñoz, Daniel; Firth, Sonja; Maire, Nicolas; Sankoh, Osman; Bronson, Gay; Setel, Philip; Byass, Peter; Jakob, Robert; Boerma, Ties; Lopez, Alan D.

    2017-01-01

    ABSTRACT Background: Reliable and representative cause of death (COD) statistics are essential to inform public health policy, respond to emerging health needs, and document progress towards Sustainable Development Goals. However, less than one-third of deaths worldwide are assigned a cause. Civil registration and vital statistics (CRVS) systems in low- and lower-middle-income countries are failing to provide timely, complete and accurate vital statistics, and it will still be some time before they can provide physician-certified COD for every death. Proposals: Verbal autopsy (VA) is a method to ascertain the probable COD and, although imperfect, it is the best alternative in the absence of medical certification. There is extensive experience with VA in research settings but only a few examples of its use on a large scale. Data collection using electronic questionnaires on mobile devices and computer algorithms to analyse responses and estimate probable COD have increased the potential for VA to be routinely applied in CRVS systems. However, a number of CRVS and health system integration issues should be considered in planning, piloting and implementing a system-wide intervention such as VA. These include addressing the multiplicity of stakeholders and sub-systems involved, integration with existing CRVS work processes and information flows, linking VA results to civil registration records, information technology requirements and data quality assurance. Conclusions: Integrating VA within CRVS systems is not simply a technical undertaking. It will have profound system-wide effects that should be carefully considered when planning for an effective implementation. This paper identifies and discusses the major system-level issues and emerging practices, provides a planning checklist of system-level considerations and proposes an overview for how VA can be integrated into routine CRVS systems. PMID:28137194

  4. Assessing a learning process with functional ANOVA estimators of EEG power spectral densities.

    PubMed

    Gutiérrez, David; Ramírez-Moreno, Mauricio A

    2016-04-01

    We propose to assess the process of learning a task using electroencephalographic (EEG) measurements. In particular, we quantify changes in brain activity associated to the progression of the learning experience through the functional analysis-of-variances (FANOVA) estimators of the EEG power spectral density (PSD). Such functional estimators provide a sense of the effect of training in the EEG dynamics. For that purpose, we implemented an experiment to monitor the process of learning to type using the Colemak keyboard layout during a twelve-lessons training. Hence, our aim is to identify statistically significant changes in PSD of various EEG rhythms at different stages and difficulty levels of the learning process. Those changes are taken into account only when a probabilistic measure of the cognitive state ensures the high engagement of the volunteer to the training. Based on this, a series of statistical tests are performed in order to determine the personalized frequencies and sensors at which changes in PSD occur, then the FANOVA estimates are computed and analyzed. Our experimental results showed a significant decrease in the power of [Formula: see text] and [Formula: see text] rhythms for ten volunteers during the learning process, and such decrease happens regardless of the difficulty of the lesson. These results are in agreement with previous reports of changes in PSD being associated to feature binding and memory encoding.

  5. Exploring how ward staff engage with the implementation of a patient safety intervention: a UK-based qualitative process evaluation

    PubMed Central

    Sheard, Laura; Marsh, Claire; O’Hara, Jane; Armitage, Gerry; Wright, John; Lawton, Rebecca

    2017-01-01

    Objectives A patient safety intervention was tested in a 33-ward randomised controlled trial. No statistically significant difference between intervention and control wards was found. We conducted a process evaluation of the trial and our aim in this paper is to understand staff engagement across the 17 intervention wards. Design Large qualitative process evaluation of the implementation of a patient safety intervention. Setting and participants National Health Service staff based on 17 acute hospital wards located at five hospital sites in the North of England. Data We concentrate on three sources here: (1) analysis of taped discussion between ward staff during action planning meetings; (2) facilitators’ field notes and (3) follow-up telephone interviews with staff focusing on whether action plans had been achieved. The analysis involved the use of pen portraits and adaptive theory. Findings First, there were palpable differences in the ways that the 17 ward teams engaged with the key components of the intervention. Five main engagement typologies were evident across the life course of the study: consistent, partial, increasing, decreasing and disengaged. Second, the intensity of support for the intervention at the level of the organisation does not predict the strength of engagement at the level of the individual ward team. Third, the standardisation of facilitative processes provided by the research team does not ensure that implementation standardisation of the intervention occurs by ward staff. Conclusions A dilution of the intervention occurred during the trial because wards engaged with Patient Reporting and Action for a Safe Environment (PRASE) in divergent ways, despite the standardisation of key components. Facilitative processes were not sufficiently adequate to enable intervention wards to successfully engage with PRASE components. PMID:28710206

  6. GTest: a software tool for graphical assessment of empirical distributions' Gaussianity.

    PubMed

    Barca, E; Bruno, E; Bruno, D E; Passarella, G

    2016-03-01

    In the present paper, the novel software GTest is introduced, designed for testing the normality of a user-specified empirical distribution. It has been implemented with two unusual characteristics; the first is the user option of selecting four different versions of the normality test, each of them suited to be applied to a specific dataset or goal, and the second is the inferential paradigm that informs the output of such tests: it is basically graphical and intrinsically self-explanatory. The concept of inference-by-eye is an emerging inferential approach which will find a successful application in the near future due to the growing need of widening the audience of users of statistical methods to people with informal statistical skills. For instance, the latest European regulation concerning environmental issues introduced strict protocols for data handling (data quality assurance, outliers detection, etc.) and information exchange (areal statistics, trend detection, etc.) between regional and central environmental agencies. Therefore, more and more frequently, laboratory and field technicians will be requested to utilize complex software applications for subjecting data coming from monitoring, surveying or laboratory activities to specific statistical analyses. Unfortunately, inferential statistics, which actually influence the decisional processes for the correct managing of environmental resources, are often implemented in a way which expresses its outcomes in a numerical form with brief comments in a strict statistical jargon (degrees of freedom, level of significance, accepted/rejected H0, etc.). Therefore, often, the interpretation of such outcomes is really difficult for people with poor statistical knowledge. In such framework, the paradigm of the visual inference can contribute to fill in such gap, providing outcomes in self-explanatory graphical forms with a brief comment in the common language. Actually, the difficulties experienced by colleagues and their request for an effective tool for addressing such difficulties motivated us in adopting the inference-by-eye paradigm and implementing an easy-to-use, quick and reliable statistical tool. GTest visualizes its outcomes as a modified version of the Q-Q plot. The application has been developed in Visual Basic for Applications (VBA) within MS Excel 2010, which demonstrated to have all the characteristics of robustness and reliability needed. GTest provides true graphical normality tests which are as reliable as any statistical quantitative approach but much easier to understand. The Q-Q plots have been integrated with the outlining of an acceptance region around the representation of the theoretical distribution, defined in accordance with the alpha level of significance and the data sample size. The test decision rule is the following: if the empirical scatterplot falls completely within the acceptance region, then it can be concluded that the empirical distribution fits the theoretical one at the given alpha level. A comprehensive case study has been carried out with simulated and real-world data in order to check the robustness and reliability of the software.

  7. New Optical Transforms For Statistical Image Recognition

    NASA Astrophysics Data System (ADS)

    Lee, Sing H.

    1983-12-01

    In optical implementation of statistical image recognition, new optical transforms on large images for real-time recognition are of special interest. Several important linear transformations frequently used in statistical pattern recognition have now been optically implemented, including the Karhunen-Loeve transform (KLT), the Fukunaga-Koontz transform (FKT) and the least-squares linear mapping technique (LSLMT).1-3 The KLT performs principle components analysis on one class of patterns for feature extraction. The FKT performs feature extraction for separating two classes of patterns. The LSLMT separates multiple classes of patterns by maximizing the interclass differences and minimizing the intraclass variations.

  8. Statistical innovations in diagnostic device evaluation.

    PubMed

    Yu, Tinghui; Li, Qin; Gray, Gerry; Yue, Lilly Q

    2016-01-01

    Due to rapid technological development, innovations in diagnostic devices are proceeding at an extremely fast pace. Accordingly, the needs for adopting innovative statistical methods have emerged in the evaluation of diagnostic devices. Statisticians in the Center for Devices and Radiological Health at the Food and Drug Administration have provided leadership in implementing statistical innovations. The innovations discussed in this article include: the adoption of bootstrap and Jackknife methods, the implementation of appropriate multiple reader multiple case study design, the application of robustness analyses for missing data, and the development of study designs and data analyses for companion diagnostics.

  9. Design of Secure ECG-Based Biometric Authentication in Body Area Sensor Networks

    PubMed Central

    Peter, Steffen; Pratap Reddy, Bhanu; Momtaz, Farshad; Givargis, Tony

    2016-01-01

    Body area sensor networks (BANs) utilize wireless communicating sensor nodes attached to a human body for convenience, safety, and health applications. Physiological characteristics of the body, such as the heart rate or Electrocardiogram (ECG) signals, are promising means to simplify the setup process and to improve security of BANs. This paper describes the design and implementation steps required to realize an ECG-based authentication protocol to identify sensor nodes attached to the same human body. Therefore, the first part of the paper addresses the design of a body-area sensor system, including the hardware setup, analogue and digital signal processing, and required ECG feature detection techniques. A model-based design flow is applied, and strengths and limitations of each design step are discussed. Real-world measured data originating from the implemented sensor system are then used to set up and parametrize a novel physiological authentication protocol for BANs. The authentication protocol utilizes statistical properties of expected and detected deviations to limit the number of false positive and false negative authentication attempts. The result of the described holistic design effort is the first practical implementation of biometric authentication in BANs that reflects timing and data uncertainties in the physical and cyber parts of the system. PMID:27110785

  10. Enhancing professionalism using ethics education as part of a dental licensure board's disciplinary action. Part 2. Evidence of the process.

    PubMed

    Bebeau, Muriel J

    2009-01-01

    Pretest scores were analyzed for 41 professionals referred for ethics assessment by a dental licensing board. Two were exempt from instruction based on pretest performance on five well-validated measures; 38 completed an individualized course designed to remediate deficiencies in ethical abilities. Statistically significant change (effect sizes ranging from .55 to 5.0) was observed for ethical sensitivity (DEST scores), moral reasoning (DIT scores), and role concept (essays and PROI scores). Analysis of the relationships between ability deficiencies and disciplinary actions supports the explanatory power of Rest's Four Component Model of Morality. Of particular interest is the way the model helped referred professionals deconstruct summary judgments about character and see them as capacities that can be further developed. The performance-based assessments, especially the DEST, were particularly useful in identifying shortcomings in ethical implementation. Referred practitioners highly valued the emphasis on ethical implementation, suggesting the importance of addressing what to do and say in ethically challenging cases. Finally, the required self-assessments of learning confirm the value of the process for professional renewal (i.e., a renewed commitment to professional ideals) and of enhanced abilities not only to reason about moral problems, but to implement actions.

  11. iCHAMPSS: Usability and Psychosocial Impact for Increasing Implementation of Sexual Health Education.

    PubMed

    Hernandez, Belinda F; Peskin, Melissa F; Shegog, Ross; Gabay, Efrat K; Cuccaro, Paula M; Addy, Robert C; Ratliff, Eric; Emery, Susan T; Markham, Christine M

    2017-05-01

    Diffusion of sexual health evidence-based programs (EBPs) in schools is a complex and challenging process. iCHAMPSS ( CHoosing And Maintaining effective Programs for Sex education in Schools) is an innovative theory- and Web-based decision support system that may help facilitate this process. The purpose of this study was to pilot-test iCHAMPSS for usability and short-term psychosocial impact. School district stakeholders from across Texas were recruited ( N = 16) and given access to iCHAMPSS for 3 weeks in fall 2014. Pre- and posttests were administered to measure usability parameters and short-term psychosocial outcomes. Data were analyzed using descriptive statistics and the Wilcoxon signed-rank test. Most participants reported that iCHAMPSS was easy to use, credible, helpful, and of sufficient motivational appeal. iCHAMPSS significantly increased participants' self-efficacy to obtain approval from their board of trustees to implement a sexual health EBP. Positive, though nonsignificant, trends included increased knowledge to locate EBPs, skills to prioritize sexual health education at the district level, and ability to choose an EBP that best meets district needs. iCHAMPSS is an innovative decision support system that could accelerate uptake of EBPs by facilitating diffusion and advance the field of dissemination and implementation science for the promotion of sexual health EBPs.

  12. A simplified implementation of edge detection in MATLAB is faster and more sensitive than fast fourier transform for actin fiber alignment quantification.

    PubMed

    Kemeny, Steven Frank; Clyne, Alisa Morss

    2011-04-01

    Fiber alignment plays a critical role in the structure and function of cells and tissues. While fiber alignment quantification is important to experimental analysis and several different methods for quantifying fiber alignment exist, many studies focus on qualitative rather than quantitative analysis perhaps due to the complexity of current fiber alignment methods. Speed and sensitivity were compared in edge detection and fast Fourier transform (FFT) for measuring actin fiber alignment in cells exposed to shear stress. While edge detection using matrix multiplication was consistently more sensitive than FFT, image processing time was significantly longer. However, when MATLAB functions were used to implement edge detection, MATLAB's efficient element-by-element calculations and fast filtering techniques reduced computation cost 100 times compared to the matrix multiplication edge detection method. The new computation time was comparable to the FFT method, and MATLAB edge detection produced well-distributed fiber angle distributions that statistically distinguished aligned and unaligned fibers in half as many sample images. When the FFT sensitivity was improved by dividing images into smaller subsections, processing time grew larger than the time required for MATLAB edge detection. Implementation of edge detection in MATLAB is simpler, faster, and more sensitive than FFT for fiber alignment quantification.

  13. Design of Secure ECG-Based Biometric Authentication in Body Area Sensor Networks.

    PubMed

    Peter, Steffen; Reddy, Bhanu Pratap; Momtaz, Farshad; Givargis, Tony

    2016-04-22

    Body area sensor networks (BANs) utilize wireless communicating sensor nodes attached to a human body for convenience, safety, and health applications. Physiological characteristics of the body, such as the heart rate or Electrocardiogram (ECG) signals, are promising means to simplify the setup process and to improve security of BANs. This paper describes the design and implementation steps required to realize an ECG-based authentication protocol to identify sensor nodes attached to the same human body. Therefore, the first part of the paper addresses the design of a body-area sensor system, including the hardware setup, analogue and digital signal processing, and required ECG feature detection techniques. A model-based design flow is applied, and strengths and limitations of each design step are discussed. Real-world measured data originating from the implemented sensor system are then used to set up and parametrize a novel physiological authentication protocol for BANs. The authentication protocol utilizes statistical properties of expected and detected deviations to limit the number of false positive and false negative authentication attempts. The result of the described holistic design effort is the first practical implementation of biometric authentication in BANs that reflects timing and data uncertainties in the physical and cyber parts of the system.

  14. Characterization of a Saccharomyces cerevisiae fermentation process for production of a therapeutic recombinant protein using a multivariate Bayesian approach.

    PubMed

    Fu, Zhibiao; Baker, Daniel; Cheng, Aili; Leighton, Julie; Appelbaum, Edward; Aon, Juan

    2016-05-01

    The principle of quality by design (QbD) has been widely applied to biopharmaceutical manufacturing processes. Process characterization is an essential step to implement the QbD concept to establish the design space and to define the proven acceptable ranges (PAR) for critical process parameters (CPPs). In this study, we present characterization of a Saccharomyces cerevisiae fermentation process using risk assessment analysis, statistical design of experiments (DoE), and the multivariate Bayesian predictive approach. The critical quality attributes (CQAs) and CPPs were identified with a risk assessment. The statistical model for each attribute was established using the results from the DoE study with consideration given to interactions between CPPs. Both the conventional overlapping contour plot and the multivariate Bayesian predictive approaches were used to establish the region of process operating conditions where all attributes met their specifications simultaneously. The quantitative Bayesian predictive approach was chosen to define the PARs for the CPPs, which apply to the manufacturing control strategy. Experience from the 10,000 L manufacturing scale process validation, including 64 continued process verification batches, indicates that the CPPs remain under a state of control and within the established PARs. The end product quality attributes were within their drug substance specifications. The probability generated with the Bayesian approach was also used as a tool to assess CPP deviations. This approach can be extended to develop other production process characterization and quantify a reliable operating region. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:799-812, 2016. © 2016 American Institute of Chemical Engineers.

  15. User manual for Blossom statistical package for R

    USGS Publications Warehouse

    Talbert, Marian; Cade, Brian S.

    2005-01-01

    Blossom is an R package with functions for making statistical comparisons with distance-function based permutation tests developed by P.W. Mielke, Jr. and colleagues at Colorado State University (Mielke and Berry, 2001) and for testing parameters estimated in linear models with permutation procedures developed by B. S. Cade and colleagues at the Fort Collins Science Center, U.S. Geological Survey. This manual is intended to provide identical documentation of the statistical methods and interpretations as the manual by Cade and Richards (2005) does for the original Fortran program, but with changes made with respect to command inputs and outputs to reflect the new implementation as a package for R (R Development Core Team, 2012). This implementation in R has allowed for numerous improvements not supported by the Cade and Richards (2005) Fortran implementation, including use of categorical predictor variables in most routines.

  16. The effect of implementation strength of basic emergency obstetric and newborn care (BEmONC) on facility deliveries and the met need for BEmONC at the primary health care level in Ethiopia.

    PubMed

    Tiruneh, Gizachew Tadele; Karim, Ali Mehryar; Avan, Bilal Iqbal; Zemichael, Nebreed Fesseha; Wereta, Tewabech Gebrekiristos; Wickremasinghe, Deepthi; Keweti, Zinar Nebi; Kebede, Zewditu; Betemariam, Wuleta Aklilu

    2018-05-02

    Basic emergency obstetric and newborn care (BEmONC) is a primary health care level initiative promoted in low- and middle-income countries to reduce maternal and newborn mortality. Tailored support, including BEmONC training to providers, mentoring and monitoring through supportive supervision, provision of equipment and supplies, strengthening referral linkages, and improving infection-prevention practice, was provided in a package of interventions to 134 health centers, covering 91 rural districts of Ethiopia to ensure timely BEmONC care. In recent years, there has been a growing interest in measuring program implementation strength to evaluate public health gains. To assess the effectiveness of the BEmONC initiative, this study measures its implementation strength and examines the effect of its variability across intervention health centers on the rate of facility deliveries and the met need for BEmONC. Before and after data from 134 intervention health centers were collected in April 2013 and July 2015. A BEmONC implementation strength index was constructed from seven input and five process indicators measured through observation, record review, and provider interview; while facility delivery rate and the met need for expected obstetric complications were measured from service statistics and patient records. We estimated the dose-response relationships between outcome and explanatory variables of interest using regression methods. The BEmONC implementation strength index score, which ranged between zero and 10, increased statistically significantly from 4.3 at baseline to 6.7 at follow-up (p < .05). Correspondingly, the health center delivery rate significantly increased from 24% to 56% (p < .05). There was a dose-response relationship between the explanatory and outcome variables. For every unit increase in BEmONC implementation strength score there was a corresponding average of 4.5 percentage points (95% confidence interval: 2.1-6.9) increase in facility-based deliveries; while a higher score for BEmONC implementation strength of a health facility at follow-up was associated with a higher met need. The BEmONC initiative was effective in improving institutional deliveries and may have also improved the met need for BEmONC services. The BEmONC implementation strength index can be potentially used to monitor the implementation of BEmONC interventions.

  17. Error threshold for color codes and random three-body Ising models.

    PubMed

    Katzgraber, Helmut G; Bombin, H; Martin-Delgado, M A

    2009-08-28

    We study the error threshold of color codes, a class of topological quantum codes that allow a direct implementation of quantum Clifford gates suitable for entanglement distillation, teleportation, and fault-tolerant quantum computation. We map the error-correction process onto a statistical mechanical random three-body Ising model and study its phase diagram via Monte Carlo simulations. The obtained error threshold of p(c) = 0.109(2) is very close to that of Kitaev's toric code, showing that enhanced computational capabilities do not necessarily imply lower resistance to noise.

  18. Next generation of weather generators on web service framework

    NASA Astrophysics Data System (ADS)

    Chinnachodteeranun, R.; Hung, N. D.; Honda, K.; Ines, A. V. M.

    2016-12-01

    Weather generator is a statistical model that synthesizes possible realization of long-term historical weather in future. It generates several tens to hundreds of realizations stochastically based on statistical analysis. Realization is essential information as a crop modeling's input for simulating crop growth and yield. Moreover, they can be contributed to analyzing uncertainty of weather to crop development stage and to decision support system on e.g. water management and fertilizer management. Performing crop modeling requires multidisciplinary skills which limit the usage of weather generator only in a research group who developed it as well as a barrier for newcomers. To improve the procedures of performing weather generators as well as the methodology to acquire the realization in a standard way, we implemented a framework for providing weather generators as web services, which support service interoperability. Legacy weather generator programs were wrapped in the web service framework. The service interfaces were implemented based on an international standard that was Sensor Observation Service (SOS) defined by Open Geospatial Consortium (OGC). Clients can request realizations generated by the model through SOS Web service. Hierarchical data preparation processes required for weather generator are also implemented as web services and seamlessly wired. Analysts and applications can invoke services over a network easily. The services facilitate the development of agricultural applications and also reduce the workload of analysts on iterative data preparation and handle legacy weather generator program. This architectural design and implementation can be a prototype for constructing further services on top of interoperable sensor network system. This framework opens an opportunity for other sectors such as application developers and scientists in other fields to utilize weather generators.

  19. Improving Clinical Trial Efficiency: Thinking outside the Box.

    PubMed

    Mandrekar, Sumithra J; Dahlberg, Suzanne E; Simon, Richard

    2015-01-01

    Clinical trial design strategies have evolved over the past few years as a means to accelerate the drug development process so that the right therapies can be delivered to the right patients. Basket, umbrella, and adaptive enrichment strategies represent a class of novel designs for testing targeted therapeutics in oncology. Umbrella trials include a central infrastructure for screening and identification of patients, and focus on a single tumor type or histology with multiple subtrials, each testing a targeted therapy within a molecularly defined subset. Basket trial designs offer the possibility to include multiple molecularly defined subpopulations, often across histology or tumor types, but included in one cohesive design to evaluate the targeted therapy in question. Adaptive enrichment designs offer the potential to enrich for patients with a particular molecular feature that is predictive of benefit for the test treatment based on accumulating evidence from the trial. This review will aim to discuss the fundamentals of these design strategies, the underlying statistical framework, the logistical barriers of implementation, and, ultimately, the interpretation of the trial results. New statistical approaches, extensive multidisciplinary collaboration, and state of the art data capture technologies are needed to implement these strategies in practice. Logistical challenges to implementation arising from centralized assay testing, requirement of multiple specimens, multidisciplinary collaboration, and infrastructure requirements will also be discussed. This review will present these concepts in the context of the National Cancer Institute's precision medicine initiative trials: MATCH, ALCHEMIST, Lung MAP, as well as other trials such as FOCUS4.

  20. Study design and implementation for population pharmacokinetics of Chinese medicine: An expert consensus.

    PubMed

    Jiang, Jun-jie; Zhang, Wen; Xie, Yan-ming; Wang, Jian-nong; He, Fu-yuan; Xiong, Xin

    2016-02-01

    Although many population pharmacokinetics (PPK) researches have been conducted on chemical drugs, few have been in the field of Chinese medicine (CM). Each ingredient in CMs possesses different pharmacokinetic characteristics, therefore, it is important to develop methods of PPK studies on them to identify the differences in CM drug safety and efficacy among the population subgroups and to conduct quantitative studies on the determinants of CM drug concentrations. To develop an expert consensus on study design and implementation for PPK of CM, in August 2013, 6 experts in the field of PPK, CMs pharmacology, and statistics discussed problems on the PPK research protocol of CMs, and a consensus was reached. The medicines with toxicity and narrow therapeutic windows and with wide range of target population or with frequent adverse reactions were selected. The compositions with definite therapeutic effects were selected as indices, and specific time points and sample sizes were designed according to standard PPK design methods. Target components were tested through various chromatography methods. Total quantity statistical moment analysis was used to estimate PPK parameters of each component and PPK models reflecting the trend of CMs (which assists in reasonable adjustments on clinical dosage). This consensus specifies the study design and implementation process of PPK. It provides guidance for the following: post-marketing clinical studies, in vivo investigations related to the metabolism in different populations, and development and clinical adjustment of dosages of CMs.

  1. Routine Microsecond Molecular Dynamics Simulations with AMBER on GPUs. 2. Explicit Solvent Particle Mesh Ewald.

    PubMed

    Salomon-Ferrer, Romelia; Götz, Andreas W; Poole, Duncan; Le Grand, Scott; Walker, Ross C

    2013-09-10

    We present an implementation of explicit solvent all atom classical molecular dynamics (MD) within the AMBER program package that runs entirely on CUDA-enabled GPUs. First released publicly in April 2010 as part of version 11 of the AMBER MD package and further improved and optimized over the last two years, this implementation supports the three most widely used statistical mechanical ensembles (NVE, NVT, and NPT), uses particle mesh Ewald (PME) for the long-range electrostatics, and runs entirely on CUDA-enabled NVIDIA graphics processing units (GPUs), providing results that are statistically indistinguishable from the traditional CPU version of the software and with performance that exceeds that achievable by the CPU version of AMBER software running on all conventional CPU-based clusters and supercomputers. We briefly discuss three different precision models developed specifically for this work (SPDP, SPFP, and DPDP) and highlight the technical details of the approach as it extends beyond previously reported work [Götz et al., J. Chem. Theory Comput. 2012, DOI: 10.1021/ct200909j; Le Grand et al., Comp. Phys. Comm. 2013, DOI: 10.1016/j.cpc.2012.09.022].We highlight the substantial improvements in performance that are seen over traditional CPU-only machines and provide validation of our implementation and precision models. We also provide evidence supporting our decision to deprecate the previously described fully single precision (SPSP) model from the latest release of the AMBER software package.

  2. An add-in implementation of the RESAMPLING syntax under Microsoft EXCEL.

    PubMed

    Meineke, I

    2000-10-01

    The RESAMPLING syntax defines a set of powerful commands, which allow the programming of probabilistic statistical models with few, easily memorized statements. This paper presents an implementation of the RESAMPLING syntax using Microsoft EXCEL with Microsoft WINDOWS(R) as a platform. Two examples are given to demonstrate typical applications of RESAMPLING in biomedicine. Details of the implementation with special emphasis on the programming environment are discussed at length. The add-in is available electronically to interested readers upon request. The use of the add-in facilitates numerical statistical analyses of data from within EXCEL in a comfortable way.

  3. A novel approach to simulate gene-environment interactions in complex diseases.

    PubMed

    Amato, Roberto; Pinelli, Michele; D'Andrea, Daniel; Miele, Gennaro; Nicodemi, Mario; Raiconi, Giancarlo; Cocozza, Sergio

    2010-01-05

    Complex diseases are multifactorial traits caused by both genetic and environmental factors. They represent the major part of human diseases and include those with largest prevalence and mortality (cancer, heart disease, obesity, etc.). Despite a large amount of information that has been collected about both genetic and environmental risk factors, there are few examples of studies on their interactions in epidemiological literature. One reason can be the incomplete knowledge of the power of statistical methods designed to search for risk factors and their interactions in these data sets. An improvement in this direction would lead to a better understanding and description of gene-environment interactions. To this aim, a possible strategy is to challenge the different statistical methods against data sets where the underlying phenomenon is completely known and fully controllable, for example simulated ones. We present a mathematical approach that models gene-environment interactions. By this method it is possible to generate simulated populations having gene-environment interactions of any form, involving any number of genetic and environmental factors and also allowing non-linear interactions as epistasis. In particular, we implemented a simple version of this model in a Gene-Environment iNteraction Simulator (GENS), a tool designed to simulate case-control data sets where a one gene-one environment interaction influences the disease risk. The main aim has been to allow the input of population characteristics by using standard epidemiological measures and to implement constraints to make the simulator behaviour biologically meaningful. By the multi-logistic model implemented in GENS it is possible to simulate case-control samples of complex disease where gene-environment interactions influence the disease risk. The user has full control of the main characteristics of the simulated population and a Monte Carlo process allows random variability. A knowledge-based approach reduces the complexity of the mathematical model by using reasonable biological constraints and makes the simulation more understandable in biological terms. Simulated data sets can be used for the assessment of novel statistical methods or for the evaluation of the statistical power when designing a study.

  4. AESS: Accelerated Exact Stochastic Simulation

    NASA Astrophysics Data System (ADS)

    Jenkins, David D.; Peterson, Gregory D.

    2011-12-01

    The Stochastic Simulation Algorithm (SSA) developed by Gillespie provides a powerful mechanism for exploring the behavior of chemical systems with small species populations or with important noise contributions. Gene circuit simulations for systems biology commonly employ the SSA method, as do ecological applications. This algorithm tends to be computationally expensive, so researchers seek an efficient implementation of SSA. In this program package, the Accelerated Exact Stochastic Simulation Algorithm (AESS) contains optimized implementations of Gillespie's SSA that improve the performance of individual simulation runs or ensembles of simulations used for sweeping parameters or to provide statistically significant results. Program summaryProgram title: AESS Catalogue identifier: AEJW_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJW_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: University of Tennessee copyright agreement No. of lines in distributed program, including test data, etc.: 10 861 No. of bytes in distributed program, including test data, etc.: 394 631 Distribution format: tar.gz Programming language: C for processors, CUDA for NVIDIA GPUs Computer: Developed and tested on various x86 computers and NVIDIA C1060 Tesla and GTX 480 Fermi GPUs. The system targets x86 workstations, optionally with multicore processors or NVIDIA GPUs as accelerators. Operating system: Tested under Ubuntu Linux OS and CentOS 5.5 Linux OS Classification: 3, 16.12 Nature of problem: Simulation of chemical systems, particularly with low species populations, can be accurately performed using Gillespie's method of stochastic simulation. Numerous variations on the original stochastic simulation algorithm have been developed, including approaches that produce results with statistics that exactly match the chemical master equation (CME) as well as other approaches that approximate the CME. Solution method: The Accelerated Exact Stochastic Simulation (AESS) tool provides implementations of a wide variety of popular variations on the Gillespie method. Users can select the specific algorithm considered most appropriate. Comparisons between the methods and with other available implementations indicate that AESS provides the fastest known implementation of Gillespie's method for a variety of test models. Users may wish to execute ensembles of simulations to sweep parameters or to obtain better statistical results, so AESS supports acceleration of ensembles of simulation using parallel processing with MPI, SSE vector units on x86 processors, and/or using NVIDIA GPUs with CUDA.

  5. Feasibility Study on the Use of On-line Multivariate Statistical Process Control for Safeguards Applications in Natural Uranium Conversion Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ladd-Lively, Jennifer L

    2014-01-01

    The objective of this work was to determine the feasibility of using on-line multivariate statistical process control (MSPC) for safeguards applications in natural uranium conversion plants. Multivariate statistical process control is commonly used throughout industry for the detection of faults. For safeguards applications in uranium conversion plants, faults could include the diversion of intermediate products such as uranium dioxide, uranium tetrafluoride, and uranium hexafluoride. This study was limited to a 100 metric ton of uranium (MTU) per year natural uranium conversion plant (NUCP) using the wet solvent extraction method for the purification of uranium ore concentrate. A key component inmore » the multivariate statistical methodology is the Principal Component Analysis (PCA) approach for the analysis of data, development of the base case model, and evaluation of future operations. The PCA approach was implemented through the use of singular value decomposition of the data matrix where the data matrix represents normal operation of the plant. Component mole balances were used to model each of the process units in the NUCP. However, this approach could be applied to any data set. The monitoring framework developed in this research could be used to determine whether or not a diversion of material has occurred at an NUCP as part of an International Atomic Energy Agency (IAEA) safeguards system. This approach can be used to identify the key monitoring locations, as well as locations where monitoring is unimportant. Detection limits at the key monitoring locations can also be established using this technique. Several faulty scenarios were developed to test the monitoring framework after the base case or normal operating conditions of the PCA model were established. In all of the scenarios, the monitoring framework was able to detect the fault. Overall this study was successful at meeting the stated objective.« less

  6. The impact of an electronic health record on nurse sensitive patient outcomes: an interrupted time series analysis.

    PubMed

    Dowding, Dawn W; Turley, Marianne; Garrido, Terhilda

    2012-01-01

    To evaluate the impact of electronic health record (EHR) implementation on nursing care processes and outcomes. Interrupted time series analysis, 2003-2009. A large US not-for-profit integrated health care organization. 29 hospitals in Northern and Southern California. An integrated EHR including computerized physician order entry, nursing documentation, risk assessment tools, and documentation tools. Percentage of patients with completed risk assessments for hospital acquired pressure ulcers (HAPUs) and falls (process measures) and rates of HAPU and falls (outcome measures). EHR implementation was significantly associated with an increase in documentation rates for HAPU risk (coefficient 2.21, 95% CI 0.67 to 3.75); the increase for fall risk was not statistically significant (0.36; -3.58 to 4.30). EHR implementation was associated with a 13% decrease in HAPU rates (coefficient -0.76, 95% CI -1.37 to -0.16) but no decrease in fall rates (-0.091; -0.29 to 0.11). Irrespective of EHR implementation, HAPU rates decreased significantly over time (-0.16; -0.20 to -0.13), while fall rates did not (0.0052; -0.01 to 0.02). Hospital region was a significant predictor of variation for both HAPU (0.72; 0.30 to 1.14) and fall rates (0.57; 0.41 to 0.72). The introduction of an integrated EHR was associated with a reduction in the number of HAPUs but not in patient fall rates. Other factors, such as changes over time and hospital region, were also associated with variation in outcomes. The findings suggest that EHR impact on nursing care processes and outcomes is dependent on a number of factors that should be further explored.

  7. On Time Performance Pressure

    NASA Technical Reports Server (NTRS)

    Connell, Linda; Wichner, David; Jakey, Abegael

    2013-01-01

    Within many operations, the pressures for on-time performance are high. Each month, on-time statistics are reported to the Department of Transportation and made public. There is a natural tendency for employees under pressure to do their best to meet these objectives. As a result, pressure to get the job done within the allotted time may cause personnel to deviate from procedures and policies. Additionally, inadequate or unavailable resources may drive employees to work around standard processes that are seen as barriers. However, bypassing practices to enable on-time performance may affect more than the statistics. ASRS reports often highlight on-time performance pressures which may result in impact across all workgroups in an attempt to achieve on-time performance. Reporters often provide in-depth insights into their experiences which can be used by industry to identify and focus on the implementation of systemic fixes.

  8. Quality of Electronic Nursing Records: The Impact of Educational Interventions During a Hospital Accreditation Process.

    PubMed

    Nomura, Aline Tsuma Gaedke; Pruinelli, Lisiane; da Silva, Marcos Barragan; Lucena, Amália de Fátima; Almeida, Miriam de Abreu

    2018-03-01

    Hospital accreditation is a strategy for the pursuit of quality of care and safety for patients and professionals. Targeted educational interventions could help support this process. This study aimed to evaluate the quality of electronic nursing records during the hospital accreditation process. A retrospective study comparing 112 nursing records during the hospital accreditation process was conducted. Educational interventions were implemented, and records were evaluated preintervention and postintervention. Mann-Whitney and χ tests were used for data analysis. Results showed that there was a significant improvement in the nursing documentation quality postintervention. When comparing records preintervention and postintervention, results showed a statistically significant difference (P < .001) between the two periods. The comparison between items showed that most scores were significant. Findings indicated that educational interventions performed by nurses led to a positive change that improved nursing documentation and, consequently, better care practices.

  9. Improvements in Cz silicon PV module manufacturing

    NASA Astrophysics Data System (ADS)

    King, Richard R.; Mitchell, Kim W.; Jester, Theresa L.

    1997-02-01

    Work focused on reducing the cost per watt of Cz Si photovoltaic modules under Phase I of Siemens Solar Industries' DOE/NREL PVMaT 4A subcontract is described. Module cost components are analyzed and solutions to high-cost items are discussed in terms of specific module designs. The approaches of using larger cells and modules to reduce per-part processing cost, and of minimizing yield loss are particularly leveraging. Yield components for various parts of the fabrication process and various types of defects are shown, and measurements of the force required to break wafers throughout the cell fabrication sequence are given. The most significant type of yield loss is mechanical breakage. The implementation of statistical process control on key manufacturing processes at Siemens Solar Industries is described. Module configurations prototyped during Phase I of this project and scheduled to begin production in Phase II have a projected cost per watt reduction of 19%.

  10. Description and testing of the Geo Data Portal: Data integration framework and Web processing services for environmental science collaboration

    USGS Publications Warehouse

    Blodgett, David L.; Booth, Nathaniel L.; Kunicki, Thomas C.; Walker, Jordan I.; Viger, Roland J.

    2011-01-01

    Interest in sharing interdisciplinary environmental modeling results and related data is increasing among scientists. The U.S. Geological Survey Geo Data Portal project enables data sharing by assembling open-standard Web services into an integrated data retrieval and analysis Web application design methodology that streamlines time-consuming and resource-intensive data management tasks. Data-serving Web services allow Web-based processing services to access Internet-available data sources. The Web processing services developed for the project create commonly needed derivatives of data in numerous formats. Coordinate reference system manipulation and spatial statistics calculation components implemented for the Web processing services were confirmed using ArcGIS 9.3.1, a geographic information science software package. Outcomes of the Geo Data Portal project support the rapid development of user interfaces for accessing and manipulating environmental data.

  11. Image processing with cellular nonlinear networks implemented on field-programmable gate arrays for real-time applications in nuclear fusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Palazzo, S.; Vagliasindi, G.; Arena, P.

    2010-08-15

    In the past years cameras have become increasingly common tools in scientific applications. They are now quite systematically used in magnetic confinement fusion, to the point that infrared imaging is starting to be used systematically for real-time machine protection in major devices. However, in order to guarantee that the control system can always react rapidly in case of critical situations, the time required for the processing of the images must be as predictable as possible. The approach described in this paper combines the new computational paradigm of cellular nonlinear networks (CNNs) with field-programmable gate arrays and has been tested inmore » an application for the detection of hot spots on the plasma facing components in JET. The developed system is able to perform real-time hot spot recognition, by processing the image stream captured by JET wide angle infrared camera, with the guarantee that computational time is constant and deterministic. The statistical results obtained from a quite extensive set of examples show that this solution approximates very well an ad hoc serial software algorithm, with no false or missed alarms and an almost perfect overlapping of alarm intervals. The computational time can be reduced to a millisecond time scale for 8 bit 496x560-sized images. Moreover, in our implementation, the computational time, besides being deterministic, is practically independent of the number of iterations performed by the CNN - unlike software CNN implementations.« less

  12. 45 CFR 153.350 - Risk adjustment data validation standards.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... implementation of any risk adjustment software and ensure proper validation of a statistically valid sample of... respect to implementation of risk adjustment software or as a result of data validation conducted pursuant... implementation of risk adjustment software or data validation. ...

  13. Implementation of an Inpatient Pediatric Sepsis Identification Pathway.

    PubMed

    Bradshaw, Chanda; Goodman, Ilyssa; Rosenberg, Rebecca; Bandera, Christopher; Fierman, Arthur; Rudy, Bret

    2016-03-01

    Early identification and treatment of severe sepsis and septic shock improves outcomes. We sought to identify and evaluate children with possible sepsis on a pediatric medical/surgical unit through successful implementation of a sepsis identification pathway. The sepsis identification pathway, a vital sign screen and subsequent physician evaluation, was implemented in October 2013. Quality improvement interventions were used to improve physician and nursing adherence with the pathway. We reviewed charts of patients with positive screens on a monthly basis to assess for nursing recognition/physician notification, physician evaluation for sepsis, and subsequent physician diagnosis of sepsis and severe sepsis/septic shock. Adherence data were analyzed on a run chart and statistical process control p-chart. Nursing and physician pathway adherence of >80% was achieved over a 6-month period and sustained for the following 6 months. The direction of improvements met standard criteria for special causes. Over a 1-year period, there were 963 admissions to the unit. Positive screens occurred in 161 (16.7%) of these admissions and 38 (23.5%) of these had a physician diagnosis of sepsis, severe sepsis, or septic shock. One patient with neutropenia and septic shock had a negative sepsis screen due to lack of initial fever. Using quality improvement methodology, we successfully implemented a sepsis identification pathway on our pediatric unit. The pathway provided a standardized process to identify and evaluate children with possible sepsis requiring timely evaluation and treatment. Copyright © 2016 by the American Academy of Pediatrics.

  14. Voronoi Tessellation for reducing the processing time of correlation functions

    NASA Astrophysics Data System (ADS)

    Cárdenas-Montes, Miguel; Sevilla-Noarbe, Ignacio

    2018-01-01

    The increase of data volume in Cosmology is motivating the search of new solutions for solving the difficulties associated with the large processing time and precision of calculations. This is specially true in the case of several relevant statistics of the galaxy distribution of the Large Scale Structure of the Universe, namely the two and three point angular correlation functions. For these, the processing time has critically grown with the increase of the size of the data sample. Beyond parallel implementations to overcome the barrier of processing time, space partitioning algorithms are necessary to reduce the computational load. These can delimit the elements involved in the correlation function estimation to those that can potentially contribute to the final result. In this work, Voronoi Tessellation is used to reduce the processing time of the two-point and three-point angular correlation functions. The results of this proof-of-concept show a significant reduction of the processing time when preprocessing the galaxy positions with Voronoi Tessellation.

  15. Hardware design and implementation of fast DOA estimation method based on multicore DSP

    NASA Astrophysics Data System (ADS)

    Guo, Rui; Zhao, Yingxiao; Zhang, Yue; Lin, Qianqiang; Chen, Zengping

    2016-10-01

    In this paper, we present a high-speed real-time signal processing hardware platform based on multicore digital signal processor (DSP). The real-time signal processing platform shows several excellent characteristics including high performance computing, low power consumption, large-capacity data storage and high speed data transmission, which make it able to meet the constraint of real-time direction of arrival (DOA) estimation. To reduce the high computational complexity of DOA estimation algorithm, a novel real-valued MUSIC estimator is used. The algorithm is decomposed into several independent steps and the time consumption of each step is counted. Based on the statistics of the time consumption, we present a new parallel processing strategy to distribute the task of DOA estimation to different cores of the real-time signal processing hardware platform. Experimental results demonstrate that the high processing capability of the signal processing platform meets the constraint of real-time direction of arrival (DOA) estimation.

  16. Implementing a Web-Based Decision Support System to Spatially and Statistically Analyze Ecological Conditions of the Sierra Nevada

    NASA Astrophysics Data System (ADS)

    Nguyen, A.; Mueller, C.; Brooks, A. N.; Kislik, E. A.; Baney, O. N.; Ramirez, C.; Schmidt, C.; Torres-Perez, J. L.

    2014-12-01

    The Sierra Nevada is experiencing changes in hydrologic regimes, such as decreases in snowmelt and peak runoff, which affect forest health and the availability of water resources. Currently, the USDA Forest Service Region 5 is undergoing Forest Plan revisions to include climate change impacts into mitigation and adaptation strategies. However, there are few processes in place to conduct quantitative assessments of forest conditions in relation to mountain hydrology, while easily and effectively delivering that information to forest managers. To assist the USDA Forest Service, this study is the final phase of a three-term project to create a Decision Support System (DSS) to allow ease of access to historical and forecasted hydrologic, climatic, and terrestrial conditions for the entire Sierra Nevada. This data is featured within three components of the DSS: the Mapping Viewer, Statistical Analysis Portal, and Geospatial Data Gateway. Utilizing ArcGIS Online, the Sierra DSS Mapping Viewer enables users to visually analyze and locate areas of interest. Once the areas of interest are targeted, the Statistical Analysis Portal provides subbasin level statistics for each variable over time by utilizing a recently developed web-based data analysis and visualization tool called Plotly. This tool allows users to generate graphs and conduct statistical analyses for the Sierra Nevada without the need to download the dataset of interest. For more comprehensive analysis, users are also able to download datasets via the Geospatial Data Gateway. The third phase of this project focused on Python-based data processing, the adaptation of the multiple capabilities of ArcGIS Online and Plotly, and the integration of the three Sierra DSS components within a website designed specifically for the USDA Forest Service.

  17. Automatic tissue characterization from ultrasound imagery

    NASA Astrophysics Data System (ADS)

    Kadah, Yasser M.; Farag, Aly A.; Youssef, Abou-Bakr M.; Badawi, Ahmed M.

    1993-08-01

    In this work, feature extraction algorithms are proposed to extract the tissue characterization parameters from liver images. Then the resulting parameter set is further processed to obtain the minimum number of parameters representing the most discriminating pattern space for classification. This preprocessing step was applied to over 120 pathology-investigated cases to obtain the learning data for designing the classifier. The extracted features are divided into independent training and test sets and are used to construct both statistical and neural classifiers. The optimal criteria for these classifiers are set to have minimum error, ease of implementation and learning, and the flexibility for future modifications. Various algorithms for implementing various classification techniques are presented and tested on the data. The best performance was obtained using a single layer tensor model functional link network. Also, the voting k-nearest neighbor classifier provided comparably good diagnostic rates.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spycher, Nicolas; Peiffer, Loic; Finsterle, Stefan

    GeoT implements the multicomponent geothermometry method developed by Reed and Spycher (1984, Geochim. Cosmichim. Acta 46 513–528) into a stand-alone computer program, to ease the application of this method and to improve the prediction of geothermal reservoir temperatures using full and integrated chemical analyses of geothermal fluids. Reservoir temperatures are estimated from statistical analyses of mineral saturation indices computed as a function of temperature. The reconstruction of the deep geothermal fluid compositions, and geothermometry computations, are all implemented into the same computer program, allowing unknown or poorly constrained input parameters to be estimated by numerical optimization using existing parameter estimationmore » software, such as iTOUGH2, PEST, or UCODE. This integrated geothermometry approach presents advantages over classical geothermometers for fluids that have not fully equilibrated with reservoir minerals and/or that have been subject to processes such as dilution and gas loss.« less

  19. Evaluating a policing strategy intended to disrupt an illicit street-level drug market.

    PubMed

    Corsaro, Nicholas; Brunson, Rod K; McGarrell, Edmund F

    2010-12-01

    The authors examined a strategic policing initiative that was implemented in a high crime Nashville, Tennessee neighborhood by utilizing a mixed-methodological evaluation approach in order to provide (a) a descriptive process assessment of program fidelity; (b) an interrupted time-series analysis relying upon generalized linear models; (c) in-depth resident interviews. Results revealed that the initiative corresponded with a statistically significant reduction in drug and narcotics incidents as well as perceived changes in neighborhood disorder within the target community. There was less-clear evidence, however, of a significant impact on other outcomes examined. The implications that an intensive crime prevention strategy corresponded with a reduction in specific forms of neighborhood crime illustrates the complex considerations that law enforcement officials face when deciding to implement this type of crime prevention initiative.

  20. Efforts to improve international migration statistics: a historical perspective.

    PubMed

    Kraly, E P; Gnanasekaran, K S

    1987-01-01

    During the past decade, the international statistical community has made several efforts to develop standards for the definition, collection and publication of statistics on international migration. This article surveys the history of official initiatives to standardize international migration statistics by reviewing the recommendations of the International Statistical Institute, International Labor Organization, and the UN, and reports a recently proposed agenda for moving toward comparability among national statistical systems. Heightening awareness of the benefits of exchange and creating motivation to implement international standards requires a 3-pronged effort from the international statistical community. 1st, it is essential to continue discussion about the significance of improvement, specifically standardization, of international migration statistics. The move from theory to practice in this area requires ongoing focus by migration statisticians so that conformity to international standards itself becomes a criterion by which national statistical practices are examined and assessed. 2nd, the countries should be provided with technical documentation to support and facilitate the implementation of the recommended statistical systems. Documentation should be developed with an understanding that conformity to international standards for migration and travel statistics must be achieved within existing national statistical programs. 3rd, the call for statistical research in this area requires more efforts by the community of migration statisticians, beginning with the mobilization of bilateral and multilateral resources to undertake the preceding list of activities.

  1. Towards interoperable and reproducible QSAR analyses: Exchange of datasets.

    PubMed

    Spjuth, Ola; Willighagen, Egon L; Guha, Rajarshi; Eklund, Martin; Wikberg, Jarl Es

    2010-06-30

    QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community.

  2. Towards interoperable and reproducible QSAR analyses: Exchange of datasets

    PubMed Central

    2010-01-01

    Background QSAR is a widely used method to relate chemical structures to responses or properties based on experimental observations. Much effort has been made to evaluate and validate the statistical modeling in QSAR, but these analyses treat the dataset as fixed. An overlooked but highly important issue is the validation of the setup of the dataset, which comprises addition of chemical structures as well as selection of descriptors and software implementations prior to calculations. This process is hampered by the lack of standards and exchange formats in the field, making it virtually impossible to reproduce and validate analyses and drastically constrain collaborations and re-use of data. Results We present a step towards standardizing QSAR analyses by defining interoperable and reproducible QSAR datasets, consisting of an open XML format (QSAR-ML) which builds on an open and extensible descriptor ontology. The ontology provides an extensible way of uniquely defining descriptors for use in QSAR experiments, and the exchange format supports multiple versioned implementations of these descriptors. Hence, a dataset described by QSAR-ML makes its setup completely reproducible. We also provide a reference implementation as a set of plugins for Bioclipse which simplifies setup of QSAR datasets, and allows for exporting in QSAR-ML as well as old-fashioned CSV formats. The implementation facilitates addition of new descriptor implementations from locally installed software and remote Web services; the latter is demonstrated with REST and XMPP Web services. Conclusions Standardized QSAR datasets open up new ways to store, query, and exchange data for subsequent analyses. QSAR-ML supports completely reproducible creation of datasets, solving the problems of defining which software components were used and their versions, and the descriptor ontology eliminates confusions regarding descriptors by defining them crisply. This makes is easy to join, extend, combine datasets and hence work collectively, but also allows for analyzing the effect descriptors have on the statistical model's performance. The presented Bioclipse plugins equip scientists with graphical tools that make QSAR-ML easily accessible for the community. PMID:20591161

  3. Primarily Statistics: Developing an Introductory Statistics Course for Pre-Service Elementary Teachers

    ERIC Educational Resources Information Center

    Green, Jennifer L.; Blankenship, Erin E.

    2013-01-01

    We developed an introductory statistics course for pre-service elementary teachers. In this paper, we describe the goals and structure of the course, as well as the assessments we implemented. Additionally, we use example course work to demonstrate pre-service teachers' progress both in learning statistics and as novice teachers. Overall, the…

  4. Distinguishing Positive Selection From Neutral Evolution: Boosting the Performance of Summary Statistics

    PubMed Central

    Lin, Kao; Li, Haipeng; Schlötterer, Christian; Futschik, Andreas

    2011-01-01

    Summary statistics are widely used in population genetics, but they suffer from the drawback that no simple sufficient summary statistic exists, which captures all information required to distinguish different evolutionary hypotheses. Here, we apply boosting, a recent statistical method that combines simple classification rules to maximize their joint predictive performance. We show that our implementation of boosting has a high power to detect selective sweeps. Demographic events, such as bottlenecks, do not result in a large excess of false positives. A comparison to other neutrality tests shows that our boosting implementation performs well compared to other neutrality tests. Furthermore, we evaluated the relative contribution of different summary statistics to the identification of selection and found that for recent sweeps integrated haplotype homozygosity is very informative whereas older sweeps are better detected by Tajima's π. Overall, Watterson's θ was found to contribute the most information for distinguishing between bottlenecks and selection. PMID:21041556

  5. Designing Summer Research Experiences for Teachers and Students That Promote Classroom Science Inquiry Projects and Produce Research Results

    NASA Astrophysics Data System (ADS)

    George, L. A.; Parra, J.; Rao, M.; Offerman, L.

    2007-12-01

    Research experiences for science teachers are an important mechanism for increasing classroom teachers' science content knowledge and facility with "real world" research processes. We have developed and implemented a summer scientific research and education workshop model for high school teachers and students which promotes classroom science inquiry projects and produces important research results supporting our overarching scientific agenda. The summer training includes development of a scientific research framework, design and implementation of preliminary studies, extensive field research and training in and access to instruments, measurement techniques and statistical tools. The development and writing of scientific papers is used to reinforce the scientific research process. Using these skills, participants collaborate with scientists to produce research quality data and analysis. Following the summer experience, teachers report increased incorporation of research inquiry in their classrooms and student participation in science fair projects. This workshop format was developed for an NSF Biocomplexity Research program focused on the interaction of urban climates, air quality and human response and can be easily adapted for other scientific research projects.

  6. Radiation Transport in Random Media With Large Fluctuations

    NASA Astrophysics Data System (ADS)

    Olson, Aaron; Prinja, Anil; Franke, Brian

    2017-09-01

    Neutral particle transport in media exhibiting large and complex material property spatial variation is modeled by representing cross sections as lognormal random functions of space and generated through a nonlinear memory-less transformation of a Gaussian process with covariance uniquely determined by the covariance of the cross section. A Karhunen-Loève decomposition of the Gaussian process is implemented to effciently generate realizations of the random cross sections and Woodcock Monte Carlo used to transport particles on each realization and generate benchmark solutions for the mean and variance of the particle flux as well as probability densities of the particle reflectance and transmittance. A computationally effcient stochastic collocation method is implemented to directly compute the statistical moments such as the mean and variance, while a polynomial chaos expansion in conjunction with stochastic collocation provides a convenient surrogate model that also produces probability densities of output quantities of interest. Extensive numerical testing demonstrates that use of stochastic reduced-order modeling provides an accurate and cost-effective alternative to random sampling for particle transport in random media.

  7. Impact of a Virtual Clinic in a Paediatric Cardiology Network on Northeast Brazil.

    PubMed

    de Araújo, Juliana Sousa Soares; Dias Filho, Adalberto Vieira; Silva Gomes, Renata Grigório; Regis, Cláudio Teixeira; Rodrigues, Klecida Nunes; Siqueira, Nicoly Negreiros; Albuquerque, Fernanda Cruz de Lira; Mourato, Felipe Alves; Mattos, Sandra da Silva

    2015-01-01

    Introduction. Congenital heart diseases (CHD) affect approximately 1% of live births and is an important cause of neonatal morbidity and mortality. Despite that, there is a shortage of paediatric cardiologists in Brazil, mainly in the northern and northeastern regions. In this context, the implementation of virtual outpatient clinics with the aid of different telemedicine resources may help in the care of children with heart defects. Methods. Patients under 18 years of age treated in virtual outpatient clinics between January 2013 and May 2014 were selected. They were divided into 2 groups: those who had and those who had not undergone a screening process for CHD in the neonatal period. Clinical and demographic characteristics were collected for further statistical analysis. Results. A total of 653 children and teenagers were treated in the virtual outpatient clinics. From these, 229 had undergone a neonatal screening process. Fewer abnormalities were observed on the physical examination of the screened patients. Conclusion. The implementation of pediatric cardiology virtual outpatient clinics can have a positive impact in the care provided to people in areas with lack of skilled professionals.

  8. Reliability Centered Maintenance - Methodologies

    NASA Technical Reports Server (NTRS)

    Kammerer, Catherine C.

    2009-01-01

    Journal article about Reliability Centered Maintenance (RCM) methodologies used by United Space Alliance, LLC (USA) in support of the Space Shuttle Program at Kennedy Space Center. The USA Reliability Centered Maintenance program differs from traditional RCM programs because various methodologies are utilized to take advantage of their respective strengths for each application. Based on operational experience, USA has customized the traditional RCM methodology into a streamlined lean logic path and has implemented the use of statistical tools to drive the process. USA RCM has integrated many of the L6S tools into both RCM methodologies. The tools utilized in the Measure, Analyze, and Improve phases of a Lean Six Sigma project lend themselves to application in the RCM process. All USA RCM methodologies meet the requirements defined in SAE JA 1011, Evaluation Criteria for Reliability-Centered Maintenance (RCM) Processes. The proposed article explores these methodologies.

  9. RankProd 2.0: a refactored bioconductor package for detecting differentially expressed features in molecular profiling datasets.

    PubMed

    Del Carratore, Francesco; Jankevics, Andris; Eisinga, Rob; Heskes, Tom; Hong, Fangxin; Breitling, Rainer

    2017-09-01

    The Rank Product (RP) is a statistical technique widely used to detect differentially expressed features in molecular profiling experiments such as transcriptomics, metabolomics and proteomics studies. An implementation of the RP and the closely related Rank Sum (RS) statistics has been available in the RankProd Bioconductor package for several years. However, several recent advances in the understanding of the statistical foundations of the method have made a complete refactoring of the existing package desirable. We implemented a completely refactored version of the RankProd package, which provides a more principled implementation of the statistics for unpaired datasets. Moreover, the permutation-based P -value estimation methods have been replaced by exact methods, providing faster and more accurate results. RankProd 2.0 is available at Bioconductor ( https://www.bioconductor.org/packages/devel/bioc/html/RankProd.html ) and as part of the mzMatch pipeline ( http://www.mzmatch.sourceforge.net ). rainer.breitling@manchester.ac.uk. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  10. Plan Recognition using Statistical Relational Models

    DTIC Science & Technology

    2014-08-25

    arguments. Section 4 describes several variants of MLNs for plan recognition. All MLN mod- els were implemented using Alchemy (Kok et al., 2010), an...For both MLN approaches, we used MC-SAT (Poon and Domingos, 2006) as implemented in the Alchemy system on both Monroe and Linux. Evaluation Metric We...Singla P, Poon H, Lowd D, Wang J, Nath A, Domingos P. The Alchemy System for Statistical Relational AI. Techni- cal Report; Department of Computer Science

  11. Training of lay health educators to implement an evidence-based behavioral weight loss intervention in rural senior centers.

    PubMed

    Krukowski, Rebecca A; Lensing, Shelly; Love, Sharhonda; Prewitt, T Elaine; Adams, Becky; Cornell, Carol E; Felix, Holly C; West, Delia

    2013-02-01

    Lay health educators (LHEs) offer great promise for facilitating the translation of evidence-based health promotion programs to underserved areas; yet, there is little guidance on how to train LHEs to implement these programs, particularly in the crucial area of empirically validated obesity interventions. This article describes experiences in recruiting, training, and retaining 20 LHEs who delivered a 12-month evidence-based behavioral lifestyle intervention (based on the Diabetes Prevention Program) in senior centers across a rural state. A mixed method approach was used which incorporated collecting the folllowing: quantitative data on sociodemographic characteristics of LHEs; process data related to training, recruitment, intervention implementation, and retention of LHEs; and a quantitative program evaluation questionnaire, which was supplemented by a qualitative program evaluation questionnaire. Descriptive statistics were calculated for quantitative data, and qualitative data were analyzed using content analysis. The training program was well received, and the LHEs effectively recruited participants and implemented the lifestyle intervention in senior centers following a structured protocol. The methods used in this study produced excellent long-term retention of LHEs and good adherence to intervention protocol, and as such may provide a model that could be effective for others seeking to implement LHE-delivered health promotion programs.

  12. Novel Application of a Reverse Triage Protocol Providing Increased Access to Care in an Outpatient, Primary Care Clinic Setting.

    PubMed

    Sacino, Amanda N; Shuster, Jonathan J; Nowicki, Kamil; Carek, Peter J; Wegman, Martin P; Listhaus, Alyson; Gibney, Joseph M; Chang, Ku-Lang

    2016-02-01

    As the number of patients with access to care increases, outpatient clinics will need to implement innovative strategies to maintain or enhance clinic efficiency. One viable alternative involves reverse triage. A reverse triage protocol was implemented during a student-run free clinic. Each patient's chief complaint(s) were obtained at the beginning of the clinic session and ranked by increasing complexity. "Complexity" was defined as the subjective amount of time required to provide a full, thorough evaluation of a patient. Less complex cases were prioritized first since they could be expedited through clinic processing and allow for more time and resources to be dedicated to complex cases. Descriptive statistics were used to characterize and summarize the data obtained. Categorical variables were analyzed using chi-square. A time series analysis of the outcome versus centered time in weeks was also conducted. The average number of patients seen per clinic session increased by 35% (9.5 versus 12.8) from pre-implementation of the reverse triage protocol to 6 months after the implementation of the protocol. The implementation of a reverse triage in an outpatient setting significantly increased clinic efficiency as noted by a significant increase in the number of patients seen during a clinic session.

  13. Quantitative Thermochemical Measurements in High-Pressure Gaseous Combustion

    NASA Technical Reports Server (NTRS)

    Kojima, Jun J.; Fischer, David G.

    2012-01-01

    We present our strategic experiment and thermochemical analyses on combustion flow using a subframe burst gating (SBG) Raman spectroscopy. This unconventional laser diagnostic technique has promising ability to enhance accuracy of the quantitative scalar measurements in a point-wise single-shot fashion. In the presentation, we briefly describe an experimental methodology that generates transferable calibration standard for the routine implementation of the diagnostics in hydrocarbon flames. The diagnostic technology was applied to simultaneous measurements of temperature and chemical species in a swirl-stabilized turbulent flame with gaseous methane fuel at elevated pressure (17 atm). Statistical analyses of the space-/time-resolved thermochemical data provide insights into the nature of the mixing process and it impact on the subsequent combustion process in the model combustor.

  14. Customized Molecular Phenotyping by Quantitative Gene Expression and Pattern Recognition Analysis

    PubMed Central

    Akilesh, Shreeram; Shaffer, Daniel J.; Roopenian, Derry

    2003-01-01

    Description of the molecular phenotypes of pathobiological processes in vivo is a pressing need in genomic biology. We have implemented a high-throughput real-time PCR strategy to establish quantitative expression profiles of a customized set of target genes. It enables rapid, reproducible data acquisition from limited quantities of RNA, permitting serial sampling of mouse blood during disease progression. We developed an easy to use statistical algorithm—Global Pattern Recognition—to readily identify genes whose expression has changed significantly from healthy baseline profiles. This approach provides unique molecular signatures for rheumatoid arthritis, systemic lupus erythematosus, and graft versus host disease, and can also be applied to defining the molecular phenotype of a variety of other normal and pathological processes. PMID:12840047

  15. Evaluating the Process of Generating a Clinical Trial Protocol

    PubMed Central

    Franciosi, Lui G.; Butterfield, Noam N.; MacLeod, Bernard A.

    2002-01-01

    The research protocol is the principal document in the conduct of a clinical trial. Its generation requires knowledge about the research problem, the potential experimental confounders, and the relevant Good Clinical Practices for conducting the trial. However, such information is not always available to authors during the writing process. A checklist of over 80 items has been developed to better understand the considerations made by authors in generating a protocol. It is based on the most cited requirements for designing and implementing the randomised controlled trial. Items are categorised according to the trial's research question, experimental design, statistics, ethics, and standard operating procedures. This quality assessment tool evaluates the extent that a generated protocol deviates from the best-planned clinical trial.

  16. Hospital implementation of health information technology and quality of care: are they related?

    PubMed

    Restuccia, Joseph D; Cohen, Alan B; Horwitt, Jedediah N; Shwartz, Michael

    2012-09-27

    Recently, there has been considerable effort to promote the use of health information technology (HIT) in order to improve health care quality. However, relatively little is known about the extent to which HIT implementation is associated with hospital patient care quality. We undertook this study to determine the association of various HITs with: hospital quality improvement (QI) practices and strategies; adherence to process of care measures; risk-adjusted inpatient mortality; patient satisfaction; and assessment of patient care quality by hospital quality managers and front-line clinicians. We conducted surveys of quality managers and front-line clinicians (physicians and nurses) in 470 short-term, general hospitals to obtain data on hospitals' extent of HIT implementation, QI practices and strategies, assessments of quality performance, commitment to quality, and sufficiency of resources for QI. Of the 470 hospitals, 401 submitted complete data necessary for analysis. We also developed measures of hospital performance from several publicly data available sources: Hospital Compare adherence to process of care measures; Medicare Provider Analysis and Review (MEDPAR) file; and Hospital Consumer Assessment of Healthcare Providers and Systems HCAHPS® survey. We used Poisson regression analysis to examine the association between HIT implementation and QI practices and strategies, and general linear models to examine the relationship between HIT implementation and hospital performance measures. Controlling for potential confounders, we found that hospitals with high levels of HIT implementation engaged in a statistically significant greater number of QI practices and strategies, and had significantly better performance on mortality rates, patient satisfaction measures, and assessments of patient care quality by hospital quality managers; there was weaker evidence of higher assessments of patient care quality by front-line clinicians. Hospital implementation of HIT was positively associated with activities intended to improve patient care quality and with higher performance on four of six performance measures.

  17. Fast semivariogram computation using FPGA architectures

    NASA Astrophysics Data System (ADS)

    Lagadapati, Yamuna; Shirvaikar, Mukul; Dong, Xuanliang

    2015-02-01

    The semivariogram is a statistical measure of the spatial distribution of data and is based on Markov Random Fields (MRFs). Semivariogram analysis is a computationally intensive algorithm that has typically seen applications in the geosciences and remote sensing areas. Recently, applications in the area of medical imaging have been investigated, resulting in the need for efficient real time implementation of the algorithm. The semivariogram is a plot of semivariances for different lag distances between pixels. A semi-variance, γ(h), is defined as the half of the expected squared differences of pixel values between any two data locations with a lag distance of h. Due to the need to examine each pair of pixels in the image or sub-image being processed, the base algorithm complexity for an image window with n pixels is O(n2). Field Programmable Gate Arrays (FPGAs) are an attractive solution for such demanding applications due to their parallel processing capability. FPGAs also tend to operate at relatively modest clock rates measured in a few hundreds of megahertz, but they can perform tens of thousands of calculations per clock cycle while operating in the low range of power. This paper presents a technique for the fast computation of the semivariogram using two custom FPGA architectures. The design consists of several modules dedicated to the constituent computational tasks. A modular architecture approach is chosen to allow for replication of processing units. This allows for high throughput due to concurrent processing of pixel pairs. The current implementation is focused on isotropic semivariogram computations only. Anisotropic semivariogram implementation is anticipated to be an extension of the current architecture, ostensibly based on refinements to the current modules. The algorithm is benchmarked using VHDL on a Xilinx XUPV5-LX110T development Kit, which utilizes the Virtex5 FPGA. Medical image data from MRI scans are utilized for the experiments. Computational speedup is measured with respect to Matlab implementation on a personal computer with an Intel i7 multi-core processor. Preliminary simulation results indicate that a significant advantage in speed can be attained by the architectures, making the algorithm viable for implementation in medical devices

  18. Processing Functional Near Infrared Spectroscopy Signal with a Kalman Filter to Assess Working Memory during Simulated Flight.

    PubMed

    Durantin, Gautier; Scannella, Sébastien; Gateau, Thibault; Delorme, Arnaud; Dehais, Frédéric

    2015-01-01

    Working memory (WM) is a key executive function for operating aircraft, especially when pilots have to recall series of air traffic control instructions. There is a need to implement tools to monitor WM as its limitation may jeopardize flight safety. An innovative way to address this issue is to adopt a Neuroergonomics approach that merges knowledge and methods from Human Factors, System Engineering, and Neuroscience. A challenge of great importance for Neuroergonomics is to implement efficient brain imaging techniques to measure the brain at work and to design Brain Computer Interfaces (BCI). We used functional near infrared spectroscopy as it has been already successfully tested to measure WM capacity in complex environment with air traffic controllers (ATC), pilots, or unmanned vehicle operators. However, the extraction of relevant features from the raw signal in ecological environment is still a critical issue due to the complexity of implementing real-time signal processing techniques without a priori knowledge. We proposed to implement the Kalman filtering approach, a signal processing technique that is efficient when the dynamics of the signal can be modeled. We based our approach on the Boynton model of hemodynamic response. We conducted a first experiment with nine participants involving a basic WM task to estimate the noise covariances of the Kalman filter. We then conducted a more ecological experiment in our flight simulator with 18 pilots who interacted with ATC instructions (two levels of difficulty). The data was processed with the same Kalman filter settings implemented in the first experiment. This filter was benchmarked with a classical pass-band IIR filter and a Moving Average Convergence Divergence (MACD) filter. Statistical analysis revealed that the Kalman filter was the most efficient to separate the two levels of load, by increasing the observed effect size in prefrontal areas involved in WM. In addition, the use of a Kalman filter increased the performance of the classification of WM levels based on brain signal. The results suggest that Kalman filter is a suitable approach for real-time improvement of near infrared spectroscopy signal in ecological situations and the development of BCI.

  19. Processing Functional Near Infrared Spectroscopy Signal with a Kalman Filter to Assess Working Memory during Simulated Flight

    PubMed Central

    Durantin, Gautier; Scannella, Sébastien; Gateau, Thibault; Delorme, Arnaud; Dehais, Frédéric

    2016-01-01

    Working memory (WM) is a key executive function for operating aircraft, especially when pilots have to recall series of air traffic control instructions. There is a need to implement tools to monitor WM as its limitation may jeopardize flight safety. An innovative way to address this issue is to adopt a Neuroergonomics approach that merges knowledge and methods from Human Factors, System Engineering, and Neuroscience. A challenge of great importance for Neuroergonomics is to implement efficient brain imaging techniques to measure the brain at work and to design Brain Computer Interfaces (BCI). We used functional near infrared spectroscopy as it has been already successfully tested to measure WM capacity in complex environment with air traffic controllers (ATC), pilots, or unmanned vehicle operators. However, the extraction of relevant features from the raw signal in ecological environment is still a critical issue due to the complexity of implementing real-time signal processing techniques without a priori knowledge. We proposed to implement the Kalman filtering approach, a signal processing technique that is efficient when the dynamics of the signal can be modeled. We based our approach on the Boynton model of hemodynamic response. We conducted a first experiment with nine participants involving a basic WM task to estimate the noise covariances of the Kalman filter. We then conducted a more ecological experiment in our flight simulator with 18 pilots who interacted with ATC instructions (two levels of difficulty). The data was processed with the same Kalman filter settings implemented in the first experiment. This filter was benchmarked with a classical pass-band IIR filter and a Moving Average Convergence Divergence (MACD) filter. Statistical analysis revealed that the Kalman filter was the most efficient to separate the two levels of load, by increasing the observed effect size in prefrontal areas involved in WM. In addition, the use of a Kalman filter increased the performance of the classification of WM levels based on brain signal. The results suggest that Kalman filter is a suitable approach for real-time improvement of near infrared spectroscopy signal in ecological situations and the development of BCI. PMID:26834607

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wurtz, R.; Kaplan, A.

    Pulse shape discrimination (PSD) is a variety of statistical classifier. Fully-­realized statistical classifiers rely on a comprehensive set of tools for designing, building, and implementing. PSD advances rely on improvements to the implemented algorithm. PSD advances can be improved by using conventional statistical classifier or machine learning methods. This paper provides the reader with a glossary of classifier-­building elements and their functions in a fully-­designed and operational classifier framework that can be used to discover opportunities for improving PSD classifier projects. This paper recommends reporting the PSD classifier’s receiver operating characteristic (ROC) curve and its behavior at a gamma rejectionmore » rate (GRR) relevant for realistic applications.« less

  1. Statistics on continuous IBD data: Exact distribution evaluation for a pair of full(half)-sibs and a pair of a (great-) grandchild with a (great-) grandparent

    PubMed Central

    Stefanov, Valeri T

    2002-01-01

    Background Pairs of related individuals are widely used in linkage analysis. Most of the tests for linkage analysis are based on statistics associated with identity by descent (IBD) data. The current biotechnology provides data on very densely packed loci, and therefore, it may provide almost continuous IBD data for pairs of closely related individuals. Therefore, the distribution theory for statistics on continuous IBD data is of interest. In particular, distributional results which allow the evaluation of p-values for relevant tests are of importance. Results A technology is provided for numerical evaluation, with any given accuracy, of the cumulative probabilities of some statistics on continuous genome data for pairs of closely related individuals. In the case of a pair of full-sibs, the following statistics are considered: (i) the proportion of genome with 2 (at least 1) haplotypes shared identical-by-descent (IBD) on a chromosomal segment, (ii) the number of distinct pieces (subsegments) of a chromosomal segment, on each of which exactly 2 (at least 1) haplotypes are shared IBD. The natural counterparts of these statistics for the other relationships are also considered. Relevant Maple codes are provided for a rapid evaluation of the cumulative probabilities of such statistics. The genomic continuum model, with Haldane's model for the crossover process, is assumed. Conclusions A technology, together with relevant software codes for its automated implementation, are provided for exact evaluation of the distributions of relevant statistics associated with continuous genome data on closely related individuals. PMID:11996673

  2. Applying Sociocultural Theory to Teaching Statistics for Doctoral Social Work Students

    ERIC Educational Resources Information Center

    Mogro-Wilson, Cristina; Reeves, Michael G.; Charter, Mollie Lazar

    2015-01-01

    This article describes the development of two doctoral-level multivariate statistics courses utilizing sociocultural theory, an integrative pedagogical framework. In the first course, the implementation of sociocultural theory helps to support the students through a rigorous introduction to statistics. The second course involves students…

  3. 32 CFR 2402.10 - Maintenance of statistics.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 32 National Defense 6 2013-07-01 2013-07-01 false Maintenance of statistics. 2402.10 Section 2402.10 National Defense Other Regulations Relating to National Defense OFFICE OF SCIENCE AND TECHNOLOGY POLICY REGULATIONS IMPLEMENTING THE FREEDOM OF INFORMATION ACT § 2402.10 Maintenance of statistics. (a...

  4. 29 CFR 2201.10 - Maintenance of statistics.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 9 2011-07-01 2011-07-01 false Maintenance of statistics. 2201.10 Section 2201.10 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH REVIEW COMMISSION REGULATIONS IMPLEMENTING THE FREEDOM OF INFORMATION ACT § 2201.10 Maintenance of statistics. (a) The FOIA Disclosure...

  5. 29 CFR 2201.10 - Maintenance of statistics.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Maintenance of statistics. 2201.10 Section 2201.10 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH REVIEW COMMISSION REGULATIONS IMPLEMENTING THE FREEDOM OF INFORMATION ACT § 2201.10 Maintenance of statistics. (a) The FOIA Disclosure...

  6. Implementing a 6-day physiotherapy service in rehabilitation: exploring staff perceptions.

    PubMed

    Caruana, Erin L; Kuys, Suzanne S; Clarke, Jane; Brauer, Sandra G

    2017-11-20

    Objective Australian weekend rehabilitation therapy provision is increasing. Staff engagement optimises service delivery. The present mixed-methods process evaluation explored staff perceptions regarding implementation of a 6-day physiotherapy service in a private rehabilitation unit. Methods All multidisciplinary staff working in the rehabilitation unit were surveyed regarding barriers, facilitators and perceptions of the effect of a 6-day physiotherapy service on length of stay (LOS) and patient goal attainment at three time points: before and after implementation, as well as after modification of a 6-day physiotherapy service. Descriptive statistics and thematic analysis was used to analyse the data. Results Fifty-one staff (50%) responded. Before implementation, all staff identified barriers, the most common being staffing (62%) and patient selection (29%). After implementation, only 30% of staff identified barriers, which differed to those identified before implementation, and included staff rostering and experience (20%), timing of therapy (10%) and increasing the allocation of patients (5%). Over time, staff perceptions changed from being unsure to being positive about the effect of the 6-day service on LOS and patient goal attainment. Conclusion Staff perceived a large number of barriers before implementation of a 6-day rehabilitation service, but these did not eventuate following implementation. Staff perceived improved LOS and patient goal attainment after implementation of a 6-day rehabilitation service incorporating staff feedback. What is known about this topic? Rehabilitation weekend services improve patient quality of life and functional independence while reducing LOS. What does this study add? Staff feedback during implementation and modification of new services is important to address potential barriers and ensure staff satisfaction and support. What are the implications for practitioners? Staff engagement and open communication are important to successfully implement a new service in rehabilitation.

  7. FPGA Implementation of Metastability-Based True Random Number Generator

    NASA Astrophysics Data System (ADS)

    Hata, Hisashi; Ichikawa, Shuichi

    True random number generators (TRNGs) are important as a basis for computer security. Though there are some TRNGs composed of analog circuit, the use of digital circuits is desired for the application of TRNGs to logic LSIs. Some of the digital TRNGs utilize jitter in free-running ring oscillators as a source of entropy, which consume large power. Another type of TRNG exploits the metastability of a latch to generate entropy. Although this kind of TRNG has been mostly implemented with full-custom LSI technology, this study presents an implementation based on common FPGA technology. Our TRNG is comprised of logic gates only, and can be integrated in any kind of logic LSI. The RS latch in our TRNG is implemented as a hard-macro to guarantee the quality of randomness by minimizing the signal skew and load imbalance of internal nodes. To improve the quality and throughput, the output of 64-256 latches are XOR'ed. The derived design was verified on a Xilinx Virtex-4 FPGA (XC4VFX20), and passed NIST statistical test suite without post-processing. Our TRNG with 256 latches occupies 580 slices, while achieving 12.5Mbps throughput.

  8. The impact of a lean rounding process in a pediatric intensive care unit.

    PubMed

    Vats, Atul; Goin, Kristin H; Villarreal, Monica C; Yilmaz, Tuba; Fortenberry, James D; Keskinocak, Pinar

    2012-02-01

    Poor workflow associated with physician rounding can produce inefficiencies that decrease time for essential activities, delay clinical decisions, and reduce staff and patient satisfaction. Workflow and provider resources were not optimized when a pediatric intensive care unit increased by 22,000 square feet (to 33,000) and by nine beds (to 30). Lean methods (focusing on essential processes) and scenario analysis were used to develop and implement a patient-centric standardized rounding process, which we hypothesize would lead to improved rounding efficiency, decrease required physician resources, improve satisfaction, and enhance throughput. Human factors techniques and statistical tools were used to collect and analyze observational data for 11 rounding events before and 12 rounding events after process redesign. Actions included: 1) recording rounding events, times, and patient interactions and classifying them as essential, nonessential, or nonvalue added; 2) comparing rounding duration and time per patient to determine the impact on efficiency; 3) analyzing discharge orders for timeliness; 4) conducting staff surveys to assess improvements in communication and care coordination; and 5) analyzing customer satisfaction data to evaluate impact on patient experience. Thirty-bed pediatric intensive care unit in a children's hospital with academic affiliation. Eight attending pediatric intensivists and their physician rounding teams. Eight attending physician-led teams were observed for 11 rounding events before and 12 rounding events after implementation of a standardized lean rounding process focusing on essential processes. Total rounding time decreased significantly (157 ± 35 mins before vs. 121 ± 20 mins after), through a reduction in time spent on nonessential (53 ± 30 vs. 9 ± 6 mins) activities. The previous process required three attending physicians for an average of 157 mins (7.55 attending physician man-hours), while the new process required two attending physicians for an average of 121 mins (4.03 attending physician man-hours). Cumulative distribution of completed patient rounds by hour of day showed an improvement from 40% to 80% of patients rounded by 9:30 AM. Discharge data showed pediatric intensive care unit patients were discharged an average of 58.05 mins sooner (p < .05). Staff surveys showed a significant increase in satisfaction with the new process (including increased efficiency, improved physician identification, and clearer understanding of process). Customer satisfaction scores showed improvement after implementing the new process. Implementation of a lean-focused, patient-centric rounding structure stressing essential processes was associated with increased timeliness and efficiency of rounds, improved staff and customer satisfaction, improved throughput, and reduced attending physician man-hours.

  9. Computer Aided Statistics Instruction Protocol (CASIP) Restructuring Undergraduate Statistics in Psychology: An Integration of Computers into Instruction and Evaluation.

    ERIC Educational Resources Information Center

    Rah, Ki-Young; Scuello, Michael

    As a result of the development of two computer statistics laboratories in the psychology department at New York's Brooklyn College, a project was undertaken to develop and implement computer program modules in undergraduate and graduate statistics courses. Rather than use the technology to merely make course presentations more exciting, the…

  10. Implementing mentor mothers in family practice to support abused mothers: study protocol.

    PubMed

    Loeffen, Maartje Jw; Lo Fo Wong, Sylvie H; Wester, Fred Pjf; Laurant, Miranda Gh; Lagro-Janssen, Antoine Lm

    2011-10-18

    Intimate partner violence is highly prevalent and mostly affects women with negative consequences for their physical and mental health. Children often witness the violence which has negative consequences for their well-being too. Care offered by family physicians is often rejected because abused women experience a too high threshold. Mentor mother support, a low threshold intervention for abused mothers in family practice, proved to be feasible and effective in Rotterdam, the Netherlands. The primary aim of this study is to investigate which factors facilitate or hinder the implementation of mentor mother support in family practice. Besides we evaluate the effect of mentor mother support in a different region. An observational study with pre- and posttests will be performed. Mothers with home living children or pregnant women who are victims of intimate partner violence will be offered mentor mother support by the participating family physicians. The implementation process evaluation consists of focus groups, interviews and questionnaires. In the effect evaluation intimate partner violence, the general health of the abused mother, the mother-child relationship, social support, and acceptance of professional help will be measured twice (t = 0 and t = 6 months) by questionnaires, reporting forms, medical records and interviews with the abused mothers. Qualitative coding will be used to analyze the data from the reporting forms, medical records, focus groups, interviews, and questionnaires. Quantitative data will be analyzed with descriptive statistics, chi square test and t-test matched pairs. While other intervention studies only evaluate the feasibility and effectiveness of the intervention, our primary aim is to evaluate the implementation process and thereby investigate which factors facilitate or hinder implementation of mentor mother support in family practice.

  11. Source-Modeling Auditory Processes of EEG Data Using EEGLAB and Brainstorm.

    PubMed

    Stropahl, Maren; Bauer, Anna-Katharina R; Debener, Stefan; Bleichner, Martin G

    2018-01-01

    Electroencephalography (EEG) source localization approaches are often used to disentangle the spatial patterns mixed up in scalp EEG recordings. However, approaches differ substantially between experiments, may be strongly parameter-dependent, and results are not necessarily meaningful. In this paper we provide a pipeline for EEG source estimation, from raw EEG data pre-processing using EEGLAB functions up to source-level analysis as implemented in Brainstorm. The pipeline is tested using a data set of 10 individuals performing an auditory attention task. The analysis approach estimates sources of 64-channel EEG data without the prerequisite of individual anatomies or individually digitized sensor positions. First, we show advanced EEG pre-processing using EEGLAB, which includes artifact attenuation using independent component analysis (ICA). ICA is a linear decomposition technique that aims to reveal the underlying statistical sources of mixed signals and is further a powerful tool to attenuate stereotypical artifacts (e.g., eye movements or heartbeat). Data submitted to ICA are pre-processed to facilitate good-quality decompositions. Aiming toward an objective approach on component identification, the semi-automatic CORRMAP algorithm is applied for the identification of components representing prominent and stereotypic artifacts. Second, we present a step-wise approach to estimate active sources of auditory cortex event-related processing, on a single subject level. The presented approach assumes that no individual anatomy is available and therefore the default anatomy ICBM152, as implemented in Brainstorm, is used for all individuals. Individual noise modeling in this dataset is based on the pre-stimulus baseline period. For EEG source modeling we use the OpenMEEG algorithm as the underlying forward model based on the symmetric Boundary Element Method (BEM). We then apply the method of dynamical statistical parametric mapping (dSPM) to obtain physiologically plausible EEG source estimates. Finally, we show how to perform group level analysis in the time domain on anatomically defined regions of interest (auditory scout). The proposed pipeline needs to be tailored to the specific datasets and paradigms. However, the straightforward combination of EEGLAB and Brainstorm analysis tools may be of interest to others performing EEG source localization.

  12. Promoting physical activity, healthy eating and gross motor skills development among preschoolers attending childcare centers: Process evaluation of the Healthy Start-Départ Santé intervention using the RE-AIM framework.

    PubMed

    Ward, Stéphanie; Chow, Amanda Froehlich; Humbert, M Louise; Bélanger, Mathieu; Muhajarine, Nazeem; Vatanparast, Hassan; Leis, Anne

    2018-06-01

    The Healthy Start-Départ Santé intervention was developed to promote physical activity, gross motor skills and healthy eating among preschoolers attending childcare centers. This process evaluation aimed to report the reach, effectiveness, adoption, implementation and maintenance of the Healthy Start-Départ Santé intervention. The RE-AIM framework was used to guide this process evaluation. Data were collected across 140 childcare centers who received the Healthy Start-Départ Santé intervention in the provinces of Saskatchewan and New Brunswick, Canada. Quantitative data were collected through director questionnaires at 10 months and 2 years after the initial training and analyzed using descriptive statistics. Qualitative data were collected throughout the intervention. The intervention was successful in reaching a large number of childcare centres and engaging both rural and urban communities across Saskatchewan and New Brunswick. Centres reported increasing opportunities for physical activity and healthy eating, which were generally low-cost, easy and quick to implement. However, these changes were rarely transformed into formal written policies. A total of 87% of centers reported using the physical activity resource and 68% using the nutrition resource on a weekly basis. Implementation fidelity of the initial training was high. Of those centers who received the initial training, 75% participated in the mid-point booster session training. Two year post-implementation questionnaires indicated that 47% of centers were still using the Active Play Equipment kit, while 42% were still using the physical activity resource and 37% were still using the nutrition resource. Key challenges to implementation and sustainability identified during the evaluation were consistent among all of the REAIM elements. These challenges included lack of time, lack of support from childcare staff and low parental engagement. Findings from this study suggest the implementation of Healthy Start-Départ Santé may be improved further by addressing resistance to change and varied levels of engagement among childcare staff. In addition, further work is needed to provide parents with opportunities to engage in HSDS with their children. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Professional Development in Statistics, Technology, and Cognitively Demanding Tasks: Classroom Implementation and Obstacles

    ERIC Educational Resources Information Center

    Foley, Gregory D.; Khoshaim, Heba Bakr; Alsaeed, Maha; Er, S. Nihan

    2012-01-01

    Attending professional development programmes can support teachers in applying new strategies for teaching mathematics and statistics. This study investigated (a) the extent to which the participants in a professional development programme subsequently used the techniques they had learned when teaching mathematics and statistics and (b) the…

  14. Antecedents to Organizational Performance: Theoretical and Practical Implications for Aircraft Maintenance Officer Force Development

    DTIC Science & Technology

    2015-03-26

    to my reader, Lieutenant Colonel Robert Overstreet, for helping solidify my research, coaching me through the statistical analysis, and positive...61  Descriptive Statistics .............................................................................................................. 61...common-method bias requires careful assessment of potential sources of bias and implementing procedural and statistical control methods. Podsakoff

  15. Statistical Learning of Phonetic Categories: Insights from a Computational Approach

    ERIC Educational Resources Information Center

    McMurray, Bob; Aslin, Richard N.; Toscano, Joseph C.

    2009-01-01

    Recent evidence (Maye, Werker & Gerken, 2002) suggests that statistical learning may be an important mechanism for the acquisition of phonetic categories in the infant's native language. We examined the sufficiency of this hypothesis and its implications for development by implementing a statistical learning mechanism in a computational model…

  16. The Impact of Student-Directed Projects in Introductory Statistics

    ERIC Educational Resources Information Center

    Spence, Dianna J.; Bailey, Brad; Sharp, Julia L.

    2017-01-01

    A multi-year study investigated the impact of incorporating student-directed discovery projects into introductory statistics courses. Pilot instructors at institutions across the United States taught statistics implementing student-directed projects with the help of a common set of instructional materials designed to facilitate such projects.…

  17. Optimizing human activity patterns using global sensitivity analysis.

    PubMed

    Fairchild, Geoffrey; Hickmann, Kyle S; Mniszewski, Susan M; Del Valle, Sara Y; Hyman, James M

    2014-12-01

    Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule's regularity for a population. We show how to tune an activity's regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimization problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. We use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations.

  18. Optimizing human activity patterns using global sensitivity analysis

    PubMed Central

    Hickmann, Kyle S.; Mniszewski, Susan M.; Del Valle, Sara Y.; Hyman, James M.

    2014-01-01

    Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule’s regularity for a population. We show how to tune an activity’s regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimization problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. We use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations. PMID:25580080

  19. Optimizing human activity patterns using global sensitivity analysis

    DOE PAGES

    Fairchild, Geoffrey; Hickmann, Kyle S.; Mniszewski, Susan M.; ...

    2013-12-10

    Implementing realistic activity patterns for a population is crucial for modeling, for example, disease spread, supply and demand, and disaster response. Using the dynamic activity simulation engine, DASim, we generate schedules for a population that capture regular (e.g., working, eating, and sleeping) and irregular activities (e.g., shopping or going to the doctor). We use the sample entropy (SampEn) statistic to quantify a schedule’s regularity for a population. We show how to tune an activity’s regularity by adjusting SampEn, thereby making it possible to realistically design activities when creating a schedule. The tuning process sets up a computationally intractable high-dimensional optimizationmore » problem. To reduce the computational demand, we use Bayesian Gaussian process regression to compute global sensitivity indices and identify the parameters that have the greatest effect on the variance of SampEn. Here we use the harmony search (HS) global optimization algorithm to locate global optima. Our results show that HS combined with global sensitivity analysis can efficiently tune the SampEn statistic with few search iterations. We demonstrate how global sensitivity analysis can guide statistical emulation and global optimization algorithms to efficiently tune activities and generate realistic activity patterns. Finally, though our tuning methods are applied to dynamic activity schedule generation, they are general and represent a significant step in the direction of automated tuning and optimization of high-dimensional computer simulations.« less

  20. Regional effects of agricultural conservation practices on nutrient transport in the Upper Mississippi River Basin

    USGS Publications Warehouse

    Garcia, Ana Maria.; Alexander, Richard B.; Arnold, Jeffrey G.; Norfleet, Lee; White, Michael J.; Robertson, Dale M.; Schwarz, Gregory E.

    2016-01-01

    Despite progress in the implementation of conservation practices, related improvements in water quality have been challenging to measure in larger river systems. In this paper we quantify these downstream effects by applying the empirical U.S. Geological Survey water-quality model SPARROW to investigate whether spatial differences in conservation intensity were statistically correlated with variations in nutrient loads. In contrast to other forms of water quality data analysis, the application of SPARROW controls for confounding factors such as hydrologic variability, multiple sources and environmental processes. A measure of conservation intensity was derived from the USDA-CEAP regional assessment of the Upper Mississippi River and used as an explanatory variable in a model of the Upper Midwest. The spatial pattern of conservation intensity was negatively correlated (p = 0.003) with the total nitrogen loads in streams in the basin. Total phosphorus loads were weakly negatively correlated with conservation (p = 0.25). Regional nitrogen reductions were estimated to range from 5 to 34% and phosphorus reductions from 1 to 10% in major river basins of the Upper Mississippi region. The statistical associations between conservation and nutrient loads are consistent with hydrological and biogeochemical processes such as denitrification. The results provide empirical evidence at the regional scale that conservation practices have had a larger statistically detectable effect on nitrogen than on phosphorus loadings in streams and rivers of the Upper Mississippi Basin.

  1. Regional Effects of Agricultural Conservation Practices on Nutrient Transport in the Upper Mississippi River Basin.

    PubMed

    García, Ana María; Alexander, Richard B; Arnold, Jeffrey G; Norfleet, Lee; White, Michael J; Robertson, Dale M; Schwarz, Gregory

    2016-07-05

    Despite progress in the implementation of conservation practices, related improvements in water quality have been challenging to measure in larger river systems. In this paper we quantify these downstream effects by applying the empirical U.S. Geological Survey water-quality model SPARROW to investigate whether spatial differences in conservation intensity were statistically correlated with variations in nutrient loads. In contrast to other forms of water quality data analysis, the application of SPARROW controls for confounding factors such as hydrologic variability, multiple sources and environmental processes. A measure of conservation intensity was derived from the USDA-CEAP regional assessment of the Upper Mississippi River and used as an explanatory variable in a model of the Upper Midwest. The spatial pattern of conservation intensity was negatively correlated (p = 0.003) with the total nitrogen loads in streams in the basin. Total phosphorus loads were weakly negatively correlated with conservation (p = 0.25). Regional nitrogen reductions were estimated to range from 5 to 34% and phosphorus reductions from 1 to 10% in major river basins of the Upper Mississippi region. The statistical associations between conservation and nutrient loads are consistent with hydrological and biogeochemical processes such as denitrification. The results provide empirical evidence at the regional scale that conservation practices have had a larger statistically detectable effect on nitrogen than on phosphorus loadings in streams and rivers of the Upper Mississippi Basin.

  2. SU-E-T-760: Tolerance Design for Site-Specific Range in Proton Patient QA Process Using the Six Sigma Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lah, J; Shin, D; Kim, G

    Purpose: To show how tolerance design and tolerancing approaches can be used to predict and improve the site-specific range in patient QA process in implementing the Six Sigma. Methods: In this study, patient QA plans were selected according to 6 site-treatment groups: head &neck (94 cases), spine (76 cases), lung (89 cases), liver (53 cases), pancreas (55 cases), and prostate (121 cases), treated between 2007 and 2013. We evaluated a model of the Six Sigma that determines allowable deviations in design parameters and process variables in patient-specific QA, where possible, tolerance may be loosened, then customized if it necessary tomore » meet the functional requirements. A Six Sigma problem-solving methodology is known as DMAIC phases, which are used stand for: Define a problem or improvement opportunity, Measure process performance, Analyze the process to determine the root causes of poor performance, Improve the process by fixing root causes, Control the improved process to hold the gains. Results: The process capability for patient-specific range QA is 0.65 with only ±1 mm of tolerance criteria. Our results suggested the tolerance level of ±2–3 mm for prostate and liver cases and ±5 mm for lung cases. We found that customized tolerance between calculated and measured range reduce that patient QA plan failure and almost all sites had failure rates less than 1%. The average QA time also improved from 2 hr to less than 1 hr for all including planning and converting process, depth-dose measurement and evaluation. Conclusion: The objective of tolerance design is to achieve optimization beyond that obtained through QA process improvement and statistical analysis function detailing to implement a Six Sigma capable design.« less

  3. Team-training in healthcare: a narrative synthesis of the literature.

    PubMed

    Weaver, Sallie J; Dy, Sydney M; Rosen, Michael A

    2014-05-01

    Patients are safer and receive higher quality care when providers work as a highly effective team. Investment in optimising healthcare teamwork has swelled in the last 10 years. Consequently, evidence regarding the effectiveness for these interventions has also grown rapidly. We provide an updated review concerning the current state of team-training science and practice in acute care settings. A PubMed search for review articles examining team-training interventions in acute care settings published between 2000 and 2012 was conducted. Following identification of relevant reviews with searches terminating in 2008 and 2010, PubMed and PSNet were searched for additional primary studies published in 2011 and 2012. Primary outcomes included patient outcomes and quality indices. Secondary outcomes included teamwork behaviours, knowledge and attitudes. Both simulation and classroom-based team-training interventions can improve teamwork processes (eg, communication, coordination and cooperation), and implementation has been associated with improvements in patient safety outcomes. Thirteen studies published between 2011 and 2012 reported statistically significant changes in teamwork behaviours, processes or emergent states and 10 reported significant improvement in clinical care processes or patient outcomes, including mortality and morbidity. Effects were reported across a range of clinical contexts. Larger effect sizes were reported for bundled team-training interventions that included tools and organisational changes to support sustainment and transfer of teamwork competencies into daily practice. Overall, moderate-to-high-quality evidence suggests team-training can positively impact healthcare team processes and patient outcomes. Additionally, toolkits are available to support intervention development and implementation. Evidence suggests bundled team-training interventions and implementation strategies that embed effective teamwork as a foundation for other improvement efforts may offer greatest impact on patient outcomes.

  4. Utilization and Harmonization of Adult Accelerometry Data: Review and Expert Consensus.

    PubMed

    Wijndaele, Katrien; Westgate, Kate; Stephens, Samantha K; Blair, Steven N; Bull, Fiona C; Chastin, Sebastien F M; Dunstan, David W; Ekelund, Ulf; Esliger, Dale W; Freedson, Patty S; Granat, Malcolm H; Matthews, Charles E; Owen, Neville; Rowlands, Alex V; Sherar, Lauren B; Tremblay, Mark S; Troiano, Richard P; Brage, Søren; Healy, Genevieve N

    2015-10-01

    This study aimed to describe the scope of accelerometry data collected internationally in adults and to obtain a consensus from measurement experts regarding the optimal strategies to harmonize international accelerometry data. In March 2014, a comprehensive review was undertaken to identify studies that collected accelerometry data in adults (sample size, n ≥ 400). In addition, 20 physical activity experts were invited to participate in a two-phase Delphi process to obtain consensus on the following: unique research opportunities available with such data, additional data required to address these opportunities, strategies for enabling comparisons between studies/countries, requirements for implementing/progressing such strategies, and value of a global repository of accelerometry data. The review identified accelerometry data from more than 275,000 adults from 76 studies across 36 countries. Consensus was achieved after two rounds of the Delphi process; 18 experts participated in one or both rounds. The key opportunities highlighted were the ability for cross-country/cross-population comparisons and the analytic options available with the larger heterogeneity and greater statistical power. Basic sociodemographic and anthropometric data were considered a prerequisite for this. Disclosure of monitor specifications and protocols for data collection and processing were deemed essential to enable comparison and data harmonization. There was strong consensus that standardization of data collection, processing, and analytical procedures was needed. To implement these strategies, communication and consensus among researchers, development of an online infrastructure, and methodological comparison work were required. There was consensus that a global accelerometry data repository would be beneficial and worthwhile. This foundational resource can lead to implementation of key priority areas and identification of future directions in physical activity epidemiology, population monitoring, and burden of disease estimates.

  5. Cross-Approximate Entropy parallel computation on GPUs for biomedical signal analysis. Application to MEG recordings.

    PubMed

    Martínez-Zarzuela, Mario; Gómez, Carlos; Díaz-Pernas, Francisco Javier; Fernández, Alberto; Hornero, Roberto

    2013-10-01

    Cross-Approximate Entropy (Cross-ApEn) is a useful measure to quantify the statistical dissimilarity of two time series. In spite of the advantage of Cross-ApEn over its one-dimensional counterpart (Approximate Entropy), only a few studies have applied it to biomedical signals, mainly due to its high computational cost. In this paper, we propose a fast GPU-based implementation of the Cross-ApEn that makes feasible its use over a large amount of multidimensional data. The scheme followed is fully scalable, thus maximizes the use of the GPU despite of the number of neural signals being processed. The approach consists in processing many trials or epochs simultaneously, with independence of its origin. In the case of MEG data, these trials can proceed from different input channels or subjects. The proposed implementation achieves an average speedup greater than 250× against a CPU parallel version running on a processor containing six cores. A dataset of 30 subjects containing 148 MEG channels (49 epochs of 1024 samples per channel) can be analyzed using our development in about 30min. The same processing takes 5 days on six cores and 15 days when running on a single core. The speedup is much larger if compared to a basic sequential Matlab(®) implementation, that would need 58 days per subject. To our knowledge, this is the first contribution of Cross-ApEn measure computation using GPUs. This study demonstrates that this hardware is, to the day, the best option for the signal processing of biomedical data with Cross-ApEn. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  6. Strategic planning processes and financial performance among hospitals in Lebanon.

    PubMed

    Saleh, Shadi; Kaissi, Amer; Semaan, Adele; Natafgi, Nabil Maher

    2013-01-01

    Strategic planning has been presented as a valuable management tool. However, evidence of its deployment in healthcare and its effect on organizational performance is limited in low-income and middle-income countries (LMICs). The study aimed to explore the use of strategic planning processes in Lebanese hospitals and to investigate its association with financial performance. The study comprised 79 hospitals and assessed occupancy rate (OR) and revenue-per-bed (RPB) as performance measures. The strategic planning process included six domains: having a plan, plan development, plan implementation, responsibility of planning activities, governing board involvement, and physicians' involvement. Approximately 90% of hospitals have strategic plans that are moderately developed (mean score of 4.9 on a 1-7 scale) and implemented (score of 4.8). In 46% of the hospitals, the CEO has the responsibility for the plan. The level of governing board involvement in the process is moderate to high (score of 5.1), whereas physician involvement is lower (score of 4.1). The OR and RPB amounted to respectively 70% and 59 304 among hospitals with a strategic plan as compared with 62% and 33 564 for those lacking such a plan. No statistical association between having a strategic plan and either of the two measures was detected. However, the findings revealed that among hospitals that had a strategic plan, higher implementation levels were associated with lower OR (p < 0.05). In an LMIC healthcare environment characterized by resource limitation, complexity, and political and economic volatility, flexibility rather than rigid plans allow organizations to better cope with environmental turbulence. Copyright © 2012 John Wiley & Sons, Ltd.

  7. Predicting on-site environmental impacts of municipal engineering works

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gangolells, Marta, E-mail: marta.gangolells@upc.edu; Casals, Miquel, E-mail: miquel.casals@upc.edu; Forcada, Núria, E-mail: nuria.forcada@upc.edu

    2014-01-15

    The research findings fill a gap in the body of knowledge by presenting an effective way to evaluate the significance of on-site environmental impacts of municipal engineering works prior to the construction stage. First, 42 on-site environmental impacts of municipal engineering works were identified by means of a process-oriented approach. Then, 46 indicators and their corresponding significance limits were determined on the basis of a statistical analysis of 25 new-build and remodelling municipal engineering projects. In order to ensure the objectivity of the assessment process, direct and indirect indicators were always based on quantitative data from the municipal engineering projectmore » documents. Finally, two case studies were analysed and found to illustrate the practical use of the proposed model. The model highlights the significant environmental impacts of a particular municipal engineering project prior to the construction stage. Consequently, preventive actions can be planned and implemented during on-site activities. The results of the model also allow a comparison of proposed municipal engineering projects and alternatives with respect to the overall on-site environmental impact and the absolute importance of a particular environmental aspect. These findings are useful within the framework of the environmental impact assessment process, as they help to improve the identification and evaluation of on-site environmental aspects of municipal engineering works. The findings may also be of use to construction companies that are willing to implement an environmental management system or simply wish to improve on-site environmental performance in municipal engineering projects. -- Highlights: • We present a model to predict the environmental impacts of municipal engineering works. • It highlights significant on-site environmental impacts prior to the construction stage. • Findings are useful within the environmental impact assessment process. • They also help contractors to implement environmental management systems.« less

  8. Team-training in healthcare: a narrative synthesis of the literature

    PubMed Central

    Weaver, Sallie J; Dy, Sydney M; Rosen, Michael A

    2014-01-01

    Background Patients are safer and receive higher quality care when providers work as a highly effective team. Investment in optimising healthcare teamwork has swelled in the last 10 years. Consequently, evidence regarding the effectiveness for these interventions has also grown rapidly. We provide an updated review concerning the current state of team-training science and practice in acute care settings. Methods A PubMed search for review articles examining team-training interventions in acute care settings published between 2000 and 2012 was conducted. Following identification of relevant reviews with searches terminating in 2008 and 2010, PubMed and PSNet were searched for additional primary studies published in 2011 and 2012. Primary outcomes included patient outcomes and quality indices. Secondary outcomes included teamwork behaviours, knowledge and attitudes. Results Both simulation and classroom-based team-training interventions can improve teamwork processes (eg, communication, coordination and cooperation), and implementation has been associated with improvements in patient safety outcomes. Thirteen studies published between 2011 and 2012 reported statistically significant changes in teamwork behaviours, processes or emergent states and 10 reported significant improvement in clinical care processes or patient outcomes, including mortality and morbidity. Effects were reported across a range of clinical contexts. Larger effect sizes were reported for bundled team-training interventions that included tools and organisational changes to support sustainment and transfer of teamwork competencies into daily practice. Conclusions Overall, moderate-to-high-quality evidence suggests team-training can positively impact healthcare team processes and patient outcomes. Additionally, toolkits are available to support intervention development and implementation. Evidence suggests bundled team-training interventions and implementation strategies that embed effective teamwork as a foundation for other improvement efforts may offer greatest impact on patient outcomes. PMID:24501181

  9. @neurIST complex information processing toolchain for the integrated management of cerebral aneurysms

    PubMed Central

    Villa-Uriol, M. C.; Berti, G.; Hose, D. R.; Marzo, A.; Chiarini, A.; Penrose, J.; Pozo, J.; Schmidt, J. G.; Singh, P.; Lycett, R.; Larrabide, I.; Frangi, A. F.

    2011-01-01

    Cerebral aneurysms are a multi-factorial disease with severe consequences. A core part of the European project @neurIST was the physical characterization of aneurysms to find candidate risk factors associated with aneurysm rupture. The project investigated measures based on morphological, haemodynamic and aneurysm wall structure analyses for more than 300 cases of ruptured and unruptured aneurysms, extracting descriptors suitable for statistical studies. This paper deals with the unique challenges associated with this task, and the implemented solutions. The consistency of results required by the subsequent statistical analyses, given the heterogeneous image data sources and multiple human operators, was met by a highly automated toolchain combined with training. A testimonial of the successful automation is the positive evaluation of the toolchain by over 260 clinicians during various hands-on workshops. The specification of the analyses required thorough investigations of modelling and processing choices, discussed in a detailed analysis protocol. Finally, an abstract data model governing the management of the simulation-related data provides a framework for data provenance and supports future use of data and toolchain. This is achieved by enabling the easy modification of the modelling approaches and solution details through abstract problem descriptions, removing the need of repetition of manual processing work. PMID:22670202

  10. Improving medium-range ensemble streamflow forecasts through statistical post-processing

    NASA Astrophysics Data System (ADS)

    Mendoza, Pablo; Wood, Andy; Clark, Elizabeth; Nijssen, Bart; Clark, Martyn; Ramos, Maria-Helena; Nowak, Kenneth; Arnold, Jeffrey

    2017-04-01

    Probabilistic hydrologic forecasts are a powerful source of information for decision-making in water resources operations. A common approach is the hydrologic model-based generation of streamflow forecast ensembles, which can be implemented to account for different sources of uncertainties - e.g., from initial hydrologic conditions (IHCs), weather forecasts, and hydrologic model structure and parameters. In practice, hydrologic ensemble forecasts typically have biases and spread errors stemming from errors in the aforementioned elements, resulting in a degradation of probabilistic properties. In this work, we compare several statistical post-processing techniques applied to medium-range ensemble streamflow forecasts obtained with the System for Hydromet Applications, Research and Prediction (SHARP). SHARP is a fully automated prediction system for the assessment and demonstration of short-term to seasonal streamflow forecasting applications, developed by the National Center for Atmospheric Research, University of Washington, U.S. Army Corps of Engineers, and U.S. Bureau of Reclamation. The suite of post-processing techniques includes linear blending, quantile mapping, extended logistic regression, quantile regression, ensemble analogs, and the generalized linear model post-processor (GLMPP). We assess and compare these techniques using multi-year hindcasts in several river basins in the western US. This presentation discusses preliminary findings about the effectiveness of the techniques for improving probabilistic skill, reliability, discrimination, sharpness and resolution.

  11. Descriptive Statistics of the Genome: Phylogenetic Classification of Viruses.

    PubMed

    Hernandez, Troy; Yang, Jie

    2016-10-01

    The typical process for classifying and submitting a newly sequenced virus to the NCBI database involves two steps. First, a BLAST search is performed to determine likely family candidates. That is followed by checking the candidate families with the pairwise sequence alignment tool for similar species. The submitter's judgment is then used to determine the most likely species classification. The aim of this article is to show that this process can be automated into a fast, accurate, one-step process using the proposed alignment-free method and properly implemented machine learning techniques. We present a new family of alignment-free vectorizations of the genome, the generalized vector, that maintains the speed of existing alignment-free methods while outperforming all available methods. This new alignment-free vectorization uses the frequency of genomic words (k-mers), as is done in the composition vector, and incorporates descriptive statistics of those k-mers' positional information, as inspired by the natural vector. We analyze five different characterizations of genome similarity using k-nearest neighbor classification and evaluate these on two collections of viruses totaling over 10,000 viruses. We show that our proposed method performs better than, or as well as, other methods at every level of the phylogenetic hierarchy. The data and R code is available upon request.

  12. Simulating the impact of dust cooling on the statistical properties of the intra-cluster medium

    NASA Astrophysics Data System (ADS)

    Pointecouteau, Etienne; da Silva, Antonio; Catalano, Andrea; Montier, Ludovic; Lanoux, Joseph; Roncarelli, Mauro; Giard, Martin

    2009-08-01

    From the first stages of star and galaxy formation, non-gravitational processes such as ram pressure stripping, SNs, galactic winds, AGNs, galaxy-galaxy mergers, etc. lead to the enrichment of the IGM in stars, metals as well as dust, via the ejection of galactic material into the IGM. We know now that these processes shape, side by side with gravitation, the formation and the evolution of structures. We present here hydrodynamic simulations of structure formation implementing the effect of the cooling by dust on large scale structure formation. We focus on the scale of galaxy clusters and study the statistical properties of clusters. Here, we present our results on the TX-M and the LX-M scaling relations which exhibit changes on both the slope and normalization when adding cooling by dust to the standard radiative cooling model. For example, the normalization of the TX-M relation changes only by a maximum of 2% at M=1014M⊙ whereas the normalization of the LX-TX changes by as much as 10% at TX=1keV for models that including dust cooling. Our study shows that the dust is an added non-gravitational process that contributes shaping the thermodynamical state of the hot ICM gas.

  13. Examining the Return on Investment of a Security Information and Event Management Solution in a Notional Department of Defense Network Environment

    DTIC Science & Technology

    2013-06-01

    collection are the facts that devices the lack encryption or compression methods and that the log file must be saved on the host system prior to transfer...time. Statistical correlation utilizes numerical algorithms to detect deviations from normal event levels and other routine activities (Chuvakin...can also assist in detecting low volume threats. Although easy and logical to implement, the implementation of statistical correlation algorithms

  14. Emotional metacontrol of attention: Top-down modulation of sensorimotor processes in a robotic visual search task

    PubMed Central

    Cuperlier, Nicolas; Gaussier, Philippe

    2017-01-01

    Emotions play a significant role in internal regulatory processes. In this paper, we advocate four key ideas. First, novelty detection can be grounded in the sensorimotor experience and allow higher order appraisal. Second, cognitive processes, such as those involved in self-assessment, influence emotional states by eliciting affects like boredom and frustration. Third, emotional processes such as those triggered by self-assessment influence attentional processes. Last, close emotion-cognition interactions implement an efficient feedback loop for the purpose of top-down behavior regulation. The latter is what we call ‘Emotional Metacontrol’. We introduce a model based on artificial neural networks. This architecture is used to control a robotic system in a visual search task. The emotional metacontrol intervenes to bias the robot visual attention during active object recognition. Through a behavioral and statistical analysis, we show that this mechanism increases the robot performance and fosters the exploratory behavior to avoid deadlocks. PMID:28934291

  15. Needs assessment under the Maternal and Child Health Services Block Grant: Massachusetts.

    PubMed

    Guyer, B; Schor, L; Messenger, K P; Prenney, B; Evans, F

    1984-09-01

    The Massachusetts maternal and child health (MCH) agency has developed a needs assessment process which includes four components: a statistical measure of need based on indirect, proxy health and social indicators; clinical standards for services to be provided; an advisory process which guides decision making and involves constituency groups; and a management system for implementing funds distribution, namely open competitive bidding in response to a Request for Proposals. In Fiscal Years 1982 and 1983, the process was applied statewide in the distribution of primary prenatal (MIC) and pediatric (C&Y) care services and lead poisoning prevention projects. Both processes resulted in clearer definitions of services to be provided under contract to the state as well as redistribution of funds to serve localities that had previously received no resources. Although the needs assessment process does not provide a direct measure of unmet need in a complex system of private and public services, it can be used to advocate for increased MCH funding and guide the distribution of new MCH service dollars.

  16. LakeMetabolizer: An R package for estimating lake metabolism from free-water oxygen using diverse statistical models

    USGS Publications Warehouse

    Winslow, Luke; Zwart, Jacob A.; Batt, Ryan D.; Dugan, Hilary; Woolway, R. Iestyn; Corman, Jessica; Hanson, Paul C.; Read, Jordan S.

    2016-01-01

    Metabolism is a fundamental process in ecosystems that crosses multiple scales of organization from individual organisms to whole ecosystems. To improve sharing and reuse of published metabolism models, we developed LakeMetabolizer, an R package for estimating lake metabolism from in situ time series of dissolved oxygen, water temperature, and, optionally, additional environmental variables. LakeMetabolizer implements 5 different metabolism models with diverse statistical underpinnings: bookkeeping, ordinary least squares, maximum likelihood, Kalman filter, and Bayesian. Each of these 5 metabolism models can be combined with 1 of 7 models for computing the coefficient of gas exchange across the air–water interface (k). LakeMetabolizer also features a variety of supporting functions that compute conversions and implement calculations commonly applied to raw data prior to estimating metabolism (e.g., oxygen saturation and optical conversion models). These tools have been organized into an R package that contains example data, example use-cases, and function documentation. The release package version is available on the Comprehensive R Archive Network (CRAN), and the full open-source GPL-licensed code is freely available for examination and extension online. With this unified, open-source, and freely available package, we hope to improve access and facilitate the application of metabolism in studies and management of lentic ecosystems.

  17. The Creation and Statistical Evaluation of a Deterministic Model of the Human Bronchial Tree from HRCT Images.

    PubMed

    Montesantos, Spyridon; Katz, Ira; Pichelin, Marine; Caillibotte, Georges

    2016-01-01

    A quantitative description of the morphology of lung structure is essential prior to any form of predictive modeling of ventilation or aerosol deposition implemented within the lung. The human lung is a very complex organ, with airway structures that span two orders of magnitude and having a multitude of interfaces between air, tissue and blood. As such, current medical imaging protocols cannot provide medical practitioners and researchers with in-vivo knowledge of deeper lung structures. In this work a detailed algorithm for the generation of an individualized 3D deterministic model of the conducting part of the human tracheo-bronchial tree is described. Distinct initial conditions were obtained from the high-resolution computed tomography (HRCT) images of seven healthy volunteers. The algorithm developed is fractal in nature and is implemented as a self-similar space sub-division procedure. The expansion process utilizes physiologically realistic relationships and thresholds to produce an anatomically consistent human airway tree. The model was validated through extensive statistical analysis of the results and comparison of the most common morphological features with previously published morphometric studies and other equivalent models. The resulting trees were shown to be in good agreement with published human lung geometric characteristics and can be used to study, among other things, structure-function relationships in simulation studies.

  18. Building integral projection models: a user's guide

    PubMed Central

    Rees, Mark; Childs, Dylan Z; Ellner, Stephen P; Coulson, Tim

    2014-01-01

    In order to understand how changes in individual performance (growth, survival or reproduction) influence population dynamics and evolution, ecologists are increasingly using parameterized mathematical models. For continuously structured populations, where some continuous measure of individual state influences growth, survival or reproduction, integral projection models (IPMs) are commonly used. We provide a detailed description of the steps involved in constructing an IPM, explaining how to: (i) translate your study system into an IPM; (ii) implement your IPM; and (iii) diagnose potential problems with your IPM. We emphasize how the study organism's life cycle, and the timing of censuses, together determine the structure of the IPM kernel and important aspects of the statistical analysis used to parameterize an IPM using data on marked individuals. An IPM based on population studies of Soay sheep is used to illustrate the complete process of constructing, implementing and evaluating an IPM fitted to sample data. We then look at very general approaches to parameterizing an IPM, using a wide range of statistical techniques (e.g. maximum likelihood methods, generalized additive models, nonparametric kernel density estimators). Methods for selecting models for parameterizing IPMs are briefly discussed. We conclude with key recommendations and a brief overview of applications that extend the basic model. The online Supporting Information provides commented R code for all our analyses. PMID:24219157

  19. Adding statistical regularity results in a global slowdown in visual search.

    PubMed

    Vaskevich, Anna; Luria, Roy

    2018-05-01

    Current statistical learning theories predict that embedding implicit regularities within a task should further improve online performance, beyond general practice. We challenged this assumption by contrasting performance in a visual search task containing either a consistent-mapping (regularity) condition, a random-mapping condition, or both conditions, mixed. Surprisingly, performance in a random visual search, without any regularity, was better than performance in a mixed design search that contained a beneficial regularity. This result was replicated using different stimuli and different regularities, suggesting that mixing consistent and random conditions leads to an overall slowing down of performance. Relying on the predictive-processing framework, we suggest that this global detrimental effect depends on the validity of the regularity: when its predictive value is low, as it is in the case of a mixed design, reliance on all prior information is reduced, resulting in a general slowdown. Our results suggest that our cognitive system does not maximize speed, but rather continues to gather and implement statistical information at the expense of a possible slowdown in performance. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. BCM: toolkit for Bayesian analysis of Computational Models using samplers.

    PubMed

    Thijssen, Bram; Dijkstra, Tjeerd M H; Heskes, Tom; Wessels, Lodewyk F A

    2016-10-21

    Computational models in biology are characterized by a large degree of uncertainty. This uncertainty can be analyzed with Bayesian statistics, however, the sampling algorithms that are frequently used for calculating Bayesian statistical estimates are computationally demanding, and each algorithm has unique advantages and disadvantages. It is typically unclear, before starting an analysis, which algorithm will perform well on a given computational model. We present BCM, a toolkit for the Bayesian analysis of Computational Models using samplers. It provides efficient, multithreaded implementations of eleven algorithms for sampling from posterior probability distributions and for calculating marginal likelihoods. BCM includes tools to simplify the process of model specification and scripts for visualizing the results. The flexible architecture allows it to be used on diverse types of biological computational models. In an example inference task using a model of the cell cycle based on ordinary differential equations, BCM is significantly more efficient than existing software packages, allowing more challenging inference problems to be solved. BCM represents an efficient one-stop-shop for computational modelers wishing to use sampler-based Bayesian statistics.

  1. Variable system: An alternative approach for the analysis of mediated moderation.

    PubMed

    Kwan, Joyce Lok Yin; Chan, Wai

    2018-06-01

    Mediated moderation (meMO) occurs when the moderation effect of the moderator (W) on the relationship between the independent variable (X) and the dependent variable (Y) is transmitted through a mediator (M). To examine this process empirically, 2 different model specifications (Type I meMO and Type II meMO) have been proposed in the literature. However, both specifications are found to be problematic, either conceptually or statistically. For example, it can be shown that each type of meMO model is statistically equivalent to a particular form of moderated mediation (moME), another process that examines the condition when the indirect effect from X to Y through M varies as a function of W. Consequently, it is difficult for one to differentiate these 2 processes mathematically. This study therefore has 2 objectives. First, we attempt to differentiate moME and meMO by proposing an alternative specification for meMO. Conceptually, this alternative specification is intuitively meaningful and interpretable, and, statistically, it offers meMO a unique representation that is no longer identical to its moME counterpart. Second, using structural equation modeling, we propose an integrated approach for the analysis of meMO as well as for other general types of conditional path models. VS, a computer software program that implements the proposed approach, has been developed to facilitate the analysis of conditional path models for applied researchers. Real examples are considered to illustrate how the proposed approach works in practice and to compare its performance against the traditional methods. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  2. Look@NanoSIMS--a tool for the analysis of nanoSIMS data in environmental microbiology.

    PubMed

    Polerecky, Lubos; Adam, Birgit; Milucka, Jana; Musat, Niculina; Vagner, Tomas; Kuypers, Marcel M M

    2012-04-01

    We describe an open-source freeware programme for high throughput analysis of nanoSIMS (nanometre-scale secondary ion mass spectrometry) data. The programme implements basic data processing and analytical functions, including display and drift-corrected accumulation of scanned planes, interactive and semi-automated definition of regions of interest (ROIs), and export of the ROIs' elemental and isotopic composition in graphical and text-based formats. Additionally, the programme offers new functions that were custom-designed to address the needs of environmental microbiologists. Specifically, it allows manual and automated classification of ROIs based on the information that is derived either from the nanoSIMS dataset itself (e.g. from labelling achieved by halogen in situ hybridization) or is provided externally (e.g. as a fluorescence in situ hybridization image). Moreover, by implementing post-processing routines coupled to built-in statistical tools, the programme allows rapid synthesis and comparative analysis of results from many different datasets. After validation of the programme, we illustrate how these new processing and analytical functions increase flexibility, efficiency and depth of the nanoSIMS data analysis. Through its custom-made and open-source design, the programme provides an efficient, reliable and easily expandable tool that can help a growing community of environmental microbiologists and researchers from other disciplines process and analyse their nanoSIMS data. © 2012 Society for Applied Microbiology and Blackwell Publishing Ltd.

  3. Facility Monitoring: A Qualitative Theory for Sensor Fusion

    NASA Technical Reports Server (NTRS)

    Figueroa, Fernando

    2001-01-01

    Data fusion and sensor management approaches have largely been implemented with centralized and hierarchical architectures. Numerical and statistical methods are the most common data fusion methods found in these systems. Given the proliferation and low cost of processing power, there is now an emphasis on designing distributed and decentralized systems. These systems use analytical/quantitative techniques or qualitative reasoning methods for date fusion.Based on other work by the author, a sensor may be treated as a highly autonomous (decentralized) unit. Each highly autonomous sensor (HAS) is capable of extracting qualitative behaviours from its data. For example, it detects spikes, disturbances, noise levels, off-limit excursions, step changes, drift, and other typical measured trends. In this context, this paper describes a distributed sensor fusion paradigm and theory where each sensor in the system is a HAS. Hence, given the reach qualitative information from each HAS, a paradigm and formal definitions are given so that sensors and processes can reason and make decisions at the qualitative level. This approach to sensor fusion makes it possible the implementation of intuitive (effective) methods to monitor, diagnose, and compensate processes/systems and their sensors. This paradigm facilitates a balanced distribution of intelligence (code and/or hardware) to the sensor level, the process/system level, and a higher controller level. The primary application of interest is in intelligent health management of rocket engine test stands.

  4. Horsetail matching: a flexible approach to optimization under uncertainty

    NASA Astrophysics Data System (ADS)

    Cook, L. W.; Jarrett, J. P.

    2018-04-01

    It is important to design engineering systems to be robust with respect to uncertainties in the design process. Often, this is done by considering statistical moments, but over-reliance on statistical moments when formulating a robust optimization can produce designs that are stochastically dominated by other feasible designs. This article instead proposes a formulation for optimization under uncertainty that minimizes the difference between a design's cumulative distribution function and a target. A standard target is proposed that produces stochastically non-dominated designs, but the formulation also offers enough flexibility to recover existing approaches for robust optimization. A numerical implementation is developed that employs kernels to give a differentiable objective function. The method is applied to algebraic test problems and a robust transonic airfoil design problem where it is compared to multi-objective, weighted-sum and density matching approaches to robust optimization; several advantages over these existing methods are demonstrated.

  5. Load research manual. Volume 2. Fundamentals of implementing load research procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brandenburg, L.; Clarkson, G.; Grund, Jr., C.

    This three-volume manual presents technical guidelines for electric utility load research. Special attention is given to issues raised by the load data reporting requirements of the Public Utility Regulatory Policies Act of 1978 and to problems faced by smaller utilities that are initiating load research programs. In Volumes 1 and 2, procedures are suggested for determining data requirements for load research, establishing the size and customer composition of a load survey sample, selecting and using equipment to record customer electricity usage, processing data tapes from the recording equipment, and analyzing the data. Statistical techniques used in customer sampling are discussedmore » in detail. The costs of load research also are estimated, and ongoing load research programs at three utilities are described. The manual includes guides to load research literature and glossaries of load research and statistical terms.« less

  6. Enabling High-Energy, High-Voltage Lithium-Ion Cells: Standardization of Coin-Cell Assembly, Electrochemical Testing, and Evaluation of Full Cells

    DOE PAGES

    Long, Brandon R.; Rinaldo, Steven G.; Gallagher, Kevin G.; ...

    2016-11-09

    Coin-cells are often the test format of choice for laboratories engaged in battery research and development as they provide a convenient platform for rapid testing of new materials on a small scale. However, reliable, reproducible data via the coin-cell format is inherently difficult, particularly in the full-cell configuration. In addition, statistical evaluation to prove the consistency and reliability of such data is often neglected. Herein we report on several studies aimed at formalizing physical process parameters and coin-cell construction related to full cells. Statistical analysis and performance benchmarking approaches are advocated as a means to more confidently track changes inmore » cell performance. Finally, we show that trends in the electrochemical data obtained from coin-cells can be reliable and informative when standardized approaches are implemented in a consistent manner.« less

  7. Influence of radiation on metastability-based TRNG

    NASA Astrophysics Data System (ADS)

    Wieczorek, Piotr Z.; Wieczorek, Zbigniew

    2017-08-01

    This paper presents a True Random Number Generator (TRNG) based on Flip-Flops with violated timing constraints. The proposed circuit has been implemented in a Xilinx Spartan 6 device. The TRNG circuit utilizes the metastability phenomenon as a source of randomness. Therefore, in the paper the influence of timing constraints on the flip-flop metastability proximity is discussed. The metastable range of operation enhances the noise influence on a Flip-Flop behavior. Therefore, the influence of an external stochastic source on the flip-flop operation is also investigated. For this purpose a radioactive source of radiation was used. According to the results shown in the paper the radiation increases the unpredictability of the metastable process of flip-flops operating as the randomness source in the TRNG. The statistical properties of TRNG operating in an increased radiation conditions were verified with the NIST battery of statistical tests.

  8. An integrated user-friendly ArcMAP tool for bivariate statistical modeling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusof, Z.; Tehrany, M. S.

    2014-10-01

    Modeling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modeling. Bivariate statistical analysis (BSA) assists in hazard modeling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, BSM (bivariate statistical modeler), for BSA technique is proposed. Three popular BSA techniques such as frequency ratio, weights-of-evidence, and evidential belief function models are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and is created by a simple graphical user interface, which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  9. An integrated user-friendly ArcMAP tool for bivariate statistical modelling in geoscience applications

    NASA Astrophysics Data System (ADS)

    Jebur, M. N.; Pradhan, B.; Shafri, H. Z. M.; Yusoff, Z. M.; Tehrany, M. S.

    2015-03-01

    Modelling and classification difficulties are fundamental issues in natural hazard assessment. A geographic information system (GIS) is a domain that requires users to use various tools to perform different types of spatial modelling. Bivariate statistical analysis (BSA) assists in hazard modelling. To perform this analysis, several calculations are required and the user has to transfer data from one format to another. Most researchers perform these calculations manually by using Microsoft Excel or other programs. This process is time-consuming and carries a degree of uncertainty. The lack of proper tools to implement BSA in a GIS environment prompted this study. In this paper, a user-friendly tool, bivariate statistical modeler (BSM), for BSA technique is proposed. Three popular BSA techniques, such as frequency ratio, weight-of-evidence (WoE), and evidential belief function (EBF) models, are applied in the newly proposed ArcMAP tool. This tool is programmed in Python and created by a simple graphical user interface (GUI), which facilitates the improvement of model performance. The proposed tool implements BSA automatically, thus allowing numerous variables to be examined. To validate the capability and accuracy of this program, a pilot test area in Malaysia is selected and all three models are tested by using the proposed program. Area under curve (AUC) is used to measure the success rate and prediction rate. Results demonstrate that the proposed program executes BSA with reasonable accuracy. The proposed BSA tool can be used in numerous applications, such as natural hazard, mineral potential, hydrological, and other engineering and environmental applications.

  10. Strategists and Non-Strategists in Austrian Enterprises—Statistical Approaches

    NASA Astrophysics Data System (ADS)

    Duller, Christine

    2011-09-01

    The purpose of this work is to determine with a modern statistical approach which variables can indicate whether an arbitrary enterprise uses strategic management as basic business concept. "Strategic management is an ongoing process that evaluates and controls the business and the industries in which the company is involved; assesses its competitors and sets goals and strategies to meet all existing and potential competitors; and then reassesses each strategy annually or quarterly (i.e. regularly) to determine how it has been implemented and whether it has succeeded or needs replacement by a new strategy to meet changed circumstances, new technology, new competitors, a new economic environment or a new social, financial or political environment." [12] In Austria 70% to 80% of all enterprises can be classified as family firms. In literature the empirically untested hypothesis can be found that family firms tend to have less formalised management accounting systems than non-family enterprises. But it is unknown whether the use of strategic management accounting systems is influenced more by the fact of structure (family or non-family enterprise) or by the effect of size (number of employees). Therefore, the goal is to split up enterprises into two subgroups, namely strategists and non-strategists and to get information on the variables of influence (size, structure, branches, etc.). Two statistical approaches are used: On the one hand a classical cluster analysis is implemented to design two subgroups and on the other hand a latent class model is built up for this problem. After a description of the theoretical background first results of both strategies are compared.

  11. Process Evaluation and Costing of a Multifaceted Population-Wide Intervention to Reduce Salt Consumption in Fiji.

    PubMed

    Webster, Jacqui; Pillay, Arti; Suku, Arleen; Gohil, Paayal; Santos, Joseph Alvin; Schultz, Jimaima; Wate, Jillian; Trieu, Kathy; Hope, Silvia; Snowdon, Wendy; Moodie, Marj; Jan, Stephen; Bell, Colin

    2018-01-30

    This paper reports the process evaluation and costing of a national salt reduction intervention in Fiji. The population-wide intervention included engaging food industry to reduce salt in foods, strategic health communication and a hospital program. The evaluation showed a 1.4 g/day drop in salt intake from the 11.7 g/day at baseline; however, this was not statistically significant. To better understand intervention implementation, we collated data to assess intervention fidelity, reach, context and costs. Government and management changes affected intervention implementation, meaning fidelity was relatively low. There was no active mechanism for ensuring food companies adhered to the voluntary salt reduction targets. Communication activities had wide reach but most activities were one-off, meaning the overall dose was low and impact on behavior limited. Intervention costs were moderate (FJD $277,410 or $0.31 per person) but the strategy relied on multi-sector action which was not fully operationalised. The cyclone also delayed monitoring and likely impacted the results. However, 73% of people surveyed had heard about the campaign and salt reduction policies have been mainstreamed into government programs. Longer-term monitoring of salt intake is planned through future surveys and lessons from this process evaluation will be used to inform future strategies in the Pacific Islands and globally.

  12. Process Evaluation and Costing of a Multifaceted Population-Wide Intervention to Reduce Salt Consumption in Fiji

    PubMed Central

    Webster, Jacqui; Pillay, Arti; Suku, Arleen; Gohil, Paayal; Santos, Joseph Alvin; Schultz, Jimaima; Wate, Jillian; Trieu, Kathy; Hope, Silvia; Snowdon, Wendy; Moodie, Marj; Jan, Stephen; Bell, Colin

    2018-01-01

    This paper reports the process evaluation and costing of a national salt reduction intervention in Fiji. The population-wide intervention included engaging food industry to reduce salt in foods, strategic health communication and a hospital program. The evaluation showed a 1.4 g/day drop in salt intake from the 11.7 g/day at baseline; however, this was not statistically significant. To better understand intervention implementation, we collated data to assess intervention fidelity, reach, context and costs. Government and management changes affected intervention implementation, meaning fidelity was relatively low. There was no active mechanism for ensuring food companies adhered to the voluntary salt reduction targets. Communication activities had wide reach but most activities were one-off, meaning the overall dose was low and impact on behavior limited. Intervention costs were moderate (FJD $277,410 or $0.31 per person) but the strategy relied on multi-sector action which was not fully operationalised. The cyclone also delayed monitoring and likely impacted the results. However, 73% of people surveyed had heard about the campaign and salt reduction policies have been mainstreamed into government programs. Longer-term monitoring of salt intake is planned through future surveys and lessons from this process evaluation will be used to inform future strategies in the Pacific Islands and globally. PMID:29385758

  13. Combined Uncertainty and A-Posteriori Error Bound Estimates for General CFD Calculations: Theory and Software Implementation

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.

    2014-01-01

    This workshop presentation discusses the design and implementation of numerical methods for the quantification of statistical uncertainty, including a-posteriori error bounds, for output quantities computed using CFD methods. Hydrodynamic realizations often contain numerical error arising from finite-dimensional approximation (e.g. numerical methods using grids, basis functions, particles) and statistical uncertainty arising from incomplete information and/or statistical characterization of model parameters and random fields. The first task at hand is to derive formal error bounds for statistics given realizations containing finite-dimensional numerical error [1]. The error in computed output statistics contains contributions from both realization error and the error resulting from the calculation of statistics integrals using a numerical method. A second task is to devise computable a-posteriori error bounds by numerically approximating all terms arising in the error bound estimates. For the same reason that CFD calculations including error bounds but omitting uncertainty modeling are only of limited value, CFD calculations including uncertainty modeling but omitting error bounds are only of limited value. To gain maximum value from CFD calculations, a general software package for uncertainty quantification with quantified error bounds has been developed at NASA. The package provides implementations for a suite of numerical methods used in uncertainty quantification: Dense tensorization basis methods [3] and a subscale recovery variant [1] for non-smooth data, Sparse tensorization methods[2] utilizing node-nested hierarchies, Sampling methods[4] for high-dimensional random variable spaces.

  14. Evaluation of the impact of fetal fibronectin test implementation on hospital admissions for preterm labour in Ontario: a multiple baseline time-series design.

    PubMed

    Fell, D B; Sprague, A E; Grimshaw, J M; Yasseen, A S; Coyle, D; Dunn, S I; Perkins, S L; Peterson, W E; Johnson, M; Bunting, P S; Walker, M C

    2014-03-01

    To determine the impact of a health system-wide fetal fibronectin (fFN) testing programme on the rates of hospital admission for preterm labour (PTL). Multiple baseline time-series design. Canadian province of Ontario. A retrospective population-based cohort of antepartum and delivered obstetrical admissions in all Ontario hospitals between 1 April 2002 and 31 March 2010. International Classification of Diseases codes in a health system-wide hospital administrative database were used to identify the study population and define the outcome measure. An aggregate time series of monthly rates of hospital admissions for PTL was analysed using segmented regression models after aligning the fFN test implementation date for each institution. Rate of obstetrical hospital admission for PTL. Estimated rates of hospital admission for PTL following fFN implementation were lower than predicted had pre-implementation trends prevailed. The reduction in the rate was modest, but statistically significant, when estimated at 12 months following fFN implementation (-0.96 hospital admissions for PTL per 100 preterm births; 95% confidence interval [CI], -1.02 to -0.90, P = 0.04). The statistically significant reduction was sustained at 24 and 36 months following implementation. Using a robust quasi-experimental study design to overcome confounding as a result of underlying secular trends or concurrent interventions, we found evidence of a small but statistically significant reduction in the health system-level rate of hospital admissions for PTL following implementation of fFN testing in a large Canadian province. © 2013 Royal College of Obstetricians and Gynaecologists.

  15. The System-Wide Effect of Real-Time Audiovisual Feedback and Postevent Debriefing for In-Hospital Cardiac Arrest: The Cardiopulmonary Resuscitation Quality Improvement Initiative.

    PubMed

    Couper, Keith; Kimani, Peter K; Abella, Benjamin S; Chilwan, Mehboob; Cooke, Matthew W; Davies, Robin P; Field, Richard A; Gao, Fang; Quinton, Sarah; Stallard, Nigel; Woolley, Sarah; Perkins, Gavin D

    2015-11-01

    To evaluate the effect of implementing real-time audiovisual feedback with and without postevent debriefing on survival and quality of cardiopulmonary resuscitation quality at in-hospital cardiac arrest. A two-phase, multicentre prospective cohort study. Three UK hospitals, all part of one National Health Service Acute Trust. One thousand three hundred and ninety-five adult patients who sustained an in-hospital cardiac arrest at the study hospitals and were treated by hospital emergency teams between November 2009 and May 2013. During phase 1, quality of cardiopulmonary resuscitation and patient outcomes were measured with no intervention implemented. During phase 2, staff at hospital 1 received real-time audiovisual feedback, whereas staff at hospital 2 received real-time audiovisual feedback supplemented by postevent debriefing. No intervention was implemented at hospital 3 during phase 2. The primary outcome was return of spontaneous circulation. Secondary endpoints included other patient-focused outcomes, such as survival to hospital discharge, and process-focused outcomes, such as chest compression depth. Random-effect logistic and linear regression models, adjusted for baseline patient characteristics, were used to analyze the effect of the interventions on study outcomes. In comparison with no intervention, neither real-time audiovisual feedback (adjusted odds ratio, 0.62; 95% CI, 0.31-1.22; p=0.17) nor real-time audiovisual feedback supplemented by postevent debriefing (adjusted odds ratio, 0.65; 95% CI, 0.35-1.21; p=0.17) was associated with a statistically significant improvement in return of spontaneous circulation or any process-focused outcome. Despite this, there was evidence of a system-wide improvement in phase 2, leading to improvements in return of spontaneous circulation (adjusted odds ratio, 1.87; 95% CI, 1.06-3.30; p=0.03) and process-focused outcomes. Implementation of real-time audiovisual feedback with or without postevent debriefing did not lead to a measured improvement in patient or process-focused outcomes at individual hospital sites. However, there was an unexplained system-wide improvement in return of spontaneous circulation and process-focused outcomes during the second phase of the study.

  16. A comparison of policy and direct practice stakeholder perceptions of factors affecting evidence-based practice implementation using concept mapping

    PubMed Central

    2011-01-01

    Background The goal of this study was to assess potential differences between administrators/policymakers and those involved in direct practice regarding factors believed to be barriers or facilitating factors to evidence-based practice (EBP) implementation in a large public mental health service system in the United States. Methods Participants included mental health system county officials, agency directors, program managers, clinical staff, administrative staff, and consumers. As part of concept mapping procedures, brainstorming groups were conducted with each target group to identify specific factors believed to be barriers or facilitating factors to EBP implementation in a large public mental health system. Statements were sorted by similarity and rated by each participant in regard to their perceived importance and changeability. Multidimensional scaling, cluster analysis, descriptive statistics and t-tests were used to analyze the data. Results A total of 105 statements were distilled into 14 clusters using concept-mapping procedures. Perceptions of importance of factors affecting EBP implementation varied between the two groups, with those involved in direct practice assigning significantly higher ratings to the importance of Clinical Perceptions and the impact of EBP implementation on clinical practice. Consistent with previous studies, financial concerns (costs, funding) were rated among the most important and least likely to change by both groups. Conclusions EBP implementation is a complex process, and different stakeholders may hold different opinions regarding the relative importance of the impact of EBP implementation. Implementation efforts must include input from stakeholders at multiple levels to bring divergent and convergent perspectives to light. PMID:21899754

  17. A comparison of policy and direct practice stakeholder perceptions of factors affecting evidence-based practice implementation using concept mapping.

    PubMed

    Green, Amy E; Aarons, Gregory A

    2011-09-07

    The goal of this study was to assess potential differences between administrators/policymakers and those involved in direct practice regarding factors believed to be barriers or facilitating factors to evidence-based practice (EBP) implementation in a large public mental health service system in the United States. Participants included mental health system county officials, agency directors, program managers, clinical staff, administrative staff, and consumers. As part of concept mapping procedures, brainstorming groups were conducted with each target group to identify specific factors believed to be barriers or facilitating factors to EBP implementation in a large public mental health system. Statements were sorted by similarity and rated by each participant in regard to their perceived importance and changeability. Multidimensional scaling, cluster analysis, descriptive statistics and t-tests were used to analyze the data. A total of 105 statements were distilled into 14 clusters using concept-mapping procedures. Perceptions of importance of factors affecting EBP implementation varied between the two groups, with those involved in direct practice assigning significantly higher ratings to the importance of Clinical Perceptions and the impact of EBP implementation on clinical practice. Consistent with previous studies, financial concerns (costs, funding) were rated among the most important and least likely to change by both groups. EBP implementation is a complex process, and different stakeholders may hold different opinions regarding the relative importance of the impact of EBP implementation. Implementation efforts must include input from stakeholders at multiple levels to bring divergent and convergent perspectives to light.

  18. Development and Implementation of a Mental Health Work Rehabilitation Program: Results of a Developmental Evaluation.

    PubMed

    Sylvain, Chantal; Durand, Marie-José; Velasquez Sanchez, Astrid; Lessard, Nathalie; Maillette, Pascale

    2018-05-23

    Purpose Long-term work disability due to common mental disorders (CMDs) is a growing problem. Yet optimal interventions remain unclear and little is known about implementation challenges in everyday practice. This study aimed to support and evaluate, in real time, the development and implementation of a work rehabilitation program (WRP) designed to promote post-CMD return-to-work (RTW). Methods A 2-year developmental evaluation was performed using a participatory approach. At program outset, the researchers held five work meetings to revise the program's logic model and discuss its underlying change theory with clinicians. Data collection tools used throughout the study period were structured charts of activities conducted with workers (n = 41); in-depth interviews with program clinicians and managers (n = 9); and participant observation during work meetings. Quantitative data were analyzed using descriptive statistics. Qualitative data underwent thematic analysis using a processual approach. Results Three types of activity were developed and implemented: individual and group interventions targeting workers, and joint activities targeting partners (physicians, employers, others). While worker-targeted activities were generally implemented as planned, joint activities were sporadic. Analysis of the implementation process revealed five challenges faced by clinicians. Determinants included clinicians, host organization, sociopolitical context and resources provided by the evaluation. Conclusion The program studied is original in that it is based on the best available scientific knowledge, yet adapted to contextual particularities. The identified implementation challenges highlight the need for greater importance to be placed on the external, non-program context to ensure sustainable implementation in everyday practice.

  19. Design of Student Information Management Database Application System for Office and Departmental Target Responsibility System

    NASA Astrophysics Data System (ADS)

    Zhou, Hui

    It is the inevitable outcome of higher education reform to carry out office and departmental target responsibility system, in which statistical processing of student's information is an important part of student's performance review. On the basis of the analysis of the student's evaluation, the student information management database application system is designed by using relational database management system software in this paper. In order to implement the function of student information management, the functional requirement, overall structure, data sheets and fields, data sheet Association and software codes are designed in details.

  20. Implementation and audit of 'Fast-Track Surgery' in gynaecological oncology surgery.

    PubMed

    Sidhu, Verinder S; Lancaster, Letitia; Elliott, David; Brand, Alison H

    2012-08-01

    Fast-track surgery is a multidisciplinary approach to surgery that results in faster recovery from surgery and decreased length of stay (LOS). The aims of this study were as follows: (i) to report on the processes required for the introduction of fast-track surgery to a gynaecological oncology unit and (ii) to report the results of a clinical audit conducted after the protocol's implementation. A fast-track protocol, specific to our unit, was developed after a series of multidisciplinary meetings. The protocol, agreed upon by those involved in the care of women in our unit, was then introduced into clinical practice. An audit was conducted of all women undergoing laparotomy, with known or suspected malignancy. Information on LOS, complication and readmission rates was collected. Descriptive statistics and Poisson regression were used for statistical analysis. The developed protocol involved a multidisciplinary approach to pre-, intra- and postoperative care. The audit included 104 consecutive women over a 6-month period, who were followed for 6 weeks postoperatively. The median LOS was 4 days. The readmission rate was 7% and the complication rate was 19% (1% intraoperative, 4% major and 14% minor). Multivariate analysis revealed that increased duration of surgery and increasing age were predictors of longer LOS. The development of a fast-track protocol is achievable in a gynaecological oncology unit, with input from a multidisciplinary team. Effective implementation of the protocol can result in a short LOS, with acceptable complication and readmission rates when applied non-selectively to gynaecological oncology patients. © 2012 The Authors ANZJOG © 2012 The Royal Australian and New Zealand College of Obstetricians and Gynaecologists.

  1. System development over the monitoring for the purpose of early warning of population from the threat of landslides. (Invited)

    NASA Astrophysics Data System (ADS)

    Zakhidova, D. V.; Kadyrhodjaev, A.; Scientific Team Of Hydroengeo Institute On Natural Hazards

    2010-12-01

    Well-timed warning of the population about possible landslide threat is one of the main positions in order to provide safe and stable country development. The system of monitoring over dangerous geological processes includes such components, as observation, forecast, control and management. Aspects of forecasting take special place. Having wide row of observations there can be possible to reveal some regularity of the phenomena, basing on which, it is possible to proceed forecasting. We looked through many approaches of forecasting that are used in different countries. The analysis of the available work has allowed to draw up a conclusion that while referring to the question of landslide forecasting, it is necessary to approach in system form, taking into account interacting components of the nature. The study of landslide processes has shown that these processes lies within the framework of engineering-geological directions of the science and also interacts with tectonics, geomorphology, hydrogeology, hydrology, climate change, technogenesis and etc. Thereby, the necessity of system approach, achievements of modern science and technology the most expedient approach to make a decision at landslide forecasting is probabilistic-statistical method with complex use of geological and satellite data, specific images processed through geoinformation systems. In this connection, probabilistic-statistical approach, reflecting natural characteristics of interacting natural system, allows to take into account multi-factored processes of landslide activations. Among the many factors, influencing on landslide activation, there exist ones that are not amenable to numerical feature. The parameters of these factors have descriptive, qualitative, rather than quantitative nature. Leaving these factors with lack of attention is absolutely not reasonable. Proposed approach has one more advantage, which allows taking into account not only numerical, but also non-numeric parameters. Combination of multidisciplinary, systematic feature, multifactorness of the account, probabilistic and statistical methods of the calculation, complex use of geological and satellite data, using modern technology processing and analysis of information - all these aspects were collected in one at proposed by authors approach to solve the question of defining the area of possible landslide activation. Proposed by authors method could be a part of the monitoring system for early warning of landslide activation. Thus, the authors propose to initialize the project “System development over the monitoring for the purpose of early warning of population from the threat of landslides”. In the process of project implementation there to be revealed such results like: 1. System of Geo-indicators in order to early warn quick-running landslide processes. 2. United interconnected system for remote, surface and underground types of observations over Geo-indicators. 3. Notification system of population about forthcoming threat by means of alerts, light signals, mobilization of municipalities and related ministries. In the result of project implementation there considered to reveal economic, technical, and social outputs.

  2. Improving performances of the knee replacement surgery process by applying DMAIC principles.

    PubMed

    Improta, Giovanni; Balato, Giovanni; Romano, Maria; Ponsiglione, Alfonso Maria; Raiola, Eliana; Russo, Mario Alessandro; Cuccaro, Patrizia; Santillo, Liberatina Carmela; Cesarelli, Mario

    2017-12-01

    The work is a part of a project about the application of the Lean Six Sigma to improve health care processes. A previously published work regarding the hip replacement surgery has shown promising results. Here, we propose an application of the DMAIC (Define, Measure, Analyse, Improve, and Control) cycle to improve quality and reduce costs related to the prosthetic knee replacement surgery by decreasing patients' length of hospital stay (LOS) METHODS: The DMAIC cycle has been adopted to decrease the patients' LOS. The University Hospital "Federico II" of Naples, one of the most important university hospitals in Southern Italy, participated in this study. Data on 148 patients who underwent prosthetic knee replacement between 2010 and 2013 were used. Process mapping, statistical measures, brainstorming activities, and comparative analysis were performed to identify factors influencing LOS and improvement strategies. The study allowed the identification of variables influencing the prolongation of the LOS and the implementation of corrective actions to improve the process of care. The adopted actions reduced the LOS by 42%, from a mean value of 14.2 to 8.3 days (standard deviation also decreased from 5.2 to 2.3 days). The DMAIC approach has proven to be a helpful strategy ensuring a significant decreasing of the LOS. Furthermore, through its implementation, a significant reduction of the average costs of hospital stay can be achieved. Such a versatile approach could be applied to improve a wide range of health care processes. © 2017 John Wiley & Sons, Ltd.

  3. Path statistics, memory, and coarse-graining of continuous-time random walks on networks

    PubMed Central

    Kion-Crosby, Willow; Morozov, Alexandre V.

    2015-01-01

    Continuous-time random walks (CTRWs) on discrete state spaces, ranging from regular lattices to complex networks, are ubiquitous across physics, chemistry, and biology. Models with coarse-grained states (for example, those employed in studies of molecular kinetics) or spatial disorder can give rise to memory and non-exponential distributions of waiting times and first-passage statistics. However, existing methods for analyzing CTRWs on complex energy landscapes do not address these effects. Here we use statistical mechanics of the nonequilibrium path ensemble to characterize first-passage CTRWs on networks with arbitrary connectivity, energy landscape, and waiting time distributions. Our approach can be applied to calculating higher moments (beyond the mean) of path length, time, and action, as well as statistics of any conservative or non-conservative force along a path. For homogeneous networks, we derive exact relations between length and time moments, quantifying the validity of approximating a continuous-time process with its discrete-time projection. For more general models, we obtain recursion relations, reminiscent of transfer matrix and exact enumeration techniques, to efficiently calculate path statistics numerically. We have implemented our algorithm in PathMAN (Path Matrix Algorithm for Networks), a Python script that users can apply to their model of choice. We demonstrate the algorithm on a few representative examples which underscore the importance of non-exponential distributions, memory, and coarse-graining in CTRWs. PMID:26646868

  4. CERENA: ChEmical REaction Network Analyzer--A Toolbox for the Simulation and Analysis of Stochastic Chemical Kinetics.

    PubMed

    Kazeroonian, Atefeh; Fröhlich, Fabian; Raue, Andreas; Theis, Fabian J; Hasenauer, Jan

    2016-01-01

    Gene expression, signal transduction and many other cellular processes are subject to stochastic fluctuations. The analysis of these stochastic chemical kinetics is important for understanding cell-to-cell variability and its functional implications, but it is also challenging. A multitude of exact and approximate descriptions of stochastic chemical kinetics have been developed, however, tools to automatically generate the descriptions and compare their accuracy and computational efficiency are missing. In this manuscript we introduced CERENA, a toolbox for the analysis of stochastic chemical kinetics using Approximations of the Chemical Master Equation solution statistics. CERENA implements stochastic simulation algorithms and the finite state projection for microscopic descriptions of processes, the system size expansion and moment equations for meso- and macroscopic descriptions, as well as the novel conditional moment equations for a hybrid description. This unique collection of descriptions in a single toolbox facilitates the selection of appropriate modeling approaches. Unlike other software packages, the implementation of CERENA is completely general and allows, e.g., for time-dependent propensities and non-mass action kinetics. By providing SBML import, symbolic model generation and simulation using MEX-files, CERENA is user-friendly and computationally efficient. The availability of forward and adjoint sensitivity analyses allows for further studies such as parameter estimation and uncertainty analysis. The MATLAB code implementing CERENA is freely available from http://cerenadevelopers.github.io/CERENA/.

  5. CERENA: ChEmical REaction Network Analyzer—A Toolbox for the Simulation and Analysis of Stochastic Chemical Kinetics

    PubMed Central

    Kazeroonian, Atefeh; Fröhlich, Fabian; Raue, Andreas; Theis, Fabian J.; Hasenauer, Jan

    2016-01-01

    Gene expression, signal transduction and many other cellular processes are subject to stochastic fluctuations. The analysis of these stochastic chemical kinetics is important for understanding cell-to-cell variability and its functional implications, but it is also challenging. A multitude of exact and approximate descriptions of stochastic chemical kinetics have been developed, however, tools to automatically generate the descriptions and compare their accuracy and computational efficiency are missing. In this manuscript we introduced CERENA, a toolbox for the analysis of stochastic chemical kinetics using Approximations of the Chemical Master Equation solution statistics. CERENA implements stochastic simulation algorithms and the finite state projection for microscopic descriptions of processes, the system size expansion and moment equations for meso- and macroscopic descriptions, as well as the novel conditional moment equations for a hybrid description. This unique collection of descriptions in a single toolbox facilitates the selection of appropriate modeling approaches. Unlike other software packages, the implementation of CERENA is completely general and allows, e.g., for time-dependent propensities and non-mass action kinetics. By providing SBML import, symbolic model generation and simulation using MEX-files, CERENA is user-friendly and computationally efficient. The availability of forward and adjoint sensitivity analyses allows for further studies such as parameter estimation and uncertainty analysis. The MATLAB code implementing CERENA is freely available from http://cerenadevelopers.github.io/CERENA/. PMID:26807911

  6. Robots in food systems: a review and assessment of potential uses.

    PubMed

    Adams, E A; Messersmith, A M

    1986-04-01

    Management personnel in foodservice, food processing, and robot industries were surveyed to evaluate potential job functions for robots in the food industry. The survey instrument listed 64 different food-related job functions that participants were asked to assess as appropriate or not appropriate for robotic implementation. Demographic data were collected from each participant to determine any positive or negative influence on job function responses. The survey responses were statistically evaluated using frequencies and the chi-square test of significance. Sixteen of the 64 job functions were identified as appropriate for robot implementation in food industries by both robot manufacturing and food managers. The study indicated, first, that food managers lack knowledge about robots and robot manufacturing managers lack knowledge about food industries. Second, robots are not currently being used to any extent in the food industry. Third, analysis of the demographic data in relation to the 16 identified job functions showed no significant differences in responses.

  7. Real-time quality assurance testing using photonic techniques: Application to iodine water system

    NASA Technical Reports Server (NTRS)

    Arendale, W. F.; Hatcher, Richard; Garlington, Yadilett; Harwell, Jack; Everett, Tracey

    1990-01-01

    A feasibility study of the use of inspection systems incorporating photonic sensors and multivariate analyses to provide an instrumentation system that in real-time assures quality and that the system in control has been conducted. A system is in control when the near future of the product quality is predictable. Off-line chemical analyses can be used for a chemical process when slow kinetics allows time to take a sample to the laboratory and the system provides a recovery mechanism that returns the system to statistical control without intervention of the operator. The objective for this study has been the implementation of do-it-right-the-first-time and just-in-time philosophies. The Environment Control and Life Support Systems (ECLSS) water reclamation system that adds iodine for biocidal control is an ideal candidate for the study and implementation of do-it-right-the-first-time technologies.

  8. Assessing electronic health record systems in emergency departments: Using a decision analytic Bayesian model.

    PubMed

    Ben-Assuli, Ofir; Leshno, Moshe

    2016-09-01

    In the last decade, health providers have implemented information systems to improve accuracy in medical diagnosis and decision-making. This article evaluates the impact of an electronic health record on emergency department physicians' diagnosis and admission decisions. A decision analytic approach using a decision tree was constructed to model the admission decision process to assess the added value of medical information retrieved from the electronic health record. Using a Bayesian statistical model, this method was evaluated on two coronary artery disease scenarios. The results show that the cases of coronary artery disease were better diagnosed when the electronic health record was consulted and led to more informed admission decisions. Furthermore, the value of medical information required for a specific admission decision in emergency departments could be quantified. The findings support the notion that physicians and patient healthcare can benefit from implementing electronic health record systems in emergency departments. © The Author(s) 2015.

  9. Statistics of software vulnerability detection in certification testing

    NASA Astrophysics Data System (ADS)

    Barabanov, A. V.; Markov, A. S.; Tsirlov, V. L.

    2018-05-01

    The paper discusses practical aspects of introduction of the methods to detect software vulnerability in the day-to-day activities of the accredited testing laboratory. It presents the approval results of the vulnerability detection methods as part of the study of the open source software and the software that is a test object of the certification tests under information security requirements, including software for communication networks. Results of the study showing the allocation of identified vulnerabilities by types of attacks, country of origin, programming languages used in the development, methods for detecting vulnerability, etc. are given. The experience of foreign information security certification systems related to the detection of certified software vulnerabilities is analyzed. The main conclusion based on the study is the need to implement practices for developing secure software in the development life cycle processes. The conclusions and recommendations for the testing laboratories on the implementation of the vulnerability analysis methods are laid down.

  10. Evaluation of Secure Computation in a Distributed Healthcare Setting.

    PubMed

    Kimura, Eizen; Hamada, Koki; Kikuchi, Ryo; Chida, Koji; Okamoto, Kazuya; Manabe, Shirou; Kuroda, Tomohiko; Matsumura, Yasushi; Takeda, Toshihiro; Mihara, Naoki

    2016-01-01

    Issues related to ensuring patient privacy and data ownership in clinical repositories prevent the growth of translational research. Previous studies have used an aggregator agent to obscure clinical repositories from the data user, and to ensure the privacy of output using statistical disclosure control. However, there remain several issues that must be considered. One such issue is that a data breach may occur when multiple nodes conspire. Another is that the agent may eavesdrop on or leak a user's queries and their results. We have implemented a secure computing method so that the data used by each party can be kept confidential even if all of the other parties conspire to crack the data. We deployed our implementation at three geographically distributed nodes connected to a high-speed layer two network. The performance of our method, with respect to processing times, suggests suitability for practical use.

  11. Optical recognition of statistical patterns

    NASA Astrophysics Data System (ADS)

    Lee, S. H.

    1981-12-01

    Optical implementation of the Fukunaga-Koontz transform (FKT) and the Least-Squares Linear Mapping Technique (LSLMT) is described. The FKT is a linear transformation which performs image feature extraction for a two-class image classification problem. The LSLMT performs a transform from large dimensional feature space to small dimensional decision space for separating multiple image classes by maximizing the interclass differences while minimizing the intraclass variations. The FKT and the LSLMT were optically implemented by utilizing a coded phase optical processor. The transform was used for classifying birds and fish. After the F-K basis functions were calculated, those most useful for classification were incorporated into a computer generated hologram. The output of the optical processor, consisting of the squared magnitude of the F-K coefficients, was detected by a T.V. camera, digitized, and fed into a micro-computer for classification. A simple linear classifier based on only two F-K coefficients was able to separate the images into two classes, indicating that the F-K transform had chosen good features. Two advantages of optically implementing the FKT and LSLMT are parallel and real time processing.

  12. Optical recognition of statistical patterns

    NASA Technical Reports Server (NTRS)

    Lee, S. H.

    1981-01-01

    Optical implementation of the Fukunaga-Koontz transform (FKT) and the Least-Squares Linear Mapping Technique (LSLMT) is described. The FKT is a linear transformation which performs image feature extraction for a two-class image classification problem. The LSLMT performs a transform from large dimensional feature space to small dimensional decision space for separating multiple image classes by maximizing the interclass differences while minimizing the intraclass variations. The FKT and the LSLMT were optically implemented by utilizing a coded phase optical processor. The transform was used for classifying birds and fish. After the F-K basis functions were calculated, those most useful for classification were incorporated into a computer generated hologram. The output of the optical processor, consisting of the squared magnitude of the F-K coefficients, was detected by a T.V. camera, digitized, and fed into a micro-computer for classification. A simple linear classifier based on only two F-K coefficients was able to separate the images into two classes, indicating that the F-K transform had chosen good features. Two advantages of optically implementing the FKT and LSLMT are parallel and real time processing.

  13. Implementation of malaria dynamic models in municipality level early warning systems in Colombia. Part I: description of study sites.

    PubMed

    Ruiz, Daniel; Cerón, Viviana; Molina, Adriana M; Quiñónes, Martha L; Jiménez, Mónica M; Ahumada, Martha; Gutiérrez, Patricia; Osorio, Salua; Mantilla, Gilma; Connor, Stephen J; Thomson, Madeleine C

    2014-07-01

    As part of the Integrated National Adaptation Pilot project and the Integrated Surveillance and Control System, the Colombian National Institute of Health is working on the design and implementation of a Malaria Early Warning System framework, supported by seasonal climate forecasting capabilities, weather and environmental monitoring, and malaria statistical and dynamic models. In this report, we provide an overview of the local ecoepidemiologic settings where four malaria process-based mathematical models are currently being implemented at a municipal level. The description includes general characteristics, malaria situation (predominant type of infection, malaria-positive cases data, malaria incidence, and seasonality), entomologic conditions (primary and secondary vectors, mosquito densities, and feeding frequencies), climatic conditions (climatology and long-term trends), key drivers of epidemic outbreaks, and non-climatic factors (populations at risk, control campaigns, and socioeconomic conditions). Selected pilot sites exhibit different ecoepidemiologic settings that must be taken into account in the development of the integrated surveillance and control system. © The American Society of Tropical Medicine and Hygiene.

  14. Implementing the WHO/UNICEF Baby Friendly Initiative in the community: a 'hearts and minds' approach.

    PubMed

    Thomson, Gill; Bilson, Andy; Dykes, Fiona

    2012-04-01

    to describe a 'hearts and minds' approach to community Baby Friendly Initiative implementation developed from the views of multidisciplinary professionals. a qualitative descriptive study utilising focus groups and interviews, with thematic networks analysis conducted. forty-seven professionals were consulted from two primary health-care facilities located in the North-West of England. thematic networks analysis generated a global theme of a 'hearts and minds approach' to BFI implementation, which embodies emotional and rational engagement. The three underpinning organising themes (and their associated basic themes): 'credible leadership', 'engagement of key partners' and 'changing attitudes and practice' reflect the context, processes and outcomes of a 'hearts and minds' approach. a 'hearts and minds' approach transcends the prescriptive aspects of a macro-level intervention with its emphasis upon audits, training, statistics and 'hard' evidence through valuing other professionals and engaging staff at all levels. It offers insights into how organisational change may move beyond traditional top-down mechanisms for driving change to incorporate ways that value others and promote cooperation and reflection. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Robust crop and weed segmentation under uncontrolled outdoor illumination.

    PubMed

    Jeon, Hong Y; Tian, Lei F; Zhu, Heping

    2011-01-01

    An image processing algorithm for detecting individual weeds was developed and evaluated. Weed detection processes included were normalized excessive green conversion, statistical threshold value estimation, adaptive image segmentation, median filter, morphological feature calculation and Artificial Neural Network (ANN). The developed algorithm was validated for its ability to identify and detect weeds and crop plants under uncontrolled outdoor illuminations. A machine vision implementing field robot captured field images under outdoor illuminations and the image processing algorithm automatically processed them without manual adjustment. The errors of the algorithm, when processing 666 field images, ranged from 2.1 to 2.9%. The ANN correctly detected 72.6% of crop plants from the identified plants, and considered the rest as weeds. However, the ANN identification rates for crop plants were improved up to 95.1% by addressing the error sources in the algorithm. The developed weed detection and image processing algorithm provides a novel method to identify plants against soil background under the uncontrolled outdoor illuminations, and to differentiate weeds from crop plants. Thus, the proposed new machine vision and processing algorithm may be useful for outdoor applications including plant specific direct applications (PSDA).

  16. SUNPLIN: Simulation with Uncertainty for Phylogenetic Investigations

    PubMed Central

    2013-01-01

    Background Phylogenetic comparative analyses usually rely on a single consensus phylogenetic tree in order to study evolutionary processes. However, most phylogenetic trees are incomplete with regard to species sampling, which may critically compromise analyses. Some approaches have been proposed to integrate non-molecular phylogenetic information into incomplete molecular phylogenies. An expanded tree approach consists of adding missing species to random locations within their clade. The information contained in the topology of the resulting expanded trees can be captured by the pairwise phylogenetic distance between species and stored in a matrix for further statistical analysis. Thus, the random expansion and processing of multiple phylogenetic trees can be used to estimate the phylogenetic uncertainty through a simulation procedure. Because of the computational burden required, unless this procedure is efficiently implemented, the analyses are of limited applicability. Results In this paper, we present efficient algorithms and implementations for randomly expanding and processing phylogenetic trees so that simulations involved in comparative phylogenetic analysis with uncertainty can be conducted in a reasonable time. We propose algorithms for both randomly expanding trees and calculating distance matrices. We made available the source code, which was written in the C++ language. The code may be used as a standalone program or as a shared object in the R system. The software can also be used as a web service through the link: http://purl.oclc.org/NET/sunplin/. Conclusion We compare our implementations to similar solutions and show that significant performance gains can be obtained. Our results open up the possibility of accounting for phylogenetic uncertainty in evolutionary and ecological analyses of large datasets. PMID:24229408

  17. SUNPLIN: simulation with uncertainty for phylogenetic investigations.

    PubMed

    Martins, Wellington S; Carmo, Welton C; Longo, Humberto J; Rosa, Thierson C; Rangel, Thiago F

    2013-11-15

    Phylogenetic comparative analyses usually rely on a single consensus phylogenetic tree in order to study evolutionary processes. However, most phylogenetic trees are incomplete with regard to species sampling, which may critically compromise analyses. Some approaches have been proposed to integrate non-molecular phylogenetic information into incomplete molecular phylogenies. An expanded tree approach consists of adding missing species to random locations within their clade. The information contained in the topology of the resulting expanded trees can be captured by the pairwise phylogenetic distance between species and stored in a matrix for further statistical analysis. Thus, the random expansion and processing of multiple phylogenetic trees can be used to estimate the phylogenetic uncertainty through a simulation procedure. Because of the computational burden required, unless this procedure is efficiently implemented, the analyses are of limited applicability. In this paper, we present efficient algorithms and implementations for randomly expanding and processing phylogenetic trees so that simulations involved in comparative phylogenetic analysis with uncertainty can be conducted in a reasonable time. We propose algorithms for both randomly expanding trees and calculating distance matrices. We made available the source code, which was written in the C++ language. The code may be used as a standalone program or as a shared object in the R system. The software can also be used as a web service through the link: http://purl.oclc.org/NET/sunplin/. We compare our implementations to similar solutions and show that significant performance gains can be obtained. Our results open up the possibility of accounting for phylogenetic uncertainty in evolutionary and ecological analyses of large datasets.

  18. Statistical Research of Investment Development of Russian Regions

    ERIC Educational Resources Information Center

    Burtseva, Tatiana A.; Aleshnikova, Vera I.; Dubovik, Mayya V.; Naidenkova, Ksenya V.; Kovalchuk, Nadezda B.; Repetskaya, Natalia V.; Kuzmina, Oksana G.; Surkov, Anton A.; Bershadskaya, Olga I.; Smirennikova, Anna V.

    2016-01-01

    This article the article is concerned with a substantiation of procedures ensuring the implementation of statistical research and monitoring of investment development of the Russian regions, which would be pertinent for modern development of the state statistics. The aim of the study is to develop the methodological framework in order to estimate…

  19. A Realistic Experimental Design and Statistical Analysis Project

    ERIC Educational Resources Information Center

    Muske, Kenneth R.; Myers, John A.

    2007-01-01

    A realistic applied chemical engineering experimental design and statistical analysis project is documented in this article. This project has been implemented as part of the professional development and applied statistics courses at Villanova University over the past five years. The novel aspects of this project are that the students are given a…

  20. The Effect of "Clickers" on Attendance in an Introductory Statistics Course: An Action Research Study

    ERIC Educational Resources Information Center

    Amstelveen, Raoul H.

    2013-01-01

    The purpose of this study was to design and implement a Classroom Response System, also known as a "clicker," to increase attendance in introductory statistics courses at an undergraduate university. Since 2010, non-attendance had been prevalent in introductory statistics courses. Moreover, non-attendance created undesirable classrooms…

  1. Improving Statistics Education through Simulations: The Case of the Sampling Distribution.

    ERIC Educational Resources Information Center

    Earley, Mark A.

    This paper presents a summary of action research investigating statistics students' understandings of the sampling distribution of the mean. With four sections of an introductory Statistics in Education course (n=98 students), a computer simulation activity (R. delMas, J. Garfield, and B. Chance, 1999) was implemented and evaluated to show…

  2. A Matlab user interface for the statistically assisted fluid registration algorithm and tensor-based morphometry

    NASA Astrophysics Data System (ADS)

    Yepes-Calderon, Fernando; Brun, Caroline; Sant, Nishita; Thompson, Paul; Lepore, Natasha

    2015-01-01

    Tensor-Based Morphometry (TBM) is an increasingly popular method for group analysis of brain MRI data. The main steps in the analysis consist of a nonlinear registration to align each individual scan to a common space, and a subsequent statistical analysis to determine morphometric differences, or difference in fiber structure between groups. Recently, we implemented the Statistically-Assisted Fluid Registration Algorithm or SAFIRA,1 which is designed for tracking morphometric differences among populations. To this end, SAFIRA allows the inclusion of statistical priors extracted from the populations being studied as regularizers in the registration. This flexibility and degree of sophistication limit the tool to expert use, even more so considering that SAFIRA was initially implemented in command line mode. Here, we introduce a new, intuitive, easy to use, Matlab-based graphical user interface for SAFIRA's multivariate TBM. The interface also generates different choices for the TBM statistics, including both the traditional univariate statistics on the Jacobian matrix, and comparison of the full deformation tensors.2 This software will be freely disseminated to the neuroimaging research community.

  3. Optical Processing of Speckle Images with Bacteriorhodopsin for Pattern Recognition

    NASA Technical Reports Server (NTRS)

    Downie, John D.; Tucker, Deanne (Technical Monitor)

    1994-01-01

    Logarithmic processing of images with multiplicative noise characteristics can be utilized to transform the image into one with an additive noise distribution. This simplifies subsequent image processing steps for applications such as image restoration or correlation for pattern recognition. One particularly common form of multiplicative noise is speckle, for which the logarithmic operation not only produces additive noise, but also makes it of constant variance (signal-independent). We examine the optical transmission properties of some bacteriorhodopsin films here and find them well suited to implement such a pointwise logarithmic transformation optically in a parallel fashion. We present experimental results of the optical conversion of speckle images into transformed images with additive, signal-independent noise statistics using the real-time photochromic properties of bacteriorhodopsin. We provide an example of improved correlation performance in terms of correlation peak signal-to-noise for such a transformed speckle image.

  4. Fermi-level effects in semiconductor processing: A modeling scheme for atomistic kinetic Monte Carlo simulators

    NASA Astrophysics Data System (ADS)

    Martin-Bragado, I.; Castrillo, P.; Jaraiz, M.; Pinacho, R.; Rubio, J. E.; Barbolla, J.; Moroz, V.

    2005-09-01

    Atomistic process simulation is expected to play an important role for the development of next generations of integrated circuits. This work describes an approach for modeling electric charge effects in a three-dimensional atomistic kinetic Monte Carlo process simulator. The proposed model has been applied to the diffusion of electrically active boron and arsenic atoms in silicon. Several key aspects of the underlying physical mechanisms are discussed: (i) the use of the local Debye length to smooth out the atomistic point-charge distribution, (ii) algorithms to correctly update the charge state in a physically accurate and computationally efficient way, and (iii) an efficient implementation of the drift of charged particles in an electric field. High-concentration effects such as band-gap narrowing and degenerate statistics are also taken into account. The efficiency, accuracy, and relevance of the model are discussed.

  5. Experiences in using DISCUS for visualizing human communication

    NASA Astrophysics Data System (ADS)

    Groehn, Matti; Nieminen, Marko; Haho, Paeivi; Smeds, Riitta

    2000-02-01

    In this paper, we present further improvement to the DISCUS software that can be used to record and analyze the flow and constants of business process simulation session discussion. The tool was initially introduced in 'visual data exploration and analysis IV' conference. The initial features of the tool enabled the visualization of discussion flow in business process simulation sessions and the creation of SOM analyses. The improvements of the tool consists of additional visualization possibilities that enable quick on-line analyses and improved graphical statistics. We have also created the very first interface to audio data and implemented two ways to visualize it. We also outline additional possibilities to use the tool in other application areas: these include usability testing and the possibility to use the tool for capturing design rationale in a product development process. The data gathered with DISCUS may be used in other applications, and further work may be done with data ming techniques.

  6. Real-time image dehazing using local adaptive neighborhoods and dark-channel-prior

    NASA Astrophysics Data System (ADS)

    Valderrama, Jesus A.; Díaz-Ramírez, Víctor H.; Kober, Vitaly; Hernandez, Enrique

    2015-09-01

    A real-time algorithm for single image dehazing is presented. The algorithm is based on calculation of local neighborhoods of a hazed image inside a moving window. The local neighborhoods are constructed by computing rank-order statistics. Next the dark-channel-prior approach is applied to the local neighborhoods to estimate the transmission function of the scene. By using the suggested approach there is no need for applying a refining algorithm to the estimated transmission such as the soft matting algorithm. To achieve high-rate signal processing the proposed algorithm is implemented exploiting massive parallelism on a graphics processing unit (GPU). Computer simulation results are carried out to test the performance of the proposed algorithm in terms of dehazing efficiency and speed of processing. These tests are performed using several synthetic and real images. The obtained results are analyzed and compared with those obtained with existing dehazing algorithms.

  7. [Development and clinical evaluation of an anesthesia information management system].

    PubMed

    Feng, Jing-yi; Chen, Hua; Zhu, Sheng-mei

    2010-09-21

    To study the design, implementation and clinical evaluation of an anesthesia information management system. To record, process and store peri-operative patient data automatically, all kinds of bedside monitoring equipments are connected into the system based on information integrating technology; after a statistical analysis of those patient data by data mining technology, patient status can be evaluated automatically based on risk prediction standard and decision support system, and then anesthetist could perform reasonable and safe clinical processes; with clinical processes electronically recorded, standard record tables could be generated, and clinical workflow is optimized, as well. With the system, kinds of patient data could be collected, stored, analyzed and archived, kinds of anesthesia documents could be generated, and patient status could be evaluated to support clinic decision. The anesthesia information management system is useful for improving anesthesia quality, decreasing risk of patient and clinician, and aiding to provide clinical proof.

  8. Development and Testing of Data Mining Algorithms for Earth Observation

    NASA Technical Reports Server (NTRS)

    Glymour, Clark

    2005-01-01

    The new algorithms developed under this project included a principled procedure for classification of objects, events or circumstances according to a target variable when a very large number of potential predictor variables is available but the number of cases that can be used for training a classifier is relatively small. These "high dimensional" problems require finding a minimal set of variables -called the Markov Blanket-- sufficient for predicting the value of the target variable. An algorithm, the Markov Blanket Fan Search, was developed, implemented and tested on both simulated and real data in conjunction with a graphical model classifier, which was also implemented. Another algorithm developed and implemented in TETRAD IV for time series elaborated on work by C. Granger and N. Swanson, which in turn exploited some of our earlier work. The algorithms in question learn a linear time series model from data. Given such a time series, the simultaneous residual covariances, after factoring out time dependencies, may provide information about causal processes that occur more rapidly than the time series representation allow, so called simultaneous or contemporaneous causal processes. Working with A. Monetta, a graduate student from Italy, we produced the correct statistics for estimating the contemporaneous causal structure from time series data using the TETRAD IV suite of algorithms. Two economists, David Bessler and Kevin Hoover, have independently published applications using TETRAD style algorithms to the same purpose. These implementations and algorithmic developments were separately used in two kinds of studies of climate data: Short time series of geographically proximate climate variables predicting agricultural effects in California, and longer duration climate measurements of temperature teleconnections.

  9. The knowledge value-chain of genetic counseling for breast cancer: an empirical assessment of prediction and communication processes.

    PubMed

    Amara, Nabil; Blouin-Bougie, Jolyane; Jbilou, Jalila; Halilem, Norrin; Simard, Jacques; Landry, Réjean

    2016-01-01

    The aim of this paper is twofold: to analyze the genetic counseling process for breast cancer with a theoretical knowledge transfer lens and to compare generalists, medical specialists, and genetic counselors with regards to their genetic counseling practices. This paper presents the genetic counseling process occurring within a chain of value-adding activities of four main stages describing health professionals' clinical practices: (1) evaluation, (2) investigation, (3) information, and (4) decision. It also presents the results of a cross-sectional study based on a Canadian medical doctors and genetic counselors survey (n = 176) realized between July 2012 and March 2013. The statistical exercise included descriptive statistics, one-way ANOVA and post-hoc tests. The results indicate that even though all types of health professionals are involved in the entire process of genetic counseling for breast cancer, genetic counselors are more involved in the evaluation of breast cancer risk, while medical doctors are more active in the decision toward breast cancer risk management strategies. The results secondly demonstrate the relevance and the key role of genetic counselors in the care provided to women at-risk of familial breast cancer. This paper presents an integrative framework to understand the current process of genetic counseling for breast cancer in Canada, and to shed light on how and where health professionals contribute to the process. It also offers a starting point for assessing clinical practices in genetic counseling in order to establish more clearly where and to what extent efforts should be undertaken to implement future genetic services.

  10. Process control charts in infection prevention: Make it simple to make it happen.

    PubMed

    Wiemken, Timothy L; Furmanek, Stephen P; Carrico, Ruth M; Mattingly, William A; Persaud, Annuradha K; Guinn, Brian E; Kelley, Robert R; Ramirez, Julio A

    2017-03-01

    Quality improvement is central to Infection Prevention and Control (IPC) programs. Challenges may occur when applying quality improvement methodologies like process control charts, often due to the limited exposure of typical IPs. Because of this, our team created an open-source database with a process control chart generator for IPC programs. The objectives of this report are to outline the development of the application and demonstrate application using simulated data. We used Research Electronic Data Capture (REDCap Consortium, Vanderbilt University, Nashville, TN), R (R Foundation for Statistical Computing, Vienna, Austria), and R Studio Shiny (R Foundation for Statistical Computing) to create an open source data collection system with automated process control chart generation. We used simulated data to test and visualize both in-control and out-of-control processes for commonly used metrics in IPC programs. The R code for implementing the control charts and Shiny application can be found on our Web site (https://github.com/ul-research-support/spcapp). Screen captures of the workflow and simulated data indicating both common cause and special cause variation are provided. Process control charts can be easily developed based on individual facility needs using freely available software. Through providing our work free to all interested parties, we hope that others will be able to harness the power and ease of use of the application for improving the quality of care and patient safety in their facilities. Copyright © 2017 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  11. Automated process control for plasma etching

    NASA Astrophysics Data System (ADS)

    McGeown, Margaret; Arshak, Khalil I.; Murphy, Eamonn

    1992-06-01

    This paper discusses the development and implementation of a rule-based system which assists in providing automated process control for plasma etching. The heart of the system is to establish a correspondence between a particular data pattern -- sensor or data signals -- and one or more modes of failure, i.e., a data-driven monitoring approach. The objective of this rule based system, PLETCHSY, is to create a program combining statistical process control (SPC) and fault diagnosis to help control a manufacturing process which varies over time. This can be achieved by building a process control system (PCS) with the following characteristics. A facility to monitor the performance of the process by obtaining and analyzing the data relating to the appropriate process variables. Process sensor/status signals are input into an SPC module. If trends are present, the SPC module outputs the last seven control points, a pattern which is represented by either regression or scoring. The pattern is passed to the rule-based module. When the rule-based system recognizes a pattern, it starts the diagnostic process using the pattern. If the process is considered to be going out of control, advice is provided about actions which should be taken to bring the process back into control.

  12. Effect of an institutional development plan for user participation on professionals' knowledge, practice, and attitudes. A controlled study

    PubMed Central

    2011-01-01

    Background Governments in several countries attempt to strengthen user participation through instructing health care organisations to plan and implement activities such as user representation in administrational boards, improved information to users, and more individual user participation in clinical work. The professionals are central in implementing initiatives to enhance user participation in organisations, but no controlled studies have been conducted on the effect on professionals from implementing institutional development plans. The objective was to investigate whether implementing a development plan intending to enhance user participation in a mental health hospital had any effect on the professionals' knowledge, practice, or attitudes towards user participation. Methods This was a non-randomized controlled study including professionals from three mental health hospitals in Central Norway. A development plan intended to enhance user participation was implemented in one of the hospitals as a part of a larger re-organizational process. The plan included i.e. establishing a patient education centre and a user office, purchasing of user expertise, appointing contact professionals for next of kin, and improving of the centre's information and the professional culture. The professionals at the intervention hospital thus constituted the intervention group, while the professionals at two other hospitals participated as control group. All professionals were invited to answer the Consumer Participation Questionnaire (CPQ) and additional questions, focusing on knowledge, practice, and attitudes towards user participation, two times with a 16 months interval. Results A total of 438 professionals participated (55% response rate). Comparing the changes in the intervention group with the changes in the control group revealed no statistically significant differences at a 0.05 level. The implementation of the development plan thus had no measurable effect on the professionals' knowledge, practice, or attitudes at the intervention hospital, compared to the control hospitals. Conclusion This is the first controlled study on the effect on professionals from implementing a development plan to enhance user participation in a mental health hospital. The plan had no effect on professionals' knowledge, practice, or attitudes. This can be due to the quality of the development plan, the implementation process, and/or the suitability of the outcome measures. PMID:22047466

  13. Effect of an institutional development plan for user participation on professionals' knowledge, practice, and attitudes. A controlled study.

    PubMed

    Rise, Marit By; Grimstad, Hilde; Solbjør, Marit; Steinsbekk, Aslak

    2011-11-02

    Governments in several countries attempt to strengthen user participation through instructing health care organisations to plan and implement activities such as user representation in administrational boards, improved information to users, and more individual user participation in clinical work. The professionals are central in implementing initiatives to enhance user participation in organisations, but no controlled studies have been conducted on the effect on professionals from implementing institutional development plans. The objective was to investigate whether implementing a development plan intending to enhance user participation in a mental health hospital had any effect on the professionals' knowledge, practice, or attitudes towards user participation. This was a non-randomized controlled study including professionals from three mental health hospitals in Central Norway. A development plan intended to enhance user participation was implemented in one of the hospitals as a part of a larger re-organizational process. The plan included i.e. establishing a patient education centre and a user office, purchasing of user expertise, appointing contact professionals for next of kin, and improving of the centre's information and the professional culture. The professionals at the intervention hospital thus constituted the intervention group, while the professionals at two other hospitals participated as control group. All professionals were invited to answer the Consumer Participation Questionnaire (CPQ) and additional questions, focusing on knowledge, practice, and attitudes towards user participation, two times with a 16 months interval. A total of 438 professionals participated (55% response rate). Comparing the changes in the intervention group with the changes in the control group revealed no statistically significant differences at a 0.05 level. The implementation of the development plan thus had no measurable effect on the professionals' knowledge, practice, or attitudes at the intervention hospital, compared to the control hospitals. This is the first controlled study on the effect on professionals from implementing a development plan to enhance user participation in a mental health hospital. The plan had no effect on professionals' knowledge, practice, or attitudes. This can be due to the quality of the development plan, the implementation process, and/or the suitability of the outcome measures.

  14. Hammerstein system represention of financial volatility processes

    NASA Astrophysics Data System (ADS)

    Capobianco, E.

    2002-05-01

    We show new modeling aspects of stock return volatility processes, by first representing them through Hammerstein Systems, and by then approximating the observed and transformed dynamics with wavelet-based atomic dictionaries. We thus propose an hybrid statistical methodology for volatility approximation and non-parametric estimation, and aim to use the information embedded in a bank of volatility sources obtained by decomposing the observed signal with multiresolution techniques. Scale dependent information refers both to market activity inherent to different temporally aggregated trading horizons, and to a variable degree of sparsity in representing the signal. A decomposition of the expansion coefficients in least dependent coordinates is then implemented through Independent Component Analysis. Based on the described steps, the features of volatility can be more effectively detected through global and greedy algorithms.

  15. Harnessing QbD, Programming Languages, and Automation for Reproducible Biology.

    PubMed

    Sadowski, Michael I; Grant, Chris; Fell, Tim S

    2016-03-01

    Building robust manufacturing processes from biological components is a task that is highly complex and requires sophisticated tools to describe processes, inputs, and measurements and administrate management of knowledge, data, and materials. We argue that for bioengineering to fully access biological potential, it will require application of statistically designed experiments to derive detailed empirical models of underlying systems. This requires execution of large-scale structured experimentation for which laboratory automation is necessary. This requires development of expressive, high-level languages that allow reusability of protocols, characterization of their reliability, and a change in focus from implementation details to functional properties. We review recent developments in these areas and identify what we believe is an exciting trend that promises to revolutionize biotechnology. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Reducing statistics anxiety and enhancing statistics learning achievement: effectiveness of a one-minute strategy.

    PubMed

    Chiou, Chei-Chang; Wang, Yu-Min; Lee, Li-Tze

    2014-08-01

    Statistical knowledge is widely used in academia; however, statistics teachers struggle with the issue of how to reduce students' statistics anxiety and enhance students' statistics learning. This study assesses the effectiveness of a "one-minute paper strategy" in reducing students' statistics-related anxiety and in improving students' statistics-related achievement. Participants were 77 undergraduates from two classes enrolled in applied statistics courses. An experiment was implemented according to a pretest/posttest comparison group design. The quasi-experimental design showed that the one-minute paper strategy significantly reduced students' statistics anxiety and improved students' statistics learning achievement. The strategy was a better instructional tool than the textbook exercise for reducing students' statistics anxiety and improving students' statistics achievement.

  17. Evaluating and implementing temporal, spatial, and spatio-temporal methods for outbreak detection in a local syndromic surveillance system

    PubMed Central

    Lall, Ramona; Levin-Rector, Alison; Sell, Jessica; Paladini, Marc; Konty, Kevin J.; Olson, Don; Weiss, Don

    2017-01-01

    The New York City Department of Health and Mental Hygiene has operated an emergency department syndromic surveillance system since 2001, using temporal and spatial scan statistics run on a daily basis for cluster detection. Since the system was originally implemented, a number of new methods have been proposed for use in cluster detection. We evaluated six temporal and four spatial/spatio-temporal detection methods using syndromic surveillance data spiked with simulated injections. The algorithms were compared on several metrics, including sensitivity, specificity, positive predictive value, coherence, and timeliness. We also evaluated each method’s implementation, programming time, run time, and the ease of use. Among the temporal methods, at a set specificity of 95%, a Holt-Winters exponential smoother performed the best, detecting 19% of the simulated injects across all shapes and sizes, followed by an autoregressive moving average model (16%), a generalized linear model (15%), a modified version of the Early Aberration Reporting System’s C2 algorithm (13%), a temporal scan statistic (11%), and a cumulative sum control chart (<2%). Of the spatial/spatio-temporal methods we tested, a spatial scan statistic detected 3% of all injects, a Bayes regression found 2%, and a generalized linear mixed model and a space-time permutation scan statistic detected none at a specificity of 95%. Positive predictive value was low (<7%) for all methods. Overall, the detection methods we tested did not perform well in identifying the temporal and spatial clusters of cases in the inject dataset. The spatial scan statistic, our current method for spatial cluster detection, performed slightly better than the other tested methods across different inject magnitudes and types. Furthermore, we found the scan statistics, as applied in the SaTScan software package, to be the easiest to program and implement for daily data analysis. PMID:28886112

  18. Evaluating and implementing temporal, spatial, and spatio-temporal methods for outbreak detection in a local syndromic surveillance system.

    PubMed

    Mathes, Robert W; Lall, Ramona; Levin-Rector, Alison; Sell, Jessica; Paladini, Marc; Konty, Kevin J; Olson, Don; Weiss, Don

    2017-01-01

    The New York City Department of Health and Mental Hygiene has operated an emergency department syndromic surveillance system since 2001, using temporal and spatial scan statistics run on a daily basis for cluster detection. Since the system was originally implemented, a number of new methods have been proposed for use in cluster detection. We evaluated six temporal and four spatial/spatio-temporal detection methods using syndromic surveillance data spiked with simulated injections. The algorithms were compared on several metrics, including sensitivity, specificity, positive predictive value, coherence, and timeliness. We also evaluated each method's implementation, programming time, run time, and the ease of use. Among the temporal methods, at a set specificity of 95%, a Holt-Winters exponential smoother performed the best, detecting 19% of the simulated injects across all shapes and sizes, followed by an autoregressive moving average model (16%), a generalized linear model (15%), a modified version of the Early Aberration Reporting System's C2 algorithm (13%), a temporal scan statistic (11%), and a cumulative sum control chart (<2%). Of the spatial/spatio-temporal methods we tested, a spatial scan statistic detected 3% of all injects, a Bayes regression found 2%, and a generalized linear mixed model and a space-time permutation scan statistic detected none at a specificity of 95%. Positive predictive value was low (<7%) for all methods. Overall, the detection methods we tested did not perform well in identifying the temporal and spatial clusters of cases in the inject dataset. The spatial scan statistic, our current method for spatial cluster detection, performed slightly better than the other tested methods across different inject magnitudes and types. Furthermore, we found the scan statistics, as applied in the SaTScan software package, to be the easiest to program and implement for daily data analysis.

  19. Application of microarray analysis on computer cluster and cloud platforms.

    PubMed

    Bernau, C; Boulesteix, A-L; Knaus, J

    2013-01-01

    Analysis of recent high-dimensional biological data tends to be computationally intensive as many common approaches such as resampling or permutation tests require the basic statistical analysis to be repeated many times. A crucial advantage of these methods is that they can be easily parallelized due to the computational independence of the resampling or permutation iterations, which has induced many statistics departments to establish their own computer clusters. An alternative is to rent computing resources in the cloud, e.g. at Amazon Web Services. In this article we analyze whether a selection of statistical projects, recently implemented at our department, can be efficiently realized on these cloud resources. Moreover, we illustrate an opportunity to combine computer cluster and cloud resources. In order to compare the efficiency of computer cluster and cloud implementations and their respective parallelizations we use microarray analysis procedures and compare their runtimes on the different platforms. Amazon Web Services provide various instance types which meet the particular needs of the different statistical projects we analyzed in this paper. Moreover, the network capacity is sufficient and the parallelization is comparable in efficiency to standard computer cluster implementations. Our results suggest that many statistical projects can be efficiently realized on cloud resources. It is important to mention, however, that workflows can change substantially as a result of a shift from computer cluster to cloud computing.

  20. Small-window parametric imaging based on information entropy for ultrasound tissue characterization

    PubMed Central

    Tsui, Po-Hsiang; Chen, Chin-Kuo; Kuo, Wen-Hung; Chang, King-Jen; Fang, Jui; Ma, Hsiang-Yang; Chou, Dean

    2017-01-01

    Constructing ultrasound statistical parametric images by using a sliding window is a widely adopted strategy for characterizing tissues. Deficiency in spatial resolution, the appearance of boundary artifacts, and the prerequisite data distribution limit the practicability of statistical parametric imaging. In this study, small-window entropy parametric imaging was proposed to overcome the above problems. Simulations and measurements of phantoms were executed to acquire backscattered radiofrequency (RF) signals, which were processed to explore the feasibility of small-window entropy imaging in detecting scatterer properties. To validate the ability of entropy imaging in tissue characterization, measurements of benign and malignant breast tumors were conducted (n = 63) to compare performances of conventional statistical parametric (based on Nakagami distribution) and entropy imaging by the receiver operating characteristic (ROC) curve analysis. The simulation and phantom results revealed that entropy images constructed using a small sliding window (side length = 1 pulse length) adequately describe changes in scatterer properties. The area under the ROC for using small-window entropy imaging to classify tumors was 0.89, which was higher than 0.79 obtained using statistical parametric imaging. In particular, boundary artifacts were largely suppressed in the proposed imaging technique. Entropy enables using a small window for implementing ultrasound parametric imaging. PMID:28106118

Top