Sample records for statistical procedures employed

  1. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    ERIC Educational Resources Information Center

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  2. Statistical methods in personality assessment research.

    PubMed

    Schinka, J A; LaLone, L; Broeckel, J A

    1997-06-01

    Emerging models of personality structure and advances in the measurement of personality and psychopathology suggest that research in personality and personality assessment has entered a stage of advanced development, in this article we examine whether researchers in these areas have taken advantage of new and evolving statistical procedures. We conducted a review of articles published in the Journal of Personality, Assessment during the past 5 years. Of the 449 articles that included some form of data analysis, 12.7% used only descriptive statistics, most employed only univariate statistics, and fewer than 10% used multivariate methods of data analysis. We discuss the cost of using limited statistical methods, the possible reasons for the apparent reluctance to employ advanced statistical procedures, and potential solutions to this technical shortcoming.

  3. Two Paradoxes in Linear Regression Analysis.

    PubMed

    Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong

    2016-12-25

    Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.

  4. Two Paradoxes in Linear Regression Analysis

    PubMed Central

    FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong

    2016-01-01

    Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214

  5. Round-off errors in cutting plane algorithms based on the revised simplex procedure

    NASA Technical Reports Server (NTRS)

    Moore, J. E.

    1973-01-01

    This report statistically analyzes computational round-off errors associated with the cutting plane approach to solving linear integer programming problems. Cutting plane methods require that the inverse of a sequence of matrices be computed. The problem basically reduces to one of minimizing round-off errors in the sequence of inverses. Two procedures for minimizing this problem are presented, and their influence on error accumulation is statistically analyzed. One procedure employs a very small tolerance factor to round computed values to zero. The other procedure is a numerical analysis technique for reinverting or improving the approximate inverse of a matrix. The results indicated that round-off accumulation can be effectively minimized by employing a tolerance factor which reflects the number of significant digits carried for each calculation and by applying the reinversion procedure once to each computed inverse. If 18 significant digits plus an exponent are carried for each variable during computations, then a tolerance value of 0.1 x 10 to the minus 12th power is reasonable.

  6. 14 CFR 21.303 - Replacement and modification parts.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... determination can be made. Statistical quality control procedures may be employed where it is shown that a... AIRCRAFT CERTIFICATION PROCEDURES FOR PRODUCTS AND PARTS Approval of Materials, Parts, Processes, and... the configuration of the part; and (ii) Information on dimensions, materials, and processes necessary...

  7. The change and development of statistical methods used in research articles in child development 1930-2010.

    PubMed

    Køppe, Simo; Dammeyer, Jesper

    2014-09-01

    The evolution of developmental psychology has been characterized by the use of different quantitative and qualitative methods and procedures. But how does the use of methods and procedures change over time? This study explores the change and development of statistical methods used in articles published in Child Development from 1930 to 2010. The methods used in every article in the first issue of every volume were categorized into four categories. Until 1980 relatively simple statistical methods were used. During the last 30 years there has been an explosive use of more advanced statistical methods employed. The absence of statistical methods or use of simple methods had been eliminated.

  8. Alternating Renewal Process Models for Behavioral Observation: Simulation Methods, Software, and Validity Illustrations

    ERIC Educational Resources Information Center

    Pustejovsky, James E.; Runyon, Christopher

    2014-01-01

    Direct observation recording procedures produce reductive summary measurements of an underlying stream of behavior. Previous methodological studies of these recording procedures have employed simulation methods for generating random behavior streams, many of which amount to special cases of a statistical model known as the alternating renewal…

  9. Statistics in the pharmacy literature.

    PubMed

    Lee, Charlene M; Soin, Herpreet K; Einarson, Thomas R

    2004-09-01

    Research in statistical methods is essential for maintenance of high quality of the published literature. To update previous reports of the types and frequencies of statistical terms and procedures in research studies of selected professional pharmacy journals. We obtained all research articles published in 2001 in 6 journals: American Journal of Health-System Pharmacy, The Annals of Pharmacotherapy, Canadian Journal of Hospital Pharmacy, Formulary, Hospital Pharmacy, and Journal of the American Pharmaceutical Association. Two independent reviewers identified and recorded descriptive and inferential statistical terms/procedures found in the methods, results, and discussion sections of each article. Results were determined by tallying the total number of times, as well as the percentage, that each statistical term or procedure appeared in the articles. One hundred forty-four articles were included. Ninety-eight percent employed descriptive statistics; of these, 28% used only descriptive statistics. The most common descriptive statistical terms were percentage (90%), mean (74%), standard deviation (58%), and range (46%). Sixty-nine percent of the articles used inferential statistics, the most frequent being chi(2) (33%), Student's t-test (26%), Pearson's correlation coefficient r (18%), ANOVA (14%), and logistic regression (11%). Statistical terms and procedures were found in nearly all of the research articles published in pharmacy journals. Thus, pharmacy education should aim to provide current and future pharmacists with an understanding of the common statistical terms and procedures identified to facilitate the appropriate appraisal and consequential utilization of the information available in research articles.

  10. Confidentiality of Research and Statistical Data. A Compendium of State Legislation.

    ERIC Educational Resources Information Center

    Knerr, Charles

    During the first half of 1977, the Law Enforcement Assistance Administration (LEAA) initiated a survey of State laws pertaining to the privacy and security of research and statistical information. Two data collection procedures were employed: a library search of State Statutes, and written inquiries to State Attorneys General and to other State…

  11. Interactive boundary delineation of agricultural lands using graphics workstations

    NASA Technical Reports Server (NTRS)

    Cheng, Thomas D.; Angelici, Gary L.; Slye, Robert E.; Ma, Matt

    1992-01-01

    A review is presented of the computer-assisted stratification and sampling (CASS) system developed to delineate the boundaries of sample units for survey procedures. CASS stratifies the sampling units by land-cover and land-use type, employing image-processing software and hardware. This procedure generates coverage areas and the boundaries of stratified sampling units that are utilized for subsequent sampling procedures from which agricultural statistics are developed.

  12. The effects of estimation of censoring, truncation, transformation and partial data vectors

    NASA Technical Reports Server (NTRS)

    Hartley, H. O.; Smith, W. B.

    1972-01-01

    The purpose of this research was to attack statistical problems concerning the estimation of distributions for purposes of predicting and measuring assembly performance as it appears in biological and physical situations. Various statistical procedures were proposed to attack problems of this sort, that is, to produce the statistical distributions of the outcomes of biological and physical situations which, employ characteristics measured on constituent parts. The techniques are described.

  13. The Effect of General Statistical Fiber Misalignment on Predicted Damage Initiation in Composites

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Aboudi, Jacob; Arnold, Steven M.

    2014-01-01

    A micromechanical method is employed for the prediction of unidirectional composites in which the fiber orientation can possess various statistical misalignment distributions. The method relies on the probability-weighted averaging of the appropriate concentration tensor, which is established by the micromechanical procedure. This approach provides access to the local field quantities throughout the constituents, from which initiation of damage in the composite can be predicted. In contrast, a typical macromechanical procedure can determine the effective composite elastic properties in the presence of statistical fiber misalignment, but cannot provide the local fields. Fully random fiber distribution is presented as a special case using the proposed micromechanical method. Results are given that illustrate the effects of various amounts of fiber misalignment in terms of the standard deviations of in-plane and out-of-plane misalignment angles, where normal distributions have been employed. Damage initiation envelopes, local fields, effective moduli, and strengths are predicted for polymer and ceramic matrix composites with given normal distributions of misalignment angles, as well as fully random fiber orientation.

  14. A new statistical method for transfer coefficient calculations in the framework of the general multiple-compartment model of transport for radionuclides in biological systems.

    PubMed

    Garcia, F; Arruda-Neto, J D; Manso, M V; Helene, O M; Vanin, V R; Rodriguez, O; Mesa, J; Likhachev, V P; Filho, J W; Deppman, A; Perez, G; Guzman, F; de Camargo, S P

    1999-10-01

    A new and simple statistical procedure (STATFLUX) for the calculation of transfer coefficients of radionuclide transport to animals and plants is proposed. The method is based on the general multiple-compartment model, which uses a system of linear equations involving geometrical volume considerations. By using experimentally available curves of radionuclide concentrations versus time, for each animal compartment (organs), flow parameters were estimated by employing a least-squares procedure, whose consistency is tested. Some numerical results are presented in order to compare the STATFLUX transfer coefficients with those from other works and experimental data.

  15. Randomised controlled trial to assess the effect of a Just-in-Time training on procedural performance: a proof-of-concept study to address procedural skill decay.

    PubMed

    Branzetti, Jeremy B; Adedipe, Adeyinka A; Gittinger, Matthew J; Rosenman, Elizabeth D; Brolliar, Sarah; Chipman, Anne K; Grand, James A; Fernandez, Rosemarie

    2017-11-01

    A subset of high-risk procedures present significant safety threats due to their (1) infrequent occurrence, (2) execution under time constraints and (3) immediate necessity for patient survival. A Just-in-Time (JIT) intervention could provide real-time bedside guidance to improve high-risk procedural performance and address procedural deficits associated with skill decay. To evaluate the impact of a novel JIT intervention on transvenous pacemaker (TVP) placement during a simulated patient event. This was a prospective, randomised controlled study to determine the effect of a JIT intervention on performance of TVP placement. Subjects included board-certified emergency medicine physicians from two hospitals. The JIT intervention consisted of a portable, bedside computer-based procedural adjunct. The primary outcome was performance during a simulated patient encounter requiring TVP placement, as assessed by trained raters using a technical skills checklist. Secondary outcomes included global performance ratings, time to TVP placement, number of critical omissions and System Usability Scale scores (intervention only). Groups were similar at baseline across all outcomes. Compared with the control group, the intervention group demonstrated statistically significant improvement in the technical checklist score (11.45 vs 23.44, p<0.001, Cohen's d effect size 4.64), the global rating scale (2.27 vs 4.54, p<0.001, Cohen's d effect size 3.76), and a statistically significant reduction in critical omissions (2.23 vs 0.68, p<0.001, Cohen's d effect size -1.86). The difference in time to procedural completion was not statistically significant between conditions (11.15 min vs 12.80 min, p=0.12, Cohen's d effect size 0.65). System Usability Scale scores demonstrated excellent usability. A JIT intervention improved procedure perfromance, suggesting a role for JIT interventions in rarely performed procedures. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  16. Regression assumptions in clinical psychology research practice-a systematic review of common misconceptions.

    PubMed

    Ernst, Anja F; Albers, Casper J

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking.

  17. Regression assumptions in clinical psychology research practice—a systematic review of common misconceptions

    PubMed Central

    Ernst, Anja F.

    2017-01-01

    Misconceptions about the assumptions behind the standard linear regression model are widespread and dangerous. These lead to using linear regression when inappropriate, and to employing alternative procedures with less statistical power when unnecessary. Our systematic literature review investigated employment and reporting of assumption checks in twelve clinical psychology journals. Findings indicate that normality of the variables themselves, rather than of the errors, was wrongfully held for a necessary assumption in 4% of papers that use regression. Furthermore, 92% of all papers using linear regression were unclear about their assumption checks, violating APA-recommendations. This paper appeals for a heightened awareness for and increased transparency in the reporting of statistical assumption checking. PMID:28533971

  18. Statistics for demodulation RFI in inverting operational amplifier circuits

    NASA Astrophysics Data System (ADS)

    Sutu, Y.-H.; Whalen, J. J.

    An investigation was conducted with the objective to determine statistical variations for RFI demodulation responses in operational amplifier (op amp) circuits. Attention is given to the experimental procedures employed, a three-stage op amp LED experiment, NCAP (Nonlinear Circuit Analysis Program) simulations of demodulation RFI in 741 op amps, and a comparison of RFI in four op amp types. Three major recommendations for future investigations are presented on the basis of the obtained results. One is concerned with the conduction of additional measurements of demodulation RFI in inverting amplifiers, while another suggests the employment of an automatic measurement system. It is also proposed to conduct additional NCAP simulations in which parasitic effects are accounted for more thoroughly.

  19. Exposure of the surgeon's hands to radiation during hand surgery procedures.

    PubMed

    Żyluk, Andrzej; Puchalski, Piotr; Szlosser, Zbigniew; Dec, Paweł; Chrąchol, Joanna

    2014-01-01

    The objective of the study was to assess the time of exposure of the surgeon's hands to radiation and calculate of the equivalent dose absorbed during surgery of hand and wrist fractures with C-arm fluoroscope guidance. The necessary data specified by the objective of the study were acquired from operations of 287 patients with fractures of fingers, metacarpals, wrist bones and distal radius. 218 operations (78%) were percutaneous procedures and 60 (22%) were performed by open method. Data on the time of exposure and dose of radiation were acquired from the display of the fluoroscope, where they were automatically generated. These data were assigned to the individual patient, type of fracture, method of surgery and the operating surgeon. Fixations of distal radial fractures required longer times of radiation exposure (mean 61 sec.) than fractures of the wrist/metacarpals and fingers (38 and 32 sec., respectively), which was associated with absorption of significantly higher equivalent doses. Fixations of distal radial fractures by open method were associated with statistically significantly higher equivalent doses (0.41 mSv) than percutaneous procedures (0.3 mSv). Fixations of wrist and metacarpal bone fractures by open method were associated with lower equivalent doses (0.34 mSv) than percutaneous procedures (0.37 mSv),but the difference was not significant. Fixations of finger fractures by open method were associated with lower equivalent doses (0.13 mSv) than percutaneous procedures (0.24 mSv), the difference being statistically non-significant. Statistically significant differences in exposure time and equivalent doses were noted between 4 surgeons participating in the study, but no definitive relationship was found between these parameters and surgeons' employment time. 1. Hand surgery procedures under fluoroscopic guidance are associated with mild exposure of the surgeons' hands to radiation. 2. The equivalent dose was related to the type of fracture, operative technique and - to some degree - to the time of employment of the surgeon.

  20. Mimic expert judgement through automated procedure for selecting rainfall events responsible for shallow landslide: A statistical approach to validation

    NASA Astrophysics Data System (ADS)

    Giovanna, Vessia; Luca, Pisano; Carmela, Vennari; Mauro, Rossi; Mario, Parise

    2016-01-01

    This paper proposes an automated method for the selection of rainfall data (duration, D, and cumulated, E), responsible for shallow landslide initiation. The method mimics an expert person identifying D and E from rainfall records through a manual procedure whose rules are applied according to her/his judgement. The comparison between the two methods is based on 300 D-E pairs drawn from temporal rainfall data series recorded in a 30 days time-lag before the landslide occurrence. Statistical tests, employed on D and E samples considered both paired and independent values to verify whether they belong to the same population, show that the automated procedure is able to replicate the expert pairs drawn by the expert judgment. Furthermore, a criterion based on cumulated distribution functions (CDFs) is proposed to select the most related D-E pairs to the expert one among the 6 drawn from the coded procedure for tracing the empirical rainfall threshold line.

  1. A Review of Structural Equation Modeling Applications in Turkish Educational Science Literature, 2010-2015

    ERIC Educational Resources Information Center

    Karakaya-Ozyer, Kubra; Aksu-Dunya, Beyza

    2018-01-01

    Structural equation modeling (SEM) is one of the most popular multivariate statistical techniques in Turkish educational research. This study elaborates the SEM procedures employed by 75 educational research articles which were published from 2010 to 2015 in Turkey. After documenting and coding 75 academic papers, categorical frequencies and…

  2. Exploring Preferences of Mentoring Activities among Generational Groups of Registered Nurses in Florida

    ERIC Educational Resources Information Center

    Posey-Goodwin, Patricia Ann

    2013-01-01

    The purpose of this study was to explore differences in perceptions of mentoring activities from four generations of registered nurses in Florida, using the Alleman Mentoring Activities Questionnaire ® (AMAQ ®). Statistical procedures of analysis of variance (ANOVA) were employed to explore differences among 65 registered nurses in Florida from…

  3. Cortisol Release in Infants in Response to Inoculation.

    ERIC Educational Resources Information Center

    Lewis, Michael; Thomas, David

    1990-01-01

    Data provide strong evidence that studies of stress and cortisol release in infants must take into account basal level, circadian rhythm, and behavioral effects and employ appropriate statistical procedures. Participants were infants of two, four, and six months of age from whom salivary cortisol was obtained before and 15 minutes after an…

  4. Exploring Emotion in the Higher Education Workplace: Capturing Contrasting Perspectives Using Q Methodology

    ERIC Educational Resources Information Center

    Woods, Charlotte

    2012-01-01

    This article presents an original application of Q methodology in investigating the challenging arena of emotion in the Higher Education (HE) workplace. Q's strength lies in capturing holistic, subjective accounts of complex and contested phenomena but is unusual in employing a statistical procedure within an interpretivist framework. Here Q is…

  5. Professional School Counseling (PSC) Publication Pattern Review: A Meta-Study of Author and Article Characteristics from the First 15 Years

    ERIC Educational Resources Information Center

    Erford, Bradley T.; Giguere, Monica; Glenn, Kacie; Ciarlone, Hallie

    2015-01-01

    Patterns of articles published in "Professional School Counseling" (PSC) from the first 15 volumes were reviewed in this meta-study. Author characteristics (e.g., sex, employment setting, nation of domicile) and article characteristics (e.g., topic, type, design, sample, sample size, participant type, statistical procedures and…

  6. A cloud and radiation model-based algorithm for rainfall retrieval from SSM/I multispectral microwave measurements

    NASA Technical Reports Server (NTRS)

    Xiang, Xuwu; Smith, Eric A.; Tripoli, Gregory J.

    1992-01-01

    A hybrid statistical-physical retrieval scheme is explored which combines a statistical approach with an approach based on the development of cloud-radiation models designed to simulate precipitating atmospheres. The algorithm employs the detailed microphysical information from a cloud model as input to a radiative transfer model which generates a cloud-radiation model database. Statistical procedures are then invoked to objectively generate an initial guess composite profile data set from the database. The retrieval algorithm has been tested for a tropical typhoon case using Special Sensor Microwave/Imager (SSM/I) data and has shown satisfactory results.

  7. Analysis of the procedures used to evaluate suicide crime scenes in Brazil: a statistical approach to interpret reports.

    PubMed

    Bruni, Aline Thaís; Velho, Jesus Antonio; Ferreira, Arthur Serra Lopes; Tasso, Maria Júlia; Ferrari, Raíssa Santos; Yoshida, Ricardo Luís; Dias, Marcos Salvador; Leite, Vitor Barbanti Pereira

    2014-08-01

    This study uses statistical techniques to evaluate reports on suicide scenes; it utilizes 80 reports from different locations in Brazil, randomly collected from both federal and state jurisdictions. We aimed to assess a heterogeneous group of cases in order to obtain an overall perspective of the problem. We evaluated variables regarding the characteristics of the crime scene, such as the detected traces (blood, instruments and clothes) that were found and we addressed the methodology employed by the experts. A qualitative approach using basic statistics revealed a wide distribution as to how the issue was addressed in the documents. We examined a quantitative approach involving an empirical equation and we used multivariate procedures to validate the quantitative methodology proposed for this empirical equation. The methodology successfully identified the main differences in the information presented in the reports, showing that there is no standardized method of analyzing evidences. Copyright © 2014 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  8. Time series modeling and forecasting using memetic algorithms for regime-switching models.

    PubMed

    Bergmeir, Christoph; Triguero, Isaac; Molina, Daniel; Aznarte, José Luis; Benitez, José Manuel

    2012-11-01

    In this brief, we present a novel model fitting procedure for the neuro-coefficient smooth transition autoregressive model (NCSTAR), as presented by Medeiros and Veiga. The model is endowed with a statistically founded iterative building procedure and can be interpreted in terms of fuzzy rule-based systems. The interpretability of the generated models and a mathematically sound building procedure are two very important properties of forecasting models. The model fitting procedure employed by the original NCSTAR is a combination of initial parameter estimation by a grid search procedure with a traditional local search algorithm. We propose a different fitting procedure, using a memetic algorithm, in order to obtain more accurate models. An empirical evaluation of the method is performed, applying it to various real-world time series originating from three forecasting competitions. The results indicate that we can significantly enhance the accuracy of the models, making them competitive to models commonly used in the field.

  9. Box-Counting Dimension Revisited: Presenting an Efficient Method of Minimizing Quantization Error and an Assessment of the Self-Similarity of Structural Root Systems

    PubMed Central

    Bouda, Martin; Caplan, Joshua S.; Saiers, James E.

    2016-01-01

    Fractal dimension (FD), estimated by box-counting, is a metric used to characterize plant anatomical complexity or space-filling characteristic for a variety of purposes. The vast majority of published studies fail to evaluate the assumption of statistical self-similarity, which underpins the validity of the procedure. The box-counting procedure is also subject to error arising from arbitrary grid placement, known as quantization error (QE), which is strictly positive and varies as a function of scale, making it problematic for the procedure's slope estimation step. Previous studies either ignore QE or employ inefficient brute-force grid translations to reduce it. The goals of this study were to characterize the effect of QE due to translation and rotation on FD estimates, to provide an efficient method of reducing QE, and to evaluate the assumption of statistical self-similarity of coarse root datasets typical of those used in recent trait studies. Coarse root systems of 36 shrubs were digitized in 3D and subjected to box-counts. A pattern search algorithm was used to minimize QE by optimizing grid placement and its efficiency was compared to the brute force method. The degree of statistical self-similarity was evaluated using linear regression residuals and local slope estimates. QE, due to both grid position and orientation, was a significant source of error in FD estimates, but pattern search provided an efficient means of minimizing it. Pattern search had higher initial computational cost but converged on lower error values more efficiently than the commonly employed brute force method. Our representations of coarse root system digitizations did not exhibit details over a sufficient range of scales to be considered statistically self-similar and informatively approximated as fractals, suggesting a lack of sufficient ramification of the coarse root systems for reiteration to be thought of as a dominant force in their development. FD estimates did not characterize the scaling of our digitizations well: the scaling exponent was a function of scale. Our findings serve as a caution against applying FD under the assumption of statistical self-similarity without rigorously evaluating it first. PMID:26925073

  10. A spin column-free approach to sodium hydroxide-based glycan permethylation.

    PubMed

    Hu, Yueming; Borges, Chad R

    2017-07-24

    Glycan permethylation was introduced as a tool to facilitate the study of glycans in 1903. Since that time, permethylation procedures have been continually modified to improve permethylation efficiency and qualitative applicability. Typically, however, either laborious preparation steps or cumbersome and uneconomical spin columns have been needed to obtain decent permethylation yields on small glycan samples. Here we describe a spin column-free (SCF) glycan permethylation procedure that is applicable to both O- and N-linked glycans and can be employed upstream to intact glycan analysis by MALDI-MS, ESI-MS, or glycan linkage analysis by GC-MS. The SCF procedure involves neutralization of NaOH beads by acidified phosphate buffer, which eliminates the risk of glycan oxidative degradation and avoids the use of spin columns. Optimization of the new permethylation procedure provided high permethylation efficiency for both hexose (>98%) and HexNAc (>99%) residues-yields which were comparable to (or better than) those of some widely-used spin column-based procedures. A light vs. heavy labelling approach was employed to compare intact glycan yields from a popular spin-column based approach to the SCF approach. Recovery of intact N-glycans was significantly better with the SCF procedure (p < 0.05), but overall yield of O-glycans was similar or slightly diminished (p < 0.05 for tetrasaccharides or smaller). When the SCF procedure was employed upstream to hydrolysis, reduction and acetylation for glycan linkage analysis of pooled glycans from unfractionated blood plasma, analytical reproducibility was on par with that from previous spin column-based "glycan node" analysis results. When applied to blood plasma samples from stage III-IV breast cancer patients (n = 20) and age-matched controls (n = 20), the SCF procedure facilitated identification of three glycan nodes with significantly different distributions between the cases and controls (ROC c-statistics > 0.75; p < 0.01). In summary, the SCF permethylation procedure expedites and economizes both intact glycan analysis and linkage analysis of glycans from whole biospecimens.

  11. A spin column-free approach to sodium hydroxide-based glycan permethylation†

    PubMed Central

    Hu, Yueming; Borges, Chad R.

    2018-01-01

    Glycan permethylation was introduced as a tool to facilitate the study of glycans in 1903. Since that time, permethylation procedures have been continually modified to improve permethylation efficiency and qualitative applicability. Typically, however, either laborious preparation steps or cumbersome and uneconomical spin columns have been needed to obtain decent permethylation yields on small glycan samples. Here we describe a spin column-free (SCF) glycan permethylation procedure that is applicable to both O- and N-linked glycans and can be employed upstream to intact glycan analysis by MALDI-MS, ESI-MS, or glycan linkage analysis by GC-MS. The SCF procedure involves neutralization of NaOH beads by acidified phosphate buffer, which eliminates the risk of glycan oxidative degradation and avoids the use of spin columns. Optimization of the new permethylation procedure provided high permethylation efficiency for both hexose (>98%) and HexNAc (>99%) residues—yields which were comparable to (or better than) those of some widely-used spin column-based procedures. A light vs. heavy labelling approach was employed to compare intact glycan yields from a popular spin-column based approach to the SCF approach. Recovery of intact N-glycans was significantly better with the SCF procedure (p < 0.05), but overall yield of O-glycans was similar or slightly diminished (p < 0.05 for tetrasaccharides or smaller). When the SCF procedure was employed upstream to hydrolysis, reduction and acetylation for glycan linkage analysis of pooled glycans from unfractionated blood plasma, analytical reproducibility was on par with that from previous spin column-based “glycan node” analysis results. When applied to blood plasma samples from stage III–IV breast cancer patients (n = 20) and age-matched controls (n = 20), the SCF procedure facilitated identification of three glycan nodes with significantly different distributions between the cases and controls (ROC c-statistics > 0.75; p < 0.01). In summary, the SCF permethylation procedure expedites and economizes both intact glycan analysis and linkage analysis of glycans from whole biospecimens. PMID:28635997

  12. Statistical variances of diffusional properties from ab initio molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    He, Xingfeng; Zhu, Yizhou; Epstein, Alexander; Mo, Yifei

    2018-12-01

    Ab initio molecular dynamics (AIMD) simulation is widely employed in studying diffusion mechanisms and in quantifying diffusional properties of materials. However, AIMD simulations are often limited to a few hundred atoms and a short, sub-nanosecond physical timescale, which leads to models that include only a limited number of diffusion events. As a result, the diffusional properties obtained from AIMD simulations are often plagued by poor statistics. In this paper, we re-examine the process to estimate diffusivity and ionic conductivity from the AIMD simulations and establish the procedure to minimize the fitting errors. In addition, we propose methods for quantifying the statistical variance of the diffusivity and ionic conductivity from the number of diffusion events observed during the AIMD simulation. Since an adequate number of diffusion events must be sampled, AIMD simulations should be sufficiently long and can only be performed on materials with reasonably fast diffusion. We chart the ranges of materials and physical conditions that can be accessible by AIMD simulations in studying diffusional properties. Our work provides the foundation for quantifying the statistical confidence levels of diffusion results from AIMD simulations and for correctly employing this powerful technique.

  13. Decision rules for unbiased inventory estimates

    NASA Technical Reports Server (NTRS)

    Argentiero, P. D.; Koch, D.

    1979-01-01

    An efficient and accurate procedure for estimating inventories from remote sensing scenes is presented. In place of the conventional and expensive full dimensional Bayes decision rule, a one-dimensional feature extraction and classification technique was employed. It is shown that this efficient decision rule can be used to develop unbiased inventory estimates and that for large sample sizes typical of satellite derived remote sensing scenes, resulting accuracies are comparable or superior to more expensive alternative procedures. Mathematical details of the procedure are provided in the body of the report and in the appendix. Results of a numerical simulation of the technique using statistics obtained from an observed LANDSAT scene are included. The simulation demonstrates the effectiveness of the technique in computing accurate inventory estimates.

  14. Motivations for seeking minimally invasive cosmetic procedures in an academic outpatient setting.

    PubMed

    Sobanko, Joseph F; Taglienti, Anthony J; Wilson, Anthony J; Sarwer, David B; Margolis, David J; Dai, Julia; Percec, Ivona

    2015-11-01

    The demand for minimally invasive cosmetic procedures has continued to rise, yet few studies have examined this patient population. This study sought to define the demographics, social characteristics, and motivations of patients seeking minimally invasive facial cosmetic procedures. A prospective, single-institution cohort study of 72 patients was conducted from 2011 through 2014 at an urban academic medical center. Patients were aged 25 through 70 years; presented for botulinum toxin or soft tissue filler injections; and completed demographic, informational, and psychometric questionnaires before treatment. Descriptive statistics were conducted using Stata statistical software. The average patient was 47.8 years old, was married, had children, was employed, possessed a college or advanced degree, and reported an above-average income. Most patients felt that the first signs of aging occurred around their eyes (74.6%), and a similar percentage expressed this area was the site most desired for rejuvenation. Almost one-third of patients experienced a "major life event" within the preceding year, nearly half had sought prior counseling from a mental health specialist, and 23.6% were being actively prescribed psychiatric medication at the time of treatment. Patients undergoing injectable aesthetic treatments in an urban outpatient academic center were mostly employed, highly educated, affluent women who believed that their procedure would positively impact their appearance. A significant minority experienced a major life event within the past year, which an astute clinician should address during the initial patient consultation. This study helps to better understand the psychosocial factors characterizing this patient population. 4 Therapeutic. © 2015 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com.

  15. Automated Box-Cox Transformations for Improved Visual Encoding.

    PubMed

    Maciejewski, Ross; Pattath, Avin; Ko, Sungahn; Hafen, Ryan; Cleveland, William S; Ebert, David S

    2013-01-01

    The concept of preconditioning data (utilizing a power transformation as an initial step) for analysis and visualization is well established within the statistical community and is employed as part of statistical modeling and analysis. Such transformations condition the data to various inherent assumptions of statistical inference procedures, as well as making the data more symmetric and easier to visualize and interpret. In this paper, we explore the use of the Box-Cox family of power transformations to semiautomatically adjust visual parameters. We focus on time-series scaling, axis transformations, and color binning for choropleth maps. We illustrate the usage of this transformation through various examples, and discuss the value and some issues in semiautomatically using these transformations for more effective data visualization.

  16. A statistical probe into variability within total ozone time series over Arosa, Switzerland (9.68°E, 46.78°N)

    NASA Astrophysics Data System (ADS)

    Chakraborthy, Parthasarathi; Chattopadhyay, Surajit

    2013-02-01

    Endeavor of the present paper is to investigate the statistical properties of the total ozone concentration time series over Arosa, Switzerland (9.68°E, 46.78°N). For this purpose, different statistical data analysis procedures have been employed for analyzing the mean monthly total ozone concentration data, collected over a period of 40 years (1932-1971), at the above location. Based on the computations on the available data set, the study reports different degrees of variations in different months. The month of July is reported as the month of lowest variability. April and May are found to be the most correlated months with respect to total ozone concentration.

  17. Satellite temperature monitoring and prediction system

    NASA Technical Reports Server (NTRS)

    Barnett, U. R.; Martsolf, J. D.; Crosby, F. L.

    1980-01-01

    The paper describes the Florida Satellite Freeze Forecast System (SFFS) in its current state. All data collection options have been demonstrated, and data collected over a three year period have been stored for future analysis. Presently, specific minimum temperature forecasts are issued routinely from November through March. The procedures for issuing these forecast are discussed. The automated data acquisition and processing system is described, and the physical and statistical models employed are examined.

  18. The compartment bag test (CBT) for enumerating fecal indicator bacteria: Basis for design and interpretation of results.

    PubMed

    Gronewold, Andrew D; Sobsey, Mark D; McMahan, Lanakila

    2017-06-01

    For the past several years, the compartment bag test (CBT) has been employed in water quality monitoring and public health protection around the world. To date, however, the statistical basis for the design and recommended procedures for enumerating fecal indicator bacteria (FIB) concentrations from CBT results have not been formally documented. Here, we provide that documentation following protocols for communicating the evolution of similar water quality testing procedures. We begin with an overview of the statistical theory behind the CBT, followed by a description of how that theory was applied to determine an optimal CBT design. We then provide recommendations for interpreting CBT results, including procedures for estimating quantiles of the FIB concentration probability distribution, and the confidence of compliance with recognized water quality guidelines. We synthesize these values in custom user-oriented 'look-up' tables similar to those developed for other FIB water quality testing methods. Modified versions of our tables are currently distributed commercially as part of the CBT testing kit. Published by Elsevier B.V.

  19. A Low-Cost Method for Multiple Disease Prediction.

    PubMed

    Bayati, Mohsen; Bhaskar, Sonia; Montanari, Andrea

    Recently, in response to the rising costs of healthcare services, employers that are financially responsible for the healthcare costs of their workforce have been investing in health improvement programs for their employees. A main objective of these so called "wellness programs" is to reduce the incidence of chronic illnesses such as cardiovascular disease, cancer, diabetes, and obesity, with the goal of reducing future medical costs. The majority of these wellness programs include an annual screening to detect individuals with the highest risk of developing chronic disease. Once these individuals are identified, the company can invest in interventions to reduce the risk of those individuals. However, capturing many biomarkers per employee creates a costly screening procedure. We propose a statistical data-driven method to address this challenge by minimizing the number of biomarkers in the screening procedure while maximizing the predictive power over a broad spectrum of diseases. Our solution uses multi-task learning and group dimensionality reduction from machine learning and statistics. We provide empirical validation of the proposed solution using data from two different electronic medical records systems, with comparisons to a statistical benchmark.

  20. Statistical analysis of water-quality data containing multiple detection limits: S-language software for regression on order statistics

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2005-01-01

    Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.

  1. Remote secure proof of identity using biometrics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sengupta, S. K.; Pearson, P.; Strait, R.S.

    1997-06-10

    Biometric measurements derived from finger- or voiceprints, hand geometry, retinal vessel pattern and iris texture characteristics etc. can be identifiers of individuals. In each case, the measurements can be coded into a statistically unique bit-string for each individual. While in electronic commerce and other electronic transactions the proof of identity of an individual is provided by the use of either public key cryptography or biometric data, more secure applications can be achieved by employing both. However the former requires the use of exact bit patterns. An error correction procedure allows us to successfully combine the use of both to providemore » a general procedure for remote secure proof of identity using a generic biometric device. One such procedure has been demonstrated using a device based on hand geometry.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Das, Raj, E-mail: rajdas@nhs.net, E-mail: raj.das@stgeorges.nhs.uk; Lucatelli, Pierleone, E-mail: pierleone.lucatelli@gmail.com; Wang, Haofan, E-mail: wwhhff123@gmail.com

    AimA clear understanding of operator experience is important in improving technical success whilst minimising patient risk undergoing endovascular procedures, and there is the need to ensure that trainees have the appropriate skills as primary operators. The aim of the study is to retrospectively analyse uterine artery embolisation (UAE) procedures performed by interventional radiology (IR) trainees at an IR training unit analysing fluoroscopy times and radiation dose as surrogate markers of technical skill.MethodsTen IR fellows were primary operator in 200 UAE procedures over a 5-year period. We compared fluoroscopy times, radiation dose and complications, after having them categorised according to threemore » groups: Group 1, initial five, Group 2, >5 procedures and Group 3, penultimate five UAE procedures. We documented factors that may affect screening time (number of vials employed and use of microcatheters).ResultsMean fluoroscopy time was 18.4 (±8.1), 17.3 (±9.0), 16.3 (±8.4) min in Groups 1, 2 and 3, respectively. There was no statistically significant difference between these groups (p > 0.05) with respect to fluoroscopy time or radiation dose. Analysis after correction for confounding factors showed no statistical significance (p > 0.05). All procedures were technically successful, and total complication rate was 4 %.ConclusionUAE was chosen as a highly standardised procedure followed by IR practitioners. Although there is a non-significant trend for shorter screening times with experience, technical success and safety were not compromised with appropriate Consultant supervision, which illustrates a safe construct for IR training. This is important and reassuring information for patients undergoing a procedure in a training unit.« less

  3. Prevention of the Posttraumatic Fibrotic Response in Joints

    DTIC Science & Technology

    2015-10-01

    surgical procedures and subsequent collection of tissues have been developed and are currently used on a regular basis. Major Task 4: Evaluating the...needed to evaluate the utility of the inhibitory antibody to reduce the flexion contracture of injured knee joints. The employed techniques include...second surgery to remove a pin, and it did not change by the end of the 32nd week 1. Major Task 5: Task 4. Data analysis and statistical evaluation

  4. Weighted statistical parameters for irregularly sampled time series

    NASA Astrophysics Data System (ADS)

    Rimoldini, Lorenzo

    2014-01-01

    Unevenly spaced time series are common in astronomy because of the day-night cycle, weather conditions, dependence on the source position in the sky, allocated telescope time and corrupt measurements, for example, or inherent to the scanning law of satellites like Hipparcos and the forthcoming Gaia. Irregular sampling often causes clumps of measurements and gaps with no data which can severely disrupt the values of estimators. This paper aims at improving the accuracy of common statistical parameters when linear interpolation (in time or phase) can be considered an acceptable approximation of a deterministic signal. A pragmatic solution is formulated in terms of a simple weighting scheme, adapting to the sampling density and noise level, applicable to large data volumes at minimal computational cost. Tests on time series from the Hipparcos periodic catalogue led to significant improvements in the overall accuracy and precision of the estimators with respect to the unweighted counterparts and those weighted by inverse-squared uncertainties. Automated classification procedures employing statistical parameters weighted by the suggested scheme confirmed the benefits of the improved input attributes. The classification of eclipsing binaries, Mira, RR Lyrae, Delta Cephei and Alpha2 Canum Venaticorum stars employing exclusively weighted descriptive statistics achieved an overall accuracy of 92 per cent, about 6 per cent higher than with unweighted estimators.

  5. 29 CFR 1926.1406 - Assembly/Disassembly-employer procedures-general requirements.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 8 2012-07-01 2012-07-01 false Assembly/Disassembly-employer procedures-general... CONSTRUCTION Cranes and Derricks in Construction § 1926.1406 Assembly/Disassembly—employer procedures—general requirements. (a) When using employer procedures instead of manufacturer procedures for assembly/disassembly...

  6. 29 CFR 1926.1406 - Assembly/Disassembly-employer procedures-general requirements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 8 2014-07-01 2014-07-01 false Assembly/Disassembly-employer procedures-general... CONSTRUCTION Cranes and Derricks in Construction § 1926.1406 Assembly/Disassembly—employer procedures—general requirements. (a) When using employer procedures instead of manufacturer procedures for assembly/disassembly...

  7. 29 CFR 1926.1406 - Assembly/Disassembly-employer procedures-general requirements.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 8 2013-07-01 2013-07-01 false Assembly/Disassembly-employer procedures-general... CONSTRUCTION Cranes and Derricks in Construction § 1926.1406 Assembly/Disassembly—employer procedures—general requirements. (a) When using employer procedures instead of manufacturer procedures for assembly/disassembly...

  8. 29 CFR 1926.1406 - Assembly/Disassembly-employer procedures-general requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 8 2011-07-01 2011-07-01 false Assembly/Disassembly-employer procedures-general... CONSTRUCTION Cranes and Derricks in Construction § 1926.1406 Assembly/Disassembly—employer procedures—general requirements. (a) When using employer procedures instead of manufacturer procedures for assembly/disassembly...

  9. Statistical analysis of radioimmunoassay. In comparison with bioassay (in Japanese)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nakano, R.

    1973-01-01

    Using the data of RIA (radioimmunoassay), statistical procedures for dealing with two problems of the linearization of dose response curve and calculation of relative potency were described. There were three methods for linearization of dose response curve of RIA. In each method, the following parameters were shown on the horizontal and vertical axis: dose x, (B/T)/sup -1/; c/x + c, B/T (C: dose which makes B/T 50%); log x, logit B/T. Among them, the last method seems to be most practical. The statistical procedures for bioassay were employed for calculating the relative potency of unknown samples compared to the standardmore » samples from dose response curves of standand and unknown samples using regression coefficient. It is desirable that relative potency is calculated by plotting more than 5 points in the standard curve and plotting more than 2 points in unknow samples. For examining the statistical limit of precision of measuremert, LH activity of gonadotropin in urine was measured and relative potency, precision coefficient and the upper and lower limits of relative potency at 95% confidence limit were calculated. On the other hand, bioassay (by the ovarian ascorbic acid reduction method and anteriol lobe of prostate weighing method) was done in the same samples, and the precision was compared with that of RIA. In these examinations, the upper and lower limits of the relative potency at 95% confidence limit were near each other, while in bioassay, a considerable difference was observed between the upper and lower limits. The necessity of standardization and systematization of the statistical procedures for increasing the precision of RIA was pointed out. (JA)« less

  10. Statistical inference for Hardy-Weinberg proportions in the presence of missing genotype information.

    PubMed

    Graffelman, Jan; Sánchez, Milagros; Cook, Samantha; Moreno, Victor

    2013-01-01

    In genetic association studies, tests for Hardy-Weinberg proportions are often employed as a quality control checking procedure. Missing genotypes are typically discarded prior to testing. In this paper we show that inference for Hardy-Weinberg proportions can be biased when missing values are discarded. We propose to use multiple imputation of missing values in order to improve inference for Hardy-Weinberg proportions. For imputation we employ a multinomial logit model that uses information from allele intensities and/or neighbouring markers. Analysis of an empirical data set of single nucleotide polymorphisms possibly related to colon cancer reveals that missing genotypes are not missing completely at random. Deviation from Hardy-Weinberg proportions is mostly due to a lack of heterozygotes. Inbreeding coefficients estimated by multiple imputation of the missings are typically lowered with respect to inbreeding coefficients estimated by discarding the missings. Accounting for missings by multiple imputation qualitatively changed the results of 10 to 17% of the statistical tests performed. Estimates of inbreeding coefficients obtained by multiple imputation showed high correlation with estimates obtained by single imputation using an external reference panel. Our conclusion is that imputation of missing data leads to improved statistical inference for Hardy-Weinberg proportions.

  11. Comparison of Fatigue Life Estimation Using Equivalent Linearization and Time Domain Simulation Methods

    NASA Technical Reports Server (NTRS)

    Mei, Chuh; Dhainaut, Jean-Michel

    2000-01-01

    The Monte Carlo simulation method in conjunction with the finite element large deflection modal formulation are used to estimate fatigue life of aircraft panels subjected to stationary Gaussian band-limited white-noise excitations. Ten loading cases varying from 106 dB to 160 dB OASPL with bandwidth 1024 Hz are considered. For each load case, response statistics are obtained from an ensemble of 10 response time histories. The finite element nonlinear modal procedure yields time histories, probability density functions (PDF), power spectral densities and higher statistical moments of the maximum deflection and stress/strain. The method of moments of PSD with Dirlik's approach is employed to estimate the panel fatigue life.

  12. Robust matching for voice recognition

    NASA Astrophysics Data System (ADS)

    Higgins, Alan; Bahler, L.; Porter, J.; Blais, P.

    1994-10-01

    This paper describes an automated method of comparing a voice sample of an unknown individual with samples from known speakers in order to establish or verify the individual's identity. The method is based on a statistical pattern matching approach that employs a simple training procedure, requires no human intervention (transcription, work or phonetic marketing, etc.), and makes no assumptions regarding the expected form of the statistical distributions of the observations. The content of the speech material (vocabulary, grammar, etc.) is not assumed to be constrained in any way. An algorithm is described which incorporates frame pruning and channel equalization processes designed to achieve robust performance with reasonable computational resources. An experimental implementation demonstrating the feasibility of the concept is described.

  13. Prospective multi-centre Voxel Based Morphometry study employing scanner specific segmentations: Procedure development using CaliBrain structural MRI data

    PubMed Central

    2009-01-01

    Background Structural Magnetic Resonance Imaging (sMRI) of the brain is employed in the assessment of a wide range of neuropsychiatric disorders. In order to improve statistical power in such studies it is desirable to pool scanning resources from multiple centres. The CaliBrain project was designed to provide for an assessment of scanner differences at three centres in Scotland, and to assess the practicality of pooling scans from multiple-centres. Methods We scanned healthy subjects twice on each of the 3 scanners in the CaliBrain project with T1-weighted sequences. The tissue classifier supplied within the Statistical Parametric Mapping (SPM5) application was used to map the grey and white tissue for each scan. We were thus able to assess within scanner variability and between scanner differences. We have sought to correct for between scanner differences by adjusting the probability mappings of tissue occupancy (tissue priors) used in SPM5 for tissue classification. The adjustment procedure resulted in separate sets of tissue priors being developed for each scanner and we refer to these as scanner specific priors. Results Voxel Based Morphometry (VBM) analyses and metric tests indicated that the use of scanner specific priors reduced tissue classification differences between scanners. However, the metric results also demonstrated that the between scanner differences were not reduced to the level of within scanner variability, the ideal for scanner harmonisation. Conclusion Our results indicate the development of scanner specific priors for SPM can assist in pooling of scan resources from different research centres. This can facilitate improvements in the statistical power of quantitative brain imaging studies. PMID:19445668

  14. Reporting Practices and Use of Quantitative Methods in Canadian Journal Articles in Psychology.

    PubMed

    Counsell, Alyssa; Harlow, Lisa L

    2017-05-01

    With recent focus on the state of research in psychology, it is essential to assess the nature of the statistical methods and analyses used and reported by psychological researchers. To that end, we investigated the prevalence of different statistical procedures and the nature of statistical reporting practices in recent articles from the four major Canadian psychology journals. The majority of authors evaluated their research hypotheses through the use of analysis of variance (ANOVA), t -tests, and multiple regression. Multivariate approaches were less common. Null hypothesis significance testing remains a popular strategy, but the majority of authors reported a standardized or unstandardized effect size measure alongside their significance test results. Confidence intervals on effect sizes were infrequently employed. Many authors provided minimal details about their statistical analyses and less than a third of the articles presented on data complications such as missing data and violations of statistical assumptions. Strengths of and areas needing improvement for reporting quantitative results are highlighted. The paper concludes with recommendations for how researchers and reviewers can improve comprehension and transparency in statistical reporting.

  15. CRISM Hyperspectral Data Filtering with Application to MSL Landing Site Selection

    NASA Astrophysics Data System (ADS)

    Seelos, F. P.; Parente, M.; Clark, T.; Morgan, F.; Barnouin-Jha, O. S.; McGovern, A.; Murchie, S. L.; Taylor, H.

    2009-12-01

    We report on the development and implementation of a custom filtering procedure for Compact Reconnaissance Imaging Spectrometer for Mars (CRISM) IR hyperspectral data that is suitable for incorporation into the CRISM Reduced Data Record (RDR) calibration pipeline. Over the course of the Mars Reconnaissance Orbiter (MRO) Primary Science Phase (PSP) and the ongoing Extended Science Phase (ESP) CRISM has operated with an IR detector temperature between ~107 K and ~127 K. This ~20 K range in operational temperature has resulted in variable data quality, with observations acquired at higher detector temperatures exhibiting a marked increase in both systematic and stochastic noise. The CRISM filtering procedure consists of two main data processing capabilities. The primary systematic noise component in CRISM IR data appears as along track or column oriented striping. This is addressed by the robust derivation and application of an inter-column ratio correction frame. The correction frame is developed through the serial evaluation of band specific column ratio statistics and so does not compromise the spectral fidelity of the image cube. The dominant CRISM IR stochastic noise components appear as isolated data spikes or column oriented segments of variable length with erroneous data values. The non-systematic noise is identified and corrected through the application of an iterative-recursive kernel modeling procedure which employs a formal statistical outlier test as the iteration control and recursion termination criterion. This allows the filtering procedure to make a statistically supported determination between high frequency (spatial/spectral) signal and high frequency noise based on the information content of a given multidimensional data kernel. The governing statistical test also allows the kernel filtering procedure to be self regulating and adaptive to the intrinsic noise level in the data. The CRISM IR filtering procedure is scheduled to be incorporated into the next augmentation of the CRISM IR calibration (version 3). The filtering algorithm will be applied to the I/F data (IF) delivered to the Planetary Data System (PDS), but the radiance on sensor data (RA) will remain unfiltered. The development of CRISM hyperspectral analysis products in support of the Mars Science Laboratory (MSL) landing site selection process has motivated the advance of CRISM-specific data processing techniques. The quantitative results of the CRISM IR filtering procedure as applied to CRISM observations acquired in support of MSL landing site selection will be presented.

  16. How many records should be used in ASCE/SEI-7 ground motion scaling procedure?

    USGS Publications Warehouse

    Reyes, Juan C.; Kalkan, Erol

    2012-01-01

    U.S. national building codes refer to the ASCE/SEI-7 provisions for selecting and scaling ground motions for use in nonlinear response history analysis of structures. Because the limiting values for the number of records in the ASCE/SEI-7 are based on engineering experience, this study examines the required number of records statistically, such that the scaled records provide accurate, efficient, and consistent estimates of “true” structural responses. Based on elastic–perfectly plastic and bilinear single-degree-of-freedom systems, the ASCE/SEI-7 scaling procedure is applied to 480 sets of ground motions; the number of records in these sets varies from three to ten. As compared to benchmark responses, it is demonstrated that the ASCE/SEI-7 scaling procedure is conservative if fewer than seven ground motions are employed. Utilizing seven or more randomly selected records provides more accurate estimate of the responses. Selecting records based on their spectral shape and design spectral acceleration increases the accuracy and efficiency of the procedure.

  17. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models.

    PubMed

    Gelfand, Lois A; MacKinnon, David P; DeRubeis, Robert J; Baraldi, Amanda N

    2016-01-01

    Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome-underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results.

  18. Optimal statistical damage detection and classification in an experimental wind turbine blade using minimum instrumentation

    NASA Astrophysics Data System (ADS)

    Hoell, Simon; Omenzetter, Piotr

    2017-04-01

    The increasing demand for carbon neutral energy in a challenging economic environment is a driving factor for erecting ever larger wind turbines in harsh environments using novel wind turbine blade (WTBs) designs characterized by high flexibilities and lower buckling capacities. To counteract resulting increasing of operation and maintenance costs, efficient structural health monitoring systems can be employed to prevent dramatic failures and to schedule maintenance actions according to the true structural state. This paper presents a novel methodology for classifying structural damages using vibrational responses from a single sensor. The method is based on statistical classification using Bayes' theorem and an advanced statistic, which allows controlling the performance by varying the number of samples which represent the current state. This is done for multivariate damage sensitive features defined as partial autocorrelation coefficients (PACCs) estimated from vibrational responses and principal component analysis scores from PACCs. Additionally, optimal DSFs are composed not only for damage classification but also for damage detection based on binary statistical hypothesis testing, where features selections are found with a fast forward procedure. The method is applied to laboratory experiments with a small scale WTB with wind-like excitation and non-destructive damage scenarios. The obtained results demonstrate the advantages of the proposed procedure and are promising for future applications of vibration-based structural health monitoring in WTBs.

  19. A wavelet-based intermittency detection technique from PIV investigations in transitional boundary layers

    NASA Astrophysics Data System (ADS)

    Simoni, Daniele; Lengani, Davide; Guida, Roberto

    2016-09-01

    The transition process of the boundary layer growing over a flat plate with pressure gradient simulating the suction side of a low-pressure turbine blade and elevated free-stream turbulence intensity level has been analyzed by means of PIV and hot-wire measurements. A detailed view of the instantaneous flow field in the wall-normal plane highlights the physics characterizing the complex process leading to the formation of large-scale coherent structures during breaking down of the ordered motion of the flow, thus generating randomized oscillations (i.e., turbulent spots). This analysis gives the basis for the development of a new procedure aimed at determining the intermittency function describing (statistically) the transition process. To this end, a wavelet-based method has been employed for the identification of the large-scale structures created during the transition process. Successively, a probability density function of these events has been defined so that an intermittency function is deduced. This latter strictly corresponds to the intermittency function of the transitional flow computed trough a classic procedure based on hot-wire data. The agreement between the two procedures in the intermittency shape and spot production rate proves the capability of the method in providing the statistical representation of the transition process. The main advantages of the procedure here proposed concern with its applicability to PIV data; it does not require a threshold level to discriminate first- and/or second-order time-derivative of hot-wire time traces (that makes the method not influenced by the operator); and it provides a clear evidence of the connection between the flow physics and the statistical representation of transition based on theory of turbulent spot propagation.

  20. Statistical Learning Analysis in Neuroscience: Aiming for Transparency

    PubMed Central

    Hanke, Michael; Halchenko, Yaroslav O.; Haxby, James V.; Pollmann, Stefan

    2009-01-01

    Encouraged by a rise of reciprocal interest between the machine learning and neuroscience communities, several recent studies have demonstrated the explanatory power of statistical learning techniques for the analysis of neural data. In order to facilitate a wider adoption of these methods, neuroscientific research needs to ensure a maximum of transparency to allow for comprehensive evaluation of the employed procedures. We argue that such transparency requires “neuroscience-aware” technology for the performance of multivariate pattern analyses of neural data that can be documented in a comprehensive, yet comprehensible way. Recently, we introduced PyMVPA, a specialized Python framework for machine learning based data analysis that addresses this demand. Here, we review its features and applicability to various neural data modalities. PMID:20582270

  1. GWAR: robust analysis and meta-analysis of genome-wide association studies.

    PubMed

    Dimou, Niki L; Tsirigos, Konstantinos D; Elofsson, Arne; Bagos, Pantelis G

    2017-05-15

    In the context of genome-wide association studies (GWAS), there is a variety of statistical techniques in order to conduct the analysis, but, in most cases, the underlying genetic model is usually unknown. Under these circumstances, the classical Cochran-Armitage trend test (CATT) is suboptimal. Robust procedures that maximize the power and preserve the nominal type I error rate are preferable. Moreover, performing a meta-analysis using robust procedures is of great interest and has never been addressed in the past. The primary goal of this work is to implement several robust methods for analysis and meta-analysis in the statistical package Stata and subsequently to make the software available to the scientific community. The CATT under a recessive, additive and dominant model of inheritance as well as robust methods based on the Maximum Efficiency Robust Test statistic, the MAX statistic and the MIN2 were implemented in Stata. Concerning MAX and MIN2, we calculated their asymptotic null distributions relying on numerical integration resulting in a great gain in computational time without losing accuracy. All the aforementioned approaches were employed in a fixed or a random effects meta-analysis setting using summary data with weights equal to the reciprocal of the combined cases and controls. Overall, this is the first complete effort to implement procedures for analysis and meta-analysis in GWAS using Stata. A Stata program and a web-server are freely available for academic users at http://www.compgen.org/tools/GWAR. pbagos@compgen.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  2. Using web-based observations to identify thresholds of a person's stability in a flow

    NASA Astrophysics Data System (ADS)

    Milanesi, L.; Pilotti, M.; Bacchi, B.

    2016-10-01

    Flood risk assessment and mitigation are important tasks that should take advantage of rational vulnerability models to increase their effectiveness. These models are usually identified through a relevant set of laboratory experiments. However, there is growing evidence that these tests are not fully representative of the variety of conditions that characterize real flood hazard situations. This paper suggests a citizen science-based and innovative approach to obtain information from web resources for the calibration of people's vulnerability models. A comprehensive study employing commonly used web engines allowed the collection of a wide set of documents showing real risk situations for people impacted by floods, classified according to the stability of the involved subjects. A procedure to extrapolate the flow depth and velocity from the video frames is developed and its reliability is verified by comparing the results with observation. The procedure is based on the statistical distribution of the population height employing a direct uncertainty propagation method. The results complement the experimental literature data and conceptual models. The growing availability of online information will progressively increase the sample size on which the procedure is based and will eventually lead to the identification of a probability surface describing the transition between stability and instability conditions of individuals in a flow.

  3. 28 CFR 42.601 - Purpose and application.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 42.601 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Procedures for Complaints of Employment Discrimination Filed Against Recipients of... procedures for processing and resolving complaints of employment discrimination filed against recipients of...

  4. A critical evaluation of ecological indices for the comparative analysis of microbial communities based on molecular datasets.

    PubMed

    Lucas, Rico; Groeneveld, Jürgen; Harms, Hauke; Johst, Karin; Frank, Karin; Kleinsteuber, Sabine

    2017-01-01

    In times of global change and intensified resource exploitation, advanced knowledge of ecophysiological processes in natural and engineered systems driven by complex microbial communities is crucial for both safeguarding environmental processes and optimising rational control of biotechnological processes. To gain such knowledge, high-throughput molecular techniques are routinely employed to investigate microbial community composition and dynamics within a wide range of natural or engineered environments. However, for molecular dataset analyses no consensus about a generally applicable alpha diversity concept and no appropriate benchmarking of corresponding statistical indices exist yet. To overcome this, we listed criteria for the appropriateness of an index for such analyses and systematically scrutinised commonly employed ecological indices describing diversity, evenness and richness based on artificial and real molecular datasets. We identified appropriate indices warranting interstudy comparability and intuitive interpretability. The unified diversity concept based on 'effective numbers of types' provides the mathematical framework for describing community composition. Additionally, the Bray-Curtis dissimilarity as a beta-diversity index was found to reflect compositional changes. The employed statistical procedure is presented comprising commented R-scripts and example datasets for user-friendly trial application. © FEMS 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  5. Estimation of urban runoff and water quality using remote sensing and artificial intelligence.

    PubMed

    Ha, S R; Park, S Y; Park, D H

    2003-01-01

    Water quality and quantity of runoff are strongly dependent on the landuse and landcover (LULC) criteria. In this study, we developed a more improved parameter estimation procedure for the environmental model using remote sensing (RS) and artificial intelligence (AI) techniques. Landsat TM multi-band (7bands) and Korea Multi-Purpose Satellite (KOMPSAT) panchromatic data were selected for input data processing. We employed two kinds of artificial intelligence techniques, RBF-NN (radial-basis-function neural network) and ANN (artificial neural network), to classify LULC of the study area. A bootstrap resampling method, a statistical technique, was employed to generate the confidence intervals and distribution of the unit load. SWMM was used to simulate the urban runoff and water quality and applied to the study watershed. The condition of urban flow and non-point contaminations was simulated with rainfall-runoff and measured water quality data. The estimated total runoff, peak time, and pollutant generation varied considerably according to the classification accuracy and percentile unit load applied. The proposed procedure would efficiently be applied to water quality and runoff simulation in a rapidly changing urban area.

  6. Temporal variation and scale in movement-based resource selection functions

    USGS Publications Warehouse

    Hooten, M.B.; Hanks, E.M.; Johnson, D.S.; Alldredge, M.W.

    2013-01-01

    A common population characteristic of interest in animal ecology studies pertains to the selection of resources. That is, given the resources available to animals, what do they ultimately choose to use? A variety of statistical approaches have been employed to examine this question and each has advantages and disadvantages with respect to the form of available data and the properties of estimators given model assumptions. A wealth of high resolution telemetry data are now being collected to study animal population movement and space use and these data present both challenges and opportunities for statistical inference. We summarize traditional methods for resource selection and then describe several extensions to deal with measurement uncertainty and an explicit movement process that exists in studies involving high-resolution telemetry data. Our approach uses a correlated random walk movement model to obtain temporally varying use and availability distributions that are employed in a weighted distribution context to estimate selection coefficients. The temporally varying coefficients are then weighted by their contribution to selection and combined to provide inference at the population level. The result is an intuitive and accessible statistical procedure that uses readily available software and is computationally feasible for large datasets. These methods are demonstrated using data collected as part of a large-scale mountain lion monitoring study in Colorado, USA.

  7. Reconsidering barriers to wind power projects: community engagement, developer transparency and place

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Firestone, Jeremy; Hoen, Ben; Rand, Joseph

    In 2016, we undertook a nationally representative wind power perceptions survey of individuals living within 8 km of over 600 projects in the United States, generating 1705 telephone, web, and mail responses. We sought information on a variety of topics, including procedural fairness and its relationship to project attitude, the foci of the present analysis. Here, we present a series of descriptive statistics and regression results, emphasizing those residents who were aware of their local project prior to construction. Sample weighting is employed to account for stratification and non-response. We find that a developer being open and transparent, a communitymore » being able to influence the outcome, and having a say in the planning process are all statistically significant predictors of a process perceived as being ‘fair,’ with an open and transparent developer having the largest effect. We also find developer transparency and ability to influence outcomes to have statistically significant relationships to a more positive attitude, with those findings holding when aesthetics, landscape, and wind turbine sound considerations are controlled for. The results indicate that jurisdictions might consider developing procedures, which ensure citizens are consulted and heard, and benchmarks or best practices for developer interaction with communities and citizens.« less

  8. Reconsidering barriers to wind power projects: community engagement, developer transparency and place

    DOE PAGES

    Firestone, Jeremy; Hoen, Ben; Rand, Joseph; ...

    2017-12-21

    In 2016, we undertook a nationally representative wind power perceptions survey of individuals living within 8 km of over 600 projects in the United States, generating 1705 telephone, web, and mail responses. We sought information on a variety of topics, including procedural fairness and its relationship to project attitude, the foci of the present analysis. Here, we present a series of descriptive statistics and regression results, emphasizing those residents who were aware of their local project prior to construction. Sample weighting is employed to account for stratification and non-response. We find that a developer being open and transparent, a communitymore » being able to influence the outcome, and having a say in the planning process are all statistically significant predictors of a process perceived as being ‘fair,’ with an open and transparent developer having the largest effect. We also find developer transparency and ability to influence outcomes to have statistically significant relationships to a more positive attitude, with those findings holding when aesthetics, landscape, and wind turbine sound considerations are controlled for. The results indicate that jurisdictions might consider developing procedures, which ensure citizens are consulted and heard, and benchmarks or best practices for developer interaction with communities and citizens.« less

  9. Modelling of electronic excitation and radiation in the Direct Simulation Monte Carlo Macroscopic Chemistry Method

    NASA Astrophysics Data System (ADS)

    Goldsworthy, M. J.

    2012-10-01

    One of the most useful tools for modelling rarefied hypersonic flows is the Direct Simulation Monte Carlo (DSMC) method. Simulator particle movement and collision calculations are combined with statistical procedures to model thermal non-equilibrium flow-fields described by the Boltzmann equation. The Macroscopic Chemistry Method for DSMC simulations was developed to simplify the inclusion of complex thermal non-equilibrium chemistry. The macroscopic approach uses statistical information which is calculated during the DSMC solution process in the modelling procedures. Here it is shown how inclusion of macroscopic information in models of chemical kinetics, electronic excitation, ionization, and radiation can enhance the capabilities of DSMC to model flow-fields where a range of physical processes occur. The approach is applied to the modelling of a 6.4 km/s nitrogen shock wave and results are compared with those from existing shock-tube experiments and continuum calculations. Reasonable agreement between the methods is obtained. The quality of the comparison is highly dependent on the set of vibrational relaxation and chemical kinetic parameters employed.

  10. Blood pressure and heart rate response to posteriorly directed pressure applied to the cervical spine in young, pain-free individuals: a randomized, repeated-measures, double-blind, placebo-controlled study.

    PubMed

    Yung, Emmanuel; Wong, Michael; Williams, Haddie; Mache, Kyle

    2014-08-01

    Randomized clinical trial. Objectives To compare the blood pressure (BP) and heart rate (HR) response of healthy volunteers to posteriorly directed (anterior-to-posterior [AP]) pressure applied to the cervical spine versus placebo. Manual therapists employ cervical spine AP mobilizations for various cervical-shoulder pain conditions. However, there is a paucity of literature describing the procedure, cardiovascular response, and safety profile. Thirty-nine (25 female) healthy participants (mean ± SD age, 24.7 ± 1.9 years) were randomly assigned to 1 of 2 groups. Group 1 received a placebo, consisting of light touch applied to the right C6 costal process. Group 2 received AP pressure at the same location. Blood pressure and HR were measured prior to, during, and after the application of AP pressure. One-way analysis of variance and paired-difference statistics were used for data analysis. There was no statistically significant difference between groups for mean systolic BP, mean diastolic BP, and mean HR (P >.05) for all time points. Within-group comparisons indicated statistically significant differences between baseline and post-AP pressure HR (-2.8 bpm; 95% confidence interval: -4.6, -1.1) and between baseline and post-AP pressure systolic BP (-2.4 mmHg; 95% confidence interval: -3.7, -1.0) in the AP group, and between baseline and postplacebo systolic BP (-2.6 mmHg; 95% confidence interval: -4.2, -1.0) in the placebo group. No participants reported any adverse reactions or side effects within 24 hours of testing. AP pressure caused a statistically significant physiologic response that resulted in a minor drop in HR (without causing asystole or vasodepression) after the procedure, whereas this cardiovascular change did not occur for those in the placebo group. Within both groups, there was a small but statistically significant reduction in systolic BP following the procedure.

  11. 29 CFR 1601.26 - Confidentiality of endeavors.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Confidentiality of endeavors. 1601.26 Section 1601.26 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURAL REGULATIONS Procedure for the Prevention of Unlawful Employment Practices Procedure to Rectify Unlawful Employment...

  12. 29 CFR 1601.26 - Confidentiality of endeavors.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 4 2011-07-01 2011-07-01 false Confidentiality of endeavors. 1601.26 Section 1601.26 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURAL REGULATIONS Procedure for the Prevention of Unlawful Employment Practices Procedure to Rectify Unlawful Employment...

  13. 41 CFR 301-75.3 - What governing policies and procedures must we establish related to pre-employment interview travel?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... and procedures must we establish related to pre-employment interview travel? 301-75.3 Section 301-75.3... ALLOWANCES AGENCY RESPONSIBILITIES 75-PRE-EMPLOYMENT INTERVIEW TRAVEL General Rules § 301-75.3 What governing policies and procedures must we establish related to pre-employment interview travel? You must establish...

  14. Detection of non-Gaussian fluctuations in a quantum point contact.

    PubMed

    Gershon, G; Bomze, Yu; Sukhorukov, E V; Reznikov, M

    2008-07-04

    An experimental study of current fluctuations through a tunable transmission barrier, a quantum point contact, is reported. We measure the probability distribution function of transmitted charge with precision sufficient to extract the first three cumulants. To obtain the intrinsic quantities, corresponding to voltage-biased barrier, we employ a procedure that accounts for the response of the external circuit and the amplifier. The third cumulant, obtained with a high precision, is found to agree with the prediction for the statistics of transport in the non-Poissonian regime.

  15. Detection of Non-Gaussian Fluctuations in a Quantum Point Contact

    NASA Astrophysics Data System (ADS)

    Gershon, G.; Bomze, Yu.; Sukhorukov, E. V.; Reznikov, M.

    2008-07-01

    An experimental study of current fluctuations through a tunable transmission barrier, a quantum point contact, is reported. We measure the probability distribution function of transmitted charge with precision sufficient to extract the first three cumulants. To obtain the intrinsic quantities, corresponding to voltage-biased barrier, we employ a procedure that accounts for the response of the external circuit and the amplifier. The third cumulant, obtained with a high precision, is found to agree with the prediction for the statistics of transport in the non-Poissonian regime.

  16. 29 CFR 1601.24 - Conciliation: Procedure and authority.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 4 2011-07-01 2011-07-01 false Conciliation: Procedure and authority. 1601.24 Section 1601.24 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURAL REGULATIONS Procedure for the Prevention of Unlawful Employment Practices Procedure to Rectify Unlawful...

  17. 29 CFR 1601.24 - Conciliation: Procedure and authority.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Conciliation: Procedure and authority. 1601.24 Section 1601.24 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURAL REGULATIONS Procedure for the Prevention of Unlawful Employment Practices Procedure to Rectify Unlawful...

  18. Detection and Evaluation of Spatio-Temporal Spike Patterns in Massively Parallel Spike Train Data with SPADE.

    PubMed

    Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M; Grün, Sonja

    2017-01-01

    Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis.

  19. Detection and Evaluation of Spatio-Temporal Spike Patterns in Massively Parallel Spike Train Data with SPADE

    PubMed Central

    Quaglio, Pietro; Yegenoglu, Alper; Torre, Emiliano; Endres, Dominik M.; Grün, Sonja

    2017-01-01

    Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs). STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons). In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST). We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE) analysis. PMID:28596729

  20. Detection of crossover time scales in multifractal detrended fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Ge, Erjia; Leung, Yee

    2013-04-01

    Fractal is employed in this paper as a scale-based method for the identification of the scaling behavior of time series. Many spatial and temporal processes exhibiting complex multi(mono)-scaling behaviors are fractals. One of the important concepts in fractals is crossover time scale(s) that separates distinct regimes having different fractal scaling behaviors. A common method is multifractal detrended fluctuation analysis (MF-DFA). The detection of crossover time scale(s) is, however, relatively subjective since it has been made without rigorous statistical procedures and has generally been determined by eye balling or subjective observation. Crossover time scales such determined may be spurious and problematic. It may not reflect the genuine underlying scaling behavior of a time series. The purpose of this paper is to propose a statistical procedure to model complex fractal scaling behaviors and reliably identify the crossover time scales under MF-DFA. The scaling-identification regression model, grounded on a solid statistical foundation, is first proposed to describe multi-scaling behaviors of fractals. Through the regression analysis and statistical inference, we can (1) identify the crossover time scales that cannot be detected by eye-balling observation, (2) determine the number and locations of the genuine crossover time scales, (3) give confidence intervals for the crossover time scales, and (4) establish the statistically significant regression model depicting the underlying scaling behavior of a time series. To substantive our argument, the regression model is applied to analyze the multi-scaling behaviors of avian-influenza outbreaks, water consumption, daily mean temperature, and rainfall of Hong Kong. Through the proposed model, we can have a deeper understanding of fractals in general and a statistical approach to identify multi-scaling behavior under MF-DFA in particular.

  1. Canadian Health Measures Survey pre-test: design, methods, results.

    PubMed

    Tremblay, Mark; Langlois, Renée; Bryan, Shirley; Esliger, Dale; Patterson, Julienne

    2007-01-01

    The Canadian Health Measures Survey (CHMS) pre-test was conducted to provide information about the challenges and costs associated with administering a physical health measures survey in Canada. To achieve the specific objectives of the pre-test, protocols were developed and tested, and methods for household interviewing and clinic testing were designed and revised. The cost, logistics and suitability of using fixed sites for the CHMS were assessed. Although data collection, transfer and storage procedures are complex, the pre-test experience confirmed Statistics Canada's ability to conduct a direct health measures survey and the willingness of Canadians to participate in such a health survey. Many operational and logistical procedures worked well and, with minor modifications, are being employed in the main survey. Fixed sites were problematic, and survey costs were higher than expected.

  2. Learning algorithm in restricted Boltzmann machines using Kullback-Leibler importance estimation procedure

    NASA Astrophysics Data System (ADS)

    Yasuda, Muneki; Sakurai, Tetsuharu; Tanaka, Kazuyuki

    Restricted Boltzmann machines (RBMs) are bipartite structured statistical neural networks and consist of two layers. One of them is a layer of visible units and the other one is a layer of hidden units. In each layer, any units do not connect to each other. RBMs have high flexibility and rich structure and have been expected to applied to various applications, for example, image and pattern recognitions, face detections and so on. However, most of computational models in RBMs are intractable and often belong to the class of NP-hard problem. In this paper, in order to construct a practical learning algorithm for them, we employ the Kullback-Leibler Importance Estimation Procedure (KLIEP) to RBMs, and give a new scheme of practical approximate learning algorithm for RBMs based on the KLIEP.

  3. 29 CFR 1626.20 - Procedure for requesting an opinion letter.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 1626.20 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN EMPLOYMENT ACT § 1626.20 Procedure for requesting an opinion letter. (a) A request for an opinion letter should be submitted in writing to the Chairman, Equal Employment Opportunity...

  4. 29 CFR 1626.20 - Procedure for requesting an opinion letter.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 1626.20 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN EMPLOYMENT ACT § 1626.20 Procedure for requesting an opinion letter. (a) A request for an opinion letter should be submitted in writing to the Chairman, Equal Employment Opportunity...

  5. 29 CFR 1626.20 - Procedure for requesting an opinion letter.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 1626.20 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN EMPLOYMENT ACT § 1626.20 Procedure for requesting an opinion letter. (a) A request for an opinion letter should be submitted in writing to the Chairman, Equal Employment Opportunity...

  6. 29 CFR 1626.20 - Procedure for requesting an opinion letter.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 1626.20 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN EMPLOYMENT ACT § 1626.20 Procedure for requesting an opinion letter. (a) A request for an opinion letter should be submitted in writing to the Chairman, Equal Employment Opportunity...

  7. 20 CFR 636.4 - Grievance procedures at the employer level.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Grievance procedures at the employer level. 636.4 Section 636.4 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR... benefits. Where local law, personnel rules, or other applicable requirements specify procedures (including...

  8. Mediation Analysis with Survival Outcomes: Accelerated Failure Time vs. Proportional Hazards Models

    PubMed Central

    Gelfand, Lois A.; MacKinnon, David P.; DeRubeis, Robert J.; Baraldi, Amanda N.

    2016-01-01

    Objective: Survival time is an important type of outcome variable in treatment research. Currently, limited guidance is available regarding performing mediation analyses with survival outcomes, which generally do not have normally distributed errors, and contain unobserved (censored) events. We present considerations for choosing an approach, using a comparison of semi-parametric proportional hazards (PH) and fully parametric accelerated failure time (AFT) approaches for illustration. Method: We compare PH and AFT models and procedures in their integration into mediation models and review their ability to produce coefficients that estimate causal effects. Using simulation studies modeling Weibull-distributed survival times, we compare statistical properties of mediation analyses incorporating PH and AFT approaches (employing SAS procedures PHREG and LIFEREG, respectively) under varied data conditions, some including censoring. A simulated data set illustrates the findings. Results: AFT models integrate more easily than PH models into mediation models. Furthermore, mediation analyses incorporating LIFEREG produce coefficients that can estimate causal effects, and demonstrate superior statistical properties. Censoring introduces bias in the coefficient estimate representing the treatment effect on outcome—underestimation in LIFEREG, and overestimation in PHREG. With LIFEREG, this bias can be addressed using an alternative estimate obtained from combining other coefficients, whereas this is not possible with PHREG. Conclusions: When Weibull assumptions are not violated, there are compelling advantages to using LIFEREG over PHREG for mediation analyses involving survival-time outcomes. Irrespective of the procedures used, the interpretation of coefficients, effects of censoring on coefficient estimates, and statistical properties should be taken into account when reporting results. PMID:27065906

  9. [A procedure for estimating the rate of occupational accidents in non-European-Union workers with irregular immigrant status].

    PubMed

    Marchiori, L; Marangi, G; Mazzoccoli, P; Scoizzato, L; Buja, Alessandra; Mastrangelo, G

    2008-01-01

    Statistics on occupational accidents provided by the Italian Institute for Occupational Diseases and Accidents (INAIL, Italian acronym) include only events that occurred in workers with regular employment status. The aim of the study was to establish a procedure in order to estimate the rate of occupational accidents in non-European-Union (non-EU) workers with irregular employment status and/or irregular immigrant status. The sources of data were the clinical records of the Emergency Department of San Bonifacio Hospital, and the population data of District 4 of Local Health Authority 20 of Verona, which was considered the catchment area of this hospital. Among 419 cases of accidents occurred in the numerator of the rate. The denominator of the rate was estimated by calculating: (1) the subjects of working age resident in District 4 (= 83714); (2) the total number of non-EU workers, assuming that the percentage was similar to that in San Bonifacio Municipality (= 0.115); the number of irregular non-EU workers, assuming that the percentage was similar to that in north-eastern Italy (= 0.103). Non-EU workers with irregular employment status and/or irregular immigrant status should, according to these calculations, be 992 (= 83714 x 0.115 x 0.103). The rate--147.2 (= 146/992) occupational accidents per 1000 irregular non-EU workers--is more than twice as high as that calculated in 2004 in Italy in regular non-EU workers (approximately 65 accidents per 1000). The difference can be explained by the fact that irregular workers find employment mainly in agriculture, building and the metallurgic industry, which have a high frequency of accidents, and are more willing to accept risky work and longer work shifts. On the assumption that the rate of occupational accidents in the 500,000 irregular workers living in Italy in 2004 was 147.2 per 1000 (as in the catchment area of the San Bonifacio Hospital), the number of accidents would be 73,600, against the 116,000 that occurred among regular non-EU workers in 2004 according to INAIL. Official INAIL statistics on occupational accidents therefore show a considerable underestimation.

  10. Effect of the image resolution on the statistical descriptors of heterogeneous media.

    PubMed

    Ledesma-Alonso, René; Barbosa, Romeli; Ortegón, Jaime

    2018-02-01

    The characterization and reconstruction of heterogeneous materials, such as porous media and electrode materials, involve the application of image processing methods to data acquired by scanning electron microscopy or other microscopy techniques. Among them, binarization and decimation are critical in order to compute the correlation functions that characterize the microstructure of the above-mentioned materials. In this study, we present a theoretical analysis of the effects of the image-size reduction, due to the progressive and sequential decimation of the original image. Three different decimation procedures (random, bilinear, and bicubic) were implemented and their consequences on the discrete correlation functions (two-point, line-path, and pore-size distribution) and the coarseness (derived from the local volume fraction) are reported and analyzed. The chosen statistical descriptors (correlation functions and coarseness) are typically employed to characterize and reconstruct heterogeneous materials. A normalization for each of the correlation functions has been performed. When the loss of statistical information has not been significant for a decimated image, its normalized correlation function is forecast by the trend of the original image (reference function). In contrast, when the decimated image does not hold statistical evidence of the original one, the normalized correlation function diverts from the reference function. Moreover, the equally weighted sum of the average of the squared difference, between the discrete correlation functions of the decimated images and the reference functions, leads to a definition of an overall error. During the first stages of the gradual decimation, the error remains relatively small and independent of the decimation procedure. Above a threshold defined by the correlation length of the reference function, the error becomes a function of the number of decimation steps. At this stage, some statistical information is lost and the error becomes dependent on the decimation procedure. These results may help us to restrict the amount of information that one can afford to lose during a decimation process, in order to reduce the computational and memory cost, when one aims to diminish the time consumed by a characterization or reconstruction technique, yet maintaining the statistical quality of the digitized sample.

  11. Effect of the image resolution on the statistical descriptors of heterogeneous media

    NASA Astrophysics Data System (ADS)

    Ledesma-Alonso, René; Barbosa, Romeli; Ortegón, Jaime

    2018-02-01

    The characterization and reconstruction of heterogeneous materials, such as porous media and electrode materials, involve the application of image processing methods to data acquired by scanning electron microscopy or other microscopy techniques. Among them, binarization and decimation are critical in order to compute the correlation functions that characterize the microstructure of the above-mentioned materials. In this study, we present a theoretical analysis of the effects of the image-size reduction, due to the progressive and sequential decimation of the original image. Three different decimation procedures (random, bilinear, and bicubic) were implemented and their consequences on the discrete correlation functions (two-point, line-path, and pore-size distribution) and the coarseness (derived from the local volume fraction) are reported and analyzed. The chosen statistical descriptors (correlation functions and coarseness) are typically employed to characterize and reconstruct heterogeneous materials. A normalization for each of the correlation functions has been performed. When the loss of statistical information has not been significant for a decimated image, its normalized correlation function is forecast by the trend of the original image (reference function). In contrast, when the decimated image does not hold statistical evidence of the original one, the normalized correlation function diverts from the reference function. Moreover, the equally weighted sum of the average of the squared difference, between the discrete correlation functions of the decimated images and the reference functions, leads to a definition of an overall error. During the first stages of the gradual decimation, the error remains relatively small and independent of the decimation procedure. Above a threshold defined by the correlation length of the reference function, the error becomes a function of the number of decimation steps. At this stage, some statistical information is lost and the error becomes dependent on the decimation procedure. These results may help us to restrict the amount of information that one can afford to lose during a decimation process, in order to reduce the computational and memory cost, when one aims to diminish the time consumed by a characterization or reconstruction technique, yet maintaining the statistical quality of the digitized sample.

  12. 78 FR 43002 - Proposed Collection; Comment Request for Revenue Procedure 2004-29

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-07-18

    ... comments concerning statistical sampling in Sec. 274 Context. DATES: Written comments should be received on... INFORMATION: Title: Statistical Sampling in Sec. 274 Contest. OMB Number: 1545-1847. Revenue Procedure Number: Revenue Procedure 2004-29. Abstract: Revenue Procedure 2004-29 prescribes the statistical sampling...

  13. Internal quality control: planning and implementation strategies.

    PubMed

    Westgard, James O

    2003-11-01

    The first essential in setting up internal quality control (IQC) of a test procedure in the clinical laboratory is to select the proper IQC procedure to implement, i.e. choosing the statistical criteria or control rules, and the number of control measurements, according to the quality required for the test and the observed performance of the method. Then the right IQC procedure must be properly implemented. This review focuses on strategies for planning and implementing IQC procedures in order to improve the quality of the IQC. A quantitative planning process is described that can be implemented with graphical tools such as power function or critical-error graphs and charts of operating specifications. Finally, a total QC strategy is formulated to minimize cost and maximize quality. A general strategy for IQC implementation is recommended that employs a three-stage design in which the first stage provides high error detection, the second stage low false rejection and the third stage prescribes the length of the analytical run, making use of an algorithm involving the average of normal patients' data.

  14. Hospital volume and mortality for 25 types of inpatient treatment in German hospitals: observational study using complete national data from 2009 to 2014.

    PubMed

    Nimptsch, Ulrike; Mansky, Thomas

    2017-09-06

    To explore the existence and strength of a relationship between hospital volume and mortality, to estimate minimum volume thresholds and to assess the potential benefit of centralisation of services. Observational population-based study using complete German hospital discharge data (Diagnosis-Related Group Statistics (DRG Statistics)). All acute care hospitals in Germany. All adult patients hospitalised for 1 out of 25 common or medically important types of inpatient treatment from 2009 to 2014. Risk-adjusted inhospital mortality. Lower inhospital mortality in association with higher hospital volume was observed in 20 out of the 25 studied types of treatment when volume was categorised in quintiles and persisted in 17 types of treatment when volume was analysed as a continuous variable. Such a relationship was found in some of the studied emergency conditions and low-risk procedures. It was more consistently present regarding complex surgical procedures. For example, about 22 000 patients receiving open repair of abdominal aortic aneurysm were analysed. In very high-volume hospitals, risk-adjusted mortality was 4.7% (95% CI 4.1 to 5.4) compared with 7.8% (7.1 to 8.7) in very low volume hospitals. Theminimum volume above which risk of death would fall below the average mortality was estimated as 18 cases per year. If all hospitals providing this service would perform at least 18 cases per year, one death among 104 (76 to 166) patients could potentially be prevented. Based on complete national hospital discharge data, the results confirmed volume-outcome relationships for many complex surgical procedures, as well as for some emergency conditions and low-risk procedures. Following these findings, the study identified areas where centralisation would provide a benefit for patients undergoing the specific type of treatment in German hospitals and quantified the possible impact of centralisation efforts. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  15. Exploratory Analysis of Survey Data for Understanding Adoption of Novel Aerospace Systems

    NASA Astrophysics Data System (ADS)

    Reddy, Lauren M.

    In order to meet the increasing demand for manned and unmanned flight, the air transportation system must constantly evolve. As new technologies or operational procedures are conceived, we must determine their effect on humans in the system. In this research, we introduce a strategy to assess how individuals or organizations would respond to a novel aerospace system. We employ the most appropriate and sophisticated exploratory analysis techniques on the survey data to generate insight and identify significant variables. We employ three different methods for eliciting views from individuals or organizations who are affected by a system: an opinion survey, a stated preference survey, and structured interviews. We conduct an opinion survey of both the general public and stakeholders in the unmanned aircraft industry to assess their knowledge, attitude, and practices regarding unmanned aircraft. We complete a statistical analysis of the multiple-choice questions using multinomial logit and multivariate probit models and conduct qualitative analysis on free-text questions. We next present a stated preference survey of the general public on the use of an unmanned aircraft package delivery service. We complete a statistical analysis of the questions using multinomial logit, ordered probit, linear regression, and negative binomial models. Finally, we discuss structured interviews conducted on stakeholders from ANSPs and airlines operating in the North Atlantic. We describe how these groups may choose to adopt a new technology (space-based ADS-B) or operational procedure (in-trail procedures). We discuss similarities and differences between the stakeholders groups, the benefits and costs of in-trail procedures and space-based ADS-B as reported by the stakeholders, and interdependencies between the groups interviewed. To demonstrate the value of the data we generated, we explore how the findings from the surveys can be used to better characterize uncertainty in the cost-benefit analysis of aerospace systems. We demonstrate how the findings from the opinion and stated preference surveys can be infused into the cost-benefit analysis of an unmanned aircraft delivery system. We also demonstrate how to apply the findings from the interviews to characterize uncertainty in the estimation of the benefits of space-based ADS-B.

  16. The influence of common method bias on the relationship of the socio-ecological model in predicting physical activity behavior.

    PubMed

    Wingate, Savanna; Sng, Eveleen; Loprinzi, Paul D

    2018-01-01

    Background: The purpose of this study was to evaluate the extent, if any, that the association between socio-ecological parameters and physical activity may be influenced by common method bias (CMB). Methods: This study took place between February and May of 2017 at a Southeastern University in the United States. A randomized controlled experiment was employed among 119 young adults.Participants were randomized into either group 1 (the group we attempted to minimize CMB)or group 2 (control group). In group 1, CMB was minimized via various procedural remedies,such as separating the measurement of predictor and criterion variables by introducing a time lag (temporal; 2 visits several days apart), creating a cover story (psychological), and approximating measures to have data collected in different media (computer-based vs. paper and pencil) and different locations to control method variance when collecting self-report measures from the same source. Socio-ecological parameters (self-efficacy; friend support; family support)and physical activity were self-reported. Results: Exercise self-efficacy was significantly associated with physical activity. This association (β = 0.74, 95% CI: 0.33-1.1; P = 0.001) was only observed in group 2 (control), but not in group 1 (experimental group) (β = 0.03; 95% CI: -0.57-0.63; P = 0.91). The difference in these coefficients (i.e., β = 0.74 vs. β = 0.03) was statistically significant (P = 0.04). Conclusion: Future research in this field, when feasible, may wish to consider employing procedural and statistical remedies to minimize CMB.

  17. The influence of common method bias on the relationship of the socio-ecological model in predicting physical activity behavior

    PubMed Central

    Wingate, Savanna; Sng, Eveleen; Loprinzi, Paul D.

    2018-01-01

    Background: The purpose of this study was to evaluate the extent, if any, that the association between socio-ecological parameters and physical activity may be influenced by common method bias (CMB). Methods: This study took place between February and May of 2017 at a Southeastern University in the United States. A randomized controlled experiment was employed among 119 young adults.Participants were randomized into either group 1 (the group we attempted to minimize CMB)or group 2 (control group). In group 1, CMB was minimized via various procedural remedies,such as separating the measurement of predictor and criterion variables by introducing a time lag (temporal; 2 visits several days apart), creating a cover story (psychological), and approximating measures to have data collected in different media (computer-based vs. paper and pencil) and different locations to control method variance when collecting self-report measures from the same source. Socio-ecological parameters (self-efficacy; friend support; family support)and physical activity were self-reported. Results: Exercise self-efficacy was significantly associated with physical activity. This association (β = 0.74, 95% CI: 0.33-1.1; P = 0.001) was only observed in group 2 (control), but not in group 1 (experimental group) (β = 0.03; 95% CI: -0.57-0.63; P = 0.91). The difference in these coefficients (i.e., β = 0.74 vs. β = 0.03) was statistically significant (P = 0.04). Conclusion: Future research in this field, when feasible, may wish to consider employing procedural and statistical remedies to minimize CMB. PMID:29423361

  18. 28 CFR 42.605 - Agency processing of complaints of employment discrimination.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... employment discrimination. 42.605 Section 42.605 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Procedures for Complaints of Employment Discrimination Filed Against Recipients of Federal Financial Assistance § 42.605 Agency processing of complaints...

  19. Legal Consequences of Employer Discharge Procedures

    ERIC Educational Resources Information Center

    Joseph, Stephanie

    2008-01-01

    The employment contract is sometimes misunderstood by both employees and employers. Drafters of employee manuals, policies, and procedures should be aware that the nature of the at-will employment relationship can be transformed into a binding employment contract by the words and phrases chosen. In this article, the author uses the case of Eric,…

  20. Protocol for monitoring metals in Ozark National Scenic Riverways, Missouri: Version 1.0

    USGS Publications Warehouse

    Schmitt, Christopher J.; Brumbaugh, William G.; Besser, John M.; Hinck, Jo Ellen; Bowles, David E.; Morrison, Lloyd W.; Williams, Michael H.

    2008-01-01

    The National Park Service is developing a monitoring plan for the Ozark National Scenic Riverways in southeastern Missouri. Because of concerns about the release of lead, zinc, and other metals from lead-zinc mining to streams, the monitoring plan will include mining-related metals. After considering a variety of alternatives, the plan will consist of measuring the concentrations of cadmium, cobalt, lead, nickel, and zinc in composite samples of crayfish (Orconectes luteus or alternate species) and Asian clam (Corbicula fluminea) collected periodically from selected sites. This document, which comprises a protocol narrative and supporting standard operating procedures, describes the methods to be employed prior to, during, and after collection of the organisms, along with procedures for their chemical analysis and quality assurance; statistical analysis, interpretation, and reporting of the data; and for modifying the protocol narrative and supporting standard operating procedures. A list of supplies and equipment, data forms, and sample labels are also included. An example based on data from a pilot study is presented.

  1. Investigation of radiative interaction in laminar flows using Monte Carlo simulation

    NASA Technical Reports Server (NTRS)

    Liu, Jiwen; Tiwari, S. N.

    1993-01-01

    The Monte Carlo method (MCM) is employed to study the radiative interactions in fully developed laminar flow between two parallel plates. Taking advantage of the characteristics of easy mathematical treatment of the MCM, a general numerical procedure is developed for nongray radiative interaction. The nongray model is based on the statistical narrow band model with an exponential-tailed inverse intensity distribution. To validate the Monte Carlo simulation for nongray radiation problems, the results of radiative dissipation from the MCM are compared with two available solutions for a given temperature profile between two plates. After this validation, the MCM is employed to solve the present physical problem and results for the bulk temperature are compared with available solutions. In general, good agreement is noted and reasons for some discrepancies in certain ranges of parameters are explained.

  2. 49 CFR 218.97 - Good faith challenge procedures.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... the railroad's operating rules implementing the requirements of this subpart. (b) General procedures... requirements of this subpart. (1) Each railroad or employer shall adopt and implement written procedures which... fulfill the requirements of this subpart. Each railroad or employer's written procedures shall provide for...

  3. 49 CFR 218.97 - Good faith challenge procedures.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... the railroad's operating rules implementing the requirements of this subpart. (b) General procedures... requirements of this subpart. (1) Each railroad or employer shall adopt and implement written procedures which... fulfill the requirements of this subpart. Each railroad or employer's written procedures shall provide for...

  4. 49 CFR 218.97 - Good faith challenge procedures.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... the railroad's operating rules implementing the requirements of this subpart. (b) General procedures... requirements of this subpart. (1) Each railroad or employer shall adopt and implement written procedures which... fulfill the requirements of this subpart. Each railroad or employer's written procedures shall provide for...

  5. 49 CFR 218.97 - Good faith challenge procedures.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the railroad's operating rules implementing the requirements of this subpart. (b) General procedures... requirements of this subpart. (1) Each railroad or employer shall adopt and implement written procedures which... fulfill the requirements of this subpart. Each railroad or employer's written procedures shall provide for...

  6. 49 CFR 218.97 - Good faith challenge procedures.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... the railroad's operating rules implementing the requirements of this subpart. (b) General procedures... requirements of this subpart. (1) Each railroad or employer shall adopt and implement written procedures which... fulfill the requirements of this subpart. Each railroad or employer's written procedures shall provide for...

  7. 28 CFR Appendix D to Part 61 - Office of Justice Assistance, Research, and Statistics Procedures Relating to the Implementation...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., and Statistics Procedures Relating to the Implementation of the National Environmental Policy Act D... Assistance, Research, and Statistics Procedures Relating to the Implementation of the National Environmental... Statistics (OJARS) assists State and local units of government in strengthening and improving law enforcement...

  8. 28 CFR Appendix D to Part 61 - Office of Justice Assistance, Research, and Statistics Procedures Relating to the Implementation...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., and Statistics Procedures Relating to the Implementation of the National Environmental Policy Act D... Assistance, Research, and Statistics Procedures Relating to the Implementation of the National Environmental... Statistics (OJARS) assists State and local units of government in strengthening and improving law enforcement...

  9. Outcomes of a novel minimalist approach for the treatment of cubital tunnel syndrome.

    PubMed

    Lan, Zheng D; Tatsui, Claudio E; Jalali, Ali; Humphries, William E; Rilea, Katheryn; Patel, Akash; Ehni, Bruce L

    2015-06-01

    We describe a minimalist approach to perform in situ decompression of the ulnar nerve. Our technique employs a unique small skin incision strategically placed to minimize postoperative scarring over the ulnar nerve and potentially decrease the risk of iatrogenic injury to the medial antebrachial cutaneous nerve. We retrospectively report the outcome of patients who have undergone this procedure at our institution, the Michael E. DeBakey Veterans Affairs Medical Center, from January 1 2007 through November 29 2010. All individuals underwent in situ decompression via the previously described minimalist approach. Outcome variables were Louisiana State University Medical Center (LSU) ulnar neuropathy grade, patient satisfaction, subjective improvement, complications and re-operation rate. A total of 44 procedures were performed in this cohort of 41 patients. Overall, patients' postoperative LSU grades showed a statistically significant improvement (p=0.0019) compared to preoperative grades. Improvement of at least one grade in the LSU scale was observed in 50% of the procedures with a preoperative grade of four or less. Overall procedure satisfaction rate was 88% (39 of 44) with 70% (31 of 44) of the procedures resulting in improvement of symptoms. There were no intraoperative or postoperative complications. One patient required re-operation due to failure of neurological improvement. Our minimalistic approach to perform in situ decompression of the ulnar nerve at the cubital tunnel is both safe and effective. We observed a statistically significant improvement in LSU ulnar neuropathy grades and a success rate comparable to those reported for other more extensive surgical techniques while providing the benefit of a smaller incision, less scarring, decreased risk of iatrogenic nerve injury and minimal complications. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. A procedure for combining acoustically induced and mechanically induced loads (first passage failure design criterion)

    NASA Technical Reports Server (NTRS)

    Crowe, D. R.; Henricks, W.

    1983-01-01

    The combined load statistics are developed by taking the acoustically induced load to be a random population, assumed to be stationary. Each element of this ensemble of acoustically induced loads is assumed to have the same power spectral density (PSD), obtained previously from a random response analysis employing the given acoustic field in the STS cargo bay as a stationary random excitation. The mechanically induced load is treated as either (1) a known deterministic transient, or (2) a nonstationary random variable of known first and second statistical moments which vary with time. A method is then shown for determining the probability that the combined load would, at any time, have a value equal to or less than a certain level. Having obtained a statistical representation of how the acoustic and mechanical loads are expected to combine, an analytical approximation for defining design levels for these loads is presented using the First Passage failure criterion.

  11. ToNER: A tool for identifying nucleotide enrichment signals in feature-enriched RNA-seq data.

    PubMed

    Promworn, Yuttachon; Kaewprommal, Pavita; Shaw, Philip J; Intarapanich, Apichart; Tongsima, Sissades; Piriyapongsa, Jittima

    2017-01-01

    Biochemical methods are available for enriching 5' ends of RNAs in prokaryotes, which are employed in the differential RNA-seq (dRNA-seq) and the more recent Cappable-seq protocols. Computational methods are needed to locate RNA 5' ends from these data by statistical analysis of the enrichment. Although statistical-based analysis methods have been developed for dRNA-seq, they may not be suitable for Cappable-seq data. The more efficient enrichment method employed in Cappable-seq compared with dRNA-seq could affect data distribution and thus algorithm performance. We present Transformation of Nucleotide Enrichment Ratios (ToNER), a tool for statistical modeling of enrichment from RNA-seq data obtained from enriched and unenriched libraries. The tool calculates nucleotide enrichment scores and determines the global transformation for fitting to the normal distribution using the Box-Cox procedure. From the transformed distribution, sites of significant enrichment are identified. To increase power of detection, meta-analysis across experimental replicates is offered. We tested the tool on Cappable-seq and dRNA-seq data for identifying Escherichia coli transcript 5' ends and compared the results with those from the TSSAR tool, which is designed for analyzing dRNA-seq data. When combining results across Cappable-seq replicates, ToNER detects more known transcript 5' ends than TSSAR. In general, the transcript 5' ends detected by ToNER but not TSSAR occur in regions which cannot be locally modeled by TSSAR. ToNER uses a simple yet robust statistical modeling approach, which can be used for detecting RNA 5'ends from Cappable-seq data, in particular when combining information from experimental replicates. The ToNER tool could potentially be applied for analyzing other RNA-seq datasets in which enrichment for other structural features of RNA is employed. The program is freely available for download at ToNER webpage (http://www4a.biotec.or.th/GI/tools/toner) and GitHub repository (https://github.com/PavitaKae/ToNER).

  12. Hands-on 2.0: improving transfer of training via the Society of American Gastrointestinal and Endoscopic Surgeons (SAGES) Acquisition of Data for Outcomes and Procedure Transfer (ADOPT) program.

    PubMed

    Dort, Jonathan; Trickey, Amber; Paige, John; Schwarz, Erin; Dunkin, Brian

    2017-08-01

    Practicing surgeons commonly learn new procedures and techniques by attending a "hands-on" course, though trainings are often ineffective at promoting subsequent procedure adoption in practice. We describe implementation of a new program with the SAGES All Things Hernia Hands-On Course, Acquisition of Data for Outcomes and Procedure Transfer (ADOPT), which employs standardized, proven teaching techniques, and 1-year mentorship. Attendee confidence and procedure adoption are compared between standard and ADOPT programs. For the pilot ADOPT course implementation, a hands-on course focusing on abdominal wall hernia repair was chosen. ADOPT participants were recruited among enrollees for the standard Hands-On Hernia Course. Enrollment in ADOPT was capped at 10 participants and limited to a 2:1 student-to-faculty ratio, compared to the standard course 22 participants with a 4:1 student-to-faculty ratio. ADOPT mentors interacted with participants through webinars, phone conferences, and continuous email availability throughout the year. All participants were asked to provide pre- and post-course surveys inquiring about the number of targeted hernia procedures performed and related confidence level. Four of 10 ADOPT participants (40%) and six of 22 standard training participants (27%) returned questionnaires. Over the 3 months following the course, ADOPT participants performed more ventral hernia mesh insertion procedures than standard training participants (median 13 vs. 0.5, p = 0.010) and considerably more total combined procedures (median 26 vs. 7, p = 0.054). Compared to standard training, learners who participated in ADOPT reported greater confidence improvements in employing a components separation via an open approach (p = 0.051), and performing an open transversus abdominis release, though the difference did not achieve statistical significance (p = 0.14). These results suggest that the ADOPT program, with standardized and structured teaching, telementoring, and a longitudinal educational approach, is effective and leads to better transfer of learned skills and procedures to clinical practice.

  13. 28 CFR 105.23 - Procedure for requesting criminal history record check.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 28 Judicial Administration 2 2013-07-01 2013-07-01 false Procedure for requesting criminal history... HISTORY BACKGROUND CHECKS Private Security Officer Employment § 105.23 Procedure for requesting criminal history record check. These procedures only apply to participating states. An authorized employer may...

  14. 28 CFR 105.23 - Procedure for requesting criminal history record check.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 2 2014-07-01 2014-07-01 false Procedure for requesting criminal history... HISTORY BACKGROUND CHECKS Private Security Officer Employment § 105.23 Procedure for requesting criminal history record check. These procedures only apply to participating states. An authorized employer may...

  15. 28 CFR 105.23 - Procedure for requesting criminal history record check.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 28 Judicial Administration 2 2011-07-01 2011-07-01 false Procedure for requesting criminal history... HISTORY BACKGROUND CHECKS Private Security Officer Employment § 105.23 Procedure for requesting criminal history record check. These procedures only apply to participating states. An authorized employer may...

  16. 28 CFR 105.23 - Procedure for requesting criminal history record check.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 28 Judicial Administration 2 2012-07-01 2012-07-01 false Procedure for requesting criminal history... HISTORY BACKGROUND CHECKS Private Security Officer Employment § 105.23 Procedure for requesting criminal history record check. These procedures only apply to participating states. An authorized employer may...

  17. 28 CFR 105.23 - Procedure for requesting criminal history record check.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 2 2010-07-01 2010-07-01 false Procedure for requesting criminal history... HISTORY BACKGROUND CHECKS Private Security Officer Employment § 105.23 Procedure for requesting criminal history record check. These procedures only apply to participating states. An authorized employer may...

  18. 29 CFR 1926.1403 - Assembly/Disassembly-selection of manufacturer or employer procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 8 2013-07-01 2013-07-01 false Assembly/Disassembly-selection of manufacturer or employer... CONSTRUCTION Cranes and Derricks in Construction § 1926.1403 Assembly/Disassembly—selection of manufacturer or... applicable to assembly and disassembly, or (b) Employer procedures for assembly and disassembly. Employer...

  19. 29 CFR 1926.1403 - Assembly/Disassembly-selection of manufacturer or employer procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 8 2012-07-01 2012-07-01 false Assembly/Disassembly-selection of manufacturer or employer... CONSTRUCTION Cranes and Derricks in Construction § 1926.1403 Assembly/Disassembly—selection of manufacturer or... applicable to assembly and disassembly, or (b) Employer procedures for assembly and disassembly. Employer...

  20. 29 CFR 1926.1403 - Assembly/Disassembly-selection of manufacturer or employer procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 8 2014-07-01 2014-07-01 false Assembly/Disassembly-selection of manufacturer or employer... CONSTRUCTION Cranes and Derricks in Construction § 1926.1403 Assembly/Disassembly—selection of manufacturer or... applicable to assembly and disassembly, or (b) Employer procedures for assembly and disassembly. Employer...

  1. 29 CFR 1926.1403 - Assembly/Disassembly-selection of manufacturer or employer procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 8 2011-07-01 2011-07-01 false Assembly/Disassembly-selection of manufacturer or employer... CONSTRUCTION Cranes and Derricks in Construction § 1926.1403 Assembly/Disassembly—selection of manufacturer or... applicable to assembly and disassembly, or (b) Employer procedures for assembly and disassembly. Employer...

  2. 28 CFR 42.612 - Interagency consultation.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 42.612 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Procedures for Complaints of Employment Discrimination Filed Against Recipients of... unlawful discrimination or initiating formal administrative enforcement procedures on that basis, an agency...

  3. A Comparative Study of Microleakage on Dental Surfaces Bonded with Three Self-Etch Adhesive Systems Treated with the Er:YAG Laser and Bur

    PubMed Central

    Sanhadji El Haddar, Youssef; Cetik, Sibel; Bahrami, Babak; Atash, Ramin

    2016-01-01

    Aim. This study sought to compare the microleakage of three adhesive systems in the context of Erbium-YAG laser and diamond bur cavity procedures. Cavities were restored with composite resin. Materials and Methods. Standardized Class V cavities were performed in 72 extracted human teeth by means of diamond burs or Er-YAG laser. The samples were randomly divided into six groups of 12, testing three adhesive systems (Clearfil s3 Bond Plus, Xeno® Select, and Futurabond U) for each method used. Cavities were restored with composite resin before thermocycling (methylene blue 2%, 24 h). The slices were prepared using a microtome. Optical microscope photography was employed to measure the penetration. Results. No statistically significant differences in microleakage were found in the use of bur or laser, nor between adhesive systems. Only statistically significant values were observed comparing enamel with cervical walls (p < 0.001). Conclusion. It can be concluded that the Er:YAG laser is as efficient as diamond bur concerning microleakage values in adhesive restoration procedures, thus constituting an alternative tool for tooth preparation. PMID:27419128

  4. Confidence intervals for expected moments algorithm flood quantile estimates

    USGS Publications Warehouse

    Cohn, Timothy A.; Lane, William L.; Stedinger, Jery R.

    2001-01-01

    Historical and paleoflood information can substantially improve flood frequency estimates if appropriate statistical procedures are properly applied. However, the Federal guidelines for flood frequency analysis, set forth in Bulletin 17B, rely on an inefficient “weighting” procedure that fails to take advantage of historical and paleoflood information. This has led researchers to propose several more efficient alternatives including the Expected Moments Algorithm (EMA), which is attractive because it retains Bulletin 17B's statistical structure (method of moments with the Log Pearson Type 3 distribution) and thus can be easily integrated into flood analyses employing the rest of the Bulletin 17B approach. The practical utility of EMA, however, has been limited because no closed‐form method has been available for quantifying the uncertainty of EMA‐based flood quantile estimates. This paper addresses that concern by providing analytical expressions for the asymptotic variance of EMA flood‐quantile estimators and confidence intervals for flood quantile estimates. Monte Carlo simulations demonstrate the properties of such confidence intervals for sites where a 25‐ to 100‐year streamgage record is augmented by 50 to 150 years of historical information. The experiments show that the confidence intervals, though not exact, should be acceptable for most purposes.

  5. 28 CFR 42.613 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... (f) Joint complaint means a complaint of employment discrimination covered by title VII or the Equal... Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Procedures for Complaints of Employment Discrimination Filed Against Recipients of Federal...

  6. 75 FR 38871 - Proposed Collection; Comment Request for Revenue Procedure 2004-29

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-06

    ... comments concerning Revenue Procedure 2004-29, Statistical Sampling in Sec. 274 Context. DATES: Written... Internet, at [email protected] . SUPPLEMENTARY INFORMATION: Title: Statistical Sampling in Sec...: Revenue Procedure 2004-29 prescribes the statistical sampling methodology by which taxpayers under...

  7. Finding the Root Causes of Statistical Inconsistency in Community Earth System Model Output

    NASA Astrophysics Data System (ADS)

    Milroy, D.; Hammerling, D.; Baker, A. H.

    2017-12-01

    Baker et al (2015) developed the Community Earth System Model Ensemble Consistency Test (CESM-ECT) to provide a metric for software quality assurance by determining statistical consistency between an ensemble of CESM outputs and new test runs. The test has proved useful for detecting statistical difference caused by compiler bugs and errors in physical modules. However, detection is only the necessary first step in finding the causes of statistical difference. The CESM is a vastly complex model comprised of millions of lines of code which is developed and maintained by a large community of software engineers and scientists. Any root cause analysis is correspondingly challenging. We propose a new capability for CESM-ECT: identifying the sections of code that cause statistical distinguishability. The first step is to discover CESM variables that cause CESM-ECT to classify new runs as statistically distinct, which we achieve via Randomized Logistic Regression. Next we use a tool developed to identify CESM components that define or compute the variables found in the first step. Finally, we employ the application Kernel GENerator (KGEN) created in Kim et al (2016) to detect fine-grained floating point differences. We demonstrate an example of the procedure and advance a plan to automate this process in our future work.

  8. 29 CFR 1625.30 - Administrative exemptions; procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ....30 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION AGE DISCRIMINATION IN EMPLOYMENT ACT Administrative Exemptions § 1625.30 Administrative exemptions; procedures. (a... discrimination in employment. Administrative action consistent with this statutory purpose may be taken under...

  9. 29 CFR 1625.30 - Administrative exemptions; procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ....30 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION AGE DISCRIMINATION IN EMPLOYMENT ACT Administrative Exemptions § 1625.30 Administrative exemptions; procedures. (a... discrimination in employment. Administrative action consistent with this statutory purpose may be taken under...

  10. 29 CFR 1625.30 - Administrative exemptions; procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ....30 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION AGE DISCRIMINATION IN EMPLOYMENT ACT Administrative Exemptions § 1625.30 Administrative exemptions; procedures. (a... discrimination in employment. Administrative action consistent with this statutory purpose may be taken under...

  11. 29 CFR 1625.30 - Administrative exemptions; procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....30 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION AGE DISCRIMINATION IN EMPLOYMENT ACT Administrative Exemptions § 1625.30 Administrative exemptions; procedures. (a... discrimination in employment. Administrative action consistent with this statutory purpose may be taken under...

  12. 29 CFR 1625.30 - Administrative exemptions; procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ....30 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION AGE DISCRIMINATION IN EMPLOYMENT ACT Administrative Exemptions § 1625.30 Administrative exemptions; procedures. (a... discrimination in employment. Administrative action consistent with this statutory purpose may be taken under...

  13. 75 FR 53738 - Proposed Collection; Comment Request for Rev. Proc. 2007-35

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-09-01

    ... Revenue Procedure Revenue Procedure 2007-35, Statistical Sampling for purposes of Section 199. DATES... through the Internet, at [email protected] . SUPPLEMENTARY INFORMATION: Title: Statistical Sampling...: This revenue procedure provides for determining when statistical sampling may be used in purposes of...

  14. Developing a Novel Parameter Estimation Method for Agent-Based Model in Immune System Simulation under the Framework of History Matching: A Case Study on Influenza A Virus Infection

    PubMed Central

    Li, Tingting; Cheng, Zhengguo; Zhang, Le

    2017-01-01

    Since they can provide a natural and flexible description of nonlinear dynamic behavior of complex system, Agent-based models (ABM) have been commonly used for immune system simulation. However, it is crucial for ABM to obtain an appropriate estimation for the key parameters of the model by incorporating experimental data. In this paper, a systematic procedure for immune system simulation by integrating the ABM and regression method under the framework of history matching is developed. A novel parameter estimation method by incorporating the experiment data for the simulator ABM during the procedure is proposed. First, we employ ABM as simulator to simulate the immune system. Then, the dimension-reduced type generalized additive model (GAM) is employed to train a statistical regression model by using the input and output data of ABM and play a role as an emulator during history matching. Next, we reduce the input space of parameters by introducing an implausible measure to discard the implausible input values. At last, the estimation of model parameters is obtained using the particle swarm optimization algorithm (PSO) by fitting the experiment data among the non-implausible input values. The real Influeza A Virus (IAV) data set is employed to demonstrate the performance of our proposed method, and the results show that the proposed method not only has good fitting and predicting accuracy, but it also owns favorable computational efficiency. PMID:29194393

  15. Developing a Novel Parameter Estimation Method for Agent-Based Model in Immune System Simulation under the Framework of History Matching: A Case Study on Influenza A Virus Infection.

    PubMed

    Li, Tingting; Cheng, Zhengguo; Zhang, Le

    2017-12-01

    Since they can provide a natural and flexible description of nonlinear dynamic behavior of complex system, Agent-based models (ABM) have been commonly used for immune system simulation. However, it is crucial for ABM to obtain an appropriate estimation for the key parameters of the model by incorporating experimental data. In this paper, a systematic procedure for immune system simulation by integrating the ABM and regression method under the framework of history matching is developed. A novel parameter estimation method by incorporating the experiment data for the simulator ABM during the procedure is proposed. First, we employ ABM as simulator to simulate the immune system. Then, the dimension-reduced type generalized additive model (GAM) is employed to train a statistical regression model by using the input and output data of ABM and play a role as an emulator during history matching. Next, we reduce the input space of parameters by introducing an implausible measure to discard the implausible input values. At last, the estimation of model parameters is obtained using the particle swarm optimization algorithm (PSO) by fitting the experiment data among the non-implausible input values. The real Influeza A Virus (IAV) data set is employed to demonstrate the performance of our proposed method, and the results show that the proposed method not only has good fitting and predicting accuracy, but it also owns favorable computational efficiency.

  16. Disability on campus: a perspective from faculty and staff.

    PubMed

    Shigaki, Cheryl L; Anderson, Kim M; Howald, Carol L; Henson, Lee; Gregg, Bonnie E

    2012-01-01

    To identify employee perceptions regarding disability-related workplace issues in Institutions of Higher Education (IHE). Faculty and staff (N=1,144) at a large, Midwestern university. A voluntary on-line survey of disability-related employment issues was developed by the university's Chancellor's Committee of Persons with Disabilities. Item responses were analyzed using descriptive and Pearson chi-square statistical methods. Fifteen percent of faculty and staff respondents were found to have disabilities, with 26% reporting experience of job discrimination, and 20% reporting harassment because of their disability. Results indicated significant differences on gender, employment standing (i.e., faculty or staff) and disability status (i.e., with or without a disability), in regard to perceptions of disability acceptance, campus accessibility, disability awareness, ADA policy, and knowledge of work accommodation procedures. Recommendations for IHEs are provided to promote a welcoming and inclusive campus that ultimately supports work success for persons with a disability.

  17. Statistical methodology for the analysis of dye-switch microarray experiments

    PubMed Central

    Mary-Huard, Tristan; Aubert, Julie; Mansouri-Attia, Nadera; Sandra, Olivier; Daudin, Jean-Jacques

    2008-01-01

    Background In individually dye-balanced microarray designs, each biological sample is hybridized on two different slides, once with Cy3 and once with Cy5. While this strategy ensures an automatic correction of the gene-specific labelling bias, it also induces dependencies between log-ratio measurements that must be taken into account in the statistical analysis. Results We present two original statistical procedures for the statistical analysis of individually balanced designs. These procedures are compared with the usual ML and REML mixed model procedures proposed in most statistical toolboxes, on both simulated and real data. Conclusion The UP procedure we propose as an alternative to usual mixed model procedures is more efficient and significantly faster to compute. This result provides some useful guidelines for the analysis of complex designs. PMID:18271965

  18. 28 CFR 42.606 - General rules concerning EEOC action on complaints.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... complaints. 42.606 Section 42.606 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Procedures for Complaints of Employment Discrimination Filed... complaints. (a) A complaint of employment discrimination filed with an agency, which is transferred or...

  19. 28 CFR 42.611 - EEOC negotiated settlements and conciliation agreements.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Procedures for Complaints of Employment Discrimination Filed Against Recipients of Federal Financial Assistance § 42.611 EEOC negotiated settlements and... no further action on the complaint of employment discrimination thereafter except that the agency may...

  20. 20 CFR 655.102 - Special procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR TEMPORARY EMPLOYMENT OF FOREIGN WORKERS IN THE UNITED STATES Labor Certification Process for Temporary Agricultural Employment in the United States (H-2A Workers) § 655.102 Special procedures. To provide for a limited degree of...

  1. 29 CFR 1915.505 - Fire response.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... written standard operating procedures for each type of fire response at the employer's facility; (v) The... suppression operations established by written standard operating procedures for each particular type of fire...) OCCUPATIONAL SAFETY AND HEALTH STANDARDS FOR SHIPYARD EMPLOYMENT Fire Protection in Shipyard Employment § 1915...

  2. 29 CFR 1915.505 - Fire response.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... written standard operating procedures for each type of fire response at the employer's facility; (v) The... suppression operations established by written standard operating procedures for each particular type of fire...) OCCUPATIONAL SAFETY AND HEALTH STANDARDS FOR SHIPYARD EMPLOYMENT Fire Protection in Shipyard Employment § 1915...

  3. 29 CFR 1915.505 - Fire response.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... written standard operating procedures for each type of fire response at the employer's facility; (v) The... suppression operations established by written standard operating procedures for each particular type of fire...) OCCUPATIONAL SAFETY AND HEALTH STANDARDS FOR SHIPYARD EMPLOYMENT Fire Protection in Shipyard Employment § 1915...

  4. 29 CFR 1915.505 - Fire response.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... written standard operating procedures for each type of fire response at the employer's facility; (v) The... suppression operations established by written standard operating procedures for each particular type of fire...) OCCUPATIONAL SAFETY AND HEALTH STANDARDS FOR SHIPYARD EMPLOYMENT Fire Protection in Shipyard Employment § 1915...

  5. 29 CFR 1915.505 - Fire response.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... written standard operating procedures for each type of fire response at the employer's facility; (v) The... suppression operations established by written standard operating procedures for each particular type of fire...) OCCUPATIONAL SAFETY AND HEALTH STANDARDS FOR SHIPYARD EMPLOYMENT Fire Protection in Shipyard Employment § 1915...

  6. 29 CFR 1601.25 - Failure of conciliation; notice.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Failure of conciliation; notice. 1601.25 Section 1601.25 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURAL REGULATIONS Procedure for the Prevention of Unlawful Employment Practices Procedure to Rectify Unlawful...

  7. 29 CFR 1601.23 - Preliminary or temporary relief.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 4 2011-07-01 2011-07-01 false Preliminary or temporary relief. 1601.23 Section 1601.23 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURAL REGULATIONS Procedure for the Prevention of Unlawful Employment Practices Procedure to Rectify Unlawful...

  8. 29 CFR 1601.25 - Failure of conciliation; notice.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 4 2011-07-01 2011-07-01 false Failure of conciliation; notice. 1601.25 Section 1601.25 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURAL REGULATIONS Procedure for the Prevention of Unlawful Employment Practices Procedure to Rectify Unlawful...

  9. Working 9-5: Causal Relationships Between Singers' "Day Jobs" and Their Performance Work, With Implications for Vocal Health.

    PubMed

    Bartlett, Irene; Wilson, Pat H

    2017-03-01

    It is acknowledged generally that professional contemporary commercial music (CCM) singers engage in supplementary employment ("the day job") to achieve and maintain a reliable living wage. In this paper, consideration is given to the impact of such nonperformance employment on CCM's sustainable vocal health. Collected data from a survey of 102 professional contemporary gig singers were analysed using descriptive statistical procedures from the Statistical Package for the Social Sciences. Although these data provided descriptions of the personal characteristics of individuals in the sample, the inclusion of open format questions encouraged participants to report details of their "lived" experience. Additionally, a meta-analysis of a range of associated literature was undertaken. Sixty-five participants (N = 102) reported that in addition to their heavy performance voice use, they were employed in "other" work (the "day job") where their speaking voice loads were high. In responding to open-ended questions, many proffered written comments that were unprompted. The collected data from this element of the research study are reported here. We propose that at least some causal factors of singers' reported voice problems may lie in the misuse or overuse of their everyday speaking voice (as demanded by their "day job") rather than a misuse of their singing voice. These findings have practical application to all whose concern is care for the vocal or emotional health and performance longevity of professional singers. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  10. Psychological profiling of offender characteristics from crime behaviors in serial rape offences.

    PubMed

    Kocsis, Richard N; Cooksey, Ray W; Irwin, Harvey J

    2002-04-01

    Criminal psychological profiling has progressively been incorporated into police procedures despite a dearth of empirical research. Indeed, in the study of serial violent crimes for the purpose of psychological profiling, very few original, quantitative, academically reviewed studies actually exist. This article reports on the analysis of 62 incidents of serial sexual assault. The statistical procedure of multidimensional scaling was employed in the analysis of this data, which in turn produced a five-cluster model of serial rapist behavior. First, a central cluster of behaviors were identified that represent common behaviors to all patterns of serial rape. Second, four distinct outlying patterns were identified as demonstrating distinct offence styles, these being assigned the following descriptive labels brutality, intercourse, chaotic, and ritual. Furthermore, analysis of these patterns also identified distinct offender characteristics that allow for the use of empirically robust offender profiles in future serial rape investigations.

  11. BIOREL: the benchmark resource to estimate the relevance of the gene networks.

    PubMed

    Antonov, Alexey V; Mewes, Hans W

    2006-02-06

    The progress of high-throughput methodologies in functional genomics has lead to the development of statistical procedures to infer gene networks from various types of high-throughput data. However, due to the lack of common standards, the biological significance of the results of the different studies is hard to compare. To overcome this problem we propose a benchmark procedure and have developed a web resource (BIOREL), which is useful for estimating the biological relevance of any genetic network by integrating different sources of biological information. The associations of each gene from the network are classified as biologically relevant or not. The proportion of genes in the network classified as "relevant" is used as the overall network relevance score. Employing synthetic data we demonstrated that such a score ranks the networks fairly in respect to the relevance level. Using BIOREL as the benchmark resource we compared the quality of experimental and theoretically predicted protein interaction data.

  12. Application of a faith-based integration tool to assess mental and physical health interventions.

    PubMed

    Saunders, Donna M; Leak, Jean; Carver, Monique E; Smith, Selina A

    2017-01-01

    To build on current research involving faith-based interventions (FBIs) for addressing mental and physical health, this study a) reviewed the extent to which relevant publications integrate faith concepts with health and b) initiated analysis of the degree of FBI integration with intervention outcomes. Derived from a systematic search of articles published between 2007 and 2017, 36 studies were assessed with a Faith-Based Integration Assessment Tool (FIAT) to quantify faith-health integration. Basic statistical procedures were employed to determine the association of faith-based integration with intervention outcomes. The assessed studies possessed (on average) moderate, inconsistent integration because of poor use of faith measures, and moderate, inconsistent use of faith practices. Analysis procedures for determining the effect of FBI integration on intervention outcomes were inadequate for formulating practical conclusions. Regardless of integration, interventions were associated with beneficial outcomes. To determine the link between FBI integration and intervention outcomes, additional analyses are needed.

  13. Headspace screening: A novel approach for fast quality assessment of the essential oil from culinary sage.

    PubMed

    Cvetkovikj, Ivana; Stefkov, Gjoshe; Acevska, Jelena; Karapandzova, Marija; Dimitrovska, Aneta; Kulevanova, Svetlana

    2016-07-01

    Quality assessment of essential oil (EO) from culinary sage (Salvia officinalis L., Lamiaceae) is limited by the long pharmacopoeial procedure. The aim of this study was to employ headspace (HS) sampling in the quality assessment of sage EO. Different populations (30) of culinary sage were assessed using GC/FID/MS analysis of the hydrodistilled EO (pharmacopoeial method) and HS sampling directly from leaves. Compound profiles from both procedures were evaluated according to ISO 9909 and GDC standards for sage EO quality, revealing compliance for only 10 populations. Factors to convert HS values, for the target ISO and GDC components, into theoretical EO values were calculated. Statistical analysis revealed a significant relationship between HS and EO values for seven target components. Consequently, HS sampling could be used as a complementary extraction technique for rapid screening in quality assessment of sage EOs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Gender and Employment. Current Statistics and Their Implications.

    ERIC Educational Resources Information Center

    Equity Issues, 1996

    1996-01-01

    This publication contains three fact sheets on gender and employment statistics and their implications. The fact sheets are divided into two sections--statistics and implications. The statistics present the current situation of men and women workers as they relate to occupations, education, and earnings. The implications express suggestions for…

  15. 41 CFR 60-3.2 - Scope.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... GUIDELINES ON EMPLOYEE SELECTION PROCEDURES (1978) General Principles § 60-3.2 Scope. A. Application of... tests and other selection procedures which are used as a basis for any employment decision. Employment... certification may be covered by Federal equal employment opportunity law. Other selection decisions, such as...

  16. 41 CFR 60-3.2 - Scope.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... GUIDELINES ON EMPLOYEE SELECTION PROCEDURES (1978) General Principles § 60-3.2 Scope. A. Application of... tests and other selection procedures which are used as a basis for any employment decision. Employment... certification may be covered by Federal equal employment opportunity law. Other selection decisions, such as...

  17. 41 CFR 60-3.2 - Scope.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... GUIDELINES ON EMPLOYEE SELECTION PROCEDURES (1978) General Principles § 60-3.2 Scope. A. Application of... tests and other selection procedures which are used as a basis for any employment decision. Employment... certification may be covered by Federal equal employment opportunity law. Other selection decisions, such as...

  18. 41 CFR 60-3.2 - Scope.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... GUIDELINES ON EMPLOYEE SELECTION PROCEDURES (1978) General Principles § 60-3.2 Scope. A. Application of... tests and other selection procedures which are used as a basis for any employment decision. Employment... certification may be covered by Federal equal employment opportunity law. Other selection decisions, such as...

  19. 26 CFR 301.7701-12 - Employer identification number.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 26 Internal Revenue 18 2010-04-01 2010-04-01 false Employer identification number. 301.7701-12 Section 301.7701-12 Internal Revenue INTERNAL REVENUE SERVICE, DEPARTMENT OF THE TREASURY (CONTINUED) PROCEDURE AND ADMINISTRATION PROCEDURE AND ADMINISTRATION Definitions § 301.7701-12 Employer identification...

  20. Integrated HPTLC-based Methodology for the Tracing of Bioactive Compounds in Herbal Extracts Employing Multivariate Chemometrics. A Case Study on Morus alba.

    PubMed

    Chaita, Eliza; Gikas, Evagelos; Aligiannis, Nektarios

    2017-03-01

    In drug discovery, bioassay-guided isolation is a well-established procedure, and still the basic approach for the discovery of natural products with desired biological properties. However, in these procedures, the most laborious and time-consuming step is the isolation of the bioactive constituents. A prior identification of the compounds that contribute to the demonstrated activity of the fractions would enable the selection of proper chromatographic techniques and lead to targeted isolation. The development of an integrated HPTLC-based methodology for the rapid tracing of the bioactive compounds during bioassay-guided processes, using multivariate statistics. Materials and Methods - The methanol extract of Morus alba was fractionated employing CPC. Subsequently, fractions were assayed for tyrosinase inhibition and analyzed with HPTLC. PLS-R algorithm was performed in order to correlate the analytical data with the biological response of the fractions and identify the compounds with the highest contribution. Two methodologies were developed for the generation of the dataset; one based on manual peak picking and the second based on chromatogram binning. Results and Discussion - Both methodologies afforded comparable results and were able to trace the bioactive constituents (e.g. oxyresveratrol, trans-dihydromorin, 2,4,3'-trihydroxydihydrostilbene). The suggested compounds were compared in terms of R f values and UV spectra with compounds isolated from M. alba using typical bioassay-guided process. Chemometric tools supported the development of a novel HPTLC-based methodology for the tracing of tyrosinase inhibitors in M. alba extract. All steps of the experimental procedure implemented techniques that afford essential key elements for application in high-throughput screening procedures for drug discovery purposes. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Inference of missing data and chemical model parameters using experimental statistics

    NASA Astrophysics Data System (ADS)

    Casey, Tiernan; Najm, Habib

    2017-11-01

    A method for determining the joint parameter density of Arrhenius rate expressions through the inference of missing experimental data is presented. This approach proposes noisy hypothetical data sets from target experiments and accepts those which agree with the reported statistics, in the form of nominal parameter values and their associated uncertainties. The data exploration procedure is formalized using Bayesian inference, employing maximum entropy and approximate Bayesian computation methods to arrive at a joint density on data and parameters. The method is demonstrated in the context of reactions in the H2-O2 system for predictive modeling of combustion systems of interest. Work supported by the US DOE BES CSGB. Sandia National Labs is a multimission lab managed and operated by Nat. Technology and Eng'g Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell Intl, for the US DOE NCSA under contract DE-NA-0003525.

  2. Statistical Characterization of Environmental Error Sources Affecting Electronically Scanned Pressure Transducers

    NASA Technical Reports Server (NTRS)

    Green, Del L.; Walker, Eric L.; Everhart, Joel L.

    2006-01-01

    Minimization of uncertainty is essential to extend the usable range of the 15-psid Electronically Scanned Pressure [ESP) transducer measurements to the low free-stream static pressures found in hypersonic wind tunnels. Statistical characterization of environmental error sources inducing much of this uncertainty requires a well defined and controlled calibration method. Employing such a controlled calibration system, several studies were conducted that provide quantitative information detailing the required controls needed to minimize environmental and human induced error sources. Results of temperature, environmental pressure, over-pressurization, and set point randomization studies for the 15-psid transducers are presented along with a comparison of two regression methods using data acquired with both 0.36-psid and 15-psid transducers. Together these results provide insight into procedural and environmental controls required for long term high-accuracy pressure measurements near 0.01 psia in the hypersonic testing environment using 15-psid ESP transducers.

  3. Statistical Characterization of Environmental Error Sources Affecting Electronically Scanned Pressure Transducers

    NASA Technical Reports Server (NTRS)

    Green, Del L.; Walker, Eric L.; Everhart, Joel L.

    2006-01-01

    Minimization of uncertainty is essential to extend the usable range of the 15-psid Electronically Scanned Pressure (ESP) transducer measurements to the low free-stream static pressures found in hypersonic wind tunnels. Statistical characterization of environmental error sources inducing much of this uncertainty requires a well defined and controlled calibration method. Employing such a controlled calibration system, several studies were conducted that provide quantitative information detailing the required controls needed to minimize environmental and human induced error sources. Results of temperature, environmental pressure, over-pressurization, and set point randomization studies for the 15-psid transducers are presented along with a comparison of two regression methods using data acquired with both 0.36-psid and 15-psid transducers. Together these results provide insight into procedural and environmental controls required for long term high-accuracy pressure measurements near 0.01 psia in the hypersonic testing environment using 15-psid ESP transducers.

  4. Engineering Students Designing a Statistical Procedure for Quantifying Variability

    ERIC Educational Resources Information Center

    Hjalmarson, Margret A.

    2007-01-01

    The study examined first-year engineering students' responses to a statistics task that asked them to generate a procedure for quantifying variability in a data set from an engineering context. Teams used technological tools to perform computations, and their final product was a ranking procedure. The students could use any statistical measures,…

  5. 29 CFR 1621.3 - Procedure for requesting an opinion letter.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 1621.3 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-THE EQUAL PAY ACT § 1621.3 Procedure for requesting an opinion letter. (a) A request for an opinion letter should be submitted in writing to the Chairman, Equal Employment Opportunity Commission...

  6. A randomized controlled clinical trial to evaluate blood pressure changes in patients undergoing extraction under local anesthesia with vasopressor use.

    PubMed

    Uzeda, Marcelo José; Moura, Brenda; Louro, Rafael Seabra; da Silva, Licínio Esmeraldo; Calasans-Maia, Mônica Diuana

    2014-05-01

    The control of hypertensive patients' blood pressure and heart rate using vasoconstrictors during surgical procedures under anesthesia is still a major concern in everyday surgical practice. This clinical trial aimed to evaluate the variation of blood pressure and heart rate in nonhypertensive and controlled hypertensive voluntary subjects undergoing oral surgery under local anesthesia with lidocaine hydrochloride and epinephrine at 1:100,000 (Alphacaine; DFL, Brazil), performed in the Oral Surgery Department, Dentistry School, Fluminense Federal University. In total, 25 voluntary subjects were divided into 2 groups: nonhypertensive (n = 15) and controlled hypertensives (n = 10). Blood pressure and heart rate were measured at 4 different times: T0, in the waiting room; T1, after placement of the surgical drapes; T2, 10 minutes after anesthesia injection; and T3, at the end of the surgical procedure. A statistically significant difference (P < 0.05) between the groups was found at times T0 and T2 for the systolic pressure but only at time T0 for the diastolic pressure. The assessment of the heart rate of both groups showed a statistically significant difference (P < 0.05) at time T1. An analysis of the employed anesthetic volume indicated no statistically significant difference (P > 0.05) between the amount administered to nonhypertensive and hypertensive subjects. It was concluded that the local anesthetics studied could safely be used in controlled hypertensive and nonhypertensive patients in compliance with the maximum recommended doses.

  7. Fast maximum likelihood estimation using continuous-time neural point process models.

    PubMed

    Lepage, Kyle Q; MacDonald, Christopher J

    2015-06-01

    A recent report estimates that the number of simultaneously recorded neurons is growing exponentially. A commonly employed statistical paradigm using discrete-time point process models of neural activity involves the computation of a maximum-likelihood estimate. The time to computate this estimate, per neuron, is proportional to the number of bins in a finely spaced discretization of time. By using continuous-time models of neural activity and the optimally efficient Gaussian quadrature, memory requirements and computation times are dramatically decreased in the commonly encountered situation where the number of parameters p is much less than the number of time-bins n. In this regime, with q equal to the quadrature order, memory requirements are decreased from O(np) to O(qp), and the number of floating-point operations are decreased from O(np(2)) to O(qp(2)). Accuracy of the proposed estimates is assessed based upon physiological consideration, error bounds, and mathematical results describing the relation between numerical integration error and numerical error affecting both parameter estimates and the observed Fisher information. A check is provided which is used to adapt the order of numerical integration. The procedure is verified in simulation and for hippocampal recordings. It is found that in 95 % of hippocampal recordings a q of 60 yields numerical error negligible with respect to parameter estimate standard error. Statistical inference using the proposed methodology is a fast and convenient alternative to statistical inference performed using a discrete-time point process model of neural activity. It enables the employment of the statistical methodology available with discrete-time inference, but is faster, uses less memory, and avoids any error due to discretization.

  8. 7 CFR 52.38c - Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes. 52.38c Section 52.38c Agriculture Regulations of the... Regulations Governing Inspection and Certification Sampling § 52.38c Statistical sampling procedures for lot...

  9. 7 CFR 52.38b - Statistical sampling procedures for on-line inspection by attributes of processed fruits and...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 2 2011-01-01 2011-01-01 false Statistical sampling procedures for on-line inspection by attributes of processed fruits and vegetables. 52.38b Section 52.38b Agriculture Regulations of... Regulations Governing Inspection and Certification Sampling § 52.38b Statistical sampling procedures for on...

  10. 75 FR 79320 - Animal Drugs, Feeds, and Related Products; Regulation of Carcinogenic Compounds in Food-Producing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-20

    ... is calculated from tumor data of the cancer bioassays using a statistical extrapolation procedure... carcinogenic concern currently set forth in Sec. 500.84 utilizes a statistical extrapolation procedure that... procedures did not rely on a statistical extrapolation of the data to a 1 in 1 million risk of cancer to test...

  11. 7 CFR 52.38b - Statistical sampling procedures for on-line inspection by attributes of processed fruits and...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Statistical sampling procedures for on-line inspection by attributes of processed fruits and vegetables. 52.38b Section 52.38b Agriculture Regulations of... Regulations Governing Inspection and Certification Sampling § 52.38b Statistical sampling procedures for on...

  12. 7 CFR 52.38c - Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 2 2010-01-01 2010-01-01 false Statistical sampling procedures for lot inspection of processed fruits and vegetables by attributes. 52.38c Section 52.38c Agriculture Regulations of the... Regulations Governing Inspection and Certification Sampling § 52.38c Statistical sampling procedures for lot...

  13. Implementation of false discovery rate for exploring novel paradigms and trait dimensions with ERPs.

    PubMed

    Crowley, Michael J; Wu, Jia; McCreary, Scott; Miller, Kelly; Mayes, Linda C

    2012-01-01

    False discovery rate (FDR) is a multiple comparison procedure that targets the expected proportion of false discoveries among the discoveries. Employing FDR methods in event-related potential (ERP) research provides an approach to explore new ERP paradigms and ERP-psychological trait/behavior relations. In Study 1, we examined neural responses to escape behavior from an aversive noise. In Study 2, we correlated a relatively unexplored trait dimension, ostracism, with neural response. In both situations we focused on the frontal cortical region, applying a channel by time plots to display statistically significant uncorrected data and FDR corrected data, controlling for multiple comparisons.

  14. Heat balance statistics derived from four-dimensional assimilations with a global circulation model

    NASA Technical Reports Server (NTRS)

    Schubert, S. D.; Herman, G. F.

    1981-01-01

    The reported investigation was conducted to develop a reliable procedure for obtaining the diabatic and vertical terms required for atmospheric heat balance studies. The method developed employs a four-dimensional assimilation mode in connection with the general circulation model of NASA's Goddard Laboratory for Atmospheric Sciences. The initial analysis was conducted with data obtained in connection with the 1976 Data Systems Test. On the basis of the results of the investigation, it appears possible to use the model's observationally constrained diagnostics to provide estimates of the global distribution of virtually all of the quantities which are needed to compute the atmosphere's heat and energy balance.

  15. SSME/side loads analysis for flight configuration, revision A. [structural analysis of space shuttle main engine under side load excitation

    NASA Technical Reports Server (NTRS)

    Holland, W.

    1974-01-01

    This document describes the dynamic loads analysis accomplished for the Space Shuttle Main Engine (SSME) considering the side load excitation associated with transient flow separation on the engine bell during ground ignition. The results contained herein pertain only to the flight configuration. A Monte Carlo procedure was employed to select the input variables describing the side load excitation and the loads were statistically combined. This revision includes an active thrust vector control system representation and updated orbiter thrust structure stiffness characteristics. No future revisions are planned but may be necessary as system definition and input parameters change.

  16. Gas detection by correlation spectroscopy employing a multimode diode laser.

    PubMed

    Lou, Xiutao; Somesfalean, Gabriel; Zhang, Zhiguo

    2008-05-01

    A gas sensor based on the gas-correlation technique has been developed using a multimode diode laser (MDL) in a dual-beam detection scheme. Measurement of CO(2) mixed with CO as an interfering gas is successfully demonstrated using a 1570 nm tunable MDL. Despite overlapping absorption spectra and occasional mode hops, the interfering signals can be effectively excluded by a statistical procedure including correlation analysis and outlier identification. The gas concentration is retrieved from several pair-correlated signals by a linear-regression scheme, yielding a reliable and accurate measurement. This demonstrates the utility of the unsophisticated MDLs as novel light sources for gas detection applications.

  17. 78 FR 19098 - Wage Methodology for the Temporary Non-Agricultural Employment H-2B Program; Delay of Effective Date

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-29

    ... by dividing the Bureau of Labor Statistics Occupational Employment Statistics Survey (OES survey... DEPARTMENT OF LABOR Employment and Training Administration 20 CFR Part 655 RIN 1205-AB61 Wage Methodology for the Temporary Non-Agricultural Employment H- 2B Program; Delay of Effective Date AGENCY...

  18. 28 CFR 42.609 - EEOC reasonable cause determination and conciliation efforts.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Procedures for Complaints of Employment Discrimination Filed Against Recipients of Federal Financial Assistance § 42.609 EEOC reasonable cause...

  19. Effects of consumer motives on search behavior using internet advertising.

    PubMed

    Yang, Kenneth C C

    2004-08-01

    Past studies on uses and gratifications theory suggested that consumer motives affect how they will use media and media contents. Recent advertising research has extended the theory to study the use of Internet advertising. The current study explores the effects of consumer motives on their search behavior using Internet advertising. The study employed a 2 by 2 between-subjects factorial experiment design. A total of 120 subjects were assigned to an experiment condition that contains an Internet advertisement varying by advertising appeals (i.e., rational vs. emotional) and product involvement levels (high vs. low). Consumer search behavior (measured by the depth, breadth, total amount of search), demographics, and motives were collected by post-experiment questionnaires. Because all three dependent variables measuring search behavior were conceptually related to each other, MANCOVA procedures were employed to examine the moderating effects of consumer motives on the dependent variables in four product involvement-advertising appeal conditions. Results indicated that main effects for product involvements and advertising appeals were statistically significant. Univariate ANOVA also showed that advertising appeals and product involvement levels influenced the total amount of search. Three-way interactions among advertising appeals, product involvement levels, and information motive were also statistically significant. Implications and future research directions are discussed.

  20. Sequential Monte Carlo tracking of the marginal artery by multiple cue fusion and random forest regression.

    PubMed

    Cherry, Kevin M; Peplinski, Brandon; Kim, Lauren; Wang, Shijun; Lu, Le; Zhang, Weidong; Liu, Jianfei; Wei, Zhuoshi; Summers, Ronald M

    2015-01-01

    Given the potential importance of marginal artery localization in automated registration in computed tomography colonography (CTC), we have devised a semi-automated method of marginal vessel detection employing sequential Monte Carlo tracking (also known as particle filtering tracking) by multiple cue fusion based on intensity, vesselness, organ detection, and minimum spanning tree information for poorly enhanced vessel segments. We then employed a random forest algorithm for intelligent cue fusion and decision making which achieved high sensitivity and robustness. After applying a vessel pruning procedure to the tracking results, we achieved statistically significantly improved precision compared to a baseline Hessian detection method (2.7% versus 75.2%, p<0.001). This method also showed statistically significantly improved recall rate compared to a 2-cue baseline method using fewer vessel cues (30.7% versus 67.7%, p<0.001). These results demonstrate that marginal artery localization on CTC is feasible by combining a discriminative classifier (i.e., random forest) with a sequential Monte Carlo tracking mechanism. In so doing, we present the effective application of an anatomical probability map to vessel pruning as well as a supplementary spatial coordinate system for colonic segmentation and registration when this task has been confounded by colon lumen collapse. Published by Elsevier B.V.

  1. Logistic regression for risk factor modelling in stuttering research.

    PubMed

    Reed, Phil; Wu, Yaqionq

    2013-06-01

    To outline the uses of logistic regression and other statistical methods for risk factor analysis in the context of research on stuttering. The principles underlying the application of a logistic regression are illustrated, and the types of questions to which such a technique has been applied in the stuttering field are outlined. The assumptions and limitations of the technique are discussed with respect to existing stuttering research, and with respect to formulating appropriate research strategies to accommodate these considerations. Finally, some alternatives to the approach are briefly discussed. The way the statistical procedures are employed are demonstrated with some hypothetical data. Research into several practical issues concerning stuttering could benefit if risk factor modelling were used. Important examples are early diagnosis, prognosis (whether a child will recover or persist) and assessment of treatment outcome. After reading this article you will: (a) Summarize the situations in which logistic regression can be applied to a range of issues about stuttering; (b) Follow the steps in performing a logistic regression analysis; (c) Describe the assumptions of the logistic regression technique and the precautions that need to be checked when it is employed; (d) Be able to summarize its advantages over other techniques like estimation of group differences and simple regression. Copyright © 2012 Elsevier Inc. All rights reserved.

  2. A statistical model for investigating binding probabilities of DNA nucleotide sequences using microarrays.

    PubMed

    Lee, Mei-Ling Ting; Bulyk, Martha L; Whitmore, G A; Church, George M

    2002-12-01

    There is considerable scientific interest in knowing the probability that a site-specific transcription factor will bind to a given DNA sequence. Microarray methods provide an effective means for assessing the binding affinities of a large number of DNA sequences as demonstrated by Bulyk et al. (2001, Proceedings of the National Academy of Sciences, USA 98, 7158-7163) in their study of the DNA-binding specificities of Zif268 zinc fingers using microarray technology. In a follow-up investigation, Bulyk, Johnson, and Church (2002, Nucleic Acid Research 30, 1255-1261) studied the interdependence of nucleotides on the binding affinities of transcription proteins. Our article is motivated by this pair of studies. We present a general statistical methodology for analyzing microarray intensity measurements reflecting DNA-protein interactions. The log probability of a protein binding to a DNA sequence on an array is modeled using a linear ANOVA model. This model is convenient because it employs familiar statistical concepts and procedures and also because it is effective for investigating the probability structure of the binding mechanism.

  3. Design of experiments enhanced statistical process control for wind tunnel check standard testing

    NASA Astrophysics Data System (ADS)

    Phillips, Ben D.

    The current wind tunnel check standard testing program at NASA Langley Research Center is focused on increasing data quality, uncertainty quantification and overall control and improvement of wind tunnel measurement processes. The statistical process control (SPC) methodology employed in the check standard testing program allows for the tracking of variations in measurements over time as well as an overall assessment of facility health. While the SPC approach can and does provide researchers with valuable information, it has certain limitations in the areas of process improvement and uncertainty quantification. It is thought by utilizing design of experiments methodology in conjunction with the current SPC practices that one can efficiently and more robustly characterize uncertainties and develop enhanced process improvement procedures. In this research, methodologies were developed to generate regression models for wind tunnel calibration coefficients, balance force coefficients and wind tunnel flow angularities. The coefficients of these regression models were then tracked in statistical process control charts, giving a higher level of understanding of the processes. The methodology outlined is sufficiently generic such that this research can be applicable to any wind tunnel check standard testing program.

  4. A Statistical Analysis of Brain Morphology Using Wild Bootstrapping

    PubMed Central

    Ibrahim, Joseph G.; Tang, Niansheng; Rowe, Daniel B.; Hao, Xuejun; Bansal, Ravi; Peterson, Bradley S.

    2008-01-01

    Methods for the analysis of brain morphology, including voxel-based morphology and surface-based morphometries, have been used to detect associations between brain structure and covariates of interest, such as diagnosis, severity of disease, age, IQ, and genotype. The statistical analysis of morphometric measures usually involves two statistical procedures: 1) invoking a statistical model at each voxel (or point) on the surface of the brain or brain subregion, followed by mapping test statistics (e.g., t test) or their associated p values at each of those voxels; 2) correction for the multiple statistical tests conducted across all voxels on the surface of the brain region under investigation. We propose the use of new statistical methods for each of these procedures. We first use a heteroscedastic linear model to test the associations between the morphological measures at each voxel on the surface of the specified subregion (e.g., cortical or subcortical surfaces) and the covariates of interest. Moreover, we develop a robust test procedure that is based on a resampling method, called wild bootstrapping. This procedure assesses the statistical significance of the associations between a measure of given brain structure and the covariates of interest. The value of this robust test procedure lies in its computationally simplicity and in its applicability to a wide range of imaging data, including data from both anatomical and functional magnetic resonance imaging (fMRI). Simulation studies demonstrate that this robust test procedure can accurately control the family-wise error rate. We demonstrate the application of this robust test procedure to the detection of statistically significant differences in the morphology of the hippocampus over time across gender groups in a large sample of healthy subjects. PMID:17649909

  5. A Retrospective Audit of Dental Treatment Provided to Special Needs Patients under General Anesthesia During a Ten-Year Period.

    PubMed

    Mallineni, Sreekanth Kumar; Yiu, Cynthia Kar Y

    The purpose of this study was to perform a comprehensive audit of dental treatment provided to special needs patients (SNP) under general anesthesia (GA) over a ten-year period. Special needs patients who received dental treatment under GA as an in-patient at Queen Mary Hospital, Hong Kong SAR during the time period January 2002 and December 2011 were included in the study. The study population was divided into three groups, based on age (<6years, 6-12 years, >12 years). One-way ANOVA was used to evaluate the effect of "age group" on duration of treatment, post-recovery time, treatment procedures and utilization of different restorative materials. Kappa statistics were used for intra-examiner reliability. A total of 275 patients (174 males and 101 females) were included in the study. The mean age of the patients at the time they received GA was 12.37±10.18 years. Dental procedures performed were mostly restorative in nature (47%). The >12 years group had significantly shorter treatment duration (p<0.05). No significant difference in post-operative recovery time was observed among the three age groups (p>0.05). The <6 years group received significantly less preventive, but more restorative procedures (p<0.05). Significantly fewer extractions were performed in the 6-12 years group (p<0.05). The use of composite restorations was significantly higher in the <6 years group; while amalgam restorations were more frequently used in the >12 years group (P<0.05). Stainless steel crowns were more frequently employed in SNP under 12 years of age (p<0.05). Intra-examiner reliability was good (k=0.94). Most of the dental procedures performed under GA on SNP were restorative procedures. For children less than 6 years of age, had longer treatment time under GA. Composite restorations and stainless steel crowns were more frequently used in the primary dentition and amalgam restorations were more frequently employed in the permanent dentition.

  6. Could some procedures commonly used in bioassays with the copepod Acartia tonsa Dana 1849 distort results?

    PubMed

    Lopes, Laís Fernanda de Palma; Agostini, Vanessa Ochi; Muxagata, Erik

    2018-04-15

    Many organizations have suggested the use of the Calanoid copepod Acartia tonsa in protocols for acute toxicity tests. Nevertheless, these protocols present some problems, such as using 60-180µm meshes to separate specific stages of A. tonsa or carrying out the tests using small volumes that reflect high densities of A. tonsa that do not occur in nature, which could lead to distorted results. In addition, ecotoxicological studies may use statistical approaches that are inadequate for the type of data being analysed. For these reasons, some methodological approaches for bioassays using A. tonsa need to be clarified and revised. In this study, we present information about (i) the retention of copepodite stages of A. tonsa on 180, 330 and 500µm net meshes; (ii) tested storage volumes of 1 organism per 5, 10 or 20mL in each test container (TC); and (iii) considerations about the statistics employed. The results demonstrated that a net mesh of 180µm is capable of retaining all copepodite stages (CI to CVI), contrasting with the recommendation of using a 180µm mesh to separate out adults only. Coarser meshes (330 and 500µm) can also retain different proportions of all copepodite stages, but cannot separate out one developmental stage only. Twenty-five millilitres of medium in an open TC, commonly employed in bioassays simulating densities of 1 organism 5mL -1 , completely evaporated, and the results showed that the TCs need to be covered (e.g., PVC film) and filled with a minimum of 100mL of culture medium (simulating densities of 1 organism 20mL -1 ) to avoid evaporation and increases in salinity. The current use of ANOVA in ecotoxicological studies with proportions of surviving organisms should also be reconsidered since the data are discrete and have a binomial distribution; general linear models (GLMs) are considered more adequate. The information presented here suggests some adjustments that hopefully will enable the improvement of the procedures and methods employed in studies of acute toxicity using the copepod A. tonsa. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. 10 CFR 10.21 - Suspension of access authorization and/or employment clearance.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Suspension of access authorization and/or employment clearance. 10.21 Section 10.21 Energy NUCLEAR REGULATORY COMMISSION CRITERIA AND PROCEDURES FOR DETERMINING... Procedures § 10.21 Suspension of access authorization and/or employment clearance. In those cases where...

  8. 10 CFR 10.21 - Suspension of access authorization and/or employment clearance.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 10 Energy 1 2010-01-01 2010-01-01 false Suspension of access authorization and/or employment clearance. 10.21 Section 10.21 Energy NUCLEAR REGULATORY COMMISSION CRITERIA AND PROCEDURES FOR DETERMINING... Procedures § 10.21 Suspension of access authorization and/or employment clearance. In those cases where...

  9. 10 CFR 10.21 - Suspension of access authorization and/or employment clearance.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Suspension of access authorization and/or employment clearance. 10.21 Section 10.21 Energy NUCLEAR REGULATORY COMMISSION CRITERIA AND PROCEDURES FOR DETERMINING... Procedures § 10.21 Suspension of access authorization and/or employment clearance. In those cases where...

  10. To t-Test or Not to t-Test? A p-Values-Based Point of View in the Receiver Operating Characteristic Curve Framework.

    PubMed

    Vexler, Albert; Yu, Jihnhee

    2018-04-13

    A common statistical doctrine supported by many introductory courses and textbooks is that t-test type procedures based on normally distributed data points are anticipated to provide a standard in decision-making. In order to motivate scholars to examine this convention, we introduce a simple approach based on graphical tools of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. In this context, we propose employing a p-values-based method, taking into account the stochastic nature of p-values. We focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we extend the EPV concept to be considered in terms of the ROC curve technique. This provides expressive evaluations and visualizations of a wide spectrum of testing mechanisms' properties. We show that the conventional power characterization of tests is a partial aspect of the presented EPV/ROC technique. We desire that this explanation of the EPV/ROC approach convinces researchers of the usefulness of the EPV/ROC approach for depicting different characteristics of decision-making procedures, in light of the growing interest regarding correct p-values-based applications.

  11. A demonstration of the application of the new paradigm for the evaluation of forensic evidence under conditions reflecting those of a real forensic-voice-comparison case.

    PubMed

    Enzinger, Ewald; Morrison, Geoffrey Stewart; Ochoa, Felipe

    2016-01-01

    The new paradigm for the evaluation of the strength of forensic evidence includes: The use of the likelihood-ratio framework. The use of relevant data, quantitative measurements, and statistical models. Empirical testing of validity and reliability under conditions reflecting those of the case under investigation. Transparency as to decisions made and procedures employed. The present paper illustrates the use of the new paradigm to evaluate strength of evidence under conditions reflecting those of a real forensic-voice-comparison case. The offender recording was from a landline telephone system, had background office noise, and was saved in a compressed format. The suspect recording included substantial reverberation and ventilation system noise, and was saved in a different compressed format. The present paper includes descriptions of the selection of the relevant hypotheses, sampling of data from the relevant population, simulation of suspect and offender recording conditions, and acoustic measurement and statistical modelling procedures. The present paper also explores the use of different techniques to compensate for the mismatch in recording conditions. It also examines how system performance would have differed had the suspect recording been of better quality. Copyright © 2015 The Chartered Society of Forensic Sciences. Published by Elsevier Ireland Ltd. All rights reserved.

  12. ToNER: A tool for identifying nucleotide enrichment signals in feature-enriched RNA-seq data

    PubMed Central

    Promworn, Yuttachon; Kaewprommal, Pavita; Shaw, Philip J.; Intarapanich, Apichart; Tongsima, Sissades

    2017-01-01

    Background Biochemical methods are available for enriching 5′ ends of RNAs in prokaryotes, which are employed in the differential RNA-seq (dRNA-seq) and the more recent Cappable-seq protocols. Computational methods are needed to locate RNA 5′ ends from these data by statistical analysis of the enrichment. Although statistical-based analysis methods have been developed for dRNA-seq, they may not be suitable for Cappable-seq data. The more efficient enrichment method employed in Cappable-seq compared with dRNA-seq could affect data distribution and thus algorithm performance. Results We present Transformation of Nucleotide Enrichment Ratios (ToNER), a tool for statistical modeling of enrichment from RNA-seq data obtained from enriched and unenriched libraries. The tool calculates nucleotide enrichment scores and determines the global transformation for fitting to the normal distribution using the Box-Cox procedure. From the transformed distribution, sites of significant enrichment are identified. To increase power of detection, meta-analysis across experimental replicates is offered. We tested the tool on Cappable-seq and dRNA-seq data for identifying Escherichia coli transcript 5′ ends and compared the results with those from the TSSAR tool, which is designed for analyzing dRNA-seq data. When combining results across Cappable-seq replicates, ToNER detects more known transcript 5′ ends than TSSAR. In general, the transcript 5′ ends detected by ToNER but not TSSAR occur in regions which cannot be locally modeled by TSSAR. Conclusion ToNER uses a simple yet robust statistical modeling approach, which can be used for detecting RNA 5′ends from Cappable-seq data, in particular when combining information from experimental replicates. The ToNER tool could potentially be applied for analyzing other RNA-seq datasets in which enrichment for other structural features of RNA is employed. The program is freely available for download at ToNER webpage (http://www4a.biotec.or.th/GI/tools/toner) and GitHub repository (https://github.com/PavitaKae/ToNER). PMID:28542466

  13. The Study on Mental Health at Work: Design and sampling.

    PubMed

    Rose, Uwe; Schiel, Stefan; Schröder, Helmut; Kleudgen, Martin; Tophoven, Silke; Rauch, Angela; Freude, Gabriele; Müller, Grit

    2017-08-01

    The Study on Mental Health at Work (S-MGA) generates the first nationwide representative survey enabling the exploration of the relationship between working conditions, mental health and functioning. This paper describes the study design, sampling procedures and data collection, and presents a summary of the sample characteristics. S-MGA is a representative study of German employees aged 31-60 years subject to social security contributions. The sample was drawn from the employment register based on a two-stage cluster sampling procedure. Firstly, 206 municipalities were randomly selected from a pool of 12,227 municipalities in Germany. Secondly, 13,590 addresses were drawn from the selected municipalities for the purpose of conducting 4500 face-to-face interviews. The questionnaire covers psychosocial working and employment conditions, measures of mental health, work ability and functioning. Data from personal interviews were combined with employment histories from register data. Descriptive statistics of socio-demographic characteristics and logistic regressions analyses were used for comparing population, gross sample and respondents. In total, 4511 face-to-face interviews were conducted. A test for sampling bias revealed that individuals in older cohorts participated more often, while individuals with an unknown educational level, residing in major cities or with a non-German ethnic background were slightly underrepresented. There is no indication of major deviations in characteristics between the basic population and the sample of respondents. Hence, S-MGA provides representative data for research on work and health, designed as a cohort study with plans to rerun the survey 5 years after the first assessment.

  14. 40 CFR Appendix Xviii to Part 86 - Statistical Outlier Identification Procedure for Light-Duty Vehicles and Light Light-Duty Trucks...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 19 2011-07-01 2011-07-01 false Statistical Outlier Identification... (CONTINUED) Pt. 86, App. XVIII Appendix XVIII to Part 86—Statistical Outlier Identification Procedure for..., but suffer theoretical deficiencies if statistical significance tests are required. Consequently, the...

  15. 40 CFR Appendix Xviii to Part 86 - Statistical Outlier Identification Procedure for Light-Duty Vehicles and Light Light-Duty Trucks...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Statistical Outlier Identification... (CONTINUED) Pt. 86, App. XVIII Appendix XVIII to Part 86—Statistical Outlier Identification Procedure for..., but suffer theoretical deficiencies if statistical significance tests are required. Consequently, the...

  16. Applications of statistics to medical science, II overview of statistical procedures for general use.

    PubMed

    Watanabe, Hiroshi

    2012-01-01

    Procedures of statistical analysis are reviewed to provide an overview of applications of statistics for general use. Topics that are dealt with are inference on a population, comparison of two populations with respect to means and probabilities, and multiple comparisons. This study is the second part of series in which we survey medical statistics. Arguments related to statistical associations and regressions will be made in subsequent papers.

  17. Assessment of statistic analysis in non-radioisotopic local lymph node assay (non-RI-LLNA) with alpha-hexylcinnamic aldehyde as an example.

    PubMed

    Takeyoshi, Masahiro; Sawaki, Masakuni; Yamasaki, Kanji; Kimber, Ian

    2003-09-30

    The murine local lymph node assay (LLNA) is used for the identification of chemicals that have the potential to cause skin sensitization. However, it requires specific facility and handling procedures to accommodate a radioisotopic (RI) endpoint. We have developed non-radioisotopic (non-RI) endpoint of LLNA based on BrdU incorporation to avoid a use of RI. Although this alternative method appears viable in principle, it is somewhat less sensitive than the standard assay. In this study, we report investigations to determine the use of statistical analysis to improve the sensitivity of a non-RI LLNA procedure with alpha-hexylcinnamic aldehyde (HCA) in two separate experiments. Consequently, the alternative non-RI method required HCA concentrations of greater than 25% to elicit a positive response based on the criterion for classification as a skin sensitizer in the standard LLNA. Nevertheless, dose responses to HCA in the alternative method were consistent in both experiments and we examined whether the use of an endpoint based upon the statistical significance of induced changes in LNC turnover, rather than an SI of 3 or greater, might provide for additional sensitivity. The results reported here demonstrate that with HCA at least significant responses were, in each of two experiments, recorded following exposure of mice to 25% of HCA. These data suggest that this approach may be more satisfactory-at least when BrdU incorporation is measured. However, this modification of the LLNA is rather less sensitive than the standard method if employing statistical endpoint. Taken together the data reported here suggest that a modified LLNA in which BrdU is used in place of radioisotope incorporation shows some promise, but that in its present form, even with the use of a statistical endpoint, lacks some of the sensitivity of the standard method. The challenge is to develop strategies for further refinement of this approach.

  18. 76 FR 44960 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Report on...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-27

    ... for OMB Review; Comment Request; Report on Current Employment Statistics ACTION: Notice. SUMMARY: The Department of Labor (DOL) is submitting the revised Bureau of Labor Statistics (BLS) sponsored information collection request (ICR) titled, ``Report on Current Employment Statistics,'' to the Office of Management and...

  19. Evaluation of Methods Used for Estimating Selected Streamflow Statistics, and Flood Frequency and Magnitude, for Small Basins in North Coastal California

    USGS Publications Warehouse

    Mann, Michael P.; Rizzardo, Jule; Satkowski, Richard

    2004-01-01

    Accurate streamflow statistics are essential to water resource agencies involved in both science and decision-making. When long-term streamflow data are lacking at a site, estimation techniques are often employed to generate streamflow statistics. However, procedures for accurately estimating streamflow statistics often are lacking. When estimation procedures are developed, they often are not evaluated properly before being applied. Use of unevaluated or underevaluated flow-statistic estimation techniques can result in improper water-resources decision-making. The California State Water Resources Control Board (SWRCB) uses two key techniques, a modified rational equation and drainage basin area-ratio transfer, to estimate streamflow statistics at ungaged locations. These techniques have been implemented to varying degrees, but have not been formally evaluated. For estimating peak flows at the 2-, 5-, 10-, 25-, 50-, and 100-year recurrence intervals, the SWRCB uses the U.S. Geological Surveys (USGS) regional peak-flow equations. In this study, done cooperatively by the USGS and SWRCB, the SWRCB estimated several flow statistics at 40 USGS streamflow gaging stations in the north coast region of California. The SWRCB estimates were made without reference to USGS flow data. The USGS used the streamflow data provided by the 40 stations to generate flow statistics that could be compared with SWRCB estimates for accuracy. While some SWRCB estimates compared favorably with USGS statistics, results were subject to varying degrees of error over the region. Flow-based estimation techniques generally performed better than rain-based methods, especially for estimation of December 15 to March 31 mean daily flows. The USGS peak-flow equations also performed well, but tended to underestimate peak flows. The USGS equations performed within reported error bounds, but will require updating in the future as peak-flow data sets grow larger. Little correlation was discovered between estimation errors and geographic locations or various basin characteristics. However, for 25-percentile year mean-daily-flow estimates for December 15 to March 31, the greatest estimation errors were at east San Francisco Bay area stations with mean annual precipitation less than or equal to 30 inches, and estimated 2-year/24-hour rainfall intensity less than 3 inches.

  20. Randomization Procedures Applied to Analysis of Ballistic Data

    DTIC Science & Technology

    1991-06-01

    test,;;15. NUMBER OF PAGES data analysis; computationally intensive statistics ; randomization tests; permutation tests; 16 nonparametric statistics ...be 0.13. 8 Any reasonable statistical procedure would fail to support the notion of improvement of dynamic over standard indexing based on this data ...AD-A238 389 TECHNICAL REPORT BRL-TR-3245 iBRL RANDOMIZATION PROCEDURES APPLIED TO ANALYSIS OF BALLISTIC DATA MALCOLM S. TAYLOR BARRY A. BODT - JUNE

  1. 28 CFR 42.603 - Confidentiality.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Confidentiality. 42.603 Section 42.603 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Procedures for Complaints of Employment Discrimination Filed Against Recipients of Federal...

  2. The associations between perceived distributive, procedural, and interactional organizational justice, self-rated health and burnout.

    PubMed

    Liljegren, Mats; Ekberg, Kerstin

    2009-01-01

    The aim of the present study was to examine the cross-sectional and 2-year longitudinal associations between perceived organizational justice, self-rated health and burnout. The study used questionnaire data from 428 Swedish employment officers and the data was analyzed with Structural Equation Modeling, SEM. Two different models were tested: a global organizational justice model (with and without correlated measurement errors) and a differentiated (distributive, procedural and interactional organizational justice) justice model (with and without correlated measurement errors). The global justice model with autocorrelations had the most satisfactory goodness-of-fit indices. Global justice showed statistically significant (p < 0.01) cross-sectional (0.80 {mle 0.84) and longitudinal positive associations (0.76 mle 0.82) between organizational justice and self-rated health, and significant (p < 0.01) negative associations between organizational justice and burnout (cross-sectional: mle = -0.85, longitudinal -0.83 mle -0.84). The global justice construct showed better goodness-of-fit indices than the threefold justice construct but a differentiated organizational justice concept could give valuable information about health related risk factors: if they are structural (distributive justice), procedural (procedural justice) or inter-personal (interactional justice). The two approaches to study organizational justice should therefore be regarded as complementary rather than exclusive.

  3. Using a Five-Step Procedure for Inferential Statistical Analyses

    ERIC Educational Resources Information Center

    Kamin, Lawrence F.

    2010-01-01

    Many statistics texts pose inferential statistical problems in a disjointed way. By using a simple five-step procedure as a template for statistical inference problems, the student can solve problems in an organized fashion. The problem and its solution will thus be a stand-by-itself organic whole and a single unit of thought and effort. The…

  4. Summary Statistics of Public TV Licensees, 1972.

    ERIC Educational Resources Information Center

    Lee, S. Young; Pedone, Ronald J.

    Statistics in the areas of finance, employment, broadcast and production for public TV licenses in 1972 are given in this report. Tables in the area of finance are presented specifying total funds, income, direct operating costs, and capital expenditures. Employment is divided into all employment with subdivisions for full- and part-time employees…

  5. Some Aspects of Part-Time Work.

    ERIC Educational Resources Information Center

    Australian Dept. of Labour and National Service, Melbourne. Women's Bureau.

    Of major importance to many married women seeking employment in Australia is the availability of part-time work. To describe the economic aspects of part-time employment for women, a review was made of statistics published by the Commonwealth Bureau of Census and Statistics and of research on part-time employment in overseas countries, and a…

  6. Transition of Higher Education Graduates to the Labour Market: Are Employment Procedures More Meritocratic in the Public Sector?

    ERIC Educational Resources Information Center

    Berggren, Caroline

    2011-01-01

    As an employer, the public sector might be expected to be more meritocratic than the private sector, because of its democratic values and more transparent appointments procedures. In this context meritocratic means that the employer only considers characteristics such as degree and grades, relevant for the position in question. The individuals in…

  7. Comparison of Sample Size by Bootstrap and by Formulas Based on Normal Distribution Assumption.

    PubMed

    Wang, Zuozhen

    2018-01-01

    Bootstrapping technique is distribution-independent, which provides an indirect way to estimate the sample size for a clinical trial based on a relatively smaller sample. In this paper, sample size estimation to compare two parallel-design arms for continuous data by bootstrap procedure are presented for various test types (inequality, non-inferiority, superiority, and equivalence), respectively. Meanwhile, sample size calculation by mathematical formulas (normal distribution assumption) for the identical data are also carried out. Consequently, power difference between the two calculation methods is acceptably small for all the test types. It shows that the bootstrap procedure is a credible technique for sample size estimation. After that, we compared the powers determined using the two methods based on data that violate the normal distribution assumption. To accommodate the feature of the data, the nonparametric statistical method of Wilcoxon test was applied to compare the two groups in the data during the process of bootstrap power estimation. As a result, the power estimated by normal distribution-based formula is far larger than that by bootstrap for each specific sample size per group. Hence, for this type of data, it is preferable that the bootstrap method be applied for sample size calculation at the beginning, and that the same statistical method as used in the subsequent statistical analysis is employed for each bootstrap sample during the course of bootstrap sample size estimation, provided there is historical true data available that can be well representative of the population to which the proposed trial is planning to extrapolate.

  8. 29 CFR 1621.1 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-THE EQUAL PAY ACT § 1621.1 Purpose. The regulations set forth in this part contain the procedures established by the Equal Employment Opportunity Commission for issuing opinion letters under the Equal Pay Act. ...

  9. A review of the application of nonattenuating frequency radars for estimating rain attenuation and space-diversity performance

    NASA Technical Reports Server (NTRS)

    Goldhirsh, J.

    1979-01-01

    Cumulative rain fade statistics are used by space communications engineers to establish transmitter power and receiver sensitivities for systems operating under various geometries, climates, and radio frequencies. Space-diversity performance criteria are also of interest. This work represents a review, in which are examined the many elements involved in the employment of single nonattenuating frequency radars for arriving at the desired information. The elements examined include radar techniques and requirements, phenomenological assumptions, path attenuation formulations and procedures, as well as error budgeting and calibration analysis. Included are the pertinent results of previous investigators who have used radar for rain-attenuation modeling. Suggestions are made for improving present methods.

  10. DynaMIT: the dynamic motif integration toolkit

    PubMed Central

    Dassi, Erik; Quattrone, Alessandro

    2016-01-01

    De-novo motif search is a frequently applied bioinformatics procedure to identify and prioritize recurrent elements in sequences sets for biological investigation, such as the ones derived from high-throughput differential expression experiments. Several algorithms have been developed to perform motif search, employing widely different approaches and often giving divergent results. In order to maximize the power of these investigations and ultimately be able to draft solid biological hypotheses, there is the need for applying multiple tools on the same sequences and merge the obtained results. However, motif reporting formats and statistical evaluation methods currently make such an integration task difficult to perform and mostly restricted to specific scenarios. We thus introduce here the Dynamic Motif Integration Toolkit (DynaMIT), an extremely flexible platform allowing to identify motifs employing multiple algorithms, integrate them by means of a user-selected strategy and visualize results in several ways; furthermore, the platform is user-extendible in all its aspects. DynaMIT is freely available at http://cibioltg.bitbucket.org. PMID:26253738

  11. 40 CFR 1065.12 - Approval of alternate procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... engine meets all applicable emission standards according to specified procedures. (iii) Use statistical.... (e) We may give you specific directions regarding methods for statistical analysis, or we may approve... statistical tests. Perform the tests as follows: (1) Repeat measurements for all applicable duty cycles at...

  12. Analysis of half diallel mating designs I: a practical analysis procedure for ANOVA approximation.

    Treesearch

    G.R. Johnson; J.N. King

    1998-01-01

    Procedures to analyze half-diallel mating designs using the SAS statistical package are presented. The procedure requires two runs of PROC and VARCOMP and results in estimates of additive and non-additive genetic variation. The procedures described can be modified to work on most statistical software packages which can compute variance component estimates. The...

  13. Characterization of the olfactory impact around a wastewater treatment plant: optimization and validation of a hydrogen sulfide determination procedure based on passive diffusion sampling.

    PubMed

    Colomer, Fernando Llavador; Espinós-Morató, Héctor; Iglesias, Enrique Mantilla; Pérez, Tatiana Gómez; Campos-Candel, Andreu; Lozano, Caterina Coll

    2012-08-01

    A monitoring program based on an indirect method was conducted to assess the approximation of the olfactory impact in several wastewater treatment plants (in the present work, only one is shown). The method uses H2S passive sampling using Palmes-type diffusion tubes impregnated with silver nitrate and fluorometric analysis employing fluorescein mercuric acetate. The analytical procedure was validated in the exposure chamber. Exposure periods ofat least 4 days are recommended. The quantification limit of the procedure is 0.61 ppb for a 5-day sampling, which allows the H2S immission (ground concentration) level to be measured within its low odor threshold, from 0.5 to 300 ppb. Experimental results suggest an exposure time greater than 4 days, while recovery efficiency of the procedure, 93.0+/-1.8%, seems not to depend on the amount of H2S collected by the samplers within their application range. The repeatability, expressed as relative standard deviation, is lower than 7%, which is within the limits normally accepted for this type of sampler. Statistical comparison showed that this procedure and the reference method provide analogous accuracy. The proposed procedure was applied in two experimental campaigns, one intensive and the other extensive, and concentrations within the H2S low odor threshold were quantified at each sampling point. From these results, it can be concluded that the procedure shows good potential for monitoring the olfactory impact around facilities where H2S emissions are dominant.

  14. Characterization of the olfactory impact around a wastewater treatment plant: Optimization and validation of a hydrogen sulfide determination procedure based on passive diffusion sampling.

    PubMed

    Colomer, Fernando Llavador; Espinós-Morató, Héctor; Iglesias, Enrique Mantilla; Pérez, Tatiana Gómez; Campos-Candel, Andreu; Coll Lozano, Caterina

    2012-08-01

    A monitoring program based on an indirect method was conducted to assess the approximation of the olfactory impact in several wastewater treatment plants (in the present work, only one is shown). The method uses H 2 S passive sampling using Palmes-type diffusion tubes impregnated with silver nitrate and fluorometric analysis employing fluorescein mercuric acetate. The analytical procedure was validated in the exposure chamber. Exposure periods of at least 4 days are recommended. The quantification limit of the procedure is 0.61 ppb for a 5-day sampling, which allows the H 2 S immission (ground concentration) level to be measured within its low odor threshold, from 0.5 to 300 ppb. Experimental results suggest an exposure time greater than 4 days, while recovery efficiency of the procedure, 93.0 ± 1.8%, seems not to depend on the amount of H 2 S collected by the samplers within their application range. The repeatability, expressed as relative standard deviation, is lower than 7%, which is within the limits normally accepted for this type of sampler. Statistical comparison showed that this procedure and the reference method provide analogous accuracy. The proposed procedure was applied in two experimental campaigns, one intensive and the other extensive, and concentrations within the H 2 S low odor threshold were quantified at each sampling point. From these results, it can be concluded that the procedure shows good potential for monitoring the olfactory impact around facilities where H 2 S emissions are dominant. [Box: see text].

  15. Robustness of methods for blinded sample size re-estimation with overdispersed count data.

    PubMed

    Schneider, Simon; Schmidli, Heinz; Friede, Tim

    2013-09-20

    Counts of events are increasingly common as primary endpoints in randomized clinical trials. With between-patient heterogeneity leading to variances in excess of the mean (referred to as overdispersion), statistical models reflecting this heterogeneity by mixtures of Poisson distributions are frequently employed. Sample size calculation in the planning of such trials requires knowledge on the nuisance parameters, that is, the control (or overall) event rate and the overdispersion parameter. Usually, there is only little prior knowledge regarding these parameters in the design phase resulting in considerable uncertainty regarding the sample size. In this situation internal pilot studies have been found very useful and very recently several blinded procedures for sample size re-estimation have been proposed for overdispersed count data, one of which is based on an EM-algorithm. In this paper we investigate the EM-algorithm based procedure with respect to aspects of their implementation by studying the algorithm's dependence on the choice of convergence criterion and find that the procedure is sensitive to the choice of the stopping criterion in scenarios relevant to clinical practice. We also compare the EM-based procedure to other competing procedures regarding their operating characteristics such as sample size distribution and power. Furthermore, the robustness of these procedures to deviations from the model assumptions is explored. We find that some of the procedures are robust to at least moderate deviations. The results are illustrated using data from the US National Heart, Lung and Blood Institute sponsored Asymptomatic Cardiac Ischemia Pilot study. Copyright © 2013 John Wiley & Sons, Ltd.

  16. Direct versus indirect revascularization procedures for moyamoya disease: a comparative effectiveness study.

    PubMed

    Macyszyn, Luke; Attiah, Mark; Ma, Tracy S; Ali, Zarina; Faught, Ryan; Hossain, Alisha; Man, Karen; Patel, Hiren; Sobota, Rosanna; Zager, Eric L; Stein, Sherman C

    2017-05-01

    OBJECTIVE Moyamoya disease (MMD) is a chronic cerebrovascular disease that can lead to devastating neurological outcomes. Surgical intervention is the definitive treatment, with direct, indirect, and combined revascularization procedures currently employed by surgeons. The optimal surgical approach, however, remains unclear. In this decision analysis, the authors compared the effectiveness of revascularization procedures in both adult and pediatric patients with MMD. METHODS A comprehensive literature search was performed for studies of MMD. Using complication and success rates from the literature, the authors constructed a decision analysis model for treatment using a direct and indirect revascularization technique. Utility values for the various outcomes and complications were extracted from the literature examining preferences in similar clinical conditions. Sensitivity analysis was performed. RESULTS A structured literature search yielded 33 studies involving 4197 cases. Cases were divided into adult and pediatric populations. These were further subdivided into 3 different treatment groups: indirect, direct, and combined revascularization procedures. In the pediatric population at 5- and 10-year follow-up, there was no significant difference between indirect and combination procedures, but both were superior to direct revascularization. In adults at 4-year follow-up, indirect was superior to direct revascularization. CONCLUSIONS In the absence of factors that dictate a specific approach, the present decision analysis suggests that direct revascularization procedures are inferior in terms of quality-adjusted life years in both adults at 4 years and children at 5 and 10 years postoperatively, respectively. These findings were statistically significant (p < 0.001 in all cases), suggesting that indirect and combination procedures may offer optimal results at long-term follow-up.

  17. 28 CFR 42.610 - Agency enforcement of unresolved complaints.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Agency enforcement of unresolved complaints. 42.610 Section 42.610 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Procedures for Complaints of Employment Discrimination Filed...

  18. 28 CFR 42.602 - Exchange of information.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Exchange of information. 42.602 Section 42.602 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Procedures for Complaints of Employment Discrimination Filed Against Recipients of...

  19. 28 CFR 42.607 - EEOC dismissals of complaints.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false EEOC dismissals of complaints. 42.607 Section 42.607 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Procedures for Complaints of Employment Discrimination Filed Against...

  20. 75 FR 9955 - Labor Surplus Area Classification Under Executive Orders 12073 and 10582

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-04

    ... DEPARTMENT OF LABOR Employment and Training Administration Labor Surplus Area Classification Under Executive Orders 12073 and 10582 AGENCY: Employment and Training Administration, Labor. ACTION: Notice... supplementary, eligibility, classification procedures and petition for exceptional circumstances procedure...

  1. Application of a faith-based integration tool to assess mental and physical health interventions

    PubMed Central

    Saunders, Donna M.; Leak, Jean; Carver, Monique E.; Smith, Selina A.

    2017-01-01

    Background To build on current research involving faith-based interventions (FBIs) for addressing mental and physical health, this study a) reviewed the extent to which relevant publications integrate faith concepts with health and b) initiated analysis of the degree of FBI integration with intervention outcomes. Methods Derived from a systematic search of articles published between 2007 and 2017, 36 studies were assessed with a Faith-Based Integration Assessment Tool (FIAT) to quantify faith-health integration. Basic statistical procedures were employed to determine the association of faith-based integration with intervention outcomes. Results The assessed studies possessed (on average) moderate, inconsistent integration because of poor use of faith measures, and moderate, inconsistent use of faith practices. Analysis procedures for determining the effect of FBI integration on intervention outcomes were inadequate for formulating practical conclusions. Conclusions Regardless of integration, interventions were associated with beneficial outcomes. To determine the link between FBI integration and intervention outcomes, additional analyses are needed. PMID:29354795

  2. Simulation-based hypothesis testing of high dimensional means under covariance heterogeneity.

    PubMed

    Chang, Jinyuan; Zheng, Chao; Zhou, Wen-Xin; Zhou, Wen

    2017-12-01

    In this article, we study the problem of testing the mean vectors of high dimensional data in both one-sample and two-sample cases. The proposed testing procedures employ maximum-type statistics and the parametric bootstrap techniques to compute the critical values. Different from the existing tests that heavily rely on the structural conditions on the unknown covariance matrices, the proposed tests allow general covariance structures of the data and therefore enjoy wide scope of applicability in practice. To enhance powers of the tests against sparse alternatives, we further propose two-step procedures with a preliminary feature screening step. Theoretical properties of the proposed tests are investigated. Through extensive numerical experiments on synthetic data sets and an human acute lymphoblastic leukemia gene expression data set, we illustrate the performance of the new tests and how they may provide assistance on detecting disease-associated gene-sets. The proposed methods have been implemented in an R-package HDtest and are available on CRAN. © 2017, The International Biometric Society.

  3. Auditory event perception: the source-perception loop for posture in human gait.

    PubMed

    Pastore, Richard E; Flint, Jesse D; Gaston, Jeremy R; Solomon, Matthew J

    2008-01-01

    There is a small but growing literature on the perception of natural acoustic events, but few attempts have been made to investigate complex sounds not systematically controlled within a laboratory setting. The present study investigates listeners' ability to make judgments about the posture (upright-stooped) of the walker who generated acoustic stimuli contrasted on each trial. We use a comprehensive three-stage approach to event perception, in which we develop a solid understanding of the source event and its sound properties, as well as the relationships between these two event stages. Developing this understanding helps both to identify the limitations of common statistical procedures and to develop effective new procedures for investigating not only the two information stages above, but also the decision strategies employed by listeners in making source judgments from sound. The result is a comprehensive, ultimately logical, but not necessarily expected picture of both the source-sound-perception loop and the utility of alternative research tools.

  4. AUGUSTO'S Sundial: Image-Based Modeling for Reverse Engeneering Purposes

    NASA Astrophysics Data System (ADS)

    Baiocchi, V.; Barbarella, M.; Del Pizzo, S.; Giannone, F.; Troisi, S.; Piccaro, C.; Marcantonio, D.

    2017-02-01

    A photogrammetric survey of a unique archaeological site is reported in this paper. The survey was performed using both a panoramic image-based solution and by classical procedure. The panoramic image-based solution was carried out employing a commercial solution: the Trimble V10 Imaging Rover (IR). Such instrument is an integrated cameras system that captures 360 degrees digital panoramas, composed of 12 images, with a single push. The direct comparison of the point clouds obtained with traditional photogrammetric procedure and V10 stations, using the same GCP coordinates has been carried out in Cloud Compare, open source software that can provide the comparison between two point clouds supplied by all the main statistical data. The site is a portion of the dial plate of the "Horologium Augusti" inaugurated in 9 B.C.E. in the area of Campo Marzio and still present intact in the same position, in a cellar of a building in Rome, around 7 meter below the present ground level.

  5. Planning Risk-Based SQC Schedules for Bracketed Operation of Continuous Production Analyzers.

    PubMed

    Westgard, James O; Bayat, Hassan; Westgard, Sten A

    2018-02-01

    To minimize patient risk, "bracketed" statistical quality control (SQC) is recommended in the new CLSI guidelines for SQC (C24-Ed4). Bracketed SQC requires that a QC event both precedes and follows (brackets) a group of patient samples. In optimizing a QC schedule, the frequency of QC or run size becomes an important planning consideration to maintain quality and also facilitate responsive reporting of results from continuous operation of high production analytic systems. Different plans for optimizing a bracketed SQC schedule were investigated on the basis of Parvin's model for patient risk and CLSI C24-Ed4's recommendations for establishing QC schedules. A Sigma-metric run size nomogram was used to evaluate different QC schedules for processes of different sigma performance. For high Sigma performance, an effective SQC approach is to employ a multistage QC procedure utilizing a "startup" design at the beginning of production and a "monitor" design periodically throughout production. Example QC schedules are illustrated for applications with measurement procedures having 6-σ, 5-σ, and 4-σ performance. Continuous production analyzers that demonstrate high σ performance can be effectively controlled with multistage SQC designs that employ a startup QC event followed by periodic monitoring or bracketing QC events. Such designs can be optimized to minimize the risk of harm to patients. © 2017 American Association for Clinical Chemistry.

  6. 14 CFR 120.203 - General.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... applicable to employers by this subpart. (c) Employer responsibility. As an employer, you are responsible for... employees who perform safety-sensitive functions in aviation. (b) Alcohol testing procedures. Each employer...

  7. Monitoring Items in Real Time to Enhance CAT Security

    ERIC Educational Resources Information Center

    Zhang, Jinming; Li, Jie

    2016-01-01

    An IRT-based sequential procedure is developed to monitor items for enhancing test security. The procedure uses a series of statistical hypothesis tests to examine whether the statistical characteristics of each item under inspection have changed significantly during CAT administration. This procedure is compared with a previously developed…

  8. Optimization of the Hartmann-Shack microlens array

    NASA Astrophysics Data System (ADS)

    de Oliveira, Otávio Gomes; de Lima Monteiro, Davies William

    2011-04-01

    In this work we propose to optimize the microlens-array geometry for a Hartmann-Shack wavefront sensor. The optimization makes possible that regular microlens arrays with a larger number of microlenses are replaced by arrays with fewer microlenses located at optimal sampling positions, with no increase in the reconstruction error. The goal is to propose a straightforward and widely accessible numerical method to calculate an optimized microlens array for a known aberration statistics. The optimization comprises the minimization of the wavefront reconstruction error and/or the number of necessary microlenses in the array. We numerically generate, sample and reconstruct the wavefront, and use a genetic algorithm to discover the optimal array geometry. Within an ophthalmological context, as a case study, we demonstrate that an array with only 10 suitably located microlenses can be used to produce reconstruction errors as small as those of a 36-microlens regular array. The same optimization procedure can be employed for any application where the wavefront statistics is known.

  9. Epidemiologic programs for computers and calculators. A microcomputer program for multiple logistic regression by unconditional and conditional maximum likelihood methods.

    PubMed

    Campos-Filho, N; Franco, E L

    1989-02-01

    A frequent procedure in matched case-control studies is to report results from the multivariate unmatched analyses if they do not differ substantially from the ones obtained after conditioning on the matching variables. Although conceptually simple, this rule requires that an extensive series of logistic regression models be evaluated by both the conditional and unconditional maximum likelihood methods. Most computer programs for logistic regression employ only one maximum likelihood method, which requires that the analyses be performed in separate steps. This paper describes a Pascal microcomputer (IBM PC) program that performs multiple logistic regression by both maximum likelihood estimation methods, which obviates the need for switching between programs to obtain relative risk estimates from both matched and unmatched analyses. The program calculates most standard statistics and allows factoring of categorical or continuous variables by two distinct methods of contrast. A built-in, descriptive statistics option allows the user to inspect the distribution of cases and controls across categories of any given variable.

  10. Simulation of parametric model towards the fixed covariate of right censored lung cancer data

    NASA Astrophysics Data System (ADS)

    Afiqah Muhamad Jamil, Siti; Asrul Affendi Abdullah, M.; Kek, Sie Long; Ridwan Olaniran, Oyebayo; Enera Amran, Syahila

    2017-09-01

    In this study, simulation procedure was applied to measure the fixed covariate of right censored data by using parametric survival model. The scale and shape parameter were modified to differentiate the analysis of parametric regression survival model. Statistically, the biases, mean biases and the coverage probability were used in this analysis. Consequently, different sample sizes were employed to distinguish the impact of parametric regression model towards right censored data with 50, 100, 150 and 200 number of sample. R-statistical software was utilised to develop the coding simulation with right censored data. Besides, the final model of right censored simulation was compared with the right censored lung cancer data in Malaysia. It was found that different values of shape and scale parameter with different sample size, help to improve the simulation strategy for right censored data and Weibull regression survival model is suitable fit towards the simulation of survival of lung cancer patients data in Malaysia.

  11. Statistical Cost Estimation in Higher Education: Some Alternatives.

    ERIC Educational Resources Information Center

    Brinkman, Paul T.; Niwa, Shelley

    Recent developments in econometrics that are relevant to the task of estimating costs in higher education are reviewed. The relative effectiveness of alternative statistical procedures for estimating costs are also tested. Statistical cost estimation involves three basic parts: a model, a data set, and an estimation procedure. Actual data are used…

  12. 28 CFR 42.604 - Standards for investigation, reviews and hearings.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Standards for investigation, reviews and hearings. 42.604 Section 42.604 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Procedures for Complaints of Employment Discrimination Filed...

  13. 12 CFR 352.10 - Compliance procedures.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... NONDISCRIMINATION ON THE BASIS OF DISABILITY § 352.10 Compliance procedures. (a) Applicability. Paragraph (b) of... disability discrimination in FDIC programs or activities and denial of technology access. (b) Employment complaints. The FDIC shall process complaints alleging employment discrimination on the basis of disability...

  14. Implications of the New EEOC Guidelines.

    ERIC Educational Resources Information Center

    Dhanens, Thomas P.

    1979-01-01

    In the next few years employers will frequently be confronted with the fact that they cannot rely on undocumented, subjective selection procedures. As long as disparate impact exists in employee selection, employers will be required to validate whatever selection procedures they use. (Author/IRT)

  15. Employing Introductory Statistics Students at "Stats Dairy"

    ERIC Educational Resources Information Center

    Keeling, Kellie

    2011-01-01

    To combat students' fear of statistics I employ my students at a fictional company, Stats Dairy, run by cows. Almost all examples used in the class notes, exercises, humour and exams use data "collected" from this company.

  16. Accurate mass measurement: terminology and treatment of data.

    PubMed

    Brenton, A Gareth; Godfrey, A Ruth

    2010-11-01

    High-resolution mass spectrometry has become ever more accessible with improvements in instrumentation, such as modern FT-ICR and Orbitrap mass spectrometers. This has resulted in an increase in the number of articles submitted for publication quoting accurate mass data. There is a plethora of terms related to accurate mass analysis that are in current usage, many employed incorrectly or inconsistently. This article is based on a set of notes prepared by the authors for research students and staff in our laboratories as a guide to the correct terminology and basic statistical procedures to apply in relation to mass measurement, particularly for accurate mass measurement. It elaborates on the editorial by Gross in 1994 regarding the use of accurate masses for structure confirmation. We have presented and defined the main terms in use with reference to the International Union of Pure and Applied Chemistry (IUPAC) recommendations for nomenclature and symbolism for mass spectrometry. The correct use of statistics and treatment of data is illustrated as a guide to new and existing mass spectrometry users with a series of examples as well as statistical methods to compare different experimental methods and datasets. Copyright © 2010. Published by Elsevier Inc.

  17. QSAR models for thiophene and imidazopyridine derivatives inhibitors of the Polo-Like Kinase 1.

    PubMed

    Comelli, Nieves C; Duchowicz, Pablo R; Castro, Eduardo A

    2014-10-01

    The inhibitory activity of 103 thiophene and 33 imidazopyridine derivatives against Polo-Like Kinase 1 (PLK1) expressed as pIC50 (-logIC50) was predicted by QSAR modeling. Multivariate linear regression (MLR) was employed to model the relationship between 0D and 3D molecular descriptors and biological activities of molecules using the replacement method (MR) as variable selection tool. The 136 compounds were separated into several training and test sets. Two splitting approaches, distribution of biological data and structural diversity, and the statistical experimental design procedure D-optimal distance were applied to the dataset. The significance of the training set models was confirmed by statistically higher values of the internal leave one out cross-validated coefficient of determination (Q2) and external predictive coefficient of determination for the test set (Rtest2). The model developed from a training set, obtained with the D-optimal distance protocol and using 3D descriptor space along with activity values, separated chemical features that allowed to distinguish high and low pIC50 values reasonably well. Then, we verified that such model was sufficient to reliably and accurately predict the activity of external diverse structures. The model robustness was properly characterized by means of standard procedures and their applicability domain (AD) was analyzed by leverage method. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. 28 CFR 42.608 - Agency action on complaints dismissed by EEOC.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Agency action on complaints dismissed by EEOC. 42.608 Section 42.608 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Procedures for Complaints of Employment Discrimination Filed...

  19. Recording and reproduction of microwave holograms using a scanning procedure and their subsequent optical processing

    NASA Technical Reports Server (NTRS)

    Hetsch, J.

    1983-01-01

    Intensity distributions in nonoptical wave fields can be visualized and stored on photosensitive material. In the case of microwaves, temperature effects can be utilized with the aid of liquid crystals to visualize intensity distributions. Particular advantages for the study of intensity distributions in microwave fields presents a scanning procedure in which a microcomputer is employed for the control of a probe and the storage of the measured data. The present investigation is concerned with the employment of such a scanning procedure for the recording and the reproduction of microwave holograms. The scanning procedure makes use of an approach discussed by Farhat, et al. (1973). An eight-bit microprocessor with 64 kBytes of RAM is employed together with a diskette storage system.

  20. Introducing Statistical Inference to Biology Students through Bootstrapping and Randomization

    ERIC Educational Resources Information Center

    Lock, Robin H.; Lock, Patti Frazer

    2008-01-01

    Bootstrap methods and randomization tests are increasingly being used as alternatives to standard statistical procedures in biology. They also serve as an effective introduction to the key ideas of statistical inference in introductory courses for biology students. We discuss the use of such simulation based procedures in an integrated curriculum…

  1. Efficiency Analysis: Enhancing the Statistical and Evaluative Power of the Regression-Discontinuity Design.

    ERIC Educational Resources Information Center

    Madhere, Serge

    An analytic procedure, efficiency analysis, is proposed for improving the utility of quantitative program evaluation for decision making. The three features of the procedure are explained: (1) for statistical control, it adopts and extends the regression-discontinuity design; (2) for statistical inferences, it de-emphasizes hypothesis testing in…

  2. 28 CFR 42.602 - Exchange of information.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Exchange of information. 42.602 Section 42.602 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Procedures for Complaints of Employment Discrimination Filed Against Recipients of Federal Financial Assistance § 42.602...

  3. 29 CFR 1607.1 - Statement of purpose.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... incorporate a single set of principles which are designed to assist employers, labor organizations, employment... are designed to provide a framework for determining the proper use of tests and other selection procedures. These guidelines do not require a user to conduct validity studies of selection procedures where...

  4. 29 CFR 1607.1 - Statement of purpose.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... incorporate a single set of principles which are designed to assist employers, labor organizations, employment... are designed to provide a framework for determining the proper use of tests and other selection procedures. These guidelines do not require a user to conduct validity studies of selection procedures where...

  5. 29 CFR 1607.1 - Statement of purpose.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... incorporate a single set of principles which are designed to assist employers, labor organizations, employment... are designed to provide a framework for determining the proper use of tests and other selection procedures. These guidelines do not require a user to conduct validity studies of selection procedures where...

  6. 29 CFR 1607.1 - Statement of purpose.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... incorporate a single set of principles which are designed to assist employers, labor organizations, employment... are designed to provide a framework for determining the proper use of tests and other selection procedures. These guidelines do not require a user to conduct validity studies of selection procedures where...

  7. 29 CFR 1607.1 - Statement of purpose.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... incorporate a single set of principles which are designed to assist employers, labor organizations, employment... are designed to provide a framework for determining the proper use of tests and other selection procedures. These guidelines do not require a user to conduct validity studies of selection procedures where...

  8. Instructional Aides: Employment, Payroll Procedures, Supervision, Performance Appraisal, Legal Aspects.

    ERIC Educational Resources Information Center

    Nielsen, Earl T.

    Designed to assist school administrators in their efforts to secure, train, and retain the most qualified instructional aides available, the monograph discusses procedures for employment, payroll processing, aide supervision, performance appraisal, and legal aspects involved in the hiring of instructional aides. Specific topics include…

  9. 12 CFR 313.99 - Prohibited actions by employer.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 12 Banks and Banking 4 2010-01-01 2010-01-01 false Prohibited actions by employer. 313.99 Section 313.99 Banks and Banking FEDERAL DEPOSIT INSURANCE CORPORATION PROCEDURE AND RULES OF PRACTICE PROCEDURES FOR CORPORATE DEBT COLLECTION Administrative Wage Garnishment § 313.99 Prohibited actions by...

  10. Evaluating measurement models in clinical research: covariance structure analysis of latent variable models of self-conception.

    PubMed

    Hoyle, R H

    1991-02-01

    Indirect measures of psychological constructs are vital to clinical research. On occasion, however, the meaning of indirect measures of psychological constructs is obfuscated by statistical procedures that do not account for the complex relations between items and latent variables and among latent variables. Covariance structure analysis (CSA) is a statistical procedure for testing hypotheses about the relations among items that indirectly measure a psychological construct and relations among psychological constructs. This article introduces clinical researchers to the strengths and limitations of CSA as a statistical procedure for conceiving and testing structural hypotheses that are not tested adequately with other statistical procedures. The article is organized around two empirical examples that illustrate the use of CSA for evaluating measurement models with correlated error terms, higher-order factors, and measured and latent variables.

  11. Consequences of common data analysis inaccuracies in CNS trauma injury basic research.

    PubMed

    Burke, Darlene A; Whittemore, Scott R; Magnuson, David S K

    2013-05-15

    The development of successful treatments for humans after traumatic brain or spinal cord injuries (TBI and SCI, respectively) requires animal research. This effort can be hampered when promising experimental results cannot be replicated because of incorrect data analysis procedures. To identify and hopefully avoid these errors in future studies, the articles in seven journals with the highest number of basic science central nervous system TBI and SCI animal research studies published in 2010 (N=125 articles) were reviewed for their data analysis procedures. After identifying the most common statistical errors, the implications of those findings were demonstrated by reanalyzing previously published data from our laboratories using the identified inappropriate statistical procedures, then comparing the two sets of results. Overall, 70% of the articles contained at least one type of inappropriate statistical procedure. The highest percentage involved incorrect post hoc t-tests (56.4%), followed by inappropriate parametric statistics (analysis of variance and t-test; 37.6%). Repeated Measures analysis was inappropriately missing in 52.0% of all articles and, among those with behavioral assessments, 58% were analyzed incorrectly. Reanalysis of our published data using the most common inappropriate statistical procedures resulted in a 14.1% average increase in significant effects compared to the original results. Specifically, an increase of 15.5% occurred with Independent t-tests and 11.1% after incorrect post hoc t-tests. Utilizing proper statistical procedures can allow more-definitive conclusions, facilitate replicability of research results, and enable more accurate translation of those results to the clinic.

  12. 4 CFR 28.13 - Special procedure for Workforce Restructuring Action.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ....13 Section 28.13 Accounts GOVERNMENT ACCOUNTABILITY OFFICE GENERAL PROCEDURES GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL APPEALS BOARD; PROCEDURES APPLICABLE TO CLAIMS CONCERNING EMPLOYMENT PRACTICES AT THE GOVERNMENT ACCOUNTABILITY OFFICE Procedures § 28.13 Special procedure for Workforce Restructuring...

  13. Correlation between the Beck Depression Inventory and bariatric surgical procedures.

    PubMed

    Ayloo, Subhashini; Thompson, Kara; Choudhury, Nabajit; Sheriffdeen, Raiyah

    2015-01-01

    The Beck Depression Inventory (BDI) is a psychosocial screen for depression in obese patients seeking bariatric surgery. Gastric bypass improves postsurgical BDI scores due to weight loss, which predicts future weight loss. The effect of different bariatric procedures with differences in weight loss on BDI scores is unknown. To evaluate the relationship between different bariatric procedures and changes in the BDI scores, adjusting for the initial BDI score, and to consider the impact of psychosocial variables. The secondary objective was to assess the relationship between changes in BDI scores and weight loss at 6 to 12 months. University Hospital, United States. Bariatric surgical patients were prospectively enrolled and retrospectively reviewed. We assessed changes in BDI after adjusting for the presurgical BDI and analyzed the relationship between patient demographic characteristics/psychological disorders and changes in BDI. We enrolled 137 patients who underwent a gastric band procedure, sleeve gastrectomy, or gastric bypass. We found a significant decrease in BMI and BDI scores across the full sample. Unlike BDI, change in BMI varied with procedure. Normalizing for baseline BDI, change in BDI did not significantly correlate with change in BMI. Patients who were employed and those without psychiatric history experienced even greater improvement in BDI scores. No statistically significant correlation was found between the change in BDI and weight loss at 6-12 months. BDI scores were independent of the type of bariatric procedure and the amount of weight loss. Advantageous psychosocial parameters were associated with greater improvement in BDI scores. Copyright © 2015 American Society for Bariatric Surgery. Published by Elsevier Inc. All rights reserved.

  14. Radial access for cerebrovascular procedures: Case report and technical note.

    PubMed

    Satti, Sudhakar R; Vance, Ansar Z; Sivapatham, Thinesh

    2016-04-01

    Advantages of radial access over brachial/axillary or femoral access have been well described for several decades and include decreased cost, patient preference, and decreased major access site complications. Despite these advantages, radial access is rarely employed or even considered for neurointerventional procedures. This attitude should be reconsidered given several recent large, randomized, controlled trials from the cardiovascular literature proving that radial access is associated with statistically lower costs, decreased incidence of myocardial infarctions, strokes, and even decreased mortality. Radial access is now considered the standard of care for percutaneous coronary interventions in most US centers. Although radial access has been described for neurovascular procedures in the past, overall experience is limited. The two major challenges are the unique anatomy required to access the cerebral vasculature given very acute angles between the arm and craniocervical vessels and limitations in available technology. We present a simplified approach to radial access for cerebrovascular procedures and provide a concise step-by-step approach for patient selection, ultrasound-guided single-wall access, recommended catheters/wires, and review of patent hemostasis. Additionally, we present a complex cerebrovascular intervention in which standard femoral access was unsuccessful, while radial access was quickly achieved to highlight the importance of familiarity with the radial approach for all neurointerventionalists. We have found that the learning curve is not too steep and that the radial access approach can be adopted smoothly for a large percentage of diagnostic and interventional neuroradiologic procedures. Radial access should be considered in all patients undergoing a cerebrovascular procedure. © The Author(s) 2015.

  15. Reduction in the incidence of shivering with perioperative dexmedetomidine: A randomized prospective study

    PubMed Central

    Bajwa, Sukhminder Jit Singh; Gupta, Sachin; Kaur, Jasbir; Singh, Amarjit; Parmar, SS

    2012-01-01

    Background and Aims: Shivering is distressing to the patient and discomforting to the attending anesthesiologist, with a varying degree of success. Various drugs and regimens have been employed to abolish the occurrence of shivering. The present study aims to explore the effectiveness of dexmedetomidine in suppressing the postanesthetic shivering in patients undergoing general anesthesia. Materials and Methods: The present study was carried out on 80 patients, in American Society of Anesthesiologists I and II, aged 22–59 years, who underwent general anesthesia for laparoscopic surgical procedures. Patients were allocated randomly into two groups: group N (n = 40) and group D (n = 40). Group D were administered 1 μg/kg of dexmedetomidine intravenously, while group N received similar volume of saline during peri-op period. Cardiorespiratory parameters were observed and recorded during the preop, intraop, and postop periods. Any incidence of postop shivering was observed and recorded as per 4 point scale. Side effects were also observed, recorded, and treated symptomatically. Statistical analysis was carried out using statistical package for social sciences (SPSS) version 15.0 for windows and employing ANOVA and chi-square test with post-hoc comparisons with Bonferroni's correction. Results: The two groups were comparable regarding demographic profile (P > 0.05). Incidence of shivering in group N was 42.5%, which was statistically highly significant (P = 0.014). Heart rate and mean arterial pressure also showed significant variation clinically and statistically in group D patients during the postop period (P = 0.008 and 0.012). A high incidence of sedation (P = 0.000) and dry mouth (P = 0.000) was observed in group D, whereas the incidence of nausea and vomiting was higher in group N (P = 0.011 and 0.034). Conclusions: Dexmedetomidine seems to possess antishivering properties and was found to reduce the occurrence of shivering in patients undergoing general anesthesia. PMID:22345953

  16. Global aesthetic surgery statistics: a closer look.

    PubMed

    Heidekrueger, Paul I; Juran, S; Ehrl, D; Aung, T; Tanna, N; Broer, P Niclas

    2017-08-01

    Obtaining quality global statistics about surgical procedures remains an important yet challenging task. The International Society of Aesthetic Plastic Surgery (ISAPS) reports the total number of surgical and non-surgical procedures performed worldwide on a yearly basis. While providing valuable insight, ISAPS' statistics leave two important factors unaccounted for: (1) the underlying base population, and (2) the number of surgeons performing the procedures. Statistics of the published ISAPS' 'International Survey on Aesthetic/Cosmetic Surgery' were analysed by country, taking into account the underlying national base population according to the official United Nations population estimates. Further, the number of surgeons per country was used to calculate the number of surgeries performed per surgeon. In 2014, based on ISAPS statistics, national surgical procedures ranked in the following order: 1st USA, 2nd Brazil, 3rd South Korea, 4th Mexico, 5th Japan, 6th Germany, 7th Colombia, and 8th France. When considering the size of the underlying national populations, the demand for surgical procedures per 100,000 people changes the overall ranking substantially. It was also found that the rate of surgical procedures per surgeon shows great variation between the responding countries. While the US and Brazil are often quoted as the countries with the highest demand for plastic surgery, according to the presented analysis, other countries surpass these countries in surgical procedures per capita. While data acquisition and quality should be improved in the future, valuable insight regarding the demand for surgical procedures can be gained by taking specific demographic and geographic factors into consideration.

  17. 28 CFR 39.170 - Compliance procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... allegations of discrimination on the basis of handicap in programs or activities conducted by the agency. (b... employment according to the procedures established by the Equal Employment Opportunity Commission in 29 CFR...—(1) Who may file. (i) Any person who believes that he or she has been subjected to discrimination...

  18. 5 CFR 1208.13 - Content of appeal; request for hearing.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Content of appeal; request for hearing... PROCEDURES PRACTICES AND PROCEDURES FOR APPEALS UNDER THE UNIFORMED SERVICES EMPLOYMENT AND REEMPLOYMENT RIGHTS ACT AND THE VETERANS EMPLOYMENT OPPORTUNITIES ACT USERRA Appeals § 1208.13 Content of appeal...

  19. 5 CFR 1208.23 - Content of appeal; request for hearing.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Content of appeal; request for hearing... PROCEDURES PRACTICES AND PROCEDURES FOR APPEALS UNDER THE UNIFORMED SERVICES EMPLOYMENT AND REEMPLOYMENT RIGHTS ACT AND THE VETERANS EMPLOYMENT OPPORTUNITIES ACT VEOA Appeals § 1208.23 Content of appeal...

  20. 20 CFR 422.105 - Presumption of authority of nonimmigrant alien to engage in employment.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 2 2010-04-01 2010-04-01 false Presumption of authority of nonimmigrant alien to engage in employment. 422.105 Section 422.105 Employees' Benefits SOCIAL SECURITY ADMINISTRATION ORGANIZATION AND PROCEDURES General Procedures § 422.105 Presumption of authority of nonimmigrant...

  1. 28 CFR 42.306 - Guidelines.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... denying equal employment opportunities to minority individuals and women. (b) Equal employment program... Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Equal Employment Opportunity Program Guidelines § 42.306 Guidelines. (a) Recipient agencies are...

  2. Space Shuttle Main Engine performance analysis

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    1993-01-01

    For a number of years, NASA has relied primarily upon periodically updated versions of Rocketdyne's power balance model (PBM) to provide space shuttle main engine (SSME) steady-state performance prediction. A recent computational study indicated that PBM predictions do not satisfy fundamental energy conservation principles. More recently, SSME test results provided by the Technology Test Bed (TTB) program have indicated significant discrepancies between PBM flow and temperature predictions and TTB observations. Results of these investigations have diminished confidence in the predictions provided by PBM, and motivated the development of new computational tools for supporting SSME performance analysis. A multivariate least squares regression algorithm was developed and implemented during this effort in order to efficiently characterize TTB data. This procedure, called the 'gains model,' was used to approximate the variation of SSME performance parameters such as flow rate, pressure, temperature, speed, and assorted hardware characteristics in terms of six assumed independent influences. These six influences were engine power level, mixture ratio, fuel inlet pressure and temperature, and oxidizer inlet pressure and temperature. A BFGS optimization algorithm provided the base procedure for determining regression coefficients for both linear and full quadratic approximations of parameter variation. Statistical information relative to data deviation from regression derived relations was also computed. A new strategy for integrating test data with theoretical performance prediction was also investigated. The current integration procedure employed by PBM treats test data as pristine and adjusts hardware characteristics in a heuristic manner to achieve engine balance. Within PBM, this integration procedure is called 'data reduction.' By contrast, the new data integration procedure, termed 'reconciliation,' uses mathematical optimization techniques, and requires both measurement and balance uncertainty estimates. The reconciler attempts to select operational parameters that minimize the difference between theoretical prediction and observation. Selected values are further constrained to fall within measurement uncertainty limits and to satisfy fundamental physical relations (mass conservation, energy conservation, pressure drop relations, etc.) within uncertainty estimates for all SSME subsystems. The parameter selection problem described above is a traditional nonlinear programming problem. The reconciler employs a mixed penalty method to determine optimum values of SSME operating parameters associated with this problem formulation.

  3. Impact of digital impression techniques on the adaption of ceramic partial crowns in vitro.

    PubMed

    Schaefer, Oliver; Decker, Mike; Wittstock, Frank; Kuepper, Harald; Guentsch, Arndt

    2014-06-01

    To investigate the effects, digital impression procedures can have on the three-dimensional fit of ceramic partial crowns in vitro. An acrylic model of a mandibular first molar was prepared to receive a partial coverage all-ceramic crown (mesio-occlusal-distal inlay preparation with reduction of all cusps and rounded shoulder finish line of buccal wall). Digital impressions were taken using iTero (ITE), cara TRIOS (TRI), CEREC AC with Bluecam (CBC), and Lava COS (COS) systems, before restorations were designed and machined from lithium disilicate blanks. Both the preparation and the restorations were digitised using an optical reference-scanner. Data were entered into quality inspection software, which superimposed the records (best-fit-algorithm), calculated fit-discrepancies for every pixel, and colour-coded the results to aid visualisation. Furthermore, mean quadratic deviations (RMS) were computed and analysed statistically with a one-way ANOVA. Scheffé's procedure was applied for multiple comparisons (n=5, α=0.05). Mean marginal (internal) discrepancies were: ITE 90 (92) μm, TRI 128 (106) μm, CBC 146 (84) μm, and COS 109 (93) μm. Differences among impression systems were statistically significant at p<0.001 (p=0.039). Qualitatively, partial crowns were undersized especially around cusp tips or the occluso-approximal isthmus. By contrast, potential high-spots could be detected along the preparation finishline and at central occlusal boxes. Marginal and internal fit of milled lithium disilicate partial crowns depended on the employed digital impression technique. The investigated digital impression procedures demonstrated significant fit discrepancies. However, all fabricated restorations showed acceptable marginal and internal gap sizes, when considering clinically relevant thresholds reported in the literature. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Statistical analysis and digital processing of the Mössbauer spectra

    NASA Astrophysics Data System (ADS)

    Prochazka, Roman; Tucek, Pavel; Tucek, Jiri; Marek, Jaroslav; Mashlan, Miroslav; Pechousek, Jiri

    2010-02-01

    This work is focused on using the statistical methods and development of the filtration procedures for signal processing in Mössbauer spectroscopy. Statistical tools for noise filtering in the measured spectra are used in many scientific areas. The use of a pure statistical approach in accumulated Mössbauer spectra filtration is described. In Mössbauer spectroscopy, the noise can be considered as a Poisson statistical process with a Gaussian distribution for high numbers of observations. This noise is a superposition of the non-resonant photons counting with electronic noise (from γ-ray detection and discrimination units), and the velocity system quality that can be characterized by the velocity nonlinearities. The possibility of a noise-reducing process using a new design of statistical filter procedure is described. This mathematical procedure improves the signal-to-noise ratio and thus makes it easier to determine the hyperfine parameters of the given Mössbauer spectra. The filter procedure is based on a periodogram method that makes it possible to assign the statistically important components in the spectral domain. The significance level for these components is then feedback-controlled using the correlation coefficient test results. The estimation of the theoretical correlation coefficient level which corresponds to the spectrum resolution is performed. Correlation coefficient test is based on comparison of the theoretical and the experimental correlation coefficients given by the Spearman method. The correctness of this solution was analyzed by a series of statistical tests and confirmed by many spectra measured with increasing statistical quality for a given sample (absorber). The effect of this filter procedure depends on the signal-to-noise ratio and the applicability of this method has binding conditions.

  5. Segmentation of bone and soft tissue regions in digital radiographic images of extremities

    NASA Astrophysics Data System (ADS)

    Pakin, S. Kubilay; Gaborski, Roger S.; Barski, Lori L.; Foos, David H.; Parker, Kevin J.

    2001-07-01

    This paper presents an algorithm for segmentation of computed radiography (CR) images of extremities into bone and soft tissue regions. The algorithm is a region-based one in which the regions are constructed using a growing procedure with two different statistical tests. Following the growing process, tissue classification procedure is employed. The purpose of the classification is to label each region as either bone or soft tissue. This binary classification goal is achieved by using a voting procedure that consists of clustering of regions in each neighborhood system into two classes. The voting procedure provides a crucial compromise between local and global analysis of the image, which is necessary due to strong exposure variations seen on the imaging plate. Also, the existence of regions whose size is large enough such that exposure variations can be observed through them makes it necessary to use overlapping blocks during the classification. After the classification step, resulting bone and soft tissue regions are refined by fitting a 2nd order surface to each tissue, and reevaluating the label of each region according to the distance between the region and surfaces. The performance of the algorithm is tested on a variety of extremity images using manually segmented images as gold standard. The experiments showed that our algorithm provided a bone boundary with an average area overlap of 90% compared to the gold standard.

  6. Hybrid Data Assimilation without Ensemble Filtering

    NASA Technical Reports Server (NTRS)

    Todling, Ricardo; Akkraoui, Amal El

    2014-01-01

    The Global Modeling and Assimilation Office is preparing to upgrade its three-dimensional variational system to a hybrid approach in which the ensemble is generated using a square-root ensemble Kalman filter (EnKF) and the variational problem is solved using the Grid-point Statistical Interpolation system. As in most EnKF applications, we found it necessary to employ a combination of multiplicative and additive inflations, to compensate for sampling and modeling errors, respectively and, to maintain the small-member ensemble solution close to the variational solution; we also found it necessary to re-center the members of the ensemble about the variational analysis. During tuning of the filter we have found re-centering and additive inflation to play a considerably larger role than expected, particularly in a dual-resolution context when the variational analysis is ran at larger resolution than the ensemble. This led us to consider a hybrid strategy in which the members of the ensemble are generated by simply converting the variational analysis to the resolution of the ensemble and applying additive inflation, thus bypassing the EnKF. Comparisons of this, so-called, filter-free hybrid procedure with an EnKF-based hybrid procedure and a control non-hybrid, traditional, scheme show both hybrid strategies to provide equally significant improvement over the control; more interestingly, the filter-free procedure was found to give qualitatively similar results to the EnKF-based procedure.

  7. A comparative clinical study of the efficacy of subepithelial connective tissue graft and acellular dermal matrix graft in root coverage: 6-month follow-up observation

    PubMed Central

    Thomas, Libby John; Emmadi, Pamela; Thyagarajan, Ramakrishnan; Namasivayam, Ambalavanan

    2013-01-01

    Aims: The purpose of this study was to compare the clinical efficacy of subepithelial connective tissue graft and acellular dermal matrix graft associated with coronally repositioned flap in the treatment of Miller's class I and II gingival recession, 6 months postoperatively. Settings and Design: Ten patients with bilateral Miller's class I or class II gingival recession were randomly divided into two groups using a split-mouth study design. Materials and Methods: Group I (10 sites) was treated with subepithelial connective tissue graft along with coronally repositioned flap and Group II (10 sites) treated with acellular dermal matrix graft along with coronally repositioned flap. Clinical parameters like recession height and width, probing pocket depth, clinical attachment level, and width of keratinized gingiva were evaluated at baseline, 90th day, and 180th day for both groups. The percentage of root coverage was calculated based on the comparison of the recession height from 0 to 180th day in both Groups I and II. Statistical Analysis Used: Intragroup parameters at different time points were measured using the Wilcoxon signed rank test and Mann–Whitney U test was employed to analyze the differences between test and control groups. Results: There was no statistically significant difference in recession height and width, gain in CAL, and increase in the width of keratinized gingiva between the two groups on the 180th day. Both procedures showed clinically and statistically significant root coverage (Group I 96%, Group II 89.1%) on the 180th day. Conclusions: The results indicate that coverage of denuded root with both subepithelial connective tissue autograft and acellular dermal matrix allograft are very predictable procedures, which were stable for 6 months postoperatively. PMID:24174728

  8. Modeling of fiber orientation in viscous fluid flow with application to self-compacting concrete

    NASA Astrophysics Data System (ADS)

    Kolařík, Filip; Patzák, Bořek

    2013-10-01

    In recent years, unconventional concrete reinforcement is of growing popularity. Especially fiber reinforcement has very wide usage in high performance concretes like "Self Compacting Concrete" (SCC). The design of advanced tailor-made structures made of SCC can take advantage of anisotropic orientation of fibers. Tools for fiber orientation predictions can contribute to design of tailor made structure and allow to develop casting procedures that enable to achieve the desired fiber distribution and orientation. This paper deals with development and implementation of suitable tool for prediction of fiber orientation in a fluid based on the knowledge of the velocity field. Statistical approach to the topic is employed. Fiber orientation is described by a probability distribution of the fiber angle.

  9. Automated optimization of water-water interaction parameters for a coarse-grained model.

    PubMed

    Fogarty, Joseph C; Chiu, See-Wing; Kirby, Peter; Jakobsson, Eric; Pandit, Sagar A

    2014-02-13

    We have developed an automated parameter optimization software framework (ParOpt) that implements the Nelder-Mead simplex algorithm and applied it to a coarse-grained polarizable water model. The model employs a tabulated, modified Morse potential with decoupled short- and long-range interactions incorporating four water molecules per interaction site. Polarizability is introduced by the addition of a harmonic angle term defined among three charged points within each bead. The target function for parameter optimization was based on the experimental density, surface tension, electric field permittivity, and diffusion coefficient. The model was validated by comparison of statistical quantities with experimental observation. We found very good performance of the optimization procedure and good agreement of the model with experiment.

  10. Privacy in confidential administrative micro data: implementing statistical disclosure control in a secure computing environment.

    PubMed

    Hochfellner, Daniela; Müller, Dana; Schmucker, Alexandra

    2014-12-01

    The demand for comprehensive and innovative data is constantly growing in social science. In particular, micro data from various social security agencies become more and more attractive. In contrast to survey data, administrative data offer a census with highly reliable information but are restricted in their usage. To make them accessible for researchers, data or research output either have to be anonymized or released after disclosure review procedures have been used. This article discusses the trade-off between maintaining a high capability of research potential while protecting private information, by exploiting the data disclosure portfolio and the adopted disclosure strategies of the Research Data Center of the German Federal Employment Agency. © The Author(s) 2014.

  11. Data survey on the effect of product features on competitive advantage of selected firms in Nigeria.

    PubMed

    Olokundun, Maxwell; Iyiola, Oladele; Ibidunni, Stephen; Falola, Hezekiah; Salau, Odunayo; Amaihian, Augusta; Peter, Fred; Borishade, Taiye

    2018-06-01

    The main objective of this study was to present a data article that investigates the effect product features on firm's competitive advantage. Few studies have examined how the features of a product could help in driving the competitive advantage of a firm. Descriptive research method was used. Statistical Package for Social Sciences (SPSS 22) was engaged for analysis of one hundred and fifty (150) valid questionnaire which were completed by small business owners registered under small and medium scale enterprises development of Nigeria (SMEDAN). Stratified and simple random sampling techniques were employed; reliability and validity procedures were also confirmed. The field data set is made publicly available to enable critical or extended analysis.

  12. Application of non-attenuating frequency radars for prediction of rain attenuation and space diversity performance

    NASA Technical Reports Server (NTRS)

    Goldhirsh, J.

    1979-01-01

    In order to establish transmitter power and receiver sensitivity levels at frequencies above 10 GHz, the designers of earth-satellite telecommunication systems are interested in cumulative rain fade statistics at variable path orientations, elevation angles, climatological regions, and frequencies. They are also interested in establishing optimum space diversity performance parameters. In this work are examined the many elements involved in the employment of single non-attenuating frequency radars for arriving at the desired information. The elements examined include radar techniques and requirements, phenomenological assumptions, path attenation formulations and procedures, as well as error budgeting and calibration analysis. Included are the pertinent results of previous investigators who have used radar for rain attenuation modeling. Suggestions are made for improving present methods.

  13. Fast iterative censoring CFAR algorithm for ship detection from SAR images

    NASA Astrophysics Data System (ADS)

    Gu, Dandan; Yue, Hui; Zhang, Yuan; Gao, Pengcheng

    2017-11-01

    Ship detection is one of the essential techniques for ship recognition from synthetic aperture radar (SAR) images. This paper presents a fast iterative detection procedure to eliminate the influence of target returns on the estimation of local sea clutter distributions for constant false alarm rate (CFAR) detectors. A fast block detector is first employed to extract potential target sub-images; and then, an iterative censoring CFAR algorithm is used to detect ship candidates from each target blocks adaptively and efficiently, where parallel detection is available, and statistical parameters of G0 distribution fitting local sea clutter well can be quickly estimated based on an integral image operator. Experimental results of TerraSAR-X images demonstrate the effectiveness of the proposed technique.

  14. Improvement and extension of a radar forest backscattering model

    NASA Technical Reports Server (NTRS)

    Simonett, David S.; Wang, Yong

    1989-01-01

    Radar modeling of mangal forest stands, in the Sundarbans area of Southern Bangladesh, was developed. The modeling employs radar system parameters with forest data on tree height, spacing, biomass, species combinations, and water (including slightly conductive water), content both in leaves and trunks of the mangal. For Sundri and Gewa tropical mangal forests, six model components are proposed, which are required to explain the contributions of various forest species combinations in the attenuation and scattering of mangal vegetated nonflooded or flooded surfaces. Statistical data of simulated images were compared with those of SIR-B images both to refine the modeling procedures and to appropriately characterize the model output. The possibility of delineation of flooded or nonflooded boundaries is discussed.

  15. Effect of laser and air abrasion pretreatment on the microleakage of a fissure sealant applied with conventional and self etch adhesives.

    PubMed

    Tirali, R E; Celik, C; Arhun, N; Berk, G; Cehreli, S B

    2013-01-01

    The purpose of this study was to investigate the effects of different pretreatment protocols along with different bonding agents on the microleakage of a fissure sealant material. A total of 144 freshly extracted noncarious human third molars were used The teeth were randomly assigned into three groups with respect to the pretreatment protocol employed: A. Air Abrasion B. Er,Cr:YSGG laser C. No pretreatment (Control). In each group specimens were further subjected to one of the following procedures before application of the sealant: 1. %36 Phosphoric acid-etch (AE) (DeTrey Conditioner 36/Denstply, UK) 2.AE+Prime&Bond NT (Dentsply, UK) 3. Clearfil S3 Bond (Kuraray, Japan) 4. Clearfil SE Bond (Kuraray, Japan). All teeth were sealed with the same fissure sealant material (Conseal F/SDI, Australia). Sealed teeth were further subjected to thermocycling, dye penetration test, sectioning and quantitative image analysis. Statistical evaluation of the microleakage data was performed with two way independent ANOVA and multiple comparisons test at p = 0.05. For qualitative evaluation 2 samples from each group were examined under Scanning Electron Microscopy. Microleakage was affected by both the type of pretreatment and the subsequent bonding protocols employed (p < 0.05). Overall, the highest (Mean = 0.36 mm) and lowest (Mean = 0.06 mm) microleakage values were observed in samples with unpretreated enamel sealed by S3+Conseal F and samples with laser pretreated enamel sealed by Acid Etch+Prime&-Bond+Conseal F protocols, respectively (p < 0.05). In the acid-etch group samples pretreated with laser yielded in slightly lower microleakage scores when compared with unpretreated samples and samples pretreated with air abrasion but the statistical significance was not important (p = 0,179). Similarly, when bonding agent is applied following acid-etching procedure, microleakage scores were not affected from pretreatment protocol (p = 0,615) (intact enamel/laser or air-abrasion). For both all-in one and two step self etch adhesive systems, unpretreated samples demonstrated the highest microleakage scores. For the groups in which bonding agent was utilized, pretreatments did not effected microleakage. Both the tested pretreatment protocols and adhesive procedures had different effects on the sealing properties of Conseal F in permanent tooth enamel.

  16. Assessing the significance of pedobarographic signals using random field theory.

    PubMed

    Pataky, Todd C

    2008-08-07

    Traditional pedobarographic statistical analyses are conducted over discrete regions. Recent studies have demonstrated that regionalization can corrupt pedobarographic field data through conflation when arbitrary dividing lines inappropriately delineate smooth field processes. An alternative is to register images such that homologous structures optimally overlap and then conduct statistical tests at each pixel to generate statistical parametric maps (SPMs). The significance of SPM processes may be assessed within the framework of random field theory (RFT). RFT is ideally suited to pedobarographic image analysis because its fundamental data unit is a lattice sampling of a smooth and continuous spatial field. To correct for the vast number of multiple comparisons inherent in such data, recent pedobarographic studies have employed a Bonferroni correction to retain a constant family-wise error rate. This approach unfortunately neglects the spatial correlation of neighbouring pixels, so provides an overly conservative (albeit valid) statistical threshold. RFT generally relaxes the threshold depending on field smoothness and on the geometry of the search area, but it also provides a framework for assigning p values to suprathreshold clusters based on their spatial extent. The current paper provides an overview of basic RFT concepts and uses simulated and experimental data to validate both RFT-relevant field smoothness estimations and RFT predictions regarding the topological characteristics of random pedobarographic fields. Finally, previously published experimental data are re-analysed using RFT inference procedures to demonstrate how RFT yields easily understandable statistical results that may be incorporated into routine clinical and laboratory analyses.

  17. Estimating multilevel logistic regression models when the number of clusters is low: a comparison of different statistical software procedures.

    PubMed

    Austin, Peter C

    2010-04-22

    Multilevel logistic regression models are increasingly being used to analyze clustered data in medical, public health, epidemiological, and educational research. Procedures for estimating the parameters of such models are available in many statistical software packages. There is currently little evidence on the minimum number of clusters necessary to reliably fit multilevel regression models. We conducted a Monte Carlo study to compare the performance of different statistical software procedures for estimating multilevel logistic regression models when the number of clusters was low. We examined procedures available in BUGS, HLM, R, SAS, and Stata. We found that there were qualitative differences in the performance of different software procedures for estimating multilevel logistic models when the number of clusters was low. Among the likelihood-based procedures, estimation methods based on adaptive Gauss-Hermite approximations to the likelihood (glmer in R and xtlogit in Stata) or adaptive Gaussian quadrature (Proc NLMIXED in SAS) tended to have superior performance for estimating variance components when the number of clusters was small, compared to software procedures based on penalized quasi-likelihood. However, only Bayesian estimation with BUGS allowed for accurate estimation of variance components when there were fewer than 10 clusters. For all statistical software procedures, estimation of variance components tended to be poor when there were only five subjects per cluster, regardless of the number of clusters.

  18. Applying a statistical PTB detection procedure to complement the gold standard.

    PubMed

    Noor, Norliza Mohd; Yunus, Ashari; Bakar, S A R Abu; Hussin, Amran; Rijal, Omar Mohd

    2011-04-01

    This paper investigates a novel statistical discrimination procedure to detect PTB when the gold standard requirement is taken into consideration. Archived data were used to establish two groups of patients which are the control and test group. The control group was used to develop the statistical discrimination procedure using four vectors of wavelet coefficients as feature vectors for the detection of pulmonary tuberculosis (PTB), lung cancer (LC), and normal lung (NL). This discrimination procedure was investigated using the test group where the number of sputum positive and sputum negative cases that were correctly classified as PTB cases were noted. The proposed statistical discrimination method is able to detect PTB patients and LC with high true positive fraction. The method is also able to detect PTB patients that are sputum negative and therefore may be used as a complement to the gold standard. Copyright © 2010 Elsevier Ltd. All rights reserved.

  19. 28 CFR 42.303 - Evaluation of employment opportunities.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... opportunities for minorities and women. (b) In many cases an effective equal employment opportunity program may... 28 Judicial Administration 1 2011-07-01 2011-07-01 false Evaluation of employment opportunities... EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Equal Employment Opportunity Program Guidelines § 42.303...

  20. 28 CFR 42.303 - Evaluation of employment opportunities.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... opportunities for minorities and women. (b) In many cases an effective equal employment opportunity program may... 28 Judicial Administration 1 2010-07-01 2010-07-01 false Evaluation of employment opportunities... EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Equal Employment Opportunity Program Guidelines § 42.303...

  1. How Do Undergraduate Piano Students Memorize Their Repertoires?

    ERIC Educational Resources Information Center

    Gerling, Cristina C.; Dos Santos, Regina Antunes Teixeira

    2017-01-01

    This study investigated the routine procedures employed by nine undergraduate piano students at a Brazilian university while learning and performing memorized pieces and the procedures employed using Chaffin's performance cue (PC) protocols. The data were collected in two phases. In Phase I, each participant selected one piece that he or she had…

  2. Using a Simultaneous Prompting Procedure to Embed Core Content When Teaching a Potential Employment Skill

    ERIC Educational Resources Information Center

    Collins, Belva C.; Terrell, Misty; Test, David W.

    2017-01-01

    This investigation used a multiple-probe-across-participants design to examine the effects of using a simultaneous prompting procedure to teach four secondary students with mild intellectual disabilities the employment task of caring for plants in a greenhouse. The instructor also embedded photosynthesis science content as nontargeted information…

  3. 76 FR 47243 - Training and Employment Guidance (TEGL) Letter No. 15-06, Change 1, Special Procedures: Labor...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-04

    ... procedures, updated to reflect regulatory and organizational changes in the H-2A Program, for employers who... (the 2008 Final Rule). The 2008 Final Rule implemented an attestation-based application process and... reflect organizational changes, in addition to new regulatory and policy objectives. It replaces previous...

  4. Uncertainty Analysis for DAM Projects.

    DTIC Science & Technology

    1987-09-01

    overwhelming majority of articles published on the use of statistical methodology for geotechnical engineering focus on performance predictions and design ...Results of the present study do not support the adoption of more esoteric statistical procedures except on a special case basis or in research ...influence that recommended statistical procedures might have had on the Carters Project, had they been applied during planning and design phases

  5. Education and Employment Patterns of Bioscientists. A Statistical Report.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC.

    This report contains a compilation of manpower statistics describing the education and employment of bioscientists. The tables also include data from other major disciplines to allow for comparisons with other scientists and nonscientists. Bioscientists include those with degrees in anatomy, biochemistry, biophysics, genetics, microbiology,…

  6. OCCUPATIONS IN COLORADO. PART I, OUTLOOK BY INDUSTRIES.

    ERIC Educational Resources Information Center

    1966

    CURRENT AND PROJECTED EMPLOYMENT STATISTICS ARE GIVEN FOR THE STATE AND FOR THE DENVER STANDARD METROPOLITAN STATISTICAL AREA WHICH INCLUDES ADAMS, ARAPAHOE, BOULDER, DENVER, AND JEFFERSON COUNTIES. DATA WERE OBTAINED FROM THE COLORADO DEPARTMENT OF EMPLOYMENT, DENVER RESEARCH INSTITUTE, U.S. CENSUS, UNIVERSITY OF COLORADO, MOUNTAIN STATES…

  7. Testing homogeneity of proportion ratios for stratified correlated bilateral data in two-arm randomized clinical trials.

    PubMed

    Pei, Yanbo; Tian, Guo-Liang; Tang, Man-Lai

    2014-11-10

    Stratified data analysis is an important research topic in many biomedical studies and clinical trials. In this article, we develop five test statistics for testing the homogeneity of proportion ratios for stratified correlated bilateral binary data based on an equal correlation model assumption. Bootstrap procedures based on these test statistics are also considered. To evaluate the performance of these statistics and procedures, we conduct Monte Carlo simulations to study their empirical sizes and powers under various scenarios. Our results suggest that the procedure based on score statistic performs well generally and is highly recommended. When the sample size is large, procedures based on the commonly used weighted least square estimate and logarithmic transformation with Mantel-Haenszel estimate are recommended as they do not involve any computation of maximum likelihood estimates requiring iterative algorithms. We also derive approximate sample size formulas based on the recommended test procedures. Finally, we apply the proposed methods to analyze a multi-center randomized clinical trial for scleroderma patients. Copyright © 2014 John Wiley & Sons, Ltd.

  8. Statistical Reform in School Psychology Research: A Synthesis

    ERIC Educational Resources Information Center

    Swaminathan, Hariharan; Rogers, H. Jane

    2007-01-01

    Statistical reform in school psychology research is discussed in terms of research designs, measurement issues, statistical modeling and analysis procedures, interpretation and reporting of statistical results, and finally statistics education.

  9. 26 CFR 601.401 - Employment taxes.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 26 Internal Revenue 20 2011-04-01 2011-04-01 false Employment taxes. 601.401 Section 601.401... STATEMENT OF PROCEDURAL RULES Provisions Special to Certain Employment Taxes § 601.401 Employment taxes. (a) General—(1) Description of taxes. Federal employment taxes are imposed by Subtitle C of the Internal...

  10. 20 CFR 422.112 - Employer identification numbers.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 2 2012-04-01 2012-04-01 false Employer identification numbers. 422.112... Procedures § 422.112 Employer identification numbers. (a) General. Most employers are required by section...)-1 to obtain an employer identification number (EIN) and to include it on wage reports filed with SSA...

  11. 20 CFR 422.112 - Employer identification numbers.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 2 2014-04-01 2014-04-01 false Employer identification numbers. 422.112... Procedures § 422.112 Employer identification numbers. (a) General. Most employers are required by section...)-1 to obtain an employer identification number (EIN) and to include it on wage reports filed with SSA...

  12. 4 CFR 28.9 - Procedures; general.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 4 Accounts 1 2010-01-01 2010-01-01 false Procedures; general. 28.9 Section 28.9 Accounts GOVERNMENT ACCOUNTABILITY OFFICE GENERAL PROCEDURES GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL APPEALS BOARD; PROCEDURES APPLICABLE TO CLAIMS CONCERNING EMPLOYMENT PRACTICES AT THE GOVERNMENT ACCOUNTABILITY OFFICE...

  13. Employment and Earnings. Volume 35, Number 3, March 1988.

    ERIC Educational Resources Information Center

    Employment and Earnings, 1988

    1988-01-01

    This document presents the following monthly statistical data for the population of United States: (1) employment status; (2) characteristics of the unemployed; (3) characteristics of the employed and their job categories; (4) seasonally adjusted employment and unemployment; (5) national employment; (6) employment in states and areas; (7) national…

  14. A hybrid downscaling procedure for estimating the vertical distribution of ambient temperature in local scale

    NASA Astrophysics Data System (ADS)

    Yiannikopoulou, I.; Philippopoulos, K.; Deligiorgi, D.

    2012-04-01

    The vertical thermal structure of the atmosphere is defined by a combination of dynamic and radiation transfer processes and plays an important role in describing the meteorological conditions at local scales. The scope of this work is to develop and quantify the predictive ability of a hybrid dynamic-statistical downscaling procedure to estimate the vertical profile of ambient temperature at finer spatial scales. The study focuses on the warm period of the year (June - August) and the method is applied to an urban coastal site (Hellinikon), located in eastern Mediterranean. The two-step methodology initially involves the dynamic downscaling of coarse resolution climate data via the RegCM4.0 regional climate model and subsequently the statistical downscaling of the modeled outputs by developing and training site-specific artificial neural networks (ANN). The 2.5ox2.5o gridded NCEP-DOE Reanalysis 2 dataset is used as initial and boundary conditions for the dynamic downscaling element of the methodology, which enhances the regional representivity of the dataset to 20km and provides modeled fields in 18 vertical levels. The regional climate modeling results are compared versus the upper-air Hellinikon radiosonde observations and the mean absolute error (MAE) is calculated between the four grid point values nearest to the station and the ambient temperature at the standard and significant pressure levels. The statistical downscaling element of the methodology consists of an ensemble of ANN models, one for each pressure level, which are trained separately and employ the regional scale RegCM4.0 output. The ANN models are theoretically capable of estimating any measurable input-output function to any desired degree of accuracy. In this study they are used as non-linear function approximators for identifying the relationship between a number of predictor variables and the ambient temperature at the various vertical levels. An insight of the statistically derived input-output transfer functions is obtained by utilizing the ANN weights method, which quantifies the relative importance of the predictor variables in the estimation procedure. The overall downscaling performance evaluation incorporates a set of correlation and statistical measures along with appropriate statistical tests. The hybrid downscaling method presented in this work can be extended to various locations by training different site-specific ANN models and the results, depending on the application, can be used for assisting the understanding of the past, present and future climatology. ____________________________ This research has been co-financed by the European Union and Greek national funds through the Operational Program "Education and Lifelong Learning" of the National Strategic Reference Framework (NSRF) - Research Funding Program: Heracleitus II: Investing in knowledge society through the European Social Fund.

  15. Gis-Based Spatial Statistical Analysis of College Graduates Employment

    NASA Astrophysics Data System (ADS)

    Tang, R.

    2012-07-01

    It is urgently necessary to be aware of the distribution and employment status of college graduates for proper allocation of human resources and overall arrangement of strategic industry. This study provides empirical evidence regarding the use of geocoding and spatial analysis in distribution and employment status of college graduates based on the data from 2004-2008 Wuhan Municipal Human Resources and Social Security Bureau, China. Spatio-temporal distribution of employment unit were analyzed with geocoding using ArcGIS software, and the stepwise multiple linear regression method via SPSS software was used to predict the employment and to identify spatially associated enterprise and professionals demand in the future. The results show that the enterprises in Wuhan east lake high and new technology development zone increased dramatically from 2004 to 2008, and tended to distributed southeastward. Furthermore, the models built by statistical analysis suggest that the specialty of graduates major in has an important impact on the number of the employment and the number of graduates engaging in pillar industries. In conclusion, the combination of GIS and statistical analysis which helps to simulate the spatial distribution of the employment status is a potential tool for human resource development research.

  16. Three Empirical Strategies for Teaching Statistics

    ERIC Educational Resources Information Center

    Marson, Stephen M.

    2007-01-01

    This paper employs a three-step process to analyze three empirically supported strategies for teaching statistics to BSW students. The strategies included: repetition, immediate feedback, and use of original data. First, each strategy is addressed through the literature. Second, the application of employing each of the strategies over the period…

  17. Electrospining of polyaniline/poly(lactic acid) ultrathin fibers: process and statistical modeling using a non-gaussian approach

    USDA-ARS?s Scientific Manuscript database

    Cover: The electrospinning technique was employed to obtain conducting nanofibers based on polyaniline and poly(lactic acid). A statistical model was employed to describe how the process factors (solution concentration, applied voltage, and flow rate) govern the fiber dimensions. Nanofibers down to ...

  18. Regional frequency analysis of extreme rainfalls using partial L moments method

    NASA Astrophysics Data System (ADS)

    Zakaria, Zahrahtul Amani; Shabri, Ani

    2013-07-01

    An approach based on regional frequency analysis using L moments and LH moments are revisited in this study. Subsequently, an alternative regional frequency analysis using the partial L moments (PL moments) method is employed, and a new relationship for homogeneity analysis is developed. The results were then compared with those obtained using the method of L moments and LH moments of order two. The Selangor catchment, consisting of 37 sites and located on the west coast of Peninsular Malaysia, is chosen as a case study. PL moments for the generalized extreme value (GEV), generalized logistic (GLO), and generalized Pareto distributions were derived and used to develop the regional frequency analysis procedure. PL moment ratio diagram and Z test were employed in determining the best-fit distribution. Comparison between the three approaches showed that GLO and GEV distributions were identified as the suitable distributions for representing the statistical properties of extreme rainfall in Selangor. Monte Carlo simulation used for performance evaluation shows that the method of PL moments would outperform L and LH moments methods for estimation of large return period events.

  19. Short alleles revealed by PCR demonstrate no heterozygote deficiency at minisatellite loci D1S7, D7S21, and D12S11

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alonso, S.; Castro, A.; Fernandez-Fernandez, I.

    1997-02-01

    Short VNTR alleles that go undetected after conventional Southern blot hybridization may constitute an alternative explanation for the heterozygosity deficiency observed at some minisatellite loci. To examine this hypothesis, we have employed a screening procedure based on PCR amplification of those individuals classified as homozygotes in our databases for the loci D1S7, D7S21, and D12S11. The results obtained indicate that the frequency of these short alleles is related to the heterozygosity deficiency observed. For the most polymorphic locus, D1S7, {approximately}60% of those individuals previously classified as homozygotes were in fact heterozygotes for a short allele. After the inclusion of thesemore » new alleles, the agreement between observed and expected heterozygosity, along with other statistical tests employed, provide additional evidence for lack of population substructuring. Comparisons of allele frequency distributions reveal greater differences between racial groups than between closely related populations. 45 refs., 3 figs., 6 tabs.« less

  20. 5 CFR 1201.144 - Hearing procedures; referring the record.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... the record to the Special Counsel, the Office of Personnel Management, and the employing agency for... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Hearing procedures; referring the record... PROCEDURES PRACTICES AND PROCEDURES Procedures for Original Jurisdiction Cases Removal from the Senior...

  1. 5 CFR 1201.144 - Hearing procedures; referring the record.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... the record to the Special Counsel, the Office of Personnel Management, and the employing agency for... 5 Administrative Personnel 3 2012-01-01 2012-01-01 false Hearing procedures; referring the record... PROCEDURES PRACTICES AND PROCEDURES Procedures for Original Jurisdiction Cases Removal from the Senior...

  2. 5 CFR 1201.144 - Hearing procedures; referring the record.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... the record to the Special Counsel, the Office of Personnel Management, and the employing agency for... 5 Administrative Personnel 3 2011-01-01 2011-01-01 false Hearing procedures; referring the record... PROCEDURES PRACTICES AND PROCEDURES Procedures for Original Jurisdiction Cases Removal from the Senior...

  3. Essentials of the disclosure review process: a federal perspective.

    PubMed

    Zarate, Alvan O; Zayatz, Laura

    2006-09-01

    MANY RESEARCHERS NEED TO MAKE arrangements to share de-identified electronic data files. However, the ways in which respondent identity may be protected are not well understood or are assumed to be the special province of large statistical agencies or specialized statisticians. Approaches to data sharing and protecting respondent identity have been pioneered by federal agencies which gather data vital to political and economic decision making. These agencies are required by statutory law both to assure confidentiality and to share data in usable form with other governmental agencies and with scholars who perform needed analyses of those data. The basic principles of disclosure limitation developed by the Census Bureau, the National Center for Health Statistics, and other federal agencies are fundamental to meeting new funding requirements to share and deidentify data, and are often referred to in the literature on data sharing. We describe how these principles are employed by the Disclosure Review Boards (DRBs) of these two agencies, and then state these principles in more general terms that are applicable to any disclosure review process. The kinds of data that academic institutions share may call for less complex or stringent DRBs and specific nondisclosure procedures different from those employed by federal agencies, but the same general principles apply. Specific application of these six principles by non-government researchers will depend on the nature of their data, their own institutional resources, and the likely future usefulness of their data.

  4. The Current State of Pediatric Sports Medicine: A Workforce Analysis.

    PubMed

    Engelman, Glenn; Koutures, Chris; Provance, Aaron

    2016-01-01

    Pediatric sports medicine is an evolving pediatric subspecialty. No workforce data currently exists describing the current state of pediatric sports medicine. The goal of this survey is to contribute information to the practicing pediatric sports medicine specialist, employers and other stakeholders regarding the current state of pediatric sports medicine. The Workforce Survey was conducted by the American Academy of Pediatrics (AAP) Division of Workforce and Medical Education Policy (WMEP) and included a 44-item standard questionnaire online addressing training, clinical practice and demographic characteristics as well as the 24-item AAP Council on Sports Medicine and Fitness (COSMF) questionnaire. Descriptive statistics were used to summarize all survey responses. Bivariate relationships were tested for statistical significance using Chi square. 145 surveys were returned, which represented a 52.7% response rate for eligible COSMF members and board certified non-council responders. The most common site of employment among respondents was university-based clinics. The respondents board certified in sports medicine were significantly more likely to perform fracture management, casting and splinting, neuropsychological testing and injections compared to those not board certified in sports medicine. A large proportion of respondents held an academic/medical school appointment. Increases were noted in both patient volume and the complexity of the injuries the specialists were treating. This pediatric sports medicine workforce study provides previously unappreciated insight into practice arrangements, weekly duties, procedures, number of patients seen, referral patterns, and potential future trends of the pediatric sports medicine specialist.

  5. Compendium of Methods for Applying Measured Data to Vibration and Acoustic Problems

    DTIC Science & Technology

    1985-10-01

    statistical energy analysis , finite element models, transfer function...Procedures for the Modal Analysis Method .............................................. 8-22 8.4 Summary of the Procedures for the Statistical Energy Analysis Method... statistical energy analysis . 8-1 • o + . . i... "_+,A" L + "+..• •+A ’! i, + +.+ +• o.+ -ore -+. • -..- , .%..% ". • 2 -".-2- ;.-.’, . o . It is helpful

  6. Able-Bodied Wild Chimpanzees Imitate a Motor Procedure Used by a Disabled Individual to Overcome Handicap

    PubMed Central

    Hobaiter, Catherine; Byrne, Richard W.

    2010-01-01

    Chimpanzee culture has generated intense recent interest, fueled by the technical complexity of chimpanzee tool-using traditions; yet it is seriously doubted whether chimpanzees are able to learn motor procedures by imitation under natural conditions. Here we take advantage of an unusual chimpanzee population as a ‘natural experiment’ to identify evidence for imitative learning of this kind in wild chimpanzees. The Sonso chimpanzee community has suffered from high levels of snare injury and now has several manually disabled members. Adult male Tinka, with near-total paralysis of both hands, compensates inability to scratch his back manually by employing a distinctive technique of holding a growing liana taut while making side-to-side body movements against it. We found that seven able-bodied young chimpanzees also used this ‘liana-scratch’ technique, although they had no need to. The distribution of the liana-scratch technique was statistically associated with individuals' range overlap with Tinka and the extent of time they spent in parties with him, confirming that the technique is acquired by social learning. The motivation for able-bodied chimpanzees copying his variant is unknown, but the fact that they do is evidence that the imitative learning of motor procedures from others is a natural trait of wild chimpanzees. PMID:20700527

  7. An Examination of the True Reliability of Lower Limb Stiffness Measures During Overground Hopping.

    PubMed

    Diggin, David; Anderson, Ross; Harrison, Andrew J

    2016-06-01

    Evidence suggests reports describing the reliability of leg-spring (kleg) and joint stiffness (kjoint) measures are contaminated by artifacts originating from digital filtering procedures. In addition, the intraday reliability of kleg and kjoint requires investigation. This study examined the effects of experimental procedures on the inter- and intraday reliability of kleg and kjoint. Thirty-two participants completed 2 trials of single-legged hopping at 1.5, 2.2, and 3.0 Hz at the same time of day across 3 days. On the final test day a fourth experimental bout took place 6 hours before or after participants' typical testing time. Kinematic and kinetic data were collected throughout. Stiffness was calculated using models of kleg and kjoint. Classifications of measurement agreement were established using thresholds for absolute and relative reliability statistics. Results illustrated that kleg and kankle exhibited strong agreement. In contrast, kknee and khip demonstrated weak-to-moderate consistency. Results suggest limits in kjoint reliability persist despite employment of appropriate filtering procedures. Furthermore, diurnal fluctuations in lower-limb muscle-tendon stiffness exhibit little effect on intraday reliability. The present findings support the existence of kleg as an attractor state during hopping, achieved through fluctuations in kjoint variables. Limits to kjoint reliability appear to represent biological function rather than measurement artifact.

  8. Secondary school accident reporting in one education authority.

    PubMed

    Williams, W R; Latif, A H A; Sibert, J

    2002-01-01

    Secondary schools appear to have very different accident rates when they are compared on the basis of accident report returns. The variation may be as a result of real differences in accident rates or different reporting procedures. This study investigates accident reporting from secondary schools and, in particular, the role of the school nurse. Accident form returns covering a 2-year period were collected for statistical analysis from 13 comprehensive schools in one local education authority in Wales. School sites were visited in the following school year to obtain information about accident records held on site and accident reporting procedures. The main factors determining the number of school accident reports submitted to the education authority relate to differences in recording and reporting procedures, such as the employment of a nurse and the policy of the head teacher/safety officer on submitting accident returns. Accident and emergency department referrals from similar schools may show significant differences in specific injuries and their causes. The level of school accident activity cannot be gauged from reports submitted to the education authority. Lack of incentives for collecting good accident data, in conjunction with the degree of complacency in the current system, suggest that future accident rates and reporting activity are unlikely to change.

  9. 34 CFR 1100.25 - What are the procedures for payment of a fellowship award through the fellow's employer?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 3 2010-07-01 2010-07-01 false What are the procedures for payment of a fellowship award through the fellow's employer? 1100.25 Section 1100.25 Education Regulations of the Offices of the Department of Education (Continued) NATIONAL INSTITUTE FOR LITERACY NATIONAL INSTITUTE FOR LITERACY: LITERACY...

  10. 20 CFR 702.205 - Employer's report; effect of failure to report upon time limitations.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 20 Employees' Benefits 4 2014-04-01 2014-04-01 false Employer's report; effect of failure to report upon time limitations. 702.205 Section 702.205 Employees' Benefits OFFICE OF WORKERS' COMPENSATION PROGRAMS, DEPARTMENT OF LABOR LONGSHOREMEN'S AND HARBOR WORKERS' COMPENSATION ACT AND RELATED STATUTES ADMINISTRATION AND PROCEDURE Claims Procedure...

  11. 20 CFR 702.205 - Employer's report; effect of failure to report upon time limitations.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 20 Employees' Benefits 4 2012-04-01 2012-04-01 false Employer's report; effect of failure to report upon time limitations. 702.205 Section 702.205 Employees' Benefits OFFICE OF WORKERS' COMPENSATION PROGRAMS, DEPARTMENT OF LABOR LONGSHOREMEN'S AND HARBOR WORKERS' COMPENSATION ACT AND RELATED STATUTES ADMINISTRATION AND PROCEDURE Claims Procedure...

  12. 20 CFR 702.205 - Employer's report; effect of failure to report upon time limitations.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 20 Employees' Benefits 4 2013-04-01 2013-04-01 false Employer's report; effect of failure to report upon time limitations. 702.205 Section 702.205 Employees' Benefits OFFICE OF WORKERS' COMPENSATION PROGRAMS, DEPARTMENT OF LABOR LONGSHOREMEN'S AND HARBOR WORKERS' COMPENSATION ACT AND RELATED STATUTES ADMINISTRATION AND PROCEDURE Claims Procedure...

  13. 20 CFR 702.205 - Employer's report; effect of failure to report upon time limitations.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 20 Employees' Benefits 3 2011-04-01 2011-04-01 false Employer's report; effect of failure to report upon time limitations. 702.205 Section 702.205 Employees' Benefits OFFICE OF WORKERS' COMPENSATION PROGRAMS, DEPARTMENT OF LABOR LONGSHOREMEN'S AND HARBOR WORKERS' COMPENSATION ACT AND RELATED STATUTES ADMINISTRATION AND PROCEDURE Claims Procedure...

  14. The impact of the 2007-2009 recession on workers' health coverage.

    PubMed

    Fronstin, Paul

    2011-04-01

    IMPACT OF THE RECESSION: The 2007-2009 recession has taken its toll on the percentage of the population with employment-based health coverage. While, since 2000, there has been a slow erosion in the percentage of individuals under age 65 with employment-based health coverage, 2009 was the first year in which the percentage fell below 60 percent, and marked the largest one-year decline in coverage. FEWER WORKERS WITH COVERAGE: The percentage of workers with coverage through their own job fell from 53.2 percent in 2008 to 52 percent in 2009, a 2.4 percent decline in the likelihood that a worker has coverage through his or her own job. The percentage of workers with coverage as a dependent fell from 17 percent in 2008 to 16.3 percent in 2009, a 4.5 percent drop in the likelihood that a worker has coverage as a dependent. These declines occurred as the unemployment rate increased from an average of 5.8 percent in 2008 to 9.3 percent in 2009 (and reached a high of 10.1 percent during 2009). FIRM SIZE/INDUSTRY: The decline in the percentage of workers with coverage from their own job affected workers in private-sector firms of all sizes. Among public-sector workers, the decline from 73.4 percent to 73 percent was not statistically significant. Workers in all private-sector industries experienced a statistically significant decline in coverage between 2008 and 2009. HOURS WORKED: Full-time workers experienced a decline in coverage that was statistically significant while part-time workers did not. Among full-time workers, those employed full year experienced a statistically significant decline in coverage from their own job. Those employed full time but for only part of the year did not experience a statistically significant change in coverage. Among part-time workers, those employed full year experienced a statistically significant increase in the likelihood of having coverage in their own name, as did part-time workers employed for only part of the year. ANNUAL EARNINGS: The decline in the percentage of workers with coverage through their own job was limited to workers with lower annual earnings. Statistically significant declines were not found among any group of workers with annual earnings of at least $40,000. Workers with a high school education or less experienced a statistically significant decline in the likelihood of having coverage. Neither workers with a college degree nor those with a graduate degree experienced a statistically significant decline in coverage through their own job. Workers of all races experienced statistically significant declines in coverage between 2008 and 2009. Both men and women experienced a statistically significant decline in the percentage with health coverage through their own job. IMPACT OF STRUCTURAL CHANGES TO THE WORK FORCE: The movement of workers from the manufacturing industry to the service sector continued between 2008 and 2009. The percentage of workers employed on a full-time basis decreased while the percentage working part time increased. While there was an overall decline in the percentage of full-time workers, that decline was limited to workers employed full year. The percentage of workers employed on a full-time, part-year basis increased between 2008 and 2009. The distribution of workers by annual earnings shifted from middle-income workers to lower-income workers between 2008 and 2009.

  15. 5 CFR 1304.4605 - Post-employment restrictions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Post-employment restrictions. 1304.4605 Section 1304.4605 Administrative Personnel OFFICE OF MANAGEMENT AND BUDGET ADMINISTRATIVE PROCEDURES POST EMPLOYMENT CONFLICT OF INTEREST § 1304.4605 Post-employment restrictions. (a) General Restrictions Applicable...

  16. 28 CFR 42.733 - Enforcement procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Enforcement procedures. 42.733 Section 42.733 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY...; Implementation of the Age Discrimination Act of 1975 Compliance Procedures § 42.733 Enforcement procedures. (a...

  17. 4 CFR 28.12 - General Counsel procedures.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 4 Accounts 1 2010-01-01 2010-01-01 false General Counsel procedures. 28.12 Section 28.12 Accounts GOVERNMENT ACCOUNTABILITY OFFICE GENERAL PROCEDURES GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL APPEALS BOARD; PROCEDURES APPLICABLE TO CLAIMS CONCERNING EMPLOYMENT PRACTICES AT THE GOVERNMENT ACCOUNTABILITY OFFICE...

  18. 4 CFR 28.8 - Informal procedural advice.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 4 Accounts 1 2010-01-01 2010-01-01 false Informal procedural advice. 28.8 Section 28.8 Accounts GOVERNMENT ACCOUNTABILITY OFFICE GENERAL PROCEDURES GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL APPEALS BOARD; PROCEDURES APPLICABLE TO CLAIMS CONCERNING EMPLOYMENT PRACTICES AT THE GOVERNMENT ACCOUNTABILITY OFFICE...

  19. 20 CFR 633.316 - Closeout procedures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false Closeout procedures. 633.316 Section 633.316 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION, DEPARTMENT OF LABOR MIGRANT AND SEASONAL FARMWORKER PROGRAMS Program Design and Administrative Procedures § 633.316 Closeout procedures. Grant...

  20. Multivariate methods to visualise colour-space and colour discrimination data.

    PubMed

    Hastings, Gareth D; Rubin, Alan

    2015-01-01

    Despite most modern colour spaces treating colour as three-dimensional (3-D), colour data is usually not visualised in 3-D (and two-dimensional (2-D) projection-plane segments and multiple 2-D perspective views are used instead). The objectives of this article are firstly, to introduce a truly 3-D percept of colour space using stereo-pairs, secondly to view colour discrimination data using that platform, and thirdly to apply formal statistics and multivariate methods to analyse the data in 3-D. This is the first demonstration of the software that generated stereo-pairs of RGB colour space, as well as of a new computerised procedure that investigated colour discrimination by measuring colour just noticeable differences (JND). An initial pilot study and thorough investigation of instrument repeatability were performed. Thereafter, to demonstrate the capabilities of the software, five colour-normal and one colour-deficient subject were examined using the JND procedure and multivariate methods of data analysis. Scatter plots of responses were meaningfully examined in 3-D and were useful in evaluating multivariate normality as well as identifying outliers. The extent and direction of the difference between each JND response and the stimulus colour point was calculated and appreciated in 3-D. Ellipsoidal surfaces of constant probability density (distribution ellipsoids) were fitted to response data; the volumes of these ellipsoids appeared useful in differentiating the colour-deficient subject from the colour-normals. Hypothesis tests of variances and covariances showed many statistically significant differences between the results of the colour-deficient subject and those of the colour-normals, while far fewer differences were found when comparing within colour-normals. The 3-D visualisation of colour data using stereo-pairs, as well as the statistics and multivariate methods of analysis employed, were found to be unique and useful tools in the representation and study of colour. Many additional studies using these methods along with the JND and other procedures have been identified and will be reported in future publications. © 2014 The Authors Ophthalmic & Physiological Optics © 2014 The College of Optometrists.

  1. Analyzing longitudinal data with the linear mixed models procedure in SPSS.

    PubMed

    West, Brady T

    2009-09-01

    Many applied researchers analyzing longitudinal data share a common misconception: that specialized statistical software is necessary to fit hierarchical linear models (also known as linear mixed models [LMMs], or multilevel models) to longitudinal data sets. Although several specialized statistical software programs of high quality are available that allow researchers to fit these models to longitudinal data sets (e.g., HLM), rapid advances in general purpose statistical software packages have recently enabled analysts to fit these same models when using preferred packages that also enable other more common analyses. One of these general purpose statistical packages is SPSS, which includes a very flexible and powerful procedure for fitting LMMs to longitudinal data sets with continuous outcomes. This article aims to present readers with a practical discussion of how to analyze longitudinal data using the LMMs procedure in the SPSS statistical software package.

  2. Detecting Non-Gaussian and Lognormal Characteristics of Temperature and Water Vapor Mixing Ratio

    NASA Astrophysics Data System (ADS)

    Kliewer, A.; Fletcher, S. J.; Jones, A. S.; Forsythe, J. M.

    2017-12-01

    Many operational data assimilation and retrieval systems assume that the errors and variables come from a Gaussian distribution. This study builds upon previous results that shows that positive definite variables, specifically water vapor mixing ratio and temperature, can follow a non-Gaussian distribution and moreover a lognormal distribution. Previously, statistical testing procedures which included the Jarque-Bera test, the Shapiro-Wilk test, the Chi-squared goodness-of-fit test, and a composite test which incorporated the results of the former tests were employed to determine locations and time spans where atmospheric variables assume a non-Gaussian distribution. These tests are now investigated in a "sliding window" fashion in order to extend the testing procedure to near real-time. The analyzed 1-degree resolution data comes from the National Oceanic and Atmospheric Administration (NOAA) Global Forecast System (GFS) six hour forecast from the 0Z analysis. These results indicate the necessity of a Data Assimilation (DA) system to be able to properly use the lognormally-distributed variables in an appropriate Bayesian analysis that does not assume the variables are Gaussian.

  3. Using data mining to segment healthcare markets from patients' preference perspectives.

    PubMed

    Liu, Sandra S; Chen, Jie

    2009-01-01

    This paper aims to provide an example of how to use data mining techniques to identify patient segments regarding preferences for healthcare attributes and their demographic characteristics. Data were derived from a number of individuals who received in-patient care at a health network in 2006. Data mining and conventional hierarchical clustering with average linkage and Pearson correlation procedures are employed and compared to show how each procedure best determines segmentation variables. Data mining tools identified three differentiable segments by means of cluster analysis. These three clusters have significantly different demographic profiles. The study reveals, when compared with traditional statistical methods, that data mining provides an efficient and effective tool for market segmentation. When there are numerous cluster variables involved, researchers and practitioners need to incorporate factor analysis for reducing variables to clearly and meaningfully understand clusters. Interests and applications in data mining are increasing in many businesses. However, this technology is seldom applied to healthcare customer experience management. The paper shows that efficient and effective application of data mining methods can aid the understanding of patient healthcare preferences.

  4. Real-time detection of organic contamination events in water distribution systems by principal components analysis of ultraviolet spectral data.

    PubMed

    Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo

    2017-05-01

    The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T 2 statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.

  5. [Prevention of post-operative pain and haemorrhage in PPH (Procedure for Prolapse and Hemorrhoids) and STARR (Stapled Trans-Anal Rectal Resection). Preliminary results in 261 cases].

    PubMed

    Mongardini, M; Custureri, F; Schillaci, F; Cola, A; Maturo, A; Fanello, G; Corelli, S; Pappalardo, G

    2005-04-01

    Intra- and early (first week) post-operative haemorrhages are the most common complications in stapled hemorrhoidectomy PPH (Procedure for Prolapse and Hemorrhoids) and in circumferential resection of the rectal prolapse STARR (Stapled Trans Anal Rectal Resection). Performing PPH and STARR we employed a gelatin based haemostatic sealant with thrombin component (FloSeal) to control intra-operative bleeding and to reduce post-operative bleeding avoiding haemostatic stitches on suture line. We report the preliminary results on 197 PPH and 64 STARR; 44 PPH (22.4%) and 27 STARR (42.2%) were treated by FloSeal. No major post-operative bleeding was observed in all patients treated by FloSeal, compared to 1.3% and 2.7% of hemorrhage respectively in PPH and STARR patients treated without sealant. Post-operative pain was less severe in patients treated by FloSeal, without a difference statistically significant. The data are preliminary and must be confirmed in prospective randomized trials in larger series.

  6. Divers-Operated Underwater Photogrammetry: Applications in the Study of Antarctic Benthos

    NASA Astrophysics Data System (ADS)

    Piazza, P.; Cummings, V.; Lohrer, D.; Marini, S.; Marriott, P.; Menna, F.; Nocerino, E.; Peirano, A.; Schiaparelli, S.

    2018-05-01

    Ecological studies about marine benthic communities received a major leap from the application of a variety of non-destructive sampling and mapping techniques based on underwater image and video recording. The well-established scientific diving practice consists in the acquisition of single path or `round-trip' over elongated transects, with the imaging device oriented in a nadir looking direction. As it may be expected, the application of automatic image processing procedures to data not specifically acquired for 3D modelling can be risky, especially if proper tools for assessing the quality of the produced results are not employed. This paper, born from an international cooperation, focuses on this topic, which is of great interest for ecological and monitoring benthic studies in Antarctica. Several video footages recorded from different scientific teams in different years are processed with an automatic photogrammetric procedure and salient statistical features are reported to critically analyse the derived results. As expected, the inclusion of oblique images from additional lateral strips may improve the expected accuracy in the object space, without altering too much the current video recording practices.

  7. Ultraclean air for prevention of postoperative infection after posterior spinal fusion with instrumentation: a comparison between surgeries performed with and without a vertical exponential filtered air-flow system.

    PubMed

    Gruenberg, Marcelo F; Campaner, Gustavo L; Sola, Carlos A; Ortolan, Eligio G

    2004-10-15

    This study retrospectively compared infection rates between adult patients after posterior spinal instrumentation procedures performed in a conventional versus an ultraclean air operating room. To evaluate if the use of ultraclean air technology could decrease the infection rate after posterior spinal arthrodesis with instrumentation. Postoperative wound infection after posterior arthrodesis remains a feared complication in spinal surgery. Although this frequent complication results in a significant problem, the employment of ultraclean air technology, as it is commonly used for arthroplasty, has not been reported as a possible alternative to reduce the infection rate after complex spine surgery. One hundred seventy-nine patients having posterior spinal fusion with instrumentation were divided into 2 groups: group I included 139 patients operated in a conventional operating room, and group II included 40 patients operated in a vertical laminar flow operating room. Patient selection was performed favoring ultraclean air technology for elective cases in which high infection risk was considered. A statistical analysis of the infection rate and its associated risk factors between both groups was assessed. We observed 18 wound infections in group I and 0 in group II. Comparison of infection rates using the chi-squared test showed a statistically significant difference (P <0.017). The use of ultraclean air technology reduced the infection rate after complex spinal procedures and appears to be an interesting alternative that still needs to be prospectively studied with a randomized protocol.

  8. [The register of exposed workers to carcinogens: legislative framework and data analysis].

    PubMed

    Scarselli, A; Di Marzio, D; Marinaccio, A; Iavicoli, S

    2010-01-01

    On the basis of the law which introduced the registration of occupational exposures to carcinogens (Legislative Decree 626/94), the National Institute for Occupational Safety and Prevention designed and implemented an information system for collecting and recording such information. The Ministry of Health Decree No 155/2007, which established the procedures for record keeping and transmission of registers of exposed workers, regulated the legislative fJamework in this field. The aim of the study was to illustrate some of the major legislative issues and toprovide summary statistics, after one year of entry into force of this Decree. The main information to record is: the carcinogenic agents used, the type of occupational exposure and data on the environmental measurements. Descriptive statistical analysis were carried out, by sector of economic activity, carcinogen agent and worker's occupation. As at 31 December 2008 the information recorded, altogether, covered: 6000 firms, 79,000 workers, 164,000 exposures and 100,000 measurements. Most of the exposures occurred in the manufacturing and construction industries and in commercial activities. Such surveillance system, established as a result of the institution of exposure registers, makes it possible to plan analytical studies, both for monitoring the effects of exposure, even at low doses, and for assessing the prevention and protection measures. It is hoped that the recent readjustment law (Legislative Decree 81/2008) will promote awareness of all subjects involved in the recording procedures (employers, physicians, local health units, research institutes, etc.), thus increasing the quality and coverage of data transmission.

  9. Omnibus Risk Assessment via Accelerated Failure Time Kernel Machine Modeling

    PubMed Central

    Sinnott, Jennifer A.; Cai, Tianxi

    2013-01-01

    Summary Integrating genomic information with traditional clinical risk factors to improve the prediction of disease outcomes could profoundly change the practice of medicine. However, the large number of potential markers and possible complexity of the relationship between markers and disease make it difficult to construct accurate risk prediction models. Standard approaches for identifying important markers often rely on marginal associations or linearity assumptions and may not capture non-linear or interactive effects. In recent years, much work has been done to group genes into pathways and networks. Integrating such biological knowledge into statistical learning could potentially improve model interpretability and reliability. One effective approach is to employ a kernel machine (KM) framework, which can capture nonlinear effects if nonlinear kernels are used (Scholkopf and Smola, 2002; Liu et al., 2007, 2008). For survival outcomes, KM regression modeling and testing procedures have been derived under a proportional hazards (PH) assumption (Li and Luan, 2003; Cai et al., 2011). In this paper, we derive testing and prediction methods for KM regression under the accelerated failure time model, a useful alternative to the PH model. We approximate the null distribution of our test statistic using resampling procedures. When multiple kernels are of potential interest, it may be unclear in advance which kernel to use for testing and estimation. We propose a robust Omnibus Test that combines information across kernels, and an approach for selecting the best kernel for estimation. The methods are illustrated with an application in breast cancer. PMID:24328713

  10. Skin Mast Cell Promotion in Random Skin Flaps in Rats using Bone Marrow Mesenchymal Stem Cells and Amniotic Membrane

    PubMed

    Chehelcheraghi, Farzaneh; Abbaszadeh, Abolfazl; Tavafi, Magid

    2018-03-06

    Skin flap procedures are employed in plastic surgery, but failure can lead to necrosis of the flap. Studies have used bone marrow mesenchymal stem cells (BM-MSCs) to improve flap viability. BM-MSCs and acellular amniotic membrane (AAM) have been introduced as alternatives. The objective of this study was to evaluate the effect of BM-MSCs and AAM on mast cells of random skin flaps (RSF) in rats. RSFs (80 × 30 mm) were created on 40 rats that were randomly assigned to one of four groups, including (I) AAM, (II) BM-MSCs, (III) BM-MSCs/AAM, and (IV) saline (control). Transplantation was carried out during the procedure (zero day). Flap necrosis was observed on day 7, and skin samples were collected from the transition line of the flap to evaluate the total number and types of mast cells. The development and the total number of mast cells were related to the development of capillaries. The results of one-way ANOVA indicated that there was no statistically significant difference between the mean numbers of mast cell types for different study groups. However, the difference between the total number of mast cells in the study groups was statistically significant (p = 0.001). The present study suggests that the use of AAM/BM-MSCs can improve the total number of mast cells and accelerate the growth of capillaries at the transient site in RSFs in rats.

  11. Case-based statistical learning applied to SPECT image classification

    NASA Astrophysics Data System (ADS)

    Górriz, Juan M.; Ramírez, Javier; Illán, I. A.; Martínez-Murcia, Francisco J.; Segovia, Fermín.; Salas-Gonzalez, Diego; Ortiz, A.

    2017-03-01

    Statistical learning and decision theory play a key role in many areas of science and engineering. Some examples include time series regression and prediction, optical character recognition, signal detection in communications or biomedical applications for diagnosis and prognosis. This paper deals with the topic of learning from biomedical image data in the classification problem. In a typical scenario we have a training set that is employed to fit a prediction model or learner and a testing set on which the learner is applied to in order to predict the outcome for new unseen patterns. Both processes are usually completely separated to avoid over-fitting and due to the fact that, in practice, the unseen new objects (testing set) have unknown outcomes. However, the outcome yields one of a discrete set of values, i.e. the binary diagnosis problem. Thus, assumptions on these outcome values could be established to obtain the most likely prediction model at the training stage, that could improve the overall classification accuracy on the testing set, or keep its performance at least at the level of the selected statistical classifier. In this sense, a novel case-based learning (c-learning) procedure is proposed which combines hypothesis testing from a discrete set of expected outcomes and a cross-validated classification stage.

  12. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 1: Methodology and applications

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for designs failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  13. An improved approach for flight readiness certification: Probabilistic models for flaw propagation and turbine blade failure. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflights systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with analytical modeling of failure phenomena to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in analytical modeling, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which analytical models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. State-of-the-art analytical models currently employed for design, failure prediction, or performance analysis are used in this methodology. The rationale for the statistical approach taken in the PFA methodology is discussed, the PFA methodology is described, and examples of its application to structural failure modes are presented. The engineering models and computer software used in fatigue crack growth and fatigue crack initiation applications are thoroughly documented.

  14. Predictors of surgeons' efficiency in the operating rooms.

    PubMed

    Nakata, Yoshinori; Watanabe, Yuichi; Narimatsu, Hiroto; Yoshimura, Tatsuya; Otake, Hiroshi; Sawa, Tomohiro

    2017-02-01

    The sustainability of the Japanese healthcare system is questionable because of a huge fiscal debt. One of the solutions is to improve the efficiency of healthcare. The purpose of this study is to determine what factors are predictive of surgeons' efficiency scores. The authors collected data from all the surgical procedures performed at Teikyo University Hospital from April 1 through September 30 in 2013-2015. Output-oriented Charnes-Cooper-Rhodes model of data envelopment analysis was employed to calculate each surgeon's efficiency score. Seven independent variables that may predict their efficiency scores were selected: experience, medical school, surgical volume, gender, academic rank, surgical specialty, and the surgical fee schedule. Multiple regression analysis using random-effects Tobit model was used for our panel data. The data from total 8722 surgical cases were obtained in 18-month study period. The authors analyzed 134 surgeons. The only statistically significant coefficients were surgical specialty and surgical fee schedule (p = 0.000 and p = 0.016, respectively). Experience had some positive association with efficiency scores but did not reach statistical significance (p = 0.062). The other coefficients were not statistically significant. These results demonstrated that the surgical reimbursement system, not surgeons' personal characteristics, is a significant predictor of surgeons' efficiency.

  15. The Effects of Statistical Multiplicity of Infection on Virus Quantification and Infectivity Assays.

    PubMed

    Mistry, Bhaven A; D'Orsogna, Maria R; Chou, Tom

    2018-06-19

    Many biological assays are employed in virology to quantify parameters of interest. Two such classes of assays, virus quantification assays (VQAs) and infectivity assays (IAs), aim to estimate the number of viruses present in a solution and the ability of a viral strain to successfully infect a host cell, respectively. VQAs operate at extremely dilute concentrations, and results can be subject to stochastic variability in virus-cell interactions. At the other extreme, high viral-particle concentrations are used in IAs, resulting in large numbers of viruses infecting each cell, enough for measurable change in total transcription activity. Furthermore, host cells can be infected at any concentration regime by multiple particles, resulting in a statistical multiplicity of infection and yielding potentially significant variability in the assay signal and parameter estimates. We develop probabilistic models for statistical multiplicity of infection at low and high viral-particle-concentration limits and apply them to the plaque (VQA), endpoint dilution (VQA), and luciferase reporter (IA) assays. A web-based tool implementing our models and analysis is also developed and presented. We test our proposed new methods for inferring experimental parameters from data using numerical simulations and show improvement on existing procedures in all limits. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  16. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples. Volume 2: Software documentation

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes, These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  17. Bridging stylized facts in finance and data non-stationarities

    NASA Astrophysics Data System (ADS)

    Camargo, Sabrina; Duarte Queirós, Sílvio M.; Anteneodo, Celia

    2013-04-01

    Employing a recent technique which allows the representation of nonstationary data by means of a juxtaposition of locally stationary paths of different length, we introduce a comprehensive analysis of the key observables in a financial market: the trading volume and the price fluctuations. From the segmentation procedure we are able to introduce a quantitative description of statistical features of these two quantities, which are often named stylized facts, namely the tails of the distribution of trading volume and price fluctuations and a dynamics compatible with the U-shaped profile of the volume in a trading section and the slow decay of the autocorrelation function. The segmentation of the trading volume series provides evidence of slow evolution of the fluctuating parameters of each patch, pointing to the mixing scenario. Assuming that long-term features are the outcome of a statistical mixture of simple local forms, we test and compare different probability density functions to provide the long-term distribution of the trading volume, concluding that the log-normal gives the best agreement with the empirical distribution. Moreover, the segmentation of the magnitude price fluctuations are quite different from the results for the trading volume, indicating that changes in the statistics of price fluctuations occur at a faster scale than in the case of trading volume.

  18. An improved approach for flight readiness certification: Methodology for failure risk assessment and application examples, volume 1

    NASA Technical Reports Server (NTRS)

    Moore, N. R.; Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Creager, M.

    1992-01-01

    An improved methodology for quantitatively evaluating failure risk of spaceflight systems to assess flight readiness and identify risk control measures is presented. This methodology, called Probabilistic Failure Assessment (PFA), combines operating experience from tests and flights with engineering analysis to estimate failure risk. The PFA methodology is of particular value when information on which to base an assessment of failure risk, including test experience and knowledge of parameters used in engineering analyses of failure phenomena, is expensive or difficult to acquire. The PFA methodology is a prescribed statistical structure in which engineering analysis models that characterize failure phenomena are used conjointly with uncertainties about analysis parameters and/or modeling accuracy to estimate failure probability distributions for specific failure modes. These distributions can then be modified, by means of statistical procedures of the PFA methodology, to reflect any test or flight experience. Conventional engineering analysis models currently employed for design of failure prediction are used in this methodology. The PFA methodology is described and examples of its application are presented. Conventional approaches to failure risk evaluation for spaceflight systems are discussed, and the rationale for the approach taken in the PFA methodology is presented. The statistical methods, engineering models, and computer software used in fatigue failure mode applications are thoroughly documented.

  19. 49 CFR 40.11 - What are the general responsibilities of employers under this regulation?

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 49 Transportation 1 2010-10-01 2010-10-01 false What are the general responsibilities of employers... PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Employer Responsibilities § 40.11 What are the general responsibilities of employers under this regulation? (a) As an employer, you are...

  20. Updated Intensity - Duration - Frequency Curves Under Different Future Climate Scenarios

    NASA Astrophysics Data System (ADS)

    Ragno, E.; AghaKouchak, A.

    2016-12-01

    Current infrastructure design procedures rely on the use of Intensity - Duration - Frequency (IDF) curves retrieved under the assumption of temporal stationarity, meaning that occurrences of extreme events are expected to be time invariant. However, numerous studies have observed more severe extreme events over time. Hence, the stationarity assumption for extreme analysis may not be appropriate in a warming climate. This issue raises concerns regarding the safety and resilience of the existing and future infrastructures. Here we employ historical and projected (RCP 8.5) CMIP5 runs to investigate IDF curves of 14 urban areas across the United States. We first statistically assess changes in precipitation extremes using an energy-based test for equal distributions. Then, through a Bayesian inference approach for stationary and non-stationary extreme value analysis, we provide updated IDF curves based on climatic model projections. This presentation summarizes the projected changes in statistics of extremes. We show that, based on CMIP5 simulations, extreme precipitation events in some urban areas can be 20% more severe in the future, even when projected annual mean precipitation is expected to remain similar to the ground-based climatology.

  1. Design of Neural Networks for Fast Convergence and Accuracy

    NASA Technical Reports Server (NTRS)

    Maghami, Peiman G.; Sparks, Dean W., Jr.

    1998-01-01

    A novel procedure for the design and training of artificial neural networks, used for rapid and efficient controls and dynamics design and analysis for flexible space systems, has been developed. Artificial neural networks are employed to provide a means of evaluating the impact of design changes rapidly. Specifically, two-layer feedforward neural networks are designed to approximate the functional relationship between the component spacecraft design changes and measures of its performance. A training algorithm, based on statistical sampling theory, is presented, which guarantees that the trained networks provide a designer-specified degree of accuracy in mapping the functional relationship. Within each iteration of this statistical-based algorithm, a sequential design algorithm is used for the design and training of the feedforward network to provide rapid convergence to the network goals. Here, at each sequence a new network is trained to minimize the error of previous network. The design algorithm attempts to avoid the local minima phenomenon that hampers the traditional network training. A numerical example is performed on a spacecraft application in order to demonstrate the feasibility of the proposed approach.

  2. Evaluation of a computer-based prompting intervention to improve essay writing in undergraduates with cognitive impairment after acquired brain injury.

    PubMed

    Ledbetter, Alexander K; Sohlberg, McKay Moore; Fickas, Stephen F; Horney, Mark A; McIntosh, Kent

    2017-11-06

    This study evaluated a computer-based prompting intervention for improving expository essay writing after acquired brain injury (ABI). Four undergraduate participants aged 18-21 with mild-moderate ABI and impaired fluid cognition at least 6 months post-injury reported difficulty with the writing process after injury. The study employed a non-concurrent multiple probe across participants, in a single-case design. Outcome measures included essay quality scores and number of revisions to writing counted then coded by type using a revision taxonomy. An inter-scorer agreement procedure was completed for quality scores for 50% of essays, with data indicating that agreement exceeded a goal of 85%. Visual analysis of results showed increased essay quality for all participants in intervention phase compared with baseline, maintained 1 week after. Statistical analyses showed statistically significant results for two of the four participants. The authors discuss external cuing for self-monitoring and tapping of existing writing knowledge as possible explanations for improvement. The study provides preliminary evidence that computer-based prompting has potential to improve writing quality for undergraduates with ABI.

  3. BTS statistical standards manual

    DOT National Transportation Integrated Search

    2005-10-01

    The Bureau of Transportation Statistics (BTS), like other federal statistical agencies, establishes professional standards to guide the methods and procedures for the collection, processing, storage, and presentation of statistical data. Standards an...

  4. 5 CFR 300.504 - Prohibition on employer-employee relationship

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... REGULATIONS EMPLOYMENT (GENERAL) Use of Private Sector Temporaries § 300.504 Prohibition on employer-employee relationship No employer-employee relationship is created by an agency's use of private sector temporaries... appropriate procedures for interaction with private sector temporaries to assure that the supervisory...

  5. The Use of Statistical Process Control-Charts for Person-Fit Analysis on Computerized Adaptive Testing. LSAC Research Report Series.

    ERIC Educational Resources Information Center

    Meijer, Rob R.; van Krimpen-Stoop, Edith M. L. A.

    In this study a cumulative-sum (CUSUM) procedure from the theory of Statistical Process Control was modified and applied in the context of person-fit analysis in a computerized adaptive testing (CAT) environment. Six person-fit statistics were proposed using the CUSUM procedure, and three of them could be used to investigate the CAT in online test…

  6. 77 FR 53889 - Statement of Organization, Functions, and Delegations of Authority

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-04

    ..., methods, and statistical procedures for assessing and monitoring the health of communities and measuring... methods and the Community Guide, and coordinates division responses to requests for technical assistance...-federal partners in developing indicators, methods, and statistical procedures for measuring and reporting...

  7. 10 CFR Appendix II to Part 504 - Fuel Price Computation

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 504—Fuel Price Computation (a) Introduction. This appendix provides the equations and parameters... inflation indices must follow standard statistical procedures and must be fully documented within the... the weighted average fuel price must follow standard statistical procedures and be fully documented...

  8. Employer Learning and the Signaling Value of Education. National Longitudinal Surveys Discussion Paper.

    ERIC Educational Resources Information Center

    Altonji, Joseph G.; Pierret, Charles R.

    A statistical analysis was performed to test the hypothesis that, if profit-maximizing firms have limited information about the general productivity of new workers, they may choose to use easily observable characteristics such as years of education to discriminate statistically among workers. Information about employer learning was obtained by…

  9. Summary Statistics of CPB-Qualified Public Radio Stations, Fiscal Year 1972.

    ERIC Educational Resources Information Center

    Lee, S. Young; Pedone, Ronald J.

    Statistics in the areas of finance, employment, and broadcast and production for CPB-qualified (Corporation for Public Broadcasting) public radio stations are given in this report. Tables in the area of finance are presented specifying total funds, income, direct operating costs, and capital expenditure. Employment is divided into all employment…

  10. A Study of Arizona Labor Market Demand Data for Vocational Education Planning.

    ERIC Educational Resources Information Center

    Gould, Albert W.; Manning, Doris E.

    A study examined the project methodology used by the Bureau of Labor Statistics and the related projections made by the state employment security agencies. Findings from a literature review indicated that the system has steadily improved since 1979. Projections made from the Occupational Employment Statistics Surveys were remarkably accurate.…

  11. Conference Report on Youth Unemployment: Its Measurements and Meaning.

    ERIC Educational Resources Information Center

    Employment and Training Administration (DOL), Washington, DC.

    Thirteen papers presented at a conference on employment statistics and youth are contained in this report. Reviewed are the problems of gathering, interpreting, and applying employment and unemployment data relating to youth. The titles of the papers are as follow: "Counting Youth: A Comparison of Youth Labor Force Statistics in the Current…

  12. Scientific procedures on living animals in Great Britain in 2003: the facts, figures and consequences.

    PubMed

    Hudson, Michelle; Bhogal, Nirmala

    2004-11-01

    The statistics for animal procedures performed in 2003 were recently released by the Home Office. They indicate that, for the second year running, there was a significant increase in the number of laboratory animal procedures undertaken in Great Britain. The species and genera used, the numbers of toxicology and non-toxicology procedures, and the overall trends, are described. The implications of these latest statistics are discussed with reference to key areas of interest and to the impact of existing regulations and pending legislative reforms.

  13. 29 CFR 1626.1 - Purpose.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN... the Equal Employment Opportunity Commission for carrying out its responsibilities in the administration and enforcement of the Age Discrimination in Employment Act of 1967, as amended. ...

  14. 29 CFR 1626.1 - Purpose.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN... the Equal Employment Opportunity Commission for carrying out its responsibilities in the administration and enforcement of the Age Discrimination in Employment Act of 1967, as amended. ...

  15. 29 CFR 1626.1 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN... the Equal Employment Opportunity Commission for carrying out its responsibilities in the administration and enforcement of the Age Discrimination in Employment Act of 1967, as amended. ...

  16. 29 CFR 1626.1 - Purpose.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN... the Equal Employment Opportunity Commission for carrying out its responsibilities in the administration and enforcement of the Age Discrimination in Employment Act of 1967, as amended. ...

  17. 29 CFR 1626.1 - Purpose.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN... the Equal Employment Opportunity Commission for carrying out its responsibilities in the administration and enforcement of the Age Discrimination in Employment Act of 1967, as amended. ...

  18. Synthesis and Process Optimization of Electrospun PEEK-Sulfonated Nanofibers by Response Surface Methodology

    PubMed Central

    Boaretti, Carlo; Roso, Martina; Lorenzetti, Alessandra; Modesti, Michele

    2015-01-01

    In this study electrospun nanofibers of partially sulfonated polyether ether ketone have been produced as a preliminary step for a possible development of composite proton exchange membranes for fuel cells. Response surface methodology has been employed for the modelling and optimization of the electrospinning process, using a Box-Behnken design. The investigation, based on a second order polynomial model, has been focused on the analysis of the effect of both process (voltage, tip-to-collector distance, flow rate) and material (sulfonation degree) variables on the mean fiber diameter. The final model has been verified by a series of statistical tests on the residuals and validated by a comparison procedure of samples at different sulfonation degrees, realized according to optimized conditions, for the production of homogeneous thin nanofibers. PMID:28793427

  19. Improvement and extension of a radar forest backscattering model

    NASA Technical Reports Server (NTRS)

    Simonett, David S.; Wang, Yong

    1989-01-01

    Radar modeling of mangal forest stands, in the Sundarbans area of Southern Bangladesh, was developed. The modeling employs radar system parameters such as wavelength, polarization, and incidence angle, with forest data on tree height, spacing, biomass, species combinations, and water content (including slightly conductive water) both in leaves and trunks of the mangal. For Sundri and Gewa tropical mangal forests, five model components are proposed, which are required to explain the contributions of various forest species combinations in the attenuation and scattering of mangal vegetated nonflooded or flooded surfaces. Statistical data of simulated images (HH components only) were compared with those of SIR-B images both to refine the modeling procedures and to appropriately characterize the model output. The possibility of delineation of flooded or non-flooded boundaries is discussed.

  20. Synthesis and Process Optimization of Electrospun PEEK-Sulfonated Nanofibers by Response Surface Methodology.

    PubMed

    Boaretti, Carlo; Roso, Martina; Lorenzetti, Alessandra; Modesti, Michele

    2015-07-07

    In this study electrospun nanofibers of partially sulfonated polyether ether ketone have been produced as a preliminary step for a possible development of composite proton exchange membranes for fuel cells. Response surface methodology has been employed for the modelling and optimization of the electrospinning process, using a Box-Behnken design. The investigation, based on a second order polynomial model, has been focused on the analysis of the effect of both process (voltage, tip-to-collector distance, flow rate) and material (sulfonation degree) variables on the mean fiber diameter. The final model has been verified by a series of statistical tests on the residuals and validated by a comparison procedure of samples at different sulfonation degrees, realized according to optimized conditions, for the production of homogeneous thin nanofibers.

  1. Statistical significance of task related deep brain EEG dynamic changes in the time-frequency domain.

    PubMed

    Chládek, J; Brázdil, M; Halámek, J; Plešinger, F; Jurák, P

    2013-01-01

    We present an off-line analysis procedure for exploring brain activity recorded from intra-cerebral electroencephalographic data (SEEG). The objective is to determine the statistical differences between different types of stimulations in the time-frequency domain. The procedure is based on computing relative signal power change and subsequent statistical analysis. An example of characteristic statistically significant event-related de/synchronization (ERD/ERS) detected across different frequency bands following different oddball stimuli is presented. The method is used for off-line functional classification of different brain areas.

  2. Structural Indicators on Graduate Employability in Europe--2016. Eurydice Report

    ERIC Educational Resources Information Center

    Riiheläinen, Jari Matti

    2017-01-01

    This publication presents some structural indicators on graduate employability in 40 European education and training systems. It examines whether countries use regular labour market forecasting to improve the employability of graduates; moreover, other indicators include the involvement of employers in external quality assurance procedures,…

  3. 39 CFR 255.5 - Employment.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 39 Postal Service 1 2010-07-01 2010-07-01 false Employment. 255.5 Section 255.5 Postal Service... Employment. No qualified individual with a disability shall, on the basis of disability, be subjected to discrimination in employment with the Postal Service. The definitions, requirements, and procedures of section...

  4. Predicting juvenile recidivism: new method, old problems.

    PubMed

    Benda, B B

    1987-01-01

    This prediction study compared three statistical procedures for accuracy using two assessment methods. The criterion is return to a juvenile prison after the first release, and the models tested are logit analysis, predictive attribute analysis, and a Burgess procedure. No significant differences are found between statistics in prediction.

  5. 28 CFR 42.530 - Procedures.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND... Section 504 of the Rehabilitation Act of 1973 Procedures § 42.530 Procedures. (a) The procedural... section 803(a) of title I of the Omnibus Crime Control and Safe Streets Act, as amended by the Justice...

  6. 28 CFR 42.530 - Procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND... Section 504 of the Rehabilitation Act of 1973 Procedures § 42.530 Procedures. (a) The procedural... section 803(a) of title I of the Omnibus Crime Control and Safe Streets Act, as amended by the Justice...

  7. 28 CFR 42.530 - Procedures.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND... Section 504 of the Rehabilitation Act of 1973 Procedures § 42.530 Procedures. (a) The procedural... section 803(a) of title I of the Omnibus Crime Control and Safe Streets Act, as amended by the Justice...

  8. 28 CFR 42.530 - Procedures.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND... Section 504 of the Rehabilitation Act of 1973 Procedures § 42.530 Procedures. (a) The procedural... section 803(a) of title I of the Omnibus Crime Control and Safe Streets Act, as amended by the Justice...

  9. 28 CFR 42.530 - Procedures.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND... Section 504 of the Rehabilitation Act of 1973 Procedures § 42.530 Procedures. (a) The procedural... section 803(a) of title I of the Omnibus Crime Control and Safe Streets Act, as amended by the Justice...

  10. 21 CFR 352.77 - Test modifications.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... modification of the testing procedures in this subpart. In addition, alternative methods (including automated or in vitro procedures) employing the same basic procedures as those described in this subpart may be used. Any proposed modification or alternative procedure shall be submitted as a petition in accord...

  11. 21 CFR 352.77 - Test modifications.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... modification of the testing procedures in this subpart. In addition, alternative methods (including automated or in vitro procedures) employing the same basic procedures as those described in this subpart may be used. Any proposed modification or alternative procedure shall be submitted as a petition in accord...

  12. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  13. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-04-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are consideredmore » for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.« less

  14. Computer-Assisted Instruction in Statistics. Technical Report.

    ERIC Educational Resources Information Center

    Cooley, William W.

    A paper given at a conference on statistical computation discussed teaching statistics with computers. It concluded that computer-assisted instruction is most appropriately employed in the numerical demonstration of statistical concepts, and for statistical laboratory instruction. The student thus learns simultaneously about the use of computers…

  15. A GIS Procedure to Monitor PWV During Severe Meteorological Events

    NASA Astrophysics Data System (ADS)

    Ferrando, I.; Federici, B.; Sguerso, D.

    2016-12-01

    As widely known, the observation of GNSS signal's delay can improve the knowledge of meteorological phenomena. The local Precipitable Water Vapour (PWV), which can be easily derived from Zenith Total Delay (ZTD), Pressure (P) and Temperature (T) (Bevis et al., 1994), is not a satisfactory parameter to evaluate the occurrence of severe meteorological events. Hence, a GIS procedure, called G4M (GNSS for Meteorology), has been conceived to produce 2D PWV maps with high spatial and temporal resolution (1 km and 6 minutes respectively). The input data are GNSS, P and T observations not necessarily co-located coming from existing infrastructures, combined with a simplified physical model, owned by the research group.On spite of the low density and the different configurations of GNSS, P and T networks, the procedure is capable to detect severe meteorological events with reliable results. The procedure has already been applied in a wide and orographically complex area covering approximately the north-west of Italy and the French-Italian border region, to study two severe meteorological events occurred in Genoa (Italy) and other meteorological alert cases. The P, T and PWV 2D maps obtained by the procedure have been compared with the ones coming from meteorological re-analysis models, used as reference to obtain statistics on the goodness of the procedure in representing these fields. Additionally, the spatial variability of PWV was taken into account as indicator for representing potential critical situations; this index seems promising in highlighting remarkable features that precede intense precipitations. The strength and originality of the procedure lie into the employment of existing infrastructures, the independence from meteorological models, the high adaptability to different networks configurations, and the ability to produce high-resolution 2D PWV maps even from sparse input data. In the next future, the procedure could also be set up for near real-time applications.

  16. 32 CFR 104.5 - Responsibilities.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... procedures and provide information concerning civilian employment and reemployment rights, benefits and... employment and reemployment rights, benefits and obligations of Service members who are covered by the... service in exercising employment and reemployment rights. (6) Provide assistance, as appropriate, to...

  17. Public health workforce employment in US public and private sectors.

    PubMed

    Kennedy, Virginia C

    2009-01-01

    The purpose of this study was to describe the number and distribution of 26 administrative, professional, and technical public health occupations across the array of US governmental and nongovernmental industries. This study used data from the Occupational Employment Statistics program of the US Bureau of Labor Statistics. For each occupation of interest, the investigator determined the number of persons employed in 2006 in five industries and industry groups: government, nonprofit agencies, education, healthcare, and all other industries. Industry-specific employment profiles varied from one occupation to another. However, about three-fourths of all those engaged in these occupations worked in the private healthcare industry. Relatively few worked in nonprofit or educational settings, and less than 10 percent were employed in government agencies. The industry-specific distribution of public health personnel, particularly the proportion employed in the public sector, merits close monitoring. This study also highlights the need for a better understanding of the work performed by public health occupations in nongovernmental work settings. Finally, the Occupational Employment Statistics program has the potential to serve as an ongoing, national data collection system for public health workforce information. If this potential was realized, future workforce enumerations would not require primary data collection but rather could be accomplished using secondary data.

  18. 45 CFR 1616.6 - Equal employment opportunity.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 4 2010-10-01 2010-10-01 false Equal employment opportunity. 1616.6 Section 1616.6 Public Welfare Regulations Relating to Public Welfare (Continued) LEGAL SERVICES CORPORATION ATTORNEY HIRING § 1616.6 Equal employment opportunity. A recipient shall adopt employment qualifications, procedures, and policies that meet the...

  19. 78 FR 19019 - Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-28

    ... DEPARTMENT OF LABOR Employment and Training Administration Labor Certification Process for the Temporary Employment of Aliens in Agriculture in the United States: Prevailing Wage Rates for Certain Occupations Processed Under H-2A Special Procedures; Correction and Rescission AGENCY: Employment and Training...

  20. 20 CFR 645.270 - What procedures are there to ensure that currently employed workers may file grievances regarding...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... currently employed workers may file grievances regarding displacement and that Welfare-to-Work participants in employment activities may file grievances regarding displacement, health and safety standards and... regarding displacement and that Welfare-to-Work participants in employment activities may file grievances...

  1. 20 CFR 645.270 - What procedures are there to ensure that currently employed workers may file grievances regarding...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... currently employed workers may file grievances regarding displacement and that Welfare-to-Work participants in employment activities may file grievances regarding displacement, health and safety standards and... regarding displacement and that Welfare-to-Work participants in employment activities may file grievances...

  2. 20 CFR 645.270 - What procedures are there to ensure that currently employed workers may file grievances regarding...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... currently employed workers may file grievances regarding displacement and that Welfare-to-Work participants in employment activities may file grievances regarding displacement, health and safety standards and... regarding displacement and that Welfare-to-Work participants in employment activities may file grievances...

  3. 20 CFR 645.270 - What procedures are there to ensure that currently employed workers may file grievances regarding...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... currently employed workers may file grievances regarding displacement and that Welfare-to-Work participants in employment activities may file grievances regarding displacement, health and safety standards and... regarding displacement and that Welfare-to-Work participants in employment activities may file grievances...

  4. 20 CFR 645.270 - What procedures are there to ensure that currently employed workers may file grievances regarding...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... currently employed workers may file grievances regarding displacement and that Welfare-to-Work participants in employment activities may file grievances regarding displacement, health and safety standards and... regarding displacement and that Welfare-to-Work participants in employment activities may file grievances...

  5. 46 CFR 4.03-45 - Marine employer.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 1 2011-10-01 2011-10-01 false Marine employer. 4.03-45 Section 4.03-45 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY PROCEDURES APPLICABLE TO THE PUBLIC MARINE CASUALTIES AND INVESTIGATIONS Definitions § 4.03-45 Marine employer. Marine employer means the owner, managing operator...

  6. 46 CFR 4.03-45 - Marine employer.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 1 2013-10-01 2013-10-01 false Marine employer. 4.03-45 Section 4.03-45 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY PROCEDURES APPLICABLE TO THE PUBLIC MARINE CASUALTIES AND INVESTIGATIONS Definitions § 4.03-45 Marine employer. Marine employer means the owner, managing operator...

  7. 46 CFR 4.03-45 - Marine employer.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 1 2010-10-01 2010-10-01 false Marine employer. 4.03-45 Section 4.03-45 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY PROCEDURES APPLICABLE TO THE PUBLIC MARINE CASUALTIES AND INVESTIGATIONS Definitions § 4.03-45 Marine employer. Marine employer means the owner, managing operator...

  8. 46 CFR 4.03-45 - Marine employer.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 1 2014-10-01 2014-10-01 false Marine employer. 4.03-45 Section 4.03-45 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY PROCEDURES APPLICABLE TO THE PUBLIC MARINE CASUALTIES AND INVESTIGATIONS Definitions § 4.03-45 Marine employer. Marine employer means the owner, managing operator...

  9. 46 CFR 4.03-45 - Marine employer.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 1 2012-10-01 2012-10-01 false Marine employer. 4.03-45 Section 4.03-45 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY PROCEDURES APPLICABLE TO THE PUBLIC MARINE CASUALTIES AND INVESTIGATIONS Definitions § 4.03-45 Marine employer. Marine employer means the owner, managing operator...

  10. A Primer on Multivariate Analysis of Variance (MANOVA) for Behavioral Scientists

    ERIC Educational Resources Information Center

    Warne, Russell T.

    2014-01-01

    Reviews of statistical procedures (e.g., Bangert & Baumberger, 2005; Kieffer, Reese, & Thompson, 2001; Warne, Lazo, Ramos, & Ritter, 2012) show that one of the most common multivariate statistical methods in psychological research is multivariate analysis of variance (MANOVA). However, MANOVA and its associated procedures are often not…

  11. 29 CFR 1607.3 - Discrimination defined: Relationship between use of selection procedures and discrimination.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... selection procedures and discrimination. 1607.3 Section 1607.3 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION UNIFORM GUIDELINES ON EMPLOYEE SELECTION PROCEDURES (1978) General Principles § 1607.3 Discrimination defined: Relationship between use of selection procedures and...

  12. 29 CFR 1601.76 - Right of party to request review.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURAL REGULATIONS FEP Agency Designation Procedures § 1601.76 Right of party to request review. The Commission shall... procedures set forth in the Substantial Weight Review Procedures (EEOC Order 916). [46 FR 50367, Oct. 13...

  13. 29 CFR 1601.76 - Right of party to request review.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURAL REGULATIONS FEP Agency Designation Procedures § 1601.76 Right of party to request review. The Commission shall... procedures set forth in the Substantial Weight Review Procedures (EEOC Order 916). [46 FR 50367, Oct. 13...

  14. 29 CFR 1601.76 - Right of party to request review.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURAL REGULATIONS FEP Agency Designation Procedures § 1601.76 Right of party to request review. The Commission shall... procedures set forth in the Substantial Weight Review Procedures (EEOC Order 916). [46 FR 50367, Oct. 13...

  15. 29 CFR 1601.76 - Right of party to request review.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURAL REGULATIONS FEP Agency Designation Procedures § 1601.76 Right of party to request review. The Commission shall... procedures set forth in the Substantial Weight Review Procedures (EEOC Order 916). [46 FR 50367, Oct. 13...

  16. 29 CFR 1601.76 - Right of party to request review.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURAL REGULATIONS FEP Agency Designation Procedures § 1601.76 Right of party to request review. The Commission shall... procedures set forth in the Substantial Weight Review Procedures (EEOC Order 916). [46 FR 50367, Oct. 13...

  17. Knowledge dimensions in hypothesis test problems

    NASA Astrophysics Data System (ADS)

    Krishnan, Saras; Idris, Noraini

    2012-05-01

    The reformation in statistics education over the past two decades has predominantly shifted the focus of statistical teaching and learning from procedural understanding to conceptual understanding. The emphasis of procedural understanding is on the formulas and calculation procedures. Meanwhile, conceptual understanding emphasizes students knowing why they are using a particular formula or executing a specific procedure. In addition, the Revised Bloom's Taxonomy offers a twodimensional framework to describe learning objectives comprising of the six revised cognition levels of original Bloom's taxonomy and four knowledge dimensions. Depending on the level of complexities, the four knowledge dimensions essentially distinguish basic understanding from the more connected understanding. This study identifiesthe factual, procedural and conceptual knowledgedimensions in hypothesis test problems. Hypothesis test being an important tool in making inferences about a population from sample informationis taught in many introductory statistics courses. However, researchers find that students in these courses still have difficulty in understanding the underlying concepts of hypothesis test. Past studies also show that even though students can perform the hypothesis testing procedure, they may not understand the rationale of executing these steps or know how to apply them in novel contexts. Besides knowing the procedural steps in conducting a hypothesis test, students must have fundamental statistical knowledge and deep understanding of the underlying inferential concepts such as sampling distribution and central limit theorem. By identifying the knowledge dimensions of hypothesis test problems in this study, suitable instructional and assessment strategies can be developed in future to enhance students' learning of hypothesis test as a valuable inferential tool.

  18. Manpower Resources for Scientific Activities at Universities and Colleges, January 1976. Detailed Statistical Tables, Appendix B.

    ERIC Educational Resources Information Center

    Loycano, Robert J.

    The data presented in these tabulations are based on the 1976 National Science Foundation survey of scientific and engineering personnel employed at universities and colleges. The data are contained in 60 statistical tables organized under the following broad headings: trends; type of institution; field, employment status, control, educational…

  19. Opticians Employed in Health Services; United States--1969. Vital and Health Statistics, Series 14, No. 3.

    ERIC Educational Resources Information Center

    National Center for Health Statistics (DHEW/PHS), Hyattsville, MD.

    First in a series of statistical reports on personnel providing vision and eye care assistance, the report presents data collected by the Bureau of Census (geographic location, age, sex, education, type and place of employment, training, specialties, activities, and time spent at work) concerning opticians actively engaged in their profession…

  20. Concurrent Data Elicitation Procedures, Processes, and the Early Stages of L2 Learning: A Critical Overview

    ERIC Educational Resources Information Center

    Leow, Ronald P.; Grey, Sarah; Marijuan, Silvia; Moorman, Colleen

    2014-01-01

    Given the current methodological interest in eliciting direct data on the cognitive processes L2 learners employ as they interact with L2 data during the early stages of the learning process, this article takes a critical and comparative look at three concurrent data elicitation procedures currently employed in the SLA literature: Think aloud (TA)…

  1. An Evaluation of One- and Three-Parameter Logistic Tailored Testing Procedures for Use with Small Item Pools.

    ERIC Educational Resources Information Center

    McKinley, Robert L.; Reckase, Mark D.

    A two-stage study was conducted to compare the ability estimates yielded by tailored testing procedures based on the one-parameter logistic (1PL) and three-parameter logistic (3PL) models. The first stage of the study employed real data, while the second stage employed simulated data. In the first stage, response data for 3,000 examinees were…

  2. 29 CFR 1640.12 - Standards.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES FOR COORDINATING THE INVESTIGATION OF COMPLAINTS OR CHARGES OF EMPLOYMENT DISCRIMINATION BASED ON DISABILITY SUBJECT TO THE AMERICANS... 504 has been violated in a complaint alleging employment discrimination shall be the standards applied...

  3. 28 CFR 42.409 - Employment practices.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Employment practices. 42.409 Section 42.409 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Coordination of Enforcement of Non-discrimination in Federally Assisted Programs...

  4. 29 CFR 1607.2 - Scope.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION UNIFORM GUIDELINES ON EMPLOYEE SELECTION... them. B. Employment decisions. These guidelines apply to tests and other selection procedures which are... employment opportunity law. Other selection decisions, such as selection for training or transfer, may also...

  5. 29 CFR 1607.2 - Scope.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION UNIFORM GUIDELINES ON EMPLOYEE SELECTION... them. B. Employment decisions. These guidelines apply to tests and other selection procedures which are... employment opportunity law. Other selection decisions, such as selection for training or transfer, may also...

  6. 29 CFR 1607.2 - Scope.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION UNIFORM GUIDELINES ON EMPLOYEE SELECTION... them. B. Employment decisions. These guidelines apply to tests and other selection procedures which are... employment opportunity law. Other selection decisions, such as selection for training or transfer, may also...

  7. 29 CFR 1607.2 - Scope.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION UNIFORM GUIDELINES ON EMPLOYEE SELECTION... them. B. Employment decisions. These guidelines apply to tests and other selection procedures which are... employment opportunity law. Other selection decisions, such as selection for training or transfer, may also...

  8. 29 CFR 1915.501 - General provisions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... safety plan including hazards, controls, fire safety and health rules, and emergency procedures; (ii... (CONTINUED) OCCUPATIONAL SAFETY AND HEALTH STANDARDS FOR SHIPYARD EMPLOYMENT Fire Protection in Shipyard... require employers to protect all employees from fire hazards in shipyard employment, including employees...

  9. 29 CFR 1915.501 - General provisions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... safety plan including hazards, controls, fire safety and health rules, and emergency procedures; (ii... (CONTINUED) OCCUPATIONAL SAFETY AND HEALTH STANDARDS FOR SHIPYARD EMPLOYMENT Fire Protection in Shipyard... require employers to protect all employees from fire hazards in shipyard employment, including employees...

  10. 29 CFR 1915.501 - General provisions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... safety plan including hazards, controls, fire safety and health rules, and emergency procedures; (ii... (CONTINUED) OCCUPATIONAL SAFETY AND HEALTH STANDARDS FOR SHIPYARD EMPLOYMENT Fire Protection in Shipyard... require employers to protect all employees from fire hazards in shipyard employment, including employees...

  11. Florida's Workforce 2005.

    ERIC Educational Resources Information Center

    Florida State Dept. of Labor and Employment Security, Tallahassee.

    This report analyzes projected changes in population, labor force, and employment by industry and occupation for Florida between 1995 and 2005. More than 50 charts and graphs provide statistics on the following: Florida's population, labor force 1975-2005; employment 1975-2005; industry employment 1995-2005; occupational employment (general);…

  12. Avoid costly litigation: ten steps to implementing lawful hiring practices.

    PubMed

    Holmes, Judith H

    2004-01-01

    A malpractice claim or suit can have a devastating effect on a physician's practice and personal life. What is often overlooked is that an employment-related suit or EEOC charge also can extract a heavy toll, personally, professionally, and financially. The number of employment-related suits and claims has risen dramatically in the last few years. According to recent enforcement and litigation statistics released by the U.S. Equal Employment Opportunity Commission (EEOC) (1), the total discrimination charges filed by individuals against their employers increased last year to 80,840--the highest level since the mid-1990's. According to the EEOC data, in 2001, employers paid $248 million in connection with charges of discrimination filed with the EEOC by job applicants, employees, and former employees. Employers paid an additional $47 million to the EEOC in connection with lawsuits filed against employers by the EEOC (2). This does not include the millions of dollars employers were forced to pay in settlements, judgments, costs, and attorney's fees incurred in connection with employment-related lawsuits filed in state and federal courts during the same period of time. Employment-related litigation is on the rise, and the healthcare industry is not immune. Physicians as employers can be a target for a wide range of employment-related claims and suits, such as breach of contract, invasion of privacy, sex, race, age, religious and age discrimination, and negligent hiring, just to name a few. The number of jury verdicts rendered against employers is increasing and the verdict awards are often staggering. In addition, defending these suits can be as expensive as defending a complicated malpractice suit. Even worse, employment discrimination suits and charges are generally not covered by malpractice, D & O, or general liability insurance policies, leaving the physician to cope with the financial burden of judgments, settlements, attorney's fees and litigation costs. Most employment-related disputes that lead to costly litigation would never have arisen if the employer had implemented more effective employment practices. Hiring mistakes in particular cause many costly legal battles. This article identifies legal issues that precipitate litigation and suggests ten steps physicians can take to implement lawful hiring practices that will reduce the risk of costly employment suits while improving office efficiency, morale, and productivity. NOTE: This article is intended as an overview of lawful hiring strategies, and is not a substitute for legal advice from experienced employment counsel. Applicable laws vary from state to state and appropriate procedures may depend on specific factual situations. This article is not, and should not be construed as, legal advice.

  13. 10 CFR 10.20 - Purpose of the procedures.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Purpose of the procedures. 10.20 Section 10.20 Energy NUCLEAR REGULATORY COMMISSION CRITERIA AND PROCEDURES FOR DETERMINING ELIGIBILITY FOR ACCESS TO RESTRICTED DATA OR NATIONAL SECURITY INFORMATION OR AN EMPLOYMENT CLEARANCE Procedures § 10.20 Purpose of the...

  14. 10 CFR 10.20 - Purpose of the procedures.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Purpose of the procedures. 10.20 Section 10.20 Energy NUCLEAR REGULATORY COMMISSION CRITERIA AND PROCEDURES FOR DETERMINING ELIGIBILITY FOR ACCESS TO RESTRICTED DATA OR NATIONAL SECURITY INFORMATION OR AN EMPLOYMENT CLEARANCE Procedures § 10.20 Purpose of the...

  15. 10 CFR 10.20 - Purpose of the procedures.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Purpose of the procedures. 10.20 Section 10.20 Energy NUCLEAR REGULATORY COMMISSION CRITERIA AND PROCEDURES FOR DETERMINING ELIGIBILITY FOR ACCESS TO RESTRICTED DATA OR NATIONAL SECURITY INFORMATION OR AN EMPLOYMENT CLEARANCE Procedures § 10.20 Purpose of the...

  16. The Development of Statistical Models for Predicting Surgical Site Infections in Japan: Toward a Statistical Model-Based Standardized Infection Ratio.

    PubMed

    Fukuda, Haruhisa; Kuroki, Manabu

    2016-03-01

    To develop and internally validate a surgical site infection (SSI) prediction model for Japan. Retrospective observational cohort study. We analyzed surveillance data submitted to the Japan Nosocomial Infections Surveillance system for patients who had undergone target surgical procedures from January 1, 2010, through December 31, 2012. Logistic regression analyses were used to develop statistical models for predicting SSIs. An SSI prediction model was constructed for each of the procedure categories by statistically selecting the appropriate risk factors from among the collected surveillance data and determining their optimal categorization. Standard bootstrapping techniques were applied to assess potential overfitting. The C-index was used to compare the predictive performances of the new statistical models with those of models based on conventional risk index variables. The study sample comprised 349,987 cases from 428 participant hospitals throughout Japan, and the overall SSI incidence was 7.0%. The C-indices of the new statistical models were significantly higher than those of the conventional risk index models in 21 (67.7%) of the 31 procedure categories (P<.05). No significant overfitting was detected. Japan-specific SSI prediction models were shown to generally have higher accuracy than conventional risk index models. These new models may have applications in assessing hospital performance and identifying high-risk patients in specific procedure categories.

  17. 28 CFR 42.408 - Complaint procedures.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Complaint procedures. 42.408 Section 42.408 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Coordination of Enforcement of Non-discrimination in Federally Assisted Programs...

  18. 4 CFR 28.60 - Briefs.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 4 Accounts 1 2010-01-01 2010-01-01 false Briefs. 28.60 Section 28.60 Accounts GOVERNMENT ACCOUNTABILITY OFFICE GENERAL PROCEDURES GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL APPEALS BOARD; PROCEDURES APPLICABLE TO CLAIMS CONCERNING EMPLOYMENT PRACTICES AT THE GOVERNMENT ACCOUNTABILITY OFFICE Procedures...

  19. 4 CFR 28.50 - Enforcement.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 4 Accounts 1 2010-01-01 2010-01-01 false Enforcement. 28.50 Section 28.50 Accounts GOVERNMENT ACCOUNTABILITY OFFICE GENERAL PROCEDURES GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL APPEALS BOARD; PROCEDURES APPLICABLE TO CLAIMS CONCERNING EMPLOYMENT PRACTICES AT THE GOVERNMENT ACCOUNTABILITY OFFICE Procedures...

  20. 4 CFR 28.58 - Transcript.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 4 Accounts 1 2010-01-01 2010-01-01 false Transcript. 28.58 Section 28.58 Accounts GOVERNMENT ACCOUNTABILITY OFFICE GENERAL PROCEDURES GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL APPEALS BOARD; PROCEDURES APPLICABLE TO CLAIMS CONCERNING EMPLOYMENT PRACTICES AT THE GOVERNMENT ACCOUNTABILITY OFFICE Procedures...

  1. 4 CFR 28.48 - Service.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 4 Accounts 1 2010-01-01 2010-01-01 false Service. 28.48 Section 28.48 Accounts GOVERNMENT ACCOUNTABILITY OFFICE GENERAL PROCEDURES GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL APPEALS BOARD; PROCEDURES APPLICABLE TO CLAIMS CONCERNING EMPLOYMENT PRACTICES AT THE GOVERNMENT ACCOUNTABILITY OFFICE Procedures...

  2. 29 CFR 1915.508 - Training.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... responsibilities at all times; (2) Keep written standard operating procedures that address anticipated emergency... the employer's standard operating procedures; (5) Train new fire response employees before they engage in emergency operations; (6) At least quarterly, provide training on the written operating procedures...

  3. 29 CFR 1915.508 - Training.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... responsibilities at all times; (2) Keep written standard operating procedures that address anticipated emergency... the employer's standard operating procedures; (5) Train new fire response employees before they engage in emergency operations; (6) At least quarterly, provide training on the written operating procedures...

  4. 29 CFR 1915.508 - Training.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... responsibilities at all times; (2) Keep written standard operating procedures that address anticipated emergency... the employer's standard operating procedures; (5) Train new fire response employees before they engage in emergency operations; (6) At least quarterly, provide training on the written operating procedures...

  5. 29 CFR 1915.508 - Training.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... responsibilities at all times; (2) Keep written standard operating procedures that address anticipated emergency... the employer's standard operating procedures; (5) Train new fire response employees before they engage in emergency operations; (6) At least quarterly, provide training on the written operating procedures...

  6. 29 CFR 1915.508 - Training.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... responsibilities at all times; (2) Keep written standard operating procedures that address anticipated emergency... the employer's standard operating procedures; (5) Train new fire response employees before they engage in emergency operations; (6) At least quarterly, provide training on the written operating procedures...

  7. 29 CFR 1603.302 - Filing an appeal.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES FOR PREVIOUSLY EXEMPT STATE AND LOCAL GOVERNMENT EMPLOYEE COMPLAINTS OF EMPLOYMENT DISCRIMINATION UNDER SECTION 304 OF... Operations, Equal Employment Opportunity Commission, P.O. Box 77960, Washington, DC 20013, by mail or...

  8. 29 CFR 1603.302 - Filing an appeal.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES FOR PREVIOUSLY EXEMPT STATE AND LOCAL GOVERNMENT EMPLOYEE COMPLAINTS OF EMPLOYMENT DISCRIMINATION UNDER SECTION 304 OF... Operations, Equal Employment Opportunity Commission, P.O. Box 77960, Washington, DC 20013, by mail or...

  9. 29 CFR 1603.302 - Filing an appeal.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES FOR PREVIOUSLY EXEMPT STATE AND LOCAL GOVERNMENT EMPLOYEE COMPLAINTS OF EMPLOYMENT DISCRIMINATION UNDER SECTION 304 OF... Operations, Equal Employment Opportunity Commission, P.O. Box 77960, Washington, DC 20013, by mail or...

  10. 29 CFR 1603.302 - Filing an appeal.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES FOR PREVIOUSLY EXEMPT STATE AND LOCAL GOVERNMENT EMPLOYEE COMPLAINTS OF EMPLOYMENT DISCRIMINATION UNDER SECTION 304 OF... Operations, Equal Employment Opportunity Commission, P.O. Box 77960, Washington, DC 20013, by mail or...

  11. 29 CFR 1603.302 - Filing an appeal.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES FOR PREVIOUSLY EXEMPT STATE AND LOCAL GOVERNMENT EMPLOYEE COMPLAINTS OF EMPLOYMENT DISCRIMINATION UNDER SECTION 304 OF... Operations, Equal Employment Opportunity Commission, P.O. Box 77960, Washington, DC 20013, by mail or...

  12. 29 CFR 1626.10 - Agreements with State or local fair employment practices agencies.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 4 2013-07-01 2013-07-01 false Agreements with State or local fair employment practices agencies. 1626.10 Section 1626.10 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN EMPLOYMENT ACT § 1626.10 Agreements with State or...

  13. 29 CFR 1626.10 - Agreements with State or local fair employment practices agencies.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 4 2014-07-01 2014-07-01 false Agreements with State or local fair employment practices agencies. 1626.10 Section 1626.10 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN EMPLOYMENT ACT § 1626.10 Agreements with State or...

  14. 29 CFR 1626.10 - Agreements with State or local fair employment practices agencies.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 4 2012-07-01 2012-07-01 false Agreements with State or local fair employment practices agencies. 1626.10 Section 1626.10 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN EMPLOYMENT ACT § 1626.10 Agreements with State or...

  15. 29 CFR 1626.10 - Agreements with State or local fair employment practices agencies.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 4 2011-07-01 2011-07-01 false Agreements with State or local fair employment practices agencies. 1626.10 Section 1626.10 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN EMPLOYMENT ACT § 1626.10 Agreements with State or...

  16. 29 CFR 1626.10 - Agreements with State or local fair employment practices agencies.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Agreements with State or local fair employment practices agencies. 1626.10 Section 1626.10 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN EMPLOYMENT ACT § 1626.10 Agreements with State or...

  17. School nursing for children with special needs: does number of schools make a difference?

    PubMed

    Kruger, Barbara J; Toker, Karen H; Radjenovic, Doreen; Comeaux, Judy M; Macha, Kiran

    2009-08-01

    Few recent studies have focused on the role of school nurses who predominantly care for children with special health care needs (CSHCN). The primary aim of this study was to explore differences related to (a) child health conditions covered, (b) direct care procedures, (c) care management functions, and (c) consultation sources used among nurses who spent the majority of their time caring for CSHCN compared to a mixed student population and among nurses who covered a single school versus multiple schools. A community-based interdisciplinary team developed a 28-item survey which was completed by 50 nurses (48.5% response) employed by health departments and school districts. Descriptive and comparative statistics and thematic coding were used to analyze data. Nurses who covered a single school (n = 23) or who were primarily assigned to CSHCN (n = 13) had a lower number of students, and more frequently (a) encountered complex child conditions, (b) performed direct care procedures, (c) participated in Individualized Education Plan (IEP) development, (d) collaborated with the Title V-CSHCN agency, and e) communicated with physicians, compared to nurses who covered multiple schools or a general child population. Benefits centered on the children, scope of work, school environment, and family relationships. Challenges included high caseloads, school district priorities, and families who did not follow up. The number of schools that the nurses covered, percent of time caring for CSHCN, and employer type (school district or health department) affected the scope of school nurse practice. Recommendations are for lower student-to-nurse ratios, improved nursing supervision, and educational support.

  18. Hearing loss associated with repeated MRI acquisition procedure-related acoustic noise exposure: an occupational cohort study.

    PubMed

    Bongers, Suzan; Slottje, Pauline; Kromhout, Hans

    2017-11-01

    To study the effects of repeated exposure to MRI-related acoustic noise during image acquisition procedures (scans) on hearing. A retrospective occupational cohort study was performed among workers of an MRI manufacturing facility (n=474). Longitudinal audiometry data from the facility's medical surveillance scheme collected from 1973 to 2010 were analysed by studying the association of cumulative exposure to MRI-related acoustic noise from voluntary (multiple) MRI scans and the hearing threshold of the volunteer. Repeated acoustic noise exposure during volunteer MRI scans was found to be associated with a small exposure-dependent increased rate change of hearing threshold level (dB/year), but the association was only found related to the number of voluntary MRI scans and not to modelled cumulative noise exposure (dB*hour) based on MRI-system type. The increased rate change of hearing threshold level was found to be statistically significant for the frequencies 500, 1000, 2000, 3000 and 4000 Hz in the right ear. From our longitudinal cohort study, it appeared that exposure to noise from voluntarily MRI scans may have resulted in a slight amount of hearing loss. Mandatory use of hearing protection might have prevented more severe hearing loss. Lack of consistency in findings between the left and right ears and between the two exposure measures prohibits definitive conclusions. Further research that addresses the study's methodological limitations is warranted to corroborate our findings. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  19. 7 CFR 800.86 - Inspection of shiplot, unit train, and lash barge grain in single lots.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... prescribed in the instructions. (b) Application procedure. Applications for the official inspection of... statistical acceptance sampling and inspection plan according to the provisions of this section and procedures... inspection as part of a single lot and accepted by a statistical acceptance sampling and inspection plan...

  20. Statistical Assessment of Variability of Terminal Restriction Fragment Length Polymorphism Analysis Applied to Complex Microbial Communities ▿ †

    PubMed Central

    Rossi, Pierre; Gillet, François; Rohrbach, Emmanuelle; Diaby, Nouhou; Holliger, Christof

    2009-01-01

    The variability of terminal restriction fragment polymorphism analysis applied to complex microbial communities was assessed statistically. Recent technological improvements were implemented in the successive steps of the procedure, resulting in a standardized procedure which provided a high level of reproducibility. PMID:19749066

  1. Statistical Analysis and Time Series Modeling of Air Traffic Operations Data From Flight Service Stations and Terminal Radar Approach Control Facilities : Two Case Studies

    DOT National Transportation Integrated Search

    1981-10-01

    Two statistical procedures have been developed to estimate hourly or daily aircraft counts. These counts can then be transformed into estimates of instantaneous air counts. The first procedure estimates the stable (deterministic) mean level of hourly...

  2. Changes in Occupational Employment in the Food and Kindred Products Industry, 1977-1980. Technical Note No. 1.

    ERIC Educational Resources Information Center

    Lewis, Gary

    The extent to which occupational staffing patterns change over time was examined in a study focusing on the Food and Kindred Products industry--Standard Industrial Classification (SIC) 20. Data were taken from the 1977 and 1980 Occupational Employment Statistics program coordinated by the United States Department of Labor Statistics. Actual 1980…

  3. Minorities and Women in State and Local Governments. 1974. Volume V--Township Governments. Research Report No. 52-5.

    ERIC Educational Resources Information Center

    Reshad, Rosalind S.

    One of six volumes summarizing through narrative and statistical tables data collected by the Equal Employment Opportunity Commission in its 1974 survey, this fifth volume details nationwide statistics on the employment status of minorities and women working in township governments. Data from 299 actual units of government in fourteen states were…

  4. Minorities and Women in State and Local Governments. 1974. Volume IV--Municipal Governments. Research Report No. 52-4.

    ERIC Educational Resources Information Center

    Skinner, Alice W.

    One of six volumes summarizing through narrative and statistical tables data collected by the Equal Employment Opportunity Commission in its 1974 survey, this fourth volume details the employment status of minorities and women in municipal governments. Based on reports filed by 2,230 municipalities, statistics in this study are designed to…

  5. Task-based data-acquisition optimization for sparse image reconstruction systems

    NASA Astrophysics Data System (ADS)

    Chen, Yujia; Lou, Yang; Kupinski, Matthew A.; Anastasio, Mark A.

    2017-03-01

    Conventional wisdom dictates that imaging hardware should be optimized by use of an ideal observer (IO) that exploits full statistical knowledge of the class of objects to be imaged, without consideration of the reconstruction method to be employed. However, accurate and tractable models of the complete object statistics are often difficult to determine in practice. Moreover, in imaging systems that employ compressive sensing concepts, imaging hardware and (sparse) image reconstruction are innately coupled technologies. We have previously proposed a sparsity-driven ideal observer (SDIO) that can be employed to optimize hardware by use of a stochastic object model that describes object sparsity. The SDIO and sparse reconstruction method can therefore be "matched" in the sense that they both utilize the same statistical information regarding the class of objects to be imaged. To efficiently compute SDIO performance, the posterior distribution is estimated by use of computational tools developed recently for variational Bayesian inference. Subsequently, the SDIO test statistic can be computed semi-analytically. The advantages of employing the SDIO instead of a Hotelling observer are systematically demonstrated in case studies in which magnetic resonance imaging (MRI) data acquisition schemes are optimized for signal detection tasks.

  6. 39 CFR 255.5 - Employment.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... POSTAL SERVICE PROGRAMS, ACTIVITIES, FACILITIES, AND ELECTRONIC AND INFORMATION TECHNOLOGY § 255.5... discrimination in employment with the Postal Service. The definitions, requirements, and procedures of section...

  7. Implications of the new EEOC guidelines.

    PubMed

    Dhanens, T P

    1979-01-01

    How can employers exercise their right to select employees without running afoul of the new guidelines? Are interviews best? Pencil and paper tests? "Nonrandom selection procedures are inevitable for most jobs," says Dr. Thomas Dhanens, a management psychologist. "Therefore, employers will always be open to charges of discrimination or favoritism from some quarter. Organizations that avoid their responsibility for examining and validating their selection procedures will be forced into a costly catch-up effort before long." The author shows employers how to collect data systematically, analyze job functions, evaluate applicants, record data, handle performance appraisals, maintain records, and identify priorities. Since a lack of data is no defense in an EEOC action, Dhanens suggests that these are the minimum steps wise employers should follow.

  8. The use of analysis of variance procedures in biological studies

    USGS Publications Warehouse

    Williams, B.K.

    1987-01-01

    The analysis of variance (ANOVA) is widely used in biological studies, yet there remains considerable confusion among researchers about the interpretation of hypotheses being tested. Ambiguities arise when statistical designs are unbalanced, and in particular when not all combinations of design factors are represented in the data. This paper clarifies the relationship among hypothesis testing, statistical modelling and computing procedures in ANOVA for unbalanced data. A simple two-factor fixed effects design is used to illustrate three common parametrizations for ANOVA models, and some associations among these parametrizations are developed. Biologically meaningful hypotheses for main effects and interactions are given in terms of each parametrization, and procedures for testing the hypotheses are described. The standard statistical computing procedures in ANOVA are given along with their corresponding hypotheses. Throughout the development unbalanced designs are assumed and attention is given to problems that arise with missing cells.

  9. 33 CFR 203.11 - Purpose.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Introduction § 203.11 Purpose. This part prescribes administrative policies, guidance, and operating procedures for natural disaster preparedness...

  10. 33 CFR 203.11 - Purpose.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Introduction § 203.11 Purpose. This part prescribes administrative policies, guidance, and operating procedures for natural disaster preparedness...

  11. 33 CFR 203.11 - Purpose.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Introduction § 203.11 Purpose. This part prescribes administrative policies, guidance, and operating procedures for natural disaster preparedness...

  12. 33 CFR 203.11 - Purpose.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Introduction § 203.11 Purpose. This part prescribes administrative policies, guidance, and operating procedures for natural disaster preparedness...

  13. 33 CFR 203.11 - Purpose.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... EMPLOYMENT OF ARMY AND OTHER RESOURCES, NATURAL DISASTER PROCEDURES Introduction § 203.11 Purpose. This part prescribes administrative policies, guidance, and operating procedures for natural disaster preparedness...

  14. 24 CFR 5.238 - Criminal and civil penalties.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Development GENERAL HUD PROGRAM REQUIREMENTS; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers; Procedures for Obtaining Income Information Procedures for...

  15. A 20-year period of orthotopic liver transplantation activity in a single center: a time series analysis performed using the R Statistical Software.

    PubMed

    Santori, G; Andorno, E; Morelli, N; Casaccia, M; Bottino, G; Di Domenico, S; Valente, U

    2009-05-01

    In many Western countries a "minimum volume rule" policy has been adopted as a quality measure for complex surgical procedures. In Italy, the National Transplant Centre set the minimum number of orthotopic liver transplantation (OLT) procedures/y at 25/center. OLT procedures performed in a single center for a reasonably large period may be treated as a time series to evaluate trend, seasonal cycles, and nonsystematic fluctuations. Between January 1, 1987 and December 31, 2006, we performed 563 cadaveric donor OLTs to adult recipients. During 2007, there were another 28 procedures. The greatest numbers of OLTs/y were performed in 2001 (n = 51), 2005 (n = 50), and 2004 (n = 49). A time series analysis performed using R Statistical Software (Foundation for Statistical Computing, Vienna, Austria), a free software environment for statistical computing and graphics, showed an incremental trend after exponential smoothing as well as after seasonal decomposition. The predicted OLT/mo for 2007 calculated with the Holt-Winters exponential smoothing applied to the previous period 1987-2006 helped to identify the months where there was a major difference between predicted and performed procedures. The time series approach may be helpful to establish a minimum volume/y at a single-center level.

  16. A close examination of double filtering with fold change and t test in microarray analysis

    PubMed Central

    2009-01-01

    Background Many researchers use the double filtering procedure with fold change and t test to identify differentially expressed genes, in the hope that the double filtering will provide extra confidence in the results. Due to its simplicity, the double filtering procedure has been popular with applied researchers despite the development of more sophisticated methods. Results This paper, for the first time to our knowledge, provides theoretical insight on the drawback of the double filtering procedure. We show that fold change assumes all genes to have a common variance while t statistic assumes gene-specific variances. The two statistics are based on contradicting assumptions. Under the assumption that gene variances arise from a mixture of a common variance and gene-specific variances, we develop the theoretically most powerful likelihood ratio test statistic. We further demonstrate that the posterior inference based on a Bayesian mixture model and the widely used significance analysis of microarrays (SAM) statistic are better approximations to the likelihood ratio test than the double filtering procedure. Conclusion We demonstrate through hypothesis testing theory, simulation studies and real data examples, that well constructed shrinkage testing methods, which can be united under the mixture gene variance assumption, can considerably outperform the double filtering procedure. PMID:19995439

  17. Optimization of Multilocus Sequence Analysis for Identification of Species in the Genus Vibrio

    PubMed Central

    Gabriel, Michael W.; Matsui, George Y.; Friedman, Robert

    2014-01-01

    Multilocus sequence analysis (MLSA) is an important method for identification of taxa that are not well differentiated by 16S rRNA gene sequences alone. In this procedure, concatenated sequences of selected genes are constructed and then analyzed. The effects that the number and the order of genes used in MLSA have on reconstruction of phylogenetic relationships were examined. The recA, rpoA, gapA, 16S rRNA gene, gyrB, and ftsZ sequences from 56 species of the genus Vibrio were used to construct molecular phylogenies, and these were evaluated individually and using various gene combinations. Phylogenies from two-gene sequences employing recA and rpoA in both possible gene orders were different. The addition of the gapA gene sequence, producing all six possible concatenated sequences, reduced the differences in phylogenies to degrees of statistical (bootstrap) support for some nodes. The overall statistical support for the phylogenetic tree, assayed on the basis of a reliability score (calculated from the number of nodes having bootstrap values of ≥80 divided by the total number of nodes) increased with increasing numbers of genes used, up to a maximum of four. No further improvement was observed from addition of the fifth gene sequence (ftsZ), and addition of the sixth gene (gyrB) resulted in lower proportions of strongly supported nodes. Reductions in the numbers of strongly supported nodes were also observed when maximum parsimony was employed for tree construction. Use of a small number of gene sequences in MLSA resulted in accurate identification of Vibrio species. PMID:24951781

  18. A comparison of the views of extension agents and farmers regarding extension education courses in Dezful, Iran

    NASA Astrophysics Data System (ADS)

    Nazarzadeh Zare, Mohsen; Dorrani, Kamal; Gholamali Lavasani, Masoud

    2012-11-01

    Background and purpose : This study examines the views of farmers and extension agents participating in extension education courses in Dezful, Iran, with regard to problems with these courses. It relies upon a descriptive methodology, using a survey as its instrument. Sample : The statistical population consisted of 5060 farmers and 50 extension agents; all extension agents were studied owing to their small population and a sample of 466 farmers was selected based on the stratified ratio sampling method. For the data analysis, statistical procedures including the t-test and factor analysis were used. Results : The results of factor analysis on the views of farmers indicated that these courses have problems such as inadequate use of instructional materials by extension agents, insufficient employment of knowledgeable and experienced extension agents, bad and inconvenient timing of courses for farmers, lack of logical connection between one curriculum and prior ones, negligence in considering the opinions of farmers in arranging the courses, and lack of information about the time of courses. The findings of factor analysis on the views of extension agents indicated that these courses suffer from problems such as use of consistent methods of instruction for teaching curricula, and lack of continuity between courses and their levels and content. Conclusions : Recommendations include: listening to the views of farmers when planning extension courses; providing audiovisual aids, pamphlets and CDs; arranging courses based on convenient timing for farmers; using incentives to encourage participation; and employing extension agents with knowledge of the latest agricultural issues.

  19. 28 CFR 42.407 - Procedures to determine compliance.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 28 Judicial Administration 1 2014-07-01 2014-07-01 false Procedures to determine compliance. 42.407 Section 42.407 Judicial Administration DEPARTMENT OF JUSTICE NONDISCRIMINATION; EQUAL EMPLOYMENT OPPORTUNITY; POLICIES AND PROCEDURES Coordination of Enforcement of Non-discrimination in Federally Assisted...

  20. 4 CFR 28.57 - Public hearings.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 4 Accounts 1 2010-01-01 2010-01-01 false Public hearings. 28.57 Section 28.57 Accounts GOVERNMENT ACCOUNTABILITY OFFICE GENERAL PROCEDURES GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL APPEALS BOARD; PROCEDURES APPLICABLE TO CLAIMS CONCERNING EMPLOYMENT PRACTICES AT THE GOVERNMENT ACCOUNTABILITY OFFICE Procedures...

  1. 4 CFR 28.59 - Official record.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 4 Accounts 1 2010-01-01 2010-01-01 false Official record. 28.59 Section 28.59 Accounts GOVERNMENT ACCOUNTABILITY OFFICE GENERAL PROCEDURES GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL APPEALS BOARD; PROCEDURES APPLICABLE TO CLAIMS CONCERNING EMPLOYMENT PRACTICES AT THE GOVERNMENT ACCOUNTABILITY OFFICE Procedures...

  2. 41 CFR 60-3.3 - Discrimination defined: Relationship between use of selection procedures and discrimination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... employment or membership opportunities of members of any race, sex, or ethnic group will be considered to be... selection procedures and suitable alternative methods of using the selection procedure which have as little...

  3. 41 CFR 60-3.3 - Discrimination defined: Relationship between use of selection procedures and discrimination.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... employment or membership opportunities of members of any race, sex, or ethnic group will be considered to be... selection procedures and suitable alternative methods of using the selection procedure which have as little...

  4. 41 CFR 60-3.3 - Discrimination defined: Relationship between use of selection procedures and discrimination.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... employment or membership opportunities of members of any race, sex, or ethnic group will be considered to be... selection procedures and suitable alternative methods of using the selection procedure which have as little...

  5. 41 CFR 60-3.3 - Discrimination defined: Relationship between use of selection procedures and discrimination.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... employment or membership opportunities of members of any race, sex, or ethnic group will be considered to be... selection procedures and suitable alternative methods of using the selection procedure which have as little...

  6. 32 CFR 327.4 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., medical history, criminal or employment history and that contains his or her name, or the identifying... equipment employing automated technology, systematic procedures, and trained personnel for the primary...

  7. Statistics of Scientific Procedures on Living Animals Great Britain 2015 - highlighting an ongoing upward trend in animal use and missed opportunities.

    PubMed

    Hudson-Shore, Michelle

    2016-12-01

    The Annual Statistics of Scientific Procedures on Living Animals Great Britain 2015 indicate that the Home Office were correct in recommending that caution should be exercised when interpreting the 2014 data as an apparent decline in animal experiments. The 2015 report shows that, as the changes to the format of the annual statistics have become more familiar and less problematic, there has been a re-emergence of the upward trend in animal research and testing in Great Britain. The 2015 statistics report an increase in animal procedures (up to 4,142,631) and in the number of animals used (up to 4,069,349). This represents 1% more than the totals in 2013, and a 7% increase on the procedures reported in 2014. This paper details an analysis of these most recent statistics, providing information on overall animal use and highlighting specific issues associated with genetically-altered animals, dogs and primates. It also reflects on areas of the new format that have previously been highlighted as being problematic, and concludes with a discussion about the use of animals in regulatory research and testing, and how there are significant missed opportunities for replacing some of the animal-based tests in this area. 2016 FRAME.

  8. Self-Regulated Learning Strategies in Relation with Statistics Anxiety

    ERIC Educational Resources Information Center

    Kesici, Sahin; Baloglu, Mustafa; Deniz, M. Engin

    2011-01-01

    Dealing with students' attitudinal problems related to statistics is an important aspect of statistics instruction. Employing the appropriate learning strategies may have a relationship with anxiety during the process of statistics learning. Thus, the present study investigated multivariate relationships between self-regulated learning strategies…

  9. Thyroid cancer and employment as a radiologic technologist.

    PubMed

    Zabel, Erik W; Alexander, Bruce H; Mongin, Steven J; Doody, Michele M; Sigurdson, Alice J; Linet, Martha S; Freedman, D Michal; Hauptmann, Michael; Mabuchi, Kiyohiko; Ron, Elaine

    2006-10-15

    The association between chronic occupational ionizing radiation exposure in the medical field and thyroid cancer is not well characterized. Thyroid cancer incidence was ascertained for 2 periods in a cohort of radiologic technologists certified for a minimum 2 years and enumerated in 1983: (i) cases identified prospectively in 73,080 radiologic technologists who were free of thyroid cancer at the baseline survey and completed a second questionnaire a decade later (N = 121), and (ii) cases occurring prior to cohort enumeration among 90,245 technologists who completed the baseline survey and were thyroid cancer free 2 years after certification (N = 148). Survival analyses estimated risks associated with employment as a radiologic technologist, including duration of employment, period of employment, types of procedures and work practices. The only occupational history characteristic associated with prospectively identified thyroid cancer was a history of holding patients for X-ray procedures at least 50 times (HR = 1.47, 95% CI = 1.01-2.15). Total years worked as a radiologic technologist, years performing diagnostic, therapeutic, and nuclear medicine procedures, employment under age 20 and calendar period of first employment were not associated with thyroid cancer risk. Risk of thyroid cancers diagnosed before the baseline questionnaire was inversely associated with decade first employed as a technologist, and was elevated, albeit imprecisely, among those working more than 5 years prior to 1950 (HR = 3.04, 95% CI = 1.01-10.78). These data provide modest evidence of an association between employment as a radiologic technologist and thyroid cancer risk; however, the findings require confirmation with more accurate exposure models. Copyright 2006 Wiley-Liss, Inc.

  10. Determining resistance to soft-rot fungi

    Treesearch

    C. G. Duncan

    1965-01-01

    A laboratory procedure is outlined that incorporates techniques found to promote soft rot by several fungi. This procedure employs either an agar or a soil substrate. Also presented are the principal findings of experiments underlying the procedure. Results of tests conducted according to the suggested procedure are illustrated. The overall decay resistance of the...

  11. Visual Contrast Sensitivity Functions Obtained from Untrained Observers Using Tracking and Staircase Procedures. Final Report.

    ERIC Educational Resources Information Center

    Geri, George A.; Hubbard, David C.

    Two adaptive psychophysical procedures (tracking and "yes-no" staircase) for obtaining human visual contrast sensitivity functions (CSF) were evaluated. The procedures were chosen based on their proven validity and the desire to evaluate the practical effects of stimulus transients, since tracking procedures traditionally employ gradual…

  12. 10 CFR 10.24 - Procedures for hearing and review.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Procedures for hearing and review. 10.24 Section 10.24... RESTRICTED DATA OR NATIONAL SECURITY INFORMATION OR AN EMPLOYMENT CLEARANCE Procedures § 10.24 Procedures for... requirements set forth in § 10.22, the Director, Office of Administration, shall forthwith appoint a Hearing...

  13. 10 CFR 10.24 - Procedures for hearing and review.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Procedures for hearing and review. 10.24 Section 10.24... RESTRICTED DATA OR NATIONAL SECURITY INFORMATION OR AN EMPLOYMENT CLEARANCE Procedures § 10.24 Procedures for... requirements set forth in § 10.22, the Director, Office of Administration, shall forthwith appoint a Hearing...

  14. 10 CFR 10.24 - Procedures for hearing and review.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Procedures for hearing and review. 10.24 Section 10.24... RESTRICTED DATA OR NATIONAL SECURITY INFORMATION OR AN EMPLOYMENT CLEARANCE Procedures § 10.24 Procedures for... requirements set forth in § 10.22, the Director, Office of Administration, shall forthwith appoint a Hearing...

  15. 10 CFR 10.24 - Procedures for hearing and review.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Procedures for hearing and review. 10.24 Section 10.24... RESTRICTED DATA OR NATIONAL SECURITY INFORMATION OR AN EMPLOYMENT CLEARANCE Procedures § 10.24 Procedures for... requirements set forth in § 10.22, the Director, Office of Administration, shall forthwith appoint a Hearing...

  16. Estimating times of surgeries with two component procedures: comparison of the lognormal and normal models.

    PubMed

    Strum, David P; May, Jerrold H; Sampson, Allan R; Vargas, Luis G; Spangler, William E

    2003-01-01

    Variability inherent in the duration of surgical procedures complicates surgical scheduling. Modeling the duration and variability of surgeries might improve time estimates. Accurate time estimates are important operationally to improve utilization, reduce costs, and identify surgeries that might be considered outliers. Surgeries with multiple procedures are difficult to model because they are difficult to segment into homogenous groups and because they are performed less frequently than single-procedure surgeries. The authors studied, retrospectively, 10,740 surgeries each with exactly two CPTs and 46,322 surgical cases with only one CPT from a large teaching hospital to determine if the distribution of dual-procedure surgery times fit more closely a lognormal or a normal model. The authors tested model goodness of fit to their data using Shapiro-Wilk tests, studied factors affecting the variability of time estimates, and examined the impact of coding permutations (ordered combinations) on modeling. The Shapiro-Wilk tests indicated that the lognormal model is statistically superior to the normal model for modeling dual-procedure surgeries. Permutations of component codes did not appear to differ significantly with respect to total procedure time and surgical time. To improve individual models for infrequent dual-procedure surgeries, permutations may be reduced and estimates may be based on the longest component procedure and type of anesthesia. The authors recommend use of the lognormal model for estimating surgical times for surgeries with two component procedures. Their results help legitimize the use of log transforms to normalize surgical procedure times prior to hypothesis testing using linear statistical models. Multiple-procedure surgeries may be modeled using the longest (statistically most important) component procedure and type of anesthesia.

  17. Applications of statistics to medical science (1) Fundamental concepts.

    PubMed

    Watanabe, Hiroshi

    2011-01-01

    The conceptual framework of statistical tests and statistical inferences are discussed, and the epidemiological background of statistics is briefly reviewed. This study is one of a series in which we survey the basics of statistics and practical methods used in medical statistics. Arguments related to actual statistical analysis procedures will be made in subsequent papers.

  18. Volume changes of autogenous bone after sinus lifting and grafting procedures: a 6-year computerized tomographic follow-up.

    PubMed

    Sbordone, Carolina; Toti, Paolo; Guidetti, Franco; Califano, Luigi; Bufo, Pantaleo; Sbordone, Ludovico

    2013-04-01

    To evaluate long-term bone remodelling of autografts over time (annually, for 6 years), comparing the block and particulate bone procedures for sinus floor elevation, as well as to evaluate the survival of positioned dental implants. Twenty-three sinus lift procedures with autogenous bone were performed: seven sinus lift procedures using particulate graft and 10 with block autogenous bone were performed in 17 patients. Employing a software program, pre- and post-surgical computerized tomography (CT) scans were used to compare the volume (V) and density (D) of inlay grafts over time (up to 6 years), and to determine the percentage of remaining bone (%R). All variable (V, D and %R) measurements were then compared statistically. At the 6-year survey for block form, a resorption of 21.5% was seen, whereas for particulate grafts there was a resorption of 39.2%. Both groups exhibited bone remodelling between the first and second follow-up which was significant regarding volume for the block form and regarding density for the particulate group. During the initial period of healing, the cortico-cancellous block bone grafted into the maxillary sinus underwent a negative remodelling of the volume, which is most probably due to graft cortex resorption, coupled with, primarily, an increase in density in the spongious area; for the particulate grafts, significant augmentations in density were obtained. The lack of significant differences among volumes was due to the wide degree of dispersion of the data. The rough data presented in this paper seem to support the use of a bone-block grafting procedure in maxillary sinus augmentation. Copyright © 2012 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  19. Implementation of the Rauch-Tung-Striebel Smoother for Sensor Compatibility Correction of a Fixed-Wing Unmanned Air Vehicle

    PubMed Central

    Chan, Woei-Leong; Hsiao, Fei-Bin

    2011-01-01

    This paper presents a complete procedure for sensor compatibility correction of a fixed-wing Unmanned Air Vehicle (UAV). The sensors consist of a differential air pressure transducer for airspeed measurement, two airdata vanes installed on an airdata probe for angle of attack (AoA) and angle of sideslip (AoS) measurement, and an Attitude and Heading Reference System (AHRS) that provides attitude angles, angular rates, and acceleration. The procedure is mainly based on a two pass algorithm called the Rauch-Tung-Striebel (RTS) smoother, which consists of a forward pass Extended Kalman Filter (EKF) and a backward recursion smoother. On top of that, this paper proposes the implementation of the Wiener Type Filter prior to the RTS in order to avoid the complicated process noise covariance matrix estimation. Furthermore, an easy to implement airdata measurement noise variance estimation method is introduced. The method estimates the airdata and subsequently the noise variances using the ground speed and ascent rate provided by the Global Positioning System (GPS). It incorporates the idea of data regionality by assuming that some sort of statistical relation exists between nearby data points. Root mean square deviation (RMSD) is being employed to justify the sensor compatibility. The result shows that the presented procedure is easy to implement and it improves the UAV sensor data compatibility significantly. PMID:22163819

  20. Implementation of the Rauch-Tung-Striebel smoother for sensor compatibility correction of a fixed-wing unmanned air vehicle.

    PubMed

    Chan, Woei-Leong; Hsiao, Fei-Bin

    2011-01-01

    This paper presents a complete procedure for sensor compatibility correction of a fixed-wing Unmanned Air Vehicle (UAV). The sensors consist of a differential air pressure transducer for airspeed measurement, two airdata vanes installed on an airdata probe for angle of attack (AoA) and angle of sideslip (AoS) measurement, and an Attitude and Heading Reference System (AHRS) that provides attitude angles, angular rates, and acceleration. The procedure is mainly based on a two pass algorithm called the Rauch-Tung-Striebel (RTS) smoother, which consists of a forward pass Extended Kalman Filter (EKF) and a backward recursion smoother. On top of that, this paper proposes the implementation of the Wiener Type Filter prior to the RTS in order to avoid the complicated process noise covariance matrix estimation. Furthermore, an easy to implement airdata measurement noise variance estimation method is introduced. The method estimates the airdata and subsequently the noise variances using the ground speed and ascent rate provided by the Global Positioning System (GPS). It incorporates the idea of data regionality by assuming that some sort of statistical relation exists between nearby data points. Root mean square deviation (RMSD) is being employed to justify the sensor compatibility. The result shows that the presented procedure is easy to implement and it improves the UAV sensor data compatibility significantly.

  1. SPRT Calibration Uncertainties and Internal Quality Control at a Commercial SPRT Calibration Facility

    NASA Astrophysics Data System (ADS)

    Wiandt, T. J.

    2008-06-01

    The Hart Scientific Division of the Fluke Corporation operates two accredited standard platinum resistance thermometer (SPRT) calibration facilities, one at the Hart Scientific factory in Utah, USA, and the other at a service facility in Norwich, UK. The US facility is accredited through National Voluntary Laboratory Accreditation Program (NVLAP), and the UK facility is accredited through UKAS. Both provide SPRT calibrations using similar equipment and procedures, and at similar levels of uncertainty. These uncertainties are among the lowest available commercially. To achieve and maintain low uncertainties, it is required that the calibration procedures be thorough and optimized. However, to minimize customer downtime, it is also important that the instruments be calibrated in a timely manner and returned to the customer. Consequently, subjecting the instrument to repeated calibrations or extensive repeated measurements is not a viable approach. Additionally, these laboratories provide SPRT calibration services involving a wide variety of SPRT designs. These designs behave differently, yet predictably, when subjected to calibration measurements. To this end, an evaluation strategy involving both statistical process control and internal consistency measures is utilized to provide confidence in both the instrument calibration and the calibration process. This article describes the calibration facilities, procedure, uncertainty analysis, and internal quality assurance measures employed in the calibration of SPRTs. Data will be reviewed and generalities will be presented. Finally, challenges and considerations for future improvements will be discussed.

  2. Sub-barrier radioactive ion beam investigations using a new methodology and analysis for the stacked target technique

    NASA Astrophysics Data System (ADS)

    Fisichella, M.; Shotter, A. C.; Di Pietro, A.; Figuera, P.; Lattuada, M.; Marchetta, C.; Privitera, V.; Romano, L.; Ruiz, C.; Zadro, M.

    2015-12-01

    For low energy reaction studies involving radioactive ion beams, the experimental reaction yields are generally small due to the low intensity of the beams. For this reason, the stacked target technique has been often used to measure excitation functions. This technique offers considerable advantages since the reaction cross-section at several energies can be simultaneously measured. In a further effort to increase yields, thick targets are also employed. The main disadvantage of the method is the degradation of the beam quality as it passes through the stack due to the statistical nature of energy loss processes and any nonuniformity of the stacked targets. This degradation can lead to ambiguities of associating effective beam energies to reaction product yields for the targets within the stack and, as a consequence, to an error in the determination of the excitation function for the reaction under study. A thorough investigation of these ambiguities is reported, and a best practice procedure of analyzing data obtained using the stacked target technique with radioactive ion beams is recommended. Using this procedure a re-evaluation is reported of some previously published sub-barrier fusion data in order to demonstrate the possibility of misinterpretations of derived excitation functions. In addition, this best practice procedure has been used to evaluate, from a new data set, the sub-barrier fusion excitation function for the reaction 6Li+120Sn .

  3. Omnibus risk assessment via accelerated failure time kernel machine modeling.

    PubMed

    Sinnott, Jennifer A; Cai, Tianxi

    2013-12-01

    Integrating genomic information with traditional clinical risk factors to improve the prediction of disease outcomes could profoundly change the practice of medicine. However, the large number of potential markers and possible complexity of the relationship between markers and disease make it difficult to construct accurate risk prediction models. Standard approaches for identifying important markers often rely on marginal associations or linearity assumptions and may not capture non-linear or interactive effects. In recent years, much work has been done to group genes into pathways and networks. Integrating such biological knowledge into statistical learning could potentially improve model interpretability and reliability. One effective approach is to employ a kernel machine (KM) framework, which can capture nonlinear effects if nonlinear kernels are used (Scholkopf and Smola, 2002; Liu et al., 2007, 2008). For survival outcomes, KM regression modeling and testing procedures have been derived under a proportional hazards (PH) assumption (Li and Luan, 2003; Cai, Tonini, and Lin, 2011). In this article, we derive testing and prediction methods for KM regression under the accelerated failure time (AFT) model, a useful alternative to the PH model. We approximate the null distribution of our test statistic using resampling procedures. When multiple kernels are of potential interest, it may be unclear in advance which kernel to use for testing and estimation. We propose a robust Omnibus Test that combines information across kernels, and an approach for selecting the best kernel for estimation. The methods are illustrated with an application in breast cancer. © 2013, The International Biometric Society.

  4. Distribution of guidance models for cardiac resynchronization therapy in the setting of multi-center clinical trials

    NASA Astrophysics Data System (ADS)

    Rajchl, Martin; Abhari, Kamyar; Stirrat, John; Ukwatta, Eranga; Cantor, Diego; Li, Feng P.; Peters, Terry M.; White, James A.

    2014-03-01

    Multi-center trials provide the unique ability to investigate novel techniques across a range of geographical sites with sufficient statistical power, the inclusion of multiple operators determining feasibility under a wider array of clinical environments and work-flows. For this purpose, we introduce a new means of distributing pre-procedural cardiac models for image-guided interventions across a large scale multi-center trial. In this method, a single core facility is responsible for image processing, employing a novel web-based interface for model visualization and distribution. The requirements for such an interface, being WebGL-based, are minimal and well within the realms of accessibility for participating centers. We then demonstrate the accuracy of our approach using a single-center pacemaker lead implantation trial with generic planning models.

  5. Stability assessment of QKD procedures in commercial quantum cryptography systems versus quality of dark channel

    NASA Astrophysics Data System (ADS)

    Jacak, Monika; Melniczuk, Damian; Jacak, Janusz; Jóźwiak, Ireneusz; Gruber, Jacek; Jóźwiak, Piotr

    2015-02-01

    In order to assess the susceptibility of the quantum key distribution (QKD) systems to the hacking attack including simultaneous and frequent system self-decalibrations, we analyze the stability of the QKD transmission organized in two commercially available systems. The first one employs non-entangled photons as flying qubits in the dark quantum channel for communication whereas the second one utilizes the entangled photon pairs to secretly share the cryptographic key. Applying standard methods of the statistical data analysis to the characteristic indicators of the quality of the QKD communication (the raw key exchange rate [RKER] and the quantum bit error rate [QBER]), we have estimated the pace of the self-decalibration of both systems and the repeatability rate in the case of controlled worsening of the dark channel quality.

  6. Premixing quality and flame stability: A theoretical and experimental study

    NASA Technical Reports Server (NTRS)

    Radhakrishnan, K.; Heywood, J. B.; Tabaczynski, R. J.

    1979-01-01

    Models for predicting flame ignition and blowout in a combustor primary zone are presented. A correlation for the blowoff velocity of premixed turbulent flames is developed using the basic quantities of turbulent flow, and the laminar flame speed. A statistical model employing a Monte Carlo calculation procedure is developed to account for nonuniformities in a combustor primary zone. An overall kinetic rate equation is used to describe the fuel oxidation process. The model is used to predict the lean ignition and blow out limits of premixed turbulent flames; the effects of mixture nonuniformity on the lean ignition limit are explored using an assumed distribution of fuel-air ratios. Data on the effects of variations in inlet temperature, reference velocity and mixture uniformity on the lean ignition and blowout limits of gaseous propane-air flames are presented.

  7. Fuzzy Adaptive Control for Intelligent Autonomous Space Exploration Problems

    NASA Technical Reports Server (NTRS)

    Esogbue, Augustine O.

    1998-01-01

    The principal objective of the research reported here is the re-design, analysis and optimization of our newly developed neural network fuzzy adaptive controller model for complex processes capable of learning fuzzy control rules using process data and improving its control through on-line adaption. The learned improvement is according to a performance objective function that provides evaluative feedback; this performance objective is broadly defined to meet long-range goals over time. Although fuzzy control had proven effective for complex, nonlinear, imprecisely-defined processes for which standard models and controls are either inefficient, impractical or cannot be derived, the state of the art prior to our work showed that procedures for deriving fuzzy control, however, were mostly ad hoc heuristics. The learning ability of neural networks was exploited to systematically derive fuzzy control and permit on-line adaption and in the process optimize control. The operation of neural networks integrates very naturally with fuzzy logic. The neural networks which were designed and tested using simulation software and simulated data, followed by realistic industrial data were reconfigured for application on several platforms as well as for the employment of improved algorithms. The statistical procedures of the learning process were investigated and evaluated with standard statistical procedures (such as ANOVA, graphical analysis of residuals, etc.). The computational advantage of dynamic programming-like methods of optimal control was used to permit on-line fuzzy adaptive control. Tests for the consistency, completeness and interaction of the control rules were applied. Comparisons to other methods and controllers were made so as to identify the major advantages of the resulting controller model. Several specific modifications and extensions were made to the original controller. Additional modifications and explorations have been proposed for further study. Some of these are in progress in our laboratory while others await additional support. All of these enhancements will improve the attractiveness of the controller as an effective tool for the on line control of an array of complex process environments.

  8. Planetarium instructional efficacy: A research synthesis

    NASA Astrophysics Data System (ADS)

    Brazell, Bruce D.

    The purpose of the current study was to explore the instructional effectiveness of the planetarium in astronomy education using meta-analysis. A review of the literature revealed 46 studies related to planetarium efficacy. However, only 19 of the studies satisfied selection criteria for inclusion in the meta-analysis. Selected studies were then subjected to coding procedures, which extracted information such as subject characteristics, experimental design, and outcome measures. From these data, 24 effect sizes were calculated in the area of student achievement and five effect sizes were determined in the area of student attitudes using reported statistical information. Mean effect sizes were calculated for both the achievement and the attitude distributions. Additionally, each effect size distribution was subjected to homogeneity analysis. The attitude distribution was found to be homogeneous with a mean effect size of -0.09, which was not significant, p = .2535. The achievement distribution was found to be heterogeneous with a statistically significant mean effect size of +0.28, p < .05. Since the achievement distribution was heterogeneous, the analog to the ANOVA procedure was employed to explore variability in this distribution in terms of the coded variables. The analog to the ANOVA procedure revealed that the variability introduced by the coded variables did not fully explain the variability in the achievement distribution beyond subject-level sampling error under a fixed effects model. Therefore, a random effects model analysis was performed which resulted in a mean effect size of +0.18, which was not significant, p = .2363. However, a large random effect variance component was determined indicating that the differences between studies were systematic and yet to be revealed. The findings of this meta-analysis showed that the planetarium has been an effective instructional tool in astronomy education in terms of student achievement. However, the meta-analysis revealed that the planetarium has not been a very effective tool for improving student attitudes towards astronomy.

  9. Accounting for model error in Bayesian solutions to hydrogeophysical inverse problems using a local basis approach

    NASA Astrophysics Data System (ADS)

    Irving, J.; Koepke, C.; Elsheikh, A. H.

    2017-12-01

    Bayesian solutions to geophysical and hydrological inverse problems are dependent upon a forward process model linking subsurface parameters to measured data, which is typically assumed to be known perfectly in the inversion procedure. However, in order to make the stochastic solution of the inverse problem computationally tractable using, for example, Markov-chain-Monte-Carlo (MCMC) methods, fast approximations of the forward model are commonly employed. This introduces model error into the problem, which has the potential to significantly bias posterior statistics and hamper data integration efforts if not properly accounted for. Here, we present a new methodology for addressing the issue of model error in Bayesian solutions to hydrogeophysical inverse problems that is geared towards the common case where these errors cannot be effectively characterized globally through some parametric statistical distribution or locally based on interpolation between a small number of computed realizations. Rather than focusing on the construction of a global or local error model, we instead work towards identification of the model-error component of the residual through a projection-based approach. In this regard, pairs of approximate and detailed model runs are stored in a dictionary that grows at a specified rate during the MCMC inversion procedure. At each iteration, a local model-error basis is constructed for the current test set of model parameters using the K-nearest neighbour entries in the dictionary, which is then used to separate the model error from the other error sources before computing the likelihood of the proposed set of model parameters. We demonstrate the performance of our technique on the inversion of synthetic crosshole ground-penetrating radar traveltime data for three different subsurface parameterizations of varying complexity. The synthetic data are generated using the eikonal equation, whereas a straight-ray forward model is assumed in the inversion procedure. In each case, the developed model-error approach enables to remove posterior bias and obtain a more realistic characterization of uncertainty.

  10. 29 CFR 1626.2 - Terms defined in the Age Discrimination in Employment Act of 1967, as amended.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 29 Labor 4 2010-07-01 2010-07-01 false Terms defined in the Age Discrimination in Employment Act of 1967, as amended. 1626.2 Section 1626.2 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN EMPLOYMENT ACT § 1626.2 Terms defined in...

  11. 29 CFR 1626.2 - Terms defined in the Age Discrimination in Employment Act of 1967, as amended.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 29 Labor 4 2012-07-01 2012-07-01 false Terms defined in the Age Discrimination in Employment Act of 1967, as amended. 1626.2 Section 1626.2 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN EMPLOYMENT ACT § 1626.2 Terms defined in...

  12. 29 CFR 1626.2 - Terms defined in the Age Discrimination in Employment Act of 1967, as amended.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 29 Labor 4 2013-07-01 2013-07-01 false Terms defined in the Age Discrimination in Employment Act of 1967, as amended. 1626.2 Section 1626.2 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN EMPLOYMENT ACT § 1626.2 Terms defined in...

  13. 29 CFR 1626.2 - Terms defined in the Age Discrimination in Employment Act of 1967, as amended.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 29 Labor 4 2011-07-01 2011-07-01 false Terms defined in the Age Discrimination in Employment Act of 1967, as amended. 1626.2 Section 1626.2 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN EMPLOYMENT ACT § 1626.2 Terms defined in...

  14. 29 CFR 1626.2 - Terms defined in the Age Discrimination in Employment Act of 1967, as amended.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 4 2014-07-01 2014-07-01 false Terms defined in the Age Discrimination in Employment Act of 1967, as amended. 1626.2 Section 1626.2 Labor Regulations Relating to Labor (Continued) EQUAL EMPLOYMENT OPPORTUNITY COMMISSION PROCEDURES-AGE DISCRIMINATION IN EMPLOYMENT ACT § 1626.2 Terms defined in...

  15. An Agency with a Mind of Its Own: The EEOC's Guidelines on Employment Testing.

    ERIC Educational Resources Information Center

    Lyons, Phil

    1985-01-01

    Analyzes the legislative and judicial background of the Equal Employment Opportunities Commission (EEOC)'s 1978 Uniform Guidelines on Employment Selection Procedures. Argues that the guidelines' apparent objectives are at odds with those of the legislators who created the EEOC in the 1960s: the guidelines force employers to discriminate by race.…

  16. 4 CFR 28.47 - Motion to quash.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 4 Accounts 1 2010-01-01 2010-01-01 false Motion to quash. 28.47 Section 28.47 Accounts GOVERNMENT ACCOUNTABILITY OFFICE GENERAL PROCEDURES GOVERNMENT ACCOUNTABILITY OFFICE PERSONNEL APPEALS BOARD; PROCEDURES APPLICABLE TO CLAIMS CONCERNING EMPLOYMENT PRACTICES AT THE GOVERNMENT ACCOUNTABILITY OFFICE Procedures...

  17. An Empirical Investigation of Methods for Assessing Item Fit for Mixed Format Tests

    ERIC Educational Resources Information Center

    Chon, Kyong Hee; Lee, Won-Chan; Ansley, Timothy N.

    2013-01-01

    Empirical information regarding performance of model-fit procedures has been a persistent need in measurement practice. Statistical procedures for evaluating item fit were applied to real test examples that consist of both dichotomously and polytomously scored items. The item fit statistics used in this study included the PARSCALE's G[squared],…

  18. An automated approach to the design of decision tree classifiers

    NASA Technical Reports Server (NTRS)

    Argentiero, P.; Chin, R.; Beaudet, P.

    1982-01-01

    An automated technique is presented for designing effective decision tree classifiers predicated only on a priori class statistics. The procedure relies on linear feature extractions and Bayes table look-up decision rules. Associated error matrices are computed and utilized to provide an optimal design of the decision tree at each so-called 'node'. A by-product of this procedure is a simple algorithm for computing the global probability of correct classification assuming the statistical independence of the decision rules. Attention is given to a more precise definition of decision tree classification, the mathematical details on the technique for automated decision tree design, and an example of a simple application of the procedure using class statistics acquired from an actual Landsat scene.

  19. Biostatistical analysis of quantitative immunofluorescence microscopy images.

    PubMed

    Giles, C; Albrecht, M A; Lam, V; Takechi, R; Mamo, J C

    2016-12-01

    Semiquantitative immunofluorescence microscopy has become a key methodology in biomedical research. Typical statistical workflows are considered in the context of avoiding pseudo-replication and marginalising experimental error. However, immunofluorescence microscopy naturally generates hierarchically structured data that can be leveraged to improve statistical power and enrich biological interpretation. Herein, we describe a robust distribution fitting procedure and compare several statistical tests, outlining their potential advantages/disadvantages in the context of biological interpretation. Further, we describe tractable procedures for power analysis that incorporates the underlying distribution, sample size and number of images captured per sample. The procedures outlined have significant potential for increasing understanding of biological processes and decreasing both ethical and financial burden through experimental optimization. © 2016 The Authors Journal of Microscopy © 2016 Royal Microscopical Society.

  20. Treatment of control data in lunar phototriangulation. [application of statistical procedures and development of mathematical and computer techniques

    NASA Technical Reports Server (NTRS)

    Wong, K. W.

    1974-01-01

    In lunar phototriangulation, there is a complete lack of accurate ground control points. The accuracy analysis of the results of lunar phototriangulation must, therefore, be completely dependent on statistical procedure. It was the objective of this investigation to examine the validity of the commonly used statistical procedures, and to develop both mathematical techniques and computer softwares for evaluating (1) the accuracy of lunar phototriangulation; (2) the contribution of the different types of photo support data on the accuracy of lunar phototriangulation; (3) accuracy of absolute orientation as a function of the accuracy and distribution of both the ground and model points; and (4) the relative slope accuracy between any triangulated pass points.

Top