Sample records for analysis procedures based

  1. The impact of functional analysis methodology on treatment choice for self-injurious and aggressive behavior.

    PubMed Central

    Pelios, L; Morren, J; Tesch, D; Axelrod, S

    1999-01-01

    Self-injurious behavior (SIB) and aggression have been the concern of researchers because of the serious impact these behaviors have on individuals' lives. Despite the plethora of research on the treatment of SIB and aggressive behavior, the reported findings have been inconsistent regarding the effectiveness of reinforcement-based versus punishment-based procedures. We conducted a literature review to determine whether a trend could be detected in researchers' selection of reinforcement-based procedures versus punishment-based procedures, particularly since the introduction of functional analysis to behavioral assessment. The data are consistent with predictions made in the past regarding the potential impact of functional analysis methodology. Specifically, the findings indicate that, once maintaining variables for problem behavior are identified, experimenters tend to choose reinforcement-based procedures rather than punishment-based procedures as treatment for both SIB and aggressive behavior. Results indicated an increased interest in studies on the treatment of SIB and aggressive behavior, particularly since 1988. PMID:10396771

  2. Effects of Computer-Based Training on Procedural Modifications to Standard Functional Analyses

    ERIC Educational Resources Information Center

    Schnell, Lauren K.; Sidener, Tina M.; DeBar, Ruth M.; Vladescu, Jason C.; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to…

  3. Cost Analysis of an Office-based Surgical Suite

    PubMed Central

    LaBove, Gabrielle

    2016-01-01

    Introduction: Operating costs are a significant part of delivering surgical care. Having a system to analyze these costs is imperative for decision making and efficiency. We present an analysis of surgical supply, labor and administrative costs, and remuneration of procedures as a means for a practice to analyze their cost effectiveness; this affects the quality of care based on the ability to provide services. The costs of surgical care cannot be estimated blindly as reconstructive and cosmetic procedures have different percentages of overhead. Methods: A detailed financial analysis of office-based surgical suite costs for surgical procedures was determined based on company contract prices and average use of supplies. The average time spent on scheduling, prepping, and doing the surgery was factored using employee rates. Results: The most expensive, minor procedure supplies are suture needles. The 4 most common procedures from the most expensive to the least are abdominoplasty, breast augmentation, facelift, and lipectomy. Conclusions: Reconstructive procedures require a greater portion of collection to cover costs. Without the adjustment of both patient and insurance remuneration in the practice, the ability to provide quality care will be increasingly difficult. PMID:27536482

  4. Finding Groups Using Model-Based Cluster Analysis: Heterogeneous Emotional Self-Regulatory Processes and Heavy Alcohol Use Risk

    ERIC Educational Resources Information Center

    Mun, Eun Young; von Eye, Alexander; Bates, Marsha E.; Vaschillo, Evgeny G.

    2008-01-01

    Model-based cluster analysis is a new clustering procedure to investigate population heterogeneity utilizing finite mixture multivariate normal densities. It is an inferentially based, statistically principled procedure that allows comparison of nonnested models using the Bayesian information criterion to compare multiple models and identify the…

  5. An Examination of the Effects of a Video-Based Training Package on Professional Staff's Implementation of a Brief Functional Analysis and Data Analysis

    ERIC Educational Resources Information Center

    Fleming, Courtney V.

    2011-01-01

    Minimal research has investigated training packages used to teach professional staff how to implement functional analysis procedures and to interpret data gathered during functional analysis. The current investigation used video-based training with role-play and feedback to teach six professionals in a clinical setting to implement procedures of a…

  6. A spin column-free approach to sodium hydroxide-based glycan permethylation.

    PubMed

    Hu, Yueming; Borges, Chad R

    2017-07-24

    Glycan permethylation was introduced as a tool to facilitate the study of glycans in 1903. Since that time, permethylation procedures have been continually modified to improve permethylation efficiency and qualitative applicability. Typically, however, either laborious preparation steps or cumbersome and uneconomical spin columns have been needed to obtain decent permethylation yields on small glycan samples. Here we describe a spin column-free (SCF) glycan permethylation procedure that is applicable to both O- and N-linked glycans and can be employed upstream to intact glycan analysis by MALDI-MS, ESI-MS, or glycan linkage analysis by GC-MS. The SCF procedure involves neutralization of NaOH beads by acidified phosphate buffer, which eliminates the risk of glycan oxidative degradation and avoids the use of spin columns. Optimization of the new permethylation procedure provided high permethylation efficiency for both hexose (>98%) and HexNAc (>99%) residues-yields which were comparable to (or better than) those of some widely-used spin column-based procedures. A light vs. heavy labelling approach was employed to compare intact glycan yields from a popular spin-column based approach to the SCF approach. Recovery of intact N-glycans was significantly better with the SCF procedure (p < 0.05), but overall yield of O-glycans was similar or slightly diminished (p < 0.05 for tetrasaccharides or smaller). When the SCF procedure was employed upstream to hydrolysis, reduction and acetylation for glycan linkage analysis of pooled glycans from unfractionated blood plasma, analytical reproducibility was on par with that from previous spin column-based "glycan node" analysis results. When applied to blood plasma samples from stage III-IV breast cancer patients (n = 20) and age-matched controls (n = 20), the SCF procedure facilitated identification of three glycan nodes with significantly different distributions between the cases and controls (ROC c-statistics > 0.75; p < 0.01). In summary, the SCF permethylation procedure expedites and economizes both intact glycan analysis and linkage analysis of glycans from whole biospecimens.

  7. A spin column-free approach to sodium hydroxide-based glycan permethylation†

    PubMed Central

    Hu, Yueming; Borges, Chad R.

    2018-01-01

    Glycan permethylation was introduced as a tool to facilitate the study of glycans in 1903. Since that time, permethylation procedures have been continually modified to improve permethylation efficiency and qualitative applicability. Typically, however, either laborious preparation steps or cumbersome and uneconomical spin columns have been needed to obtain decent permethylation yields on small glycan samples. Here we describe a spin column-free (SCF) glycan permethylation procedure that is applicable to both O- and N-linked glycans and can be employed upstream to intact glycan analysis by MALDI-MS, ESI-MS, or glycan linkage analysis by GC-MS. The SCF procedure involves neutralization of NaOH beads by acidified phosphate buffer, which eliminates the risk of glycan oxidative degradation and avoids the use of spin columns. Optimization of the new permethylation procedure provided high permethylation efficiency for both hexose (>98%) and HexNAc (>99%) residues—yields which were comparable to (or better than) those of some widely-used spin column-based procedures. A light vs. heavy labelling approach was employed to compare intact glycan yields from a popular spin-column based approach to the SCF approach. Recovery of intact N-glycans was significantly better with the SCF procedure (p < 0.05), but overall yield of O-glycans was similar or slightly diminished (p < 0.05 for tetrasaccharides or smaller). When the SCF procedure was employed upstream to hydrolysis, reduction and acetylation for glycan linkage analysis of pooled glycans from unfractionated blood plasma, analytical reproducibility was on par with that from previous spin column-based “glycan node” analysis results. When applied to blood plasma samples from stage III–IV breast cancer patients (n = 20) and age-matched controls (n = 20), the SCF procedure facilitated identification of three glycan nodes with significantly different distributions between the cases and controls (ROC c-statistics > 0.75; p < 0.01). In summary, the SCF permethylation procedure expedites and economizes both intact glycan analysis and linkage analysis of glycans from whole biospecimens. PMID:28635997

  8. Ordered weighted averaging with fuzzy quantifiers: GIS-based multicriteria evaluation for land-use suitability analysis

    NASA Astrophysics Data System (ADS)

    Malczewski, Jacek

    2006-12-01

    The objective of this paper is to incorporate the concept of fuzzy (linguistic) quantifiers into the GIS-based land suitability analysis via ordered weighted averaging (OWA). OWA is a multicriteria evaluation procedure (or combination operator). The nature of the OWA procedure depends on some parameters, which can be specified by means of fuzzy (linguistic) quantifiers. By changing the parameters, OWA can generate a wide range of decision strategies or scenarios. The quantifier-guided OWA procedure is illustrated using land-use suitability analysis in a region of Mexico.

  9. Geometric Analysis of Wing Sections

    DOT National Transportation Integrated Search

    1995-04-01

    This paper describes a new geometric analysis procedure for wing sections. This procedure is based on the normal mode analysis for continuous functions. A set of special shape functions is introduced to represent the geometry of the wing section. The...

  10. Meta-Analysis of Criterion Validity for Curriculum-Based Measurement in Written Language

    ERIC Educational Resources Information Center

    Romig, John Elwood; Therrien, William J.; Lloyd, John W.

    2017-01-01

    We used meta-analysis to examine the criterion validity of four scoring procedures used in curriculum-based measurement of written language. A total of 22 articles representing 21 studies (N = 21) met the inclusion criteria. Results indicated that two scoring procedures, correct word sequences and correct minus incorrect sequences, have acceptable…

  11. The prediction of acoustical particle motion using an efficient polynomial curve fit procedure

    NASA Technical Reports Server (NTRS)

    Marshall, S. E.; Bernhard, R.

    1984-01-01

    A procedure is examined whereby the acoustic model parameters, natural frequencies and mode shapes, in the cavities of transportation vehicles are determined experimentally. The acoustic model shapes are described in terms of the particle motion. The acoustic modal analysis procedure is tailored to existing minicomputer based spectral analysis systems.

  12. Hierarchical Factoring Based On Image Analysis And Orthoblique Rotations.

    PubMed

    Stankov, L

    1979-07-01

    The procedure for hierarchical factoring suggested by Schmid and Leiman (1957) is applied within the framework of image analysis and orthoblique rotational procedures. It is shown that this approach necessarily leads to correlated higher order factors. Also, one can obtain a smaller number of factors than produced by typical hierarchical procedures.

  13. Randomization Procedures Applied to Analysis of Ballistic Data

    DTIC Science & Technology

    1991-06-01

    test,;;15. NUMBER OF PAGES data analysis; computationally intensive statistics ; randomization tests; permutation tests; 16 nonparametric statistics ...be 0.13. 8 Any reasonable statistical procedure would fail to support the notion of improvement of dynamic over standard indexing based on this data ...AD-A238 389 TECHNICAL REPORT BRL-TR-3245 iBRL RANDOMIZATION PROCEDURES APPLIED TO ANALYSIS OF BALLISTIC DATA MALCOLM S. TAYLOR BARRY A. BODT - JUNE

  14. EFFECTS OF AQUATIC HUMIC SUBSTANCES ON ANALYSIS FOR HYDROGEN PEROXIDE USING PEROXIDASE-CATALYZED OXIDATIONS OF TRIARYLMETHANES OR P-HYDROXYPENYLACETIC ACID (JOURNAL VERSION)

    EPA Science Inventory

    A sensitive procedure is described for trace analysis of hydrogen peroxide in water. The process involves the peroxide-catalyzed oxidation of the leuco forms of two dyes, crystal violet and malachite green. The sensitivity of this procedure, as well as of another procedure based ...

  15. Using GOMS models and hypertext to create representations of medical procedures for online display

    NASA Technical Reports Server (NTRS)

    Gugerty, Leo; Halgren, Shannon; Gosbee, John; Rudisill, Marianne

    1991-01-01

    This study investigated two methods to improve organization and presentation of computer-based medical procedures. A literature review suggested that the GOMS (goals, operators, methods, and selecton rules) model can assist in rigorous task analysis, which can then help generate initial design ideas for the human-computer interface. GOMS model are hierarchical in nature, so this study also investigated the effect of hierarchical, hypertext interfaces. We used a 2 x 2 between subjects design, including the following independent variables: procedure organization - GOMS model based vs. medical-textbook based; navigation type - hierarchical vs. linear (booklike). After naive subjects studies the online procedures, measures were taken of their memory for the content and the organization of the procedures. This design was repeated for two medical procedures. For one procedure, subjects who studied GOMS-based and hierarchical procedures remembered more about the procedures than other subjects. The results for the other procedure were less clear. However, data for both procedures showed a 'GOMSification effect'. That is, when asked to do a free recall of a procedure, subjects who had studies a textbook procedure often recalled key information in a location inconsistent with the procedure they actually studied, but consistent with the GOMS-based procedure.

  16. Application of a trigonometric finite difference procedure to numerical analysis of compressive and shear buckling of orthotropic panels

    NASA Technical Reports Server (NTRS)

    Stein, M.; Housner, J. D.

    1978-01-01

    A numerical analysis developed for the buckling of rectangular orthotropic layered panels under combined shear and compression is described. This analysis uses a central finite difference procedure based on trigonometric functions instead of using the conventional finite differences which are based on polynomial functions. Inasmuch as the buckle mode shape is usually trigonometric in nature, the analysis using trigonometric finite differences can be made to exhibit a much faster convergence rate than that using conventional differences. Also, the trigonometric finite difference procedure leads to difference equations having the same form as conventional finite differences; thereby allowing available conventional finite difference formulations to be converted readily to trigonometric form. For two-dimensional problems, the procedure introduces two numerical parameters into the analysis. Engineering approaches for the selection of these parameters are presented and the analysis procedure is demonstrated by application to several isotropic and orthotropic panel buckling problems. Among these problems is the shear buckling of stiffened isotropic and filamentary composite panels in which the stiffener is broken. Results indicate that a break may degrade the effect of the stiffener to the extent that the panel will not carry much more load than if the stiffener were absent.

  17. Effects of computer-based training on procedural modifications to standard functional analyses.

    PubMed

    Schnell, Lauren K; Sidener, Tina M; DeBar, Ruth M; Vladescu, Jason C; Kahng, SungWoo

    2018-01-01

    Few studies have evaluated methods for training decision-making when functional analysis data are undifferentiated. The current study evaluated computer-based training to teach 20 graduate students to arrange functional analysis conditions, analyze functional analysis data, and implement procedural modifications. Participants were exposed to training materials using interactive software during a 1-day session. Following the training, mean scores on the posttest, novel cases probe, and maintenance probe increased for all participants. These results replicate previous findings during a 1-day session and include a measure of participant acceptability of the training. Recommendations for future research on computer-based training and functional analysis are discussed. © 2017 Society for the Experimental Analysis of Behavior.

  18. Modal-pushover-based ground-motion scaling procedure

    USGS Publications Warehouse

    Kalkan, Erol; Chopra, Anil K.

    2011-01-01

    Earthquake engineering is increasingly using nonlinear response history analysis (RHA) to demonstrate the performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. This paper presents a modal-pushover-based scaling (MPS) procedure to scale ground motions for use in a nonlinear RHA of buildings. In the MPS method, the ground motions are scaled to match to a specified tolerance, a target value of the inelastic deformation of the first-mode inelastic single-degree-of-freedom (SDF) system whose properties are determined by the first-mode pushover analysis. Appropriate for first-mode dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-mode SDF systems in selecting a subset of the scaled ground motions. Based on results presented for three actual buildings-4, 6, and 13-story-the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.

  19. Evaluation of modal pushover-based scaling of one component of ground motion: Tall buildings

    USGS Publications Warehouse

    Kalkan, Erol; Chopra, Anil K.

    2012-01-01

    Nonlinear response history analysis (RHA) is now increasingly used for performance-based seismic design of tall buildings. Required for nonlinear RHAs is a set of ground motions selected and scaled appropriately so that analysis results would be accurate (unbiased) and efficient (having relatively small dispersion). This paper evaluates accuracy and efficiency of recently developed modal pushover–based scaling (MPS) method to scale ground motions for tall buildings. The procedure presented explicitly considers structural strength and is based on the standard intensity measure (IM) of spectral acceleration in a form convenient for evaluating existing structures or proposed designs for new structures. Based on results presented for two actual buildings (19 and 52 stories, respectively), it is demonstrated that the MPS procedure provided a highly accurate estimate of the engineering demand parameters (EDPs), accompanied by significantly reduced record-to-record variability of the responses. In addition, the MPS procedure is shown to be superior to the scaling procedure specified in the ASCE/SEI 7-05 document.

  20. Multivariate Cluster Analysis.

    ERIC Educational Resources Information Center

    McRae, Douglas J.

    Procedures for grouping students into homogeneous subsets have long interested educational researchers. The research reported in this paper is an investigation of a set of objective grouping procedures based on multivariate analysis considerations. Four multivariate functions that might serve as criteria for adequate grouping are given and…

  1. [Evidence based medicine and cost-effectiveness analysis in ophthalmology].

    PubMed

    Nováková, D; Rozsíval, P

    2004-09-01

    To make the reader familiar with the term evidence based medicine (EBM), to explain the principle of cost-effectiveness analysis (price-profit), and to show its usefulness to compare the effectiveness of different medical procedures. Based on few examples, in this article the relevance and calculation of important parameters of cost-effectiveness analysis (CE), as utility value (UV), quality adjusted life years (QALY) is explained. In addition, calculation of UV and QALY for the cataract surgery, including its complications, is provided. According to this method, laser photocoagulation and cryocoagulation of the early stages of retinopathy of prematurity, treatment of amblyopia, cataract surgery of one or both eyes, from the vitreoretinal procedures the early vitrectomy in cases of hemophtalmus in proliferative diabetic retinopathy or grid laser photocoagulation in diabetic macular edema or worsening of the visual acuity due to the branch retinal vein occlusion belong to highly effective procedures. On the other hand, to the procedures with low cost effectiveness belongs the treating of the central retinal artery occlusion with anterior chamber paracentesis, as well as with CO2 inhalation, or photodynamic therapy in choroidal neovascularization in age-related macular degeneration with visual acuity of the better eye 20/200. Cost-effectiveness analysis is a new perspective method evaluating successfulness of medical procedure comparing the final effect with the financial costs. In evaluation of effectiveness of individual procedures, three main aspects are considered: subjective feeling of influence of the disease on the patient's life, objective results of clinical examination and financial costs of the procedure. According to this method, the cataract surgery, as well as procedures in the pediatric ophthalmology belong to the most effective surgical methods.

  2. Factor Retention in Exploratory Factor Analysis: A Comparison of Alternative Methods.

    ERIC Educational Resources Information Center

    Mumford, Karen R.; Ferron, John M.; Hines, Constance V.; Hogarty, Kristine Y.; Kromrey, Jeffery D.

    This study compared the effectiveness of 10 methods of determining the number of factors to retain in exploratory common factor analysis. The 10 methods included the Kaiser rule and a modified Kaiser criterion, 3 variations of parallel analysis, 4 regression-based variations of the scree procedure, and the minimum average partial procedure. The…

  3. Nomarski differential interference contrast microscopy for surface slope measurements: an examination of techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hartman, J.S.; Gordon, R.L.; Lessor, D.L.

    1981-08-01

    Alternate measurement and data analysis procedures are discussed and compared for the application of reflective Nomarski differential interference contrast microscopy for the determination of surface slopes. The discussion includes the interpretation of a previously reported iterative procedure using the results of a detailed optical model and the presentation of a new procedure based on measured image intensity extrema. Surface slope determinations from these procedures are presented and compared with results from a previously reported curve fit analysis of image intensity data. The accuracy and advantages of the different procedures are discussed.

  4. An XML Representation for Crew Procedures

    NASA Technical Reports Server (NTRS)

    Simpson, Richard C.

    2005-01-01

    NASA ensures safe operation of complex systems through the use of formally-documented procedures, which encode the operational knowledge of the system as derived from system experts. Crew members use procedure documentation on the ground for training purposes and on-board space shuttle and space station to guide their activities. Investigators at JSC are developing a new representation for procedures that is content-based (as opposed to display-based). Instead of specifying how a procedure should look on the printed page, the content-based representation will identify the components of a procedure and (more importantly) how the components are related (e.g., how the activities within a procedure are sequenced; what resources need to be available for each activity). This approach will allow different sets of rules to be created for displaying procedures on a computer screen, on a hand-held personal digital assistant (PDA), verbally, or on a printed page, and will also allow intelligent reasoning processes to automatically interpret and use procedure definitions. During his NASA fellowship, Dr. Simpson examined how various industries represent procedures (also called business processes or workflows), in areas such as manufacturing, accounting, shipping, or customer service. A useful method for designing and evaluating workflow representation languages is by determining their ability to encode various workflow patterns, which depict abstract relationships between the components of a procedure removed from the context of a specific procedure or industry. Investigators have used this type of analysis to evaluate how well-suited existing workflow representation languages are for various industries based on the workflow patterns that commonly arise across industry-specific procedures. Based on this type of analysis, it is already clear that existing workflow representations capture discrete flow of control (i.e., when one activity should start and stop based on when other activities start and stop), but do not capture the flow of data, materials, resources or priorities. Existing workflow representation languages are also limited to representing sequences of discrete activities, and cannot encode procedures involving continuous flow of information or materials between activities.

  5. Evaluation of flaws in carbon steel piping. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zahoor, A.; Gamble, R.M.; Mehta, H.S.

    1986-10-01

    The objective of this program was to develop flaw evaluation procedures and allowable flaw sizes for ferritic piping used in light water reactor (LWR) power generation facilities. The program results provide relevant ASME Code groups with the information necessary to define flaw evaluation procedures, allowable flaw sizes, and their associated bases for Section XI of the code. Because there are several possible flaw-related failure modes for ferritic piping over the LWR operating temperature range, three analysis methods were employed to develop the evaluation procedures. These include limit load analysis for plastic collapse, elastic plastic fracture mechanics (EPFM) analysis for ductilemore » tearing, and linear elastic fracture mechanics (LEFM) analysis for non ductile crack extension. To ensure the appropriate analysis method is used in an evaluation, a step by step procedure also is provided to identify the relevant acceptance standard or procedure on a case by case basis. The tensile strength and toughness properties required to complete the flaw evaluation for any of the three analysis methods are included in the evaluation procedure. The flaw evaluation standards are provided in tabular form for the plastic collapse and ductile tearing modes, where the allowable part through flaw depth is defined as a function of load and flaw length. For non ductile crack extension, linear elastic fracture mechanics analysis methods, similar to those in Appendix A of Section XI, are defined. Evaluation flaw sizes and procedures are developed for both longitudinal and circumferential flaw orientations and normal/upset and emergency/faulted operating conditions. The tables are based on margins on load of 2.77 and 1.39 for circumferential flaws and 3.0 and 1.5 for longitudinal flaws for normal/upset and emergency/faulted conditions, respectively.« less

  6. Quantitative analysis of crystalline pharmaceuticals in powders and tablets by a pattern-fitting procedure using X-ray powder diffraction data.

    PubMed

    Yamamura, S; Momose, Y

    2001-01-16

    A pattern-fitting procedure for quantitative analysis of crystalline pharmaceuticals in solid dosage forms using X-ray powder diffraction data is described. This method is based on a procedure for pattern-fitting in crystal structure refinement, and observed X-ray scattering intensities were fitted to analytical expressions including some fitting parameters, i.e. scale factor, peak positions, peak widths and degree of preferred orientation of the crystallites. All fitting parameters were optimized by the non-linear least-squares procedure. Then the weight fraction of each component was determined from the optimized scale factors. In the present study, well-crystallized binary systems, zinc oxide-zinc sulfide (ZnO-ZnS) and salicylic acid-benzoic acid (SA-BA), were used as the samples. In analysis of the ZnO-ZnS system, the weight fraction of ZnO or ZnS could be determined quantitatively in the range of 5-95% in the case of both powders and tablets. In analysis of the SA-BA systems, the weight fraction of SA or BA could be determined quantitatively in the range of 20-80% in the case of both powders and tablets. Quantitative analysis applying this pattern-fitting procedure showed better reproducibility than other X-ray methods based on the linear or integral intensities of particular diffraction peaks. Analysis using this pattern-fitting procedure also has the advantage that the preferred orientation of the crystallites in solid dosage forms can be also determined in the course of quantitative analysis.

  7. Effectiveness of internet-based affect induction procedures: A systematic review and meta-analysis.

    PubMed

    Ferrer, Rebecca A; Grenen, Emily G; Taber, Jennifer M

    2015-12-01

    Procedures used to induce affect in a laboratory are effective and well-validated. Given recent methodological and technological advances in Internet research, it is important to determine whether affect can be effectively induced using Internet methodology. We conducted a meta-analysis and systematic review of prior research that has used Internet-based affect induction procedures, and examined potential moderators of the effectiveness of affect induction procedures. Twenty-six studies were included in final analyses, with 89 independent effect sizes. Affect induction procedures effectively induced general positive affect, general negative affect, fear, disgust, anger, sadness, and guilt, but did not significantly induce happiness. Contamination of other nontarget affect did not appear to be a major concern. Video inductions resulted in greater effect sizes. Overall, results indicate that affect can be effectively induced in Internet studies, suggesting an important venue for the acceleration of affective science. (PsycINFO Database Record (c) 2015 APA, all rights reserved).

  8. Adaptive graph-based multiple testing procedures

    PubMed Central

    Klinglmueller, Florian; Posch, Martin; Koenig, Franz

    2016-01-01

    Multiple testing procedures defined by directed, weighted graphs have recently been proposed as an intuitive visual tool for constructing multiple testing strategies that reflect the often complex contextual relations between hypotheses in clinical trials. Many well-known sequentially rejective tests, such as (parallel) gatekeeping tests or hierarchical testing procedures are special cases of the graph based tests. We generalize these graph-based multiple testing procedures to adaptive trial designs with an interim analysis. These designs permit mid-trial design modifications based on unblinded interim data as well as external information, while providing strong family wise error rate control. To maintain the familywise error rate, it is not required to prespecify the adaption rule in detail. Because the adaptive test does not require knowledge of the multivariate distribution of test statistics, it is applicable in a wide range of scenarios including trials with multiple treatment comparisons, endpoints or subgroups, or combinations thereof. Examples of adaptations are dropping of treatment arms, selection of subpopulations, and sample size reassessment. If, in the interim analysis, it is decided to continue the trial as planned, the adaptive test reduces to the originally planned multiple testing procedure. Only if adaptations are actually implemented, an adjusted test needs to be applied. The procedure is illustrated with a case study and its operating characteristics are investigated by simulations. PMID:25319733

  9. Characterizing the Experimental Procedure in Science Laboratories: A preliminary step towards students experimental design

    NASA Astrophysics Data System (ADS)

    Girault, Isabelle; d'Ham, Cedric; Ney, Muriel; Sanchez, Eric; Wajeman, Claire

    2012-04-01

    Many studies have stressed students' lack of understanding of experiments in laboratories. Some researchers suggest that if students design all or parts of entire experiment, as part of an inquiry-based approach, it would overcome certain difficulties. It requires that a procedure be written for experimental design. The aim of this paper is to describe the characteristics of a procedure in science laboratories, in an educational context. As a starting point, this paper proposes a model in the form of a hierarchical task diagram that gives the general structure of any procedure. This model allows both the analysis of existing procedures and the design of a new inquiry-based approach. The obtained characteristics are further organized into criteria that can help both teachers and students assess a procedure during and after its writing. These results are obtained through two different sets of data. First, the characteristics of procedures are established by analysing laboratory manuals. This allows the organization and type of information in procedures to be defined. This analysis reveals that students are seldom asked to write a full procedure, but sometimes have to specify tasks within a procedure. Secondly, iterative interviews are undertaken with teachers. This leads to the list of criteria to evaluate the procedure.

  10. Strengthening safety compliance in nuclear power operations: a role-based approach.

    PubMed

    Martínez-Córcoles, Mario; Gracia, Francisco J; Tomás, Inés; Peiró, José M

    2014-07-01

    Safety compliance is of paramount importance in guaranteeing the safe running of nuclear power plants. However, it depends mostly on procedures that do not always involve the safest outcomes. This article introduces an empirical model based on the organizational role theory to analyze the influence of legitimate sources of expectations (procedures formalization and leadership) on workers' compliance behaviors. The sample was composed of 495 employees from two Spanish nuclear power plants. Structural equation analysis showed that, in spite of some problematic effects of proceduralization (such as role conflict and role ambiguity), procedure formalization along with an empowering leadership style lead to safety compliance by clarifying a worker's role in safety. Implications of these findings for safety research are outlined, as well as their practical implications. © 2014 Society for Risk Analysis.

  11. MATLAB-implemented estimation procedure for model-based assessment of hepatic insulin degradation from standard intravenous glucose tolerance test data.

    PubMed

    Di Nardo, Francesco; Mengoni, Michele; Morettini, Micaela

    2013-05-01

    Present study provides a novel MATLAB-based parameter estimation procedure for individual assessment of hepatic insulin degradation (HID) process from standard frequently-sampled intravenous glucose tolerance test (FSIGTT) data. Direct access to the source code, offered by MATLAB, enabled us to design an optimization procedure based on the alternating use of Gauss-Newton's and Levenberg-Marquardt's algorithms, which assures the full convergence of the process and the containment of computational time. Reliability was tested by direct comparison with the application, in eighteen non-diabetic subjects, of well-known kinetic analysis software package SAAM II, and by application on different data. Agreement between MATLAB and SAAM II was warranted by intraclass correlation coefficients ≥0.73; no significant differences between corresponding mean parameter estimates and prediction of HID rate; and consistent residual analysis. Moreover, MATLAB optimization procedure resulted in a significant 51% reduction of CV% for the worst-estimated parameter by SAAM II and in maintaining all model-parameter CV% <20%. In conclusion, our MATLAB-based procedure was suggested as a suitable tool for the individual assessment of HID process. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  12. An analysis of ratings: A guide to RMRATE

    Treesearch

    Thomas C. Brown; Terry C. Daniel; Herbert W. Schroeder; Glen E. Brink

    1990-01-01

    This report describes RMRATE, a computer program for analyzing rating judgments. RMRATE scales ratings using several scaling procedures, and compares the resulting scale values. The scaling procedures include the median and simple mean, standardized values, scale values based on Thurstone's Law of Categorical Judgment, and regression-based values. RMRATE also...

  13. Numerical solution of quadratic matrix equations for free vibration analysis of structures

    NASA Technical Reports Server (NTRS)

    Gupta, K. K.

    1975-01-01

    This paper is concerned with the efficient and accurate solution of the eigenvalue problem represented by quadratic matrix equations. Such matrix forms are obtained in connection with the free vibration analysis of structures, discretized by finite 'dynamic' elements, resulting in frequency-dependent stiffness and inertia matrices. The paper presents a new numerical solution procedure of the quadratic matrix equations, based on a combined Sturm sequence and inverse iteration technique enabling economical and accurate determination of a few required eigenvalues and associated vectors. An alternative procedure based on a simultaneous iteration procedure is also described when only the first few modes are the usual requirement. The employment of finite dynamic elements in conjunction with the presently developed eigenvalue routines results in a most significant economy in the dynamic analysis of structures.

  14. Preparation And Analysis Of Specimens Of Ablative Materials

    NASA Technical Reports Server (NTRS)

    Solomon, William C.

    1994-01-01

    Procedure for chemical analysis of specimens of silicone-based ablative thermal-insulation materials SLA-561 and MA25 involves acid digestion of specimens to prepare them for analysis by inductively-coupled-plasma/atomic-emission spectroscopy (ICP/AES). In comparison with atomic-absorption spectroscopy (AAS), ICP/AES is faster and more accurate than AAS. Results of analyses stored in data base, used to trace variations in concentrations of chemical elements in materials during long-term storage, and used in timely manner in investigations of failures. Acid-digestion portion of procedure applied to other thermal-insulation materials containing room-temperature-vulcanizing silicones and enables instrumental analysis of these materials.

  15. Considering Horn's Parallel Analysis from a Random Matrix Theory Point of View.

    PubMed

    Saccenti, Edoardo; Timmerman, Marieke E

    2017-03-01

    Horn's parallel analysis is a widely used method for assessing the number of principal components and common factors. We discuss the theoretical foundations of parallel analysis for principal components based on a covariance matrix by making use of arguments from random matrix theory. In particular, we show that (i) for the first component, parallel analysis is an inferential method equivalent to the Tracy-Widom test, (ii) its use to test high-order eigenvalues is equivalent to the use of the joint distribution of the eigenvalues, and thus should be discouraged, and (iii) a formal test for higher-order components can be obtained based on a Tracy-Widom approximation. We illustrate the performance of the two testing procedures using simulated data generated under both a principal component model and a common factors model. For the principal component model, the Tracy-Widom test performs consistently in all conditions, while parallel analysis shows unpredictable behavior for higher-order components. For the common factor model, including major and minor factors, both procedures are heuristic approaches, with variable performance. We conclude that the Tracy-Widom procedure is preferred over parallel analysis for statistically testing the number of principal components based on a covariance matrix.

  16. Structural Analysis of Correlated Factors: Lessons from the Verbal-Performance Dichotomy of the Wechsler Scales.

    ERIC Educational Resources Information Center

    Macmann, Gregg M.; Barnett, David W.

    1994-01-01

    Describes exploratory and confirmatory analyses of verbal-performance procedures to illustrate concepts and procedures for analysis of correlated factors. Argues that, based on convergent and discriminant validity criteria, factors should have higher correlations with variables that they purport to measure than with other variables. Discusses…

  17. Applications of Nonlinear Principal Components Analysis to Behavioral Data.

    ERIC Educational Resources Information Center

    Hicks, Marilyn Maginley

    1981-01-01

    An empirical investigation of the statistical procedure entitled nonlinear principal components analysis was conducted on a known equation and on measurement data in order to demonstrate the procedure and examine its potential usefulness. This method was suggested by R. Gnanadesikan and based on an early paper of Karl Pearson. (Author/AL)

  18. Finite element mesh refinement criteria for stress analysis

    NASA Technical Reports Server (NTRS)

    Kittur, Madan G.; Huston, Ronald L.

    1990-01-01

    This paper discusses procedures for finite-element mesh selection and refinement. The objective is to improve accuracy. The procedures are based on (1) the minimization of the stiffness matrix race (optimizing node location); (2) the use of h-version refinement (rezoning, element size reduction, and increasing the number of elements); and (3) the use of p-version refinement (increasing the order of polynomial approximation of the elements). A step-by-step procedure of mesh selection, improvement, and refinement is presented. The criteria for 'goodness' of a mesh are based on strain energy, displacement, and stress values at selected critical points of a structure. An analysis of an aircraft lug problem is presented as an example.

  19. Contact stresses in meshing spur gear teeth: Use of an incremental finite element procedure

    NASA Technical Reports Server (NTRS)

    Hsieh, Chih-Ming; Huston, Ronald L.; Oswald, Fred B.

    1992-01-01

    Contact stresses in meshing spur gear teeth are examined. The analysis is based upon an incremental finite element procedure that simultaneously determines the stresses in the contact region between the meshing teeth. The teeth themselves are modeled by two dimensional plain strain elements. Friction effects are included, with the friction forces assumed to obey Coulomb's law. The analysis assumes that the displacements are small and that the tooth materials are linearly elastic. The analysis procedure is validated by comparing its results with those for the classical two contacting semicylinders obtained from the Hertz method. Agreement is excellent.

  20. Industrial Water Analysis Program: A Critical Study.

    DTIC Science & Technology

    1983-09-01

    Patterson Air Force Base, Ohio UMAMUION lTATEMNT A APtO.6UPbhtd3WJ 83 11 04 014’ X. .. . INDUSTRIAL WATER ANALYSIS PROGRAM: A CRITICAL STUDY Dennis C...Twhoo-17 (ATC) Wright-FPafewon AFB O 44 3 1S. KEY WORDS (Continue on reverse aide if necessary and identify by block number) Industrial Water Analysis ...Boiler Water Analysis Preservation Procedures Air Force Industrial Water Stabilization Procedures 20. ABSTRACT (Continue on reverse side it necessary and

  1. A Risk-Analysis Approach to Implementing Web-Based Assessment

    ERIC Educational Resources Information Center

    Ricketts, Chris; Zakrzewski, Stan

    2005-01-01

    Computer-Based Assessment is a risky business. This paper proposes the use of a model for web-based assessment systems that identifies pedagogic, operational, technical (non web-based), web-based and financial risks. The strategies and procedures for risk elimination or reduction arise from risk analysis and management and are the means by which…

  2. Reliability of Laparoscopic Compared With Hysteroscopic Sterilization at One Year: A Decision Analysis

    PubMed Central

    Gariepy, Aileen M.; Creinin, Mitchell D.; Schwarz, Eleanor B.; Smith, Kenneth J.

    2011-01-01

    OBJECTIVE To estimate the probability of successful sterilization after hysteroscopic or laparoscopic sterilization procedure. METHODS An evidence-based clinical decision analysis using a Markov model was performed to estimate the probability of a successful sterilization procedure using laparoscopic sterilization, hysteroscopic sterilization in the operating room, and hysteroscopic sterilization in the office. Procedure and follow-up testing probabilities for the model were estimated from published sources. RESULTS In the base case analysis, the proportion of women having a successful sterilization procedure on first attempt is 99% for laparoscopic, 88% for hysteroscopic in the operating room and 87% for hysteroscopic in the office. The probability of having a successful sterilization procedure within one year is 99% with laparoscopic, 95% for hysteroscopic in the operating room, and 94% for hysteroscopic in the office. These estimates for hysteroscopic success include approximately 6% of women who attempt hysteroscopically but are ultimately sterilized laparoscopically. Approximately 5% of women who have a failed hysteroscopic attempt decline further sterilization attempts. CONCLUSIONS Women choosing laparoscopic sterilization are more likely than those choosing hysteroscopic sterilization to have a successful sterilization procedure within one year. However, the risk of failed sterilization and subsequent pregnancy must be considered when choosing a method of sterilization. PMID:21775842

  3. Reliability of laparoscopic compared with hysteroscopic sterilization at 1 year: a decision analysis.

    PubMed

    Gariepy, Aileen M; Creinin, Mitchell D; Schwarz, Eleanor B; Smith, Kenneth J

    2011-08-01

    To estimate the probability of successful sterilization after an hysteroscopic or laparoscopic sterilization procedure. An evidence-based clinical decision analysis using a Markov model was performed to estimate the probability of a successful sterilization procedure using laparoscopic sterilization, hysteroscopic sterilization in the operating room, and hysteroscopic sterilization in the office. Procedure and follow-up testing probabilities for the model were estimated from published sources. In the base case analysis, the proportion of women having a successful sterilization procedure on the first attempt is 99% for laparoscopic sterilization, 88% for hysteroscopic sterilization in the operating room, and 87% for hysteroscopic sterilization in the office. The probability of having a successful sterilization procedure within 1 year is 99% with laparoscopic sterilization, 95% for hysteroscopic sterilization in the operating room, and 94% for hysteroscopic sterilization in the office. These estimates for hysteroscopic success include approximately 6% of women who attempt hysteroscopically but are ultimately sterilized laparoscopically. Approximately 5% of women who have a failed hysteroscopic attempt decline further sterilization attempts. Women choosing laparoscopic sterilization are more likely than those choosing hysteroscopic sterilization to have a successful sterilization procedure within 1 year. However, the risk of failed sterilization and subsequent pregnancy must be considered when choosing a method of sterilization.

  4. Surgical task analysis of simulated laparoscopic cholecystectomy with a navigation system.

    PubMed

    Sugino, T; Kawahira, H; Nakamura, R

    2014-09-01

       Advanced surgical procedures, which have become complex and difficult, increase the burden of surgeons. Quantitative analysis of surgical procedures can improve training, reduce variability, and enable optimization of surgical procedures. To this end, a surgical task analysis system was developed that uses only surgical navigation information.    Division of the surgical procedure, task progress analysis, and task efficiency analysis were done. First, the procedure was divided into five stages. Second, the operating time and progress rate were recorded to document task progress during specific stages, including the dissecting task. Third, the speed of the surgical instrument motion (mean velocity and acceleration), as well as the size and overlap ratio of the approximate ellipse of the location log data distribution, was computed to estimate the task efficiency during each stage. These analysis methods were evaluated based on experimental validation with two groups of surgeons, i.e., skilled and "other" surgeons. The performance metrics and analytical parameters included incidents during the operation, the surgical environment, and the surgeon's skills or habits.    Comparison of groups revealed that skilled surgeons tended to perform the procedure in less time and involved smaller regions; they also manipulated the surgical instruments more gently.    Surgical task analysis developed for quantitative assessment of surgical procedures and surgical performance may provide practical methods and metrics for objective evaluation of surgical expertise.

  5. [Costing nuclear medicine diagnostic procedures].

    PubMed

    Markou, Pavlos

    2005-01-01

    To the Editor: Referring to a recent special report about the cost analysis of twenty-nine nuclear medicine procedures, I would like to clarify some basic aspects for determining costs of nuclear medicine procedure with various costing methodologies. Activity Based Costing (ABC) method, is a new approach in imaging services costing that can provide the most accurate cost data, but is difficult to perform in nuclear medicine diagnostic procedures. That is because ABC requires determining and analyzing all direct and indirect costs of each procedure, according all its activities. Traditional costing methods, like those for estimating incomes and expenses per procedure or fixed and variable costs per procedure, which are widely used in break-even point analysis and the method of ratio-of-costs-to-charges per procedure may be easily performed in nuclear medicine departments, to evaluate the variability and differences between costs and reimbursement - charges.

  6. What Do You Think You Are Measuring? A Mixed-Methods Procedure for Assessing the Content Validity of Test Items and Theory-Based Scaling

    PubMed Central

    Koller, Ingrid; Levenson, Michael R.; Glück, Judith

    2017-01-01

    The valid measurement of latent constructs is crucial for psychological research. Here, we present a mixed-methods procedure for improving the precision of construct definitions, determining the content validity of items, evaluating the representativeness of items for the target construct, generating test items, and analyzing items on a theoretical basis. To illustrate the mixed-methods content-scaling-structure (CSS) procedure, we analyze the Adult Self-Transcendence Inventory, a self-report measure of wisdom (ASTI, Levenson et al., 2005). A content-validity analysis of the ASTI items was used as the basis of psychometric analyses using multidimensional item response models (N = 1215). We found that the new procedure produced important suggestions concerning five subdimensions of the ASTI that were not identifiable using exploratory methods. The study shows that the application of the suggested procedure leads to a deeper understanding of latent constructs. It also demonstrates the advantages of theory-based item analysis. PMID:28270777

  7. Mixture-based gatekeeping procedures in adaptive clinical trials.

    PubMed

    Kordzakhia, George; Dmitrienko, Alex; Ishida, Eiji

    2018-01-01

    Clinical trials with data-driven decision rules often pursue multiple clinical objectives such as the evaluation of several endpoints or several doses of an experimental treatment. These complex analysis strategies give rise to "multivariate" multiplicity problems with several components or sources of multiplicity. A general framework for defining gatekeeping procedures in clinical trials with adaptive multistage designs is proposed in this paper. The mixture method is applied to build a gatekeeping procedure at each stage and inferences at each decision point (interim or final analysis) are performed using the combination function approach. An advantage of utilizing the mixture method is that it enables powerful gatekeeping procedures applicable to a broad class of settings with complex logical relationships among the hypotheses of interest. Further, the combination function approach supports flexible data-driven decisions such as a decision to increase the sample size or remove a treatment arm. The paper concludes with a clinical trial example that illustrates the methodology by applying it to develop an adaptive two-stage design with a mixture-based gatekeeping procedure.

  8. A GIS-based automated procedure for landslide susceptibility mapping by the Conditional Analysis method: the Baganza valley case study (Italian Northern Apennines)

    NASA Astrophysics Data System (ADS)

    Clerici, Aldo; Perego, Susanna; Tellini, Claudio; Vescovi, Paolo

    2006-08-01

    Among the many GIS based multivariate statistical methods for landslide susceptibility zonation, the so called “Conditional Analysis method” holds a special place for its conceptual simplicity. In fact, in this method landslide susceptibility is simply expressed as landslide density in correspondence with different combinations of instability-factor classes. To overcome the operational complexity connected to the long, tedious and error prone sequence of commands required by the procedure, a shell script mainly based on the GRASS GIS was created. The script, starting from a landslide inventory map and a number of factor maps, automatically carries out the whole procedure resulting in the construction of a map with five landslide susceptibility classes. A validation procedure allows to assess the reliability of the resulting model, while the simple mean deviation of the density values in the factor class combinations, helps to evaluate the goodness of landslide density distribution. The procedure was applied to a relatively small basin (167 km2) in the Italian Northern Apennines considering three landslide types, namely rotational slides, flows and complex landslides, for a total of 1,137 landslides, and five factors, namely lithology, slope angle and aspect, elevation and slope/bedding relations. The analysis of the resulting 31 different models obtained combining the five factors, confirms the role of lithology, slope angle and slope/bedding relations in influencing slope stability.

  9. Multicriteria decision analysis in ranking of analytical procedures for aldrin determination in water.

    PubMed

    Tobiszewski, Marek; Orłowski, Aleksander

    2015-03-27

    The study presents the possibility of multi-criteria decision analysis (MCDA) application when choosing analytical procedures with low environmental impact. A type of MCDA, Preference Ranking Organization Method for Enrichment Evaluations (PROMETHEE), was chosen as versatile tool that meets all the analytical chemists--decision makers requirements. Twenty five analytical procedures for aldrin determination in water samples (as an example) were selected as input alternatives to MCDA analysis. Nine different criteria describing the alternatives were chosen from different groups--metrological, economical and the most importantly--environmental impact. The weights for each criterion were obtained from questionnaires that were sent to experts, giving three different scenarios for MCDA results. The results of analysis show that PROMETHEE is very promising tool to choose the analytical procedure with respect to its greenness. The rankings for all three scenarios placed solid phase microextraction and liquid phase microextraction--based procedures high, while liquid-liquid extraction, solid phase extraction and stir bar sorptive extraction--based procedures were placed low in the ranking. The results show that although some of the experts do not intentionally choose green analytical chemistry procedures, their MCDA choice is in accordance with green chemistry principles. The PROMETHEE ranking results were compared with more widely accepted green analytical chemistry tools--NEMI and Eco-Scale. As PROMETHEE involved more different factors than NEMI, the assessment results were only weakly correlated. Oppositely, the results of Eco-Scale assessment were well-correlated as both methodologies involved similar criteria of assessment. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Solution of quadratic matrix equations for free vibration analysis of structures.

    NASA Technical Reports Server (NTRS)

    Gupta, K. K.

    1973-01-01

    An efficient digital computer procedure and the related numerical algorithm are presented herein for the solution of quadratic matrix equations associated with free vibration analysis of structures. Such a procedure enables accurate and economical analysis of natural frequencies and associated modes of discretized structures. The numerically stable algorithm is based on the Sturm sequence method, which fully exploits the banded form of associated stiffness and mass matrices. The related computer program written in FORTRAN V for the JPL UNIVAC 1108 computer proves to be substantially more accurate and economical than other existing procedures of such analysis. Numerical examples are presented for two structures - a cantilever beam and a semicircular arch.

  11. Stormwater Characterization and Lagoon Sediment Analysis, Grand Forks Air Force Base, North Dakota

    DTIC Science & Technology

    1990-08-01

    tetrachloroethylene, and 0.0026 mg/l ethyl benzene. Analyses showed no pesticides . 4. Extraction Procedure (EP) Analysis. An AFOEHL contractor performed EP extraction ...runoff met North Dakota state stream standards. Lagoon sediment did not contain Extraction Procedure hazardous chemicals. Stormwater runoff exceeded...Standards for Water Quality for the State of North Dakota ( Extracts ) 39 D Site/Analysis Summary 69 E Lift Station Flow Records 73 F Wastewater

  12. Office-based deep sedation for pediatric ophthalmologic procedures using a sedation service model.

    PubMed

    Lalwani, Kirk; Tomlinson, Matthew; Koh, Jeffrey; Wheeler, David

    2012-01-01

    Aims. (1) To assess the efficacy and safety of pediatric office-based sedation for ophthalmologic procedures using a pediatric sedation service model. (2) To assess the reduction in hospital charges of this model of care delivery compared to the operating room (OR) setting for similar procedures. Background. Sedation is used to facilitate pediatric procedures and to immobilize patients for imaging and examination. We believe that the pediatric sedation service model can be used to facilitate office-based deep sedation for brief ophthalmologic procedures and examinations. Methods. After IRB approval, all children who underwent office-based ophthalmologic procedures at our institution between January 1, 2000 and July 31, 2008 were identified using the sedation service database and the electronic health record. A comparison of hospital charges between similar procedures in the operating room was performed. Results. A total of 855 procedures were reviewed. Procedure completion rate was 100% (C.I. 99.62-100). There were no serious complications or unanticipated admissions. Our analysis showed a significant reduction in hospital charges (average of $1287 per patient) as a result of absent OR and recovery unit charges. Conclusions. Pediatric ophthalmologic minor procedures can be performed using a sedation service model with significant reductions in hospital charges.

  13. Cost Savings and Patient Experiences of a Clinic-Based, Wide-Awake Hand Surgery Program at a Military Medical Center: A Critical Analysis of the First 100 Procedures.

    PubMed

    Rhee, Peter C; Fischer, Michelle M; Rhee, Laura S; McMillan, Ha; Johnson, Anthony E

    2017-03-01

    Wide-awake, local anesthesia, no tourniquet (WALANT) hand surgery was developed to improve access to hand surgery care while optimizing medical resources. Hand surgery in the clinic setting may result in substantial cost savings for the United States Military Health Care System (MHS) and provide a safe alternative to performing similar procedures in the operating room. A prospective cohort study was performed on the first 100 consecutive clinic-based WALANT hand surgery procedures performed at a military medical center from January 2014 to September 2015 by a single hand surgeon. Cost savings analysis was performed by using the Medical Expense and Performance Reporting System, the standard cost accounting system for the MHS, to compare procedures performed in the clinic versus the operating room during the study period. A study specific questionnaire was obtained for 66 procedures to evaluate the patient's experience. For carpal tunnel release (n = 34) and A1 pulley release (n = 33), there were 85% and 70% cost savings by having the procedures performed in clinic under WALANT compared with the main operating room, respectively. During the study period, carpal tunnel release, A1 pulley release, and de Quervain release performed in the clinic instead of the operating room amounted to $393,100 in cost savings for the MHS. There were no adverse events during the WALANT procedure. A clinic-based WALANT hand surgery program at a military medical center results in considerable cost savings for the MHS. Economic/Decision Analysis IV. Copyright © 2017 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  14. Past and present work practices of European interventional cardiologists in the context of radiation protection of the eye lens-results of the EURALOC study.

    PubMed

    Domienik-Andrzejewska, Joanna; Ciraj-Bjelac, Olivera; Askounis, Panagiotis; Covens, Peter; Dragusin, Octavian; Jacob, Sophie; Farah, Jad; Gianicolo, Emilio; Padovani, Renato; Teles, Pedro; Widmark, Anders; Struelens, Lara

    2018-05-21

    This paper investigates over five decades of work practices in interventional cardiology, with an emphasis on radiation protection. The analysis is based on data from more than 400 cardiologists from various European countries recruited for a EURALOC study and collected in the period from 2014 to 2016. Information on the types of procedures performed and their annual mean number, fluoroscopy time, access site choice, x-ray units and radiation protection means used was collected using an occupational questionnaire. Based on the specific European data, changes in each parameter have been analysed over decades, while country-specific data analysis has allowed us to determine the differences in local practices. In particular, based on the collected data, the typical workload of a European cardiologist working in a haemodynamic room and an electrophysiology room was specified for various types of procedures. The results showed that when working in a haemodynamic room, a transparent ceiling-suspended lead shield or lead glasses are necessary in order to remain below the recommended eye lens dose limit of 20 mSv. Moreover, the analysis revealed that new, more complex cardiac procedures such as chronic total occlusion, valvuloplasty and pulmonary vein isolation for atrial fibrillation ablation might contribute substantially to annual doses, although they are relatively rarely performed. The results revealed that considerable progress has been made in the use of radiation protection tools. While their use in electrophysiology procedures is not generic, the situation in haemodynamic procedures is rather encouraging, as ceiling-suspended shields are used in 90% of cases, while the combination of ceiling shield and lead glasses is noted in more than 40% of the procedures. However, we find that still 7% of haemodynamic procedures are performed without any radiation protection tools.

  15. A Statistical Analysis of Brain Morphology Using Wild Bootstrapping

    PubMed Central

    Ibrahim, Joseph G.; Tang, Niansheng; Rowe, Daniel B.; Hao, Xuejun; Bansal, Ravi; Peterson, Bradley S.

    2008-01-01

    Methods for the analysis of brain morphology, including voxel-based morphology and surface-based morphometries, have been used to detect associations between brain structure and covariates of interest, such as diagnosis, severity of disease, age, IQ, and genotype. The statistical analysis of morphometric measures usually involves two statistical procedures: 1) invoking a statistical model at each voxel (or point) on the surface of the brain or brain subregion, followed by mapping test statistics (e.g., t test) or their associated p values at each of those voxels; 2) correction for the multiple statistical tests conducted across all voxels on the surface of the brain region under investigation. We propose the use of new statistical methods for each of these procedures. We first use a heteroscedastic linear model to test the associations between the morphological measures at each voxel on the surface of the specified subregion (e.g., cortical or subcortical surfaces) and the covariates of interest. Moreover, we develop a robust test procedure that is based on a resampling method, called wild bootstrapping. This procedure assesses the statistical significance of the associations between a measure of given brain structure and the covariates of interest. The value of this robust test procedure lies in its computationally simplicity and in its applicability to a wide range of imaging data, including data from both anatomical and functional magnetic resonance imaging (fMRI). Simulation studies demonstrate that this robust test procedure can accurately control the family-wise error rate. We demonstrate the application of this robust test procedure to the detection of statistically significant differences in the morphology of the hippocampus over time across gender groups in a large sample of healthy subjects. PMID:17649909

  16. You Learn by Your Mistakes. Effective Training Strategies Based on the Analysis of Video-Recorded Worked-Out Examples

    ERIC Educational Resources Information Center

    Cattaneo, Alberto A. P.; Boldrini, Elena

    2017-01-01

    This paper presents an empirical study on procedural learning from errors that was conducted within the field of vocational education. It examines whether, and to what extent, procedural learning can benefit more from the detection and written analysis of errors (experimental condition) than from the correct elements (control group). The study…

  17. Objective analysis of observational data from the FGGE observing systems

    NASA Technical Reports Server (NTRS)

    Baker, W.; Edelmann, D.; Iredell, M.; Han, D.; Jakkempudi, S.

    1981-01-01

    An objective analysis procedure for updating the GLAS second and fourth order general atmospheric circulation models using observational data from the first GARP global experiment is described. The objective analysis procedure is based on a successive corrections method and the model is updated in a data assimilation cycle. Preparation of the observational data for analysis and the objective analysis scheme are described. The organization of the program and description of the required data sets are presented. The program logic and detailed descriptions of each subroutine are given.

  18. Evidence-based value of subcutaneous surgical wound drainage: the largest systematic review and meta-analysis.

    PubMed

    Kosins, Aaron M; Scholz, Thomas; Cetinkaya, Mine; Evans, Gregory R D

    2013-08-01

    The purpose of this study was to determine the evidenced-based value of prophylactic drainage of subcutaneous wounds in surgery. An electronic search was performed. Articles comparing subcutaneous prophylactic drainage with no drainage were identified and classified by level of evidence. If sufficient randomized controlled trials were included, a meta-analysis was performed using the random-effects model. Fifty-two randomized controlled trials were included in the meta-analysis, and subgroups were determined by specific surgical procedures or characteristics (cesarean delivery, abdominal wound, breast reduction, breast biopsy, femoral wound, axillary lymph node dissection, hip and knee arthroplasty, obesity, and clean-contaminated wound). Studies were compared for the following endpoints: hematoma, wound healing issues, seroma, abscess, and infection. Fifty-two studies with a total of 6930 operations were identified as suitable for this analysis. There were 3495 operations in the drain group and 3435 in the no-drain group. Prophylactic subcutaneous drainage offered a statistically significant advantage only for (1) prevention of hematomas in breast biopsy procedures and (2) prevention of seromas in axillary node dissections. In all other procedures studied, drainage did not offer an advantage. Many surgical operations can be performed safely without prophylactic drainage. Surgeons can consider omitting drains after cesarean section, breast reduction, abdominal wounds, femoral wounds, and hip and knee joint replacement. Furthermore, surgeons should consider not placing drains prophylactically in obese patients. However, drain placement following a surgical procedure is the surgeon's choice and can be based on multiple factors beyond the type of procedure being performed or the patient's body habitus. Therapeutic, II.

  19. A Framework for Creating a Function-based Design Tool for Failure Mode Identification

    NASA Technical Reports Server (NTRS)

    Arunajadai, Srikesh G.; Stone, Robert B.; Tumer, Irem Y.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Knowledge of potential failure modes during design is critical for prevention of failures. Currently industries use procedures such as Failure Modes and Effects Analysis (FMEA), Fault Tree analysis, or Failure Modes, Effects and Criticality analysis (FMECA), as well as knowledge and experience, to determine potential failure modes. When new products are being developed there is often a lack of sufficient knowledge of potential failure mode and/or a lack of sufficient experience to identify all failure modes. This gives rise to a situation in which engineers are unable to extract maximum benefits from the above procedures. This work describes a function-based failure identification methodology, which would act as a storehouse of information and experience, providing useful information about the potential failure modes for the design under consideration, as well as enhancing the usefulness of procedures like FMEA. As an example, the method is applied to fifteen products and the benefits are illustrated.

  20. Classifying breast cancer surgery: a novel, complexity-based system for oncological, oncoplastic and reconstructive procedures, and proof of principle by analysis of 1225 operations in 1166 patients.

    PubMed

    Hoffmann, Jürgen; Wallwiener, Diethelm

    2009-04-08

    One of the basic prerequisites for generating evidence-based data is the availability of classification systems. Attempts to date to classify breast cancer operations have focussed on specific problems, e.g. the avoidance of secondary corrective surgery for surgical defects, rather than taking a generic approach. Starting from an existing, simpler empirical scheme based on the complexity of breast surgical procedures, which was used in-house primarily in operative report-writing, a novel classification of ablative and breast-conserving procedures initially needed to be developed and elaborated systematically. To obtain proof of principle, a prospectively planned analysis of patient records for all major breast cancer-related operations performed at our breast centre in 2005 and 2006 was conducted using the new classification. Data were analysed using basic descriptive statistics such as frequency tables. A novel two-type, six-tier classification system comprising 12 main categories, 13 subcategories and 39 sub-subcategories of oncological, oncoplastic and reconstructive breast cancer-related surgery was successfully developed. Our system permitted unequivocal classification, without exception, of all 1225 procedures performed in 1166 breast cancer patients in 2005 and 2006. Breast cancer-related surgical procedures can be generically classified according to their surgical complexity. Analysis of all major procedures performed at our breast centre during the study period provides proof of principle for this novel classification system. We envisage various applications for this classification, including uses in randomised clinical trials, guideline development, specialist surgical training, continuing professional development as well as quality of care and public health research.

  1. A vibration-based health monitoring program for a large and seismically vulnerable masonry dome

    NASA Astrophysics Data System (ADS)

    Pecorelli, M. L.; Ceravolo, R.; De Lucia, G.; Epicoco, R.

    2017-05-01

    Vibration-based health monitoring of monumental structures must rely on efficient and, as far as possible, automatic modal analysis procedures. Relatively low excitation energy provided by traffic, wind and other sources is usually sufficient to detect structural changes, as those produced by earthquakes and extreme events. Above all, in-operation modal analysis is a non-invasive diagnostic technique that can support optimal strategies for the preservation of architectural heritage, especially if complemented by model-driven procedures. In this paper, the preliminary steps towards a fully automated vibration-based monitoring of the world’s largest masonry oval dome (internal axes of 37.23 by 24.89 m) are presented. More specifically, the paper reports on signal treatment operations conducted to set up the permanent dynamic monitoring system of the dome and to realise a robust automatic identification procedure. Preliminary considerations on the effects of temperature on dynamic parameters are finally reported.

  2. Probability and Visual Aids for Assessing Intervention Effectiveness in Single-Case Designs: A Field Test.

    PubMed

    Manolov, Rumen; Jamieson, Matthew; Evans, Jonathan J; Sierra, Vicenta

    2015-09-01

    Single-case data analysis still relies heavily on visual inspection, and, at the same time, it is not clear to what extent the results of different quantitative procedures converge in identifying an intervention effect and its magnitude when applied to the same data; this is the type of evidence provided here for two procedures. One of the procedures, included due to the importance of providing objective criteria to visual analysts, is a visual aid fitting and projecting split-middle trend while taking into account data variability. The other procedure converts several different metrics into probabilities making their results comparable. In the present study, we expore to what extend these two procedures coincide in the magnitude of intervention effect taking place in a set of studies stemming from a recent meta-analysis. The procedures concur to a greater extent with the values of the indices computed and with each other and, to a lesser extent, with our own visual analysis. For distinguishing smaller from larger effects, the probability-based approach seems somewhat better suited. Moreover, the results of the field test suggest that the latter is a reasonably good mechanism for translating different metrics into similar labels. User friendly R code is provided for promoting the use of the visual aid, together with a quantification based on nonoverlap and the label provided by the probability approach. © The Author(s) 2015.

  3. Cost Utility Analysis of Cervical Therapeutic Medial Branch Blocks in Managing Chronic Neck Pain

    PubMed Central

    Manchikanti, Laxmaiah; Pampati, Vidyasagar; Kaye, Alan D.; Hirsch, Joshua A.

    2017-01-01

    Background:Controlled diagnostic studies have established the prevalence of cervical facet joint pain to range from 36% to 67% based on the criterion standard of ≥ 80% pain relief. Treatment of cervical facet joint pain has been described with Level II evidence of effectiveness for therapeutic facet joint nerve blocks and radiofrequency neurotomy and with no significant evidence for intraarticular injections. However, there have not been any cost effectiveness or cost utility analysis studies performed in managing chronic neck pain with or without headaches with cervical facet joint interventions. Study Design:Cost utility analysis based on the results of a double-blind, randomized, controlled trial of cervical therapeutic medial branch blocks in managing chronic neck pain. Objectives:To assess cost utility of therapeutic cervical medial branch blocks in managing chronic neck pain. Methods: A randomized trial was conducted in a specialty referral private practice interventional pain management center in the United States. This trial assessed the clinical effectiveness of therapeutic cervical medial branch blocks with or without steroids for an established diagnosis of cervical facet joint pain by means of controlled diagnostic blocks. Cost utility analysis was performed with direct payment data for the procedures for a total of 120 patients over a period of 2 years from this trial based on reimbursement rates of 2016. The payment data provided direct procedural costs without inclusion of drug treatments. An additional 40% was added to procedural costs with multiplication of a factor of 1.67 to provide estimated total costs including direct and indirect costs, based on highly regarded surgical literature. Outcome measures included significant improvement defined as at least a 50% improvement with reduction in pain and disability status with a combined 50% or more reduction in pain in Neck Disability Index (NDI) scores. Results:The results showed direct procedural costs per one-year improvement in quality adjusted life year (QALY) of United States Dollar (USD) of $2,552, and overall costs of USD $4,261. Overall, each patient on average received 5.7 ± 2.2 procedures over a period of 2 years. Average significant improvement per procedure was 15.6 ± 12.3 weeks and average significant improvement in 2 years per patient was 86.0 ± 24.6 weeks. Limitations:The limitations of this cost utility analysis are that data are based on a single center evaluation. Only costs of therapeutic interventional procedures and physician visits were included, with extrapolation of indirect costs. Conclusion:The cost utility analysis of therapeutic cervical medial branch blocks in the treatment of chronic neck pain non-responsive to conservative management demonstrated clinical effectiveness and cost utility at USD $4,261 per one year of QALY. PMID:29200944

  4. Cost Utility Analysis of Cervical Therapeutic Medial Branch Blocks in Managing Chronic Neck Pain.

    PubMed

    Manchikanti, Laxmaiah; Pampati, Vidyasagar; Kaye, Alan D; Hirsch, Joshua A

    2017-01-01

    Background: Controlled diagnostic studies have established the prevalence of cervical facet joint pain to range from 36% to 67% based on the criterion standard of ≥ 80% pain relief. Treatment of cervical facet joint pain has been described with Level II evidence of effectiveness for therapeutic facet joint nerve blocks and radiofrequency neurotomy and with no significant evidence for intraarticular injections. However, there have not been any cost effectiveness or cost utility analysis studies performed in managing chronic neck pain with or without headaches with cervical facet joint interventions. Study Design: Cost utility analysis based on the results of a double-blind, randomized, controlled trial of cervical therapeutic medial branch blocks in managing chronic neck pain. Objectives: To assess cost utility of therapeutic cervical medial branch blocks in managing chronic neck pain. Methods: A randomized trial was conducted in a specialty referral private practice interventional pain management center in the United States. This trial assessed the clinical effectiveness of therapeutic cervical medial branch blocks with or without steroids for an established diagnosis of cervical facet joint pain by means of controlled diagnostic blocks. Cost utility analysis was performed with direct payment data for the procedures for a total of 120 patients over a period of 2 years from this trial based on reimbursement rates of 2016. The payment data provided direct procedural costs without inclusion of drug treatments. An additional 40% was added to procedural costs with multiplication of a factor of 1.67 to provide estimated total costs including direct and indirect costs, based on highly regarded surgical literature. Outcome measures included significant improvement defined as at least a 50% improvement with reduction in pain and disability status with a combined 50% or more reduction in pain in Neck Disability Index (NDI) scores. Results: The results showed direct procedural costs per one-year improvement in quality adjusted life year (QALY) of United States Dollar (USD) of $2,552, and overall costs of USD $4,261. Overall, each patient on average received 5.7 ± 2.2 procedures over a period of 2 years. Average significant improvement per procedure was 15.6 ± 12.3 weeks and average significant improvement in 2 years per patient was 86.0 ± 24.6 weeks. Limitations: The limitations of this cost utility analysis are that data are based on a single center evaluation. Only costs of therapeutic interventional procedures and physician visits were included, with extrapolation of indirect costs. Conclusion: The cost utility analysis of therapeutic cervical medial branch blocks in the treatment of chronic neck pain non-responsive to conservative management demonstrated clinical effectiveness and cost utility at USD $4,261 per one year of QALY.

  5. Development of Multiobjective Optimization Techniques for Sonic Boom Minimization

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John Narayan; Pagaldipti, Naryanan S.

    1996-01-01

    A discrete, semi-analytical sensitivity analysis procedure has been developed for calculating aerodynamic design sensitivities. The sensitivities of the flow variables and the grid coordinates are numerically calculated using direct differentiation of the respective discretized governing equations. The sensitivity analysis techniques are adapted within a parabolized Navier Stokes equations solver. Aerodynamic design sensitivities for high speed wing-body configurations are calculated using the semi-analytical sensitivity analysis procedures. Representative results obtained compare well with those obtained using the finite difference approach and establish the computational efficiency and accuracy of the semi-analytical procedures. Multidisciplinary design optimization procedures have been developed for aerospace applications namely, gas turbine blades and high speed wing-body configurations. In complex applications, the coupled optimization problems are decomposed into sublevels using multilevel decomposition techniques. In cases with multiple objective functions, formal multiobjective formulation such as the Kreisselmeier-Steinhauser function approach and the modified global criteria approach have been used. Nonlinear programming techniques for continuous design variables and a hybrid optimization technique, based on a simulated annealing algorithm, for discrete design variables have been used for solving the optimization problems. The optimization procedure for gas turbine blades improves the aerodynamic and heat transfer characteristics of the blades. The two-dimensional, blade-to-blade aerodynamic analysis is performed using a panel code. The blade heat transfer analysis is performed using an in-house developed finite element procedure. The optimization procedure yields blade shapes with significantly improved velocity and temperature distributions. The multidisciplinary design optimization procedures for high speed wing-body configurations simultaneously improve the aerodynamic, the sonic boom and the structural characteristics of the aircraft. The flow solution is obtained using a comprehensive parabolized Navier Stokes solver. Sonic boom analysis is performed using an extrapolation procedure. The aircraft wing load carrying member is modeled as either an isotropic or a composite box beam. The isotropic box beam is analyzed using thin wall theory. The composite box beam is analyzed using a finite element procedure. The developed optimization procedures yield significant improvements in all the performance criteria and provide interesting design trade-offs. The semi-analytical sensitivity analysis techniques offer significant computational savings and allow the use of comprehensive analysis procedures within design optimization studies.

  6. Assessment of Matrix Multiplication Learning with a Rule-Based Analytical Model--"A Bayesian Network Representation"

    ERIC Educational Resources Information Center

    Zhang, Zhidong

    2016-01-01

    This study explored an alternative assessment procedure to examine learning trajectories of matrix multiplication. It took rule-based analytical and cognitive task analysis methods specifically to break down operation rules for a given matrix multiplication. Based on the analysis results, a hierarchical Bayesian network, an assessment model,…

  7. Analysis of the Accuracy of a Proposed Target Motion Analysis Procedure

    DTIC Science & Technology

    1989-09-01

    Monte, Caracas. Venezuela 5. Comando de La Escuadra 1 Base Naval "CA Agustin Armario " Puerto Cabello, Edo. Carabobo. Venezuela 6. United States Naval...Base New London Groton, CT 06340 9. Comando del Escuadron de Submarinos I Base Naval "CA Agustin Armario " Puerto Cabello, Edo. Carabobo. Venezuela 10

  8. Equilibrium paths analysis of materials with rheological properties by using the chaos theory

    NASA Astrophysics Data System (ADS)

    Bednarek, Paweł; Rządkowski, Jan

    2018-01-01

    The numerical equilibrium path analysis of the material with random rheological properties by using standard procedures and specialist computer programs was not successful. The proper solution for the analysed heuristic model of the material was obtained on the base of chaos theory elements and neural networks. The paper deals with mathematical reasons of used computer programs and also are elaborated the properties of the attractor used in analysis. There are presented results of conducted numerical analysis both in a numerical and in graphical form for the used procedures.

  9. Physics faculty beliefs and values about the teaching and learning of problem solving. II. Procedures for measurement and analysis

    NASA Astrophysics Data System (ADS)

    Henderson, Charles; Yerushalmi, Edit; Kuo, Vince H.; Heller, Kenneth; Heller, Patricia

    2007-12-01

    To identify and describe the basis upon which instructors make curricular and pedagogical decisions, we have developed an artifact-based interview and an analysis technique based on multilayered concept maps. The policy capturing technique used in the interview asks instructors to make judgments about concrete instructional artifacts similar to those they likely encounter in their teaching environment. The analysis procedure alternatively employs both an a priori systems view analysis and an emergent categorization to construct a multilayered concept map, which is a hierarchically arranged set of concept maps where child maps include more details than parent maps. Although our goal was to develop a model of physics faculty beliefs about the teaching and learning of problem solving in the context of an introductory calculus-based physics course, the techniques described here are applicable to a variety of situations in which instructors make decisions that influence teaching and learning.

  10. Designing simulator-based training: an approach integrating cognitive task analysis and four-component instructional design.

    PubMed

    Tjiam, Irene M; Schout, Barbara M A; Hendrikx, Ad J M; Scherpbier, Albert J J M; Witjes, J Alfred; van Merriënboer, Jeroen J G

    2012-01-01

    Most studies of simulator-based surgical skills training have focused on the acquisition of psychomotor skills, but surgical procedures are complex tasks requiring both psychomotor and cognitive skills. As skills training is modelled on expert performance consisting partly of unconscious automatic processes that experts are not always able to explicate, simulator developers should collaborate with educational experts and physicians in developing efficient and effective training programmes. This article presents an approach to designing simulator-based skill training comprising cognitive task analysis integrated with instructional design according to the four-component/instructional design model. This theory-driven approach is illustrated by a description of how it was used in the development of simulator-based training for the nephrostomy procedure.

  11. Forensic analysis of Salvia divinorum using multivariate statistical procedures. Part I: discrimination from related Salvia species.

    PubMed

    Willard, Melissa A Bodnar; McGuffin, Victoria L; Smith, Ruth Waddell

    2012-01-01

    Salvia divinorum is a hallucinogenic herb that is internationally regulated. In this study, salvinorin A, the active compound in S. divinorum, was extracted from S. divinorum plant leaves using a 5-min extraction with dichloromethane. Four additional Salvia species (Salvia officinalis, Salvia guaranitica, Salvia splendens, and Salvia nemorosa) were extracted using this procedure, and all extracts were analyzed by gas chromatography-mass spectrometry. Differentiation of S. divinorum from other Salvia species was successful based on visual assessment of the resulting chromatograms. To provide a more objective comparison, the total ion chromatograms (TICs) were subjected to principal components analysis (PCA). Prior to PCA, the TICs were subjected to a series of data pretreatment procedures to minimize non-chemical sources of variance in the data set. Successful discrimination of S. divinorum from the other four Salvia species was possible based on visual assessment of the PCA scores plot. To provide a numerical assessment of the discrimination, a series of statistical procedures such as Euclidean distance measurement, hierarchical cluster analysis, Student's t tests, Wilcoxon rank-sum tests, and Pearson product moment correlation were also applied to the PCA scores. The statistical procedures were then compared to determine the advantages and disadvantages for forensic applications.

  12. Completely automated modal analysis procedure based on the combination of different OMA methods

    NASA Astrophysics Data System (ADS)

    Ripamonti, Francesco; Bussini, Alberto; Resta, Ferruccio

    2018-03-01

    In this work a completely automated output-only Modal Analysis procedure is presented and all its benefits are listed. Based on the merging of different Operational Modal Analysis methods and a statistical approach, the identification process has been improved becoming more robust and giving as results only the real natural frequencies, damping ratios and mode shapes of the system. The effect of the temperature can be taken into account as well, leading to the creation of a better tool for automated Structural Health Monitoring. The algorithm has been developed and tested on a numerical model of a scaled three-story steel building present in the laboratories of Politecnico di Milano.

  13. A probabilistic seismic risk assessment procedure for nuclear power plants: (I) Methodology

    USGS Publications Warehouse

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2011-01-01

    A new procedure for probabilistic seismic risk assessment of nuclear power plants (NPPs) is proposed. This procedure modifies the current procedures using tools developed recently for performance-based earthquake engineering of buildings. The proposed procedure uses (a) response-based fragility curves to represent the capacity of structural and nonstructural components of NPPs, (b) nonlinear response-history analysis to characterize the demands on those components, and (c) Monte Carlo simulations to determine the damage state of the components. The use of response-rather than ground-motion-based fragility curves enables the curves to be independent of seismic hazard and closely related to component capacity. The use of Monte Carlo procedure enables the correlation in the responses of components to be directly included in the risk assessment. An example of the methodology is presented in a companion paper to demonstrate its use and provide the technical basis for aspects of the methodology. ?? 2011 Published by Elsevier B.V.

  14. Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements

    NASA Technical Reports Server (NTRS)

    Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.

    1988-01-01

    The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.

  15. Quantitative Analysis of Science and Chemistry Textbooks for Indicators of Reform: A complementary perspective

    NASA Astrophysics Data System (ADS)

    Kahveci, Ajda

    2010-07-01

    In this study, multiple thematically based and quantitative analysis procedures were utilized to explore the effectiveness of Turkish chemistry and science textbooks in terms of their reflection of reform. The themes gender equity, questioning level, science vocabulary load, and readability level provided the conceptual framework for the analyses. An unobtrusive research method, content analysis, was used by coding the manifest content and counting the frequency of words, photographs, drawings, and questions by cognitive level. The context was an undergraduate chemistry teacher preparation program at a large public university in a metropolitan area in northwestern Turkey. Forty preservice chemistry teachers were guided to analyze 10 middle school science and 10 high school chemistry textbooks. Overall, the textbooks included unfair gender representations, a considerably higher number of input and processing than output level questions, and high load of science terminology. The textbooks failed to provide sufficient empirical evidence to be considered as gender equitable and inquiry-based. The quantitative approach employed for evaluation contrasts with a more interpretive approach, and has the potential in depicting textbook profiles in a more reliable way, complementing the commonly employed qualitative procedures. Implications suggest that further work in this line is needed on calibrating the analysis procedures with science textbooks used in different international settings. The procedures could be modified and improved to meet specific evaluation needs. In the Turkish context, next step research may concern the analysis of science textbooks being rewritten for the reform-based curricula to make cross-comparisons and evaluate a possible progression.

  16. Using Computation Curriculum-Based Measurement Probes for Error Pattern Analysis

    ERIC Educational Resources Information Center

    Dennis, Minyi Shih; Calhoon, Mary Beth; Olson, Christopher L.; Williams, Cara

    2014-01-01

    This article describes how "curriculum-based measurement--computation" (CBM-C) mathematics probes can be used in combination with "error pattern analysis" (EPA) to pinpoint difficulties in basic computation skills for students who struggle with learning mathematics. Both assessment procedures provide ongoing assessment data…

  17. An Integrated Analysis-Test Approach

    NASA Technical Reports Server (NTRS)

    Kaufman, Daniel

    2003-01-01

    This viewgraph presentation provides an overview of a project to develop a computer program which integrates data analysis and test procedures. The software application aims to propose a new perspective to traditional mechanical analysis and test procedures and to integrate pre-test and test analysis calculation methods. The program also should also be able to be used in portable devices and allows for the 'quasi-real time' analysis of data sent by electronic means. Test methods reviewed during this presentation include: shaker swept sine and random tests, shaker shock mode tests, shaker base driven model survey tests and acoustic tests.

  18. CFD Analysis of Turbo Expander for Cryogenic Refrigeration and Liquefaction Cycles

    NASA Astrophysics Data System (ADS)

    Verma, Rahul; Sam, Ashish Alex; Ghosh, Parthasarathi

    Computational Fluid Dynamics analysis has emerged as a necessary tool for designing of turbomachinery. It helps to understand the various sources of inefficiency through investigation of flow physics of the turbine. In this paper, 3D turbulent flow analysis of a cryogenic turboexpander for small scale air separation was performed using Ansys CFX®. The turboexpander has been designed following assumptions based on meanlineblade generation procedure provided in open literature and good engineering judgement. Through analysis of flow field, modifications and further analysis required to evolve a more robust design procedure, have been suggested.

  19. Recent developments in nickel electrode analysis

    NASA Technical Reports Server (NTRS)

    Whiteley, Richard V.; Daman, M. E.; Kaiser, E. Q.

    1991-01-01

    Three aspects of nickel electrode analysis for Nickel-Hydrogen and Nickel-Cadmium battery cell applications are addressed: (1) the determination of active material; (2) charged state nickel (as NiOOH + CoOOH); and (3) potassium ion content in the electrode. Four deloading procedures are compared for completeness of active material removal, and deloading conditions for efficient active material analyses are established. Two methods for charged state nickel analysis are compared: the current NASA procedure and a new procedure based on the oxidation of sodium oxalate by the charged material. Finally, a method for determining potassium content in an electrode sample by flame photometry is presented along with analytical results illustrating differences in potassium levels from vendor to vendor and the effects of stress testing on potassium content in the electrode. The relevance of these analytical procedures to electrode performance is reviewed.

  20. Influence of Freezing and Storage Procedure on Human Urine Samples in NMR-Based Metabolomics

    PubMed Central

    Rist, Manuela J.; Muhle-Goll, Claudia; Görling, Benjamin; Bub, Achim; Heissler, Stefan; Watzl, Bernhard; Luy, Burkhard

    2013-01-01

    It is consensus in the metabolomics community that standardized protocols should be followed for sample handling, storage and analysis, as it is of utmost importance to maintain constant measurement conditions to identify subtle biological differences. The aim of this work, therefore, was to systematically investigate the influence of freezing procedures and storage temperatures and their effect on NMR spectra as a potentially disturbing aspect for NMR-based metabolomics studies. Urine samples were collected from two healthy volunteers, centrifuged and divided into aliquots. Urine aliquots were frozen either at −20 °C, on dry ice, at −80 °C or in liquid nitrogen and then stored at −20 °C, −80 °C or in liquid nitrogen vapor phase for 1–5 weeks before NMR analysis. Results show spectral changes depending on the freezing procedure, with samples frozen on dry ice showing the largest deviations. The effect was found to be based on pH differences, which were caused by variations in CO2 concentrations introduced by the freezing procedure. Thus, we recommend that urine samples should be frozen at −20 °C and transferred to lower storage temperatures within one week and that freezing procedures should be part of the publication protocol. PMID:24957990

  1. Influence of Freezing and Storage Procedure on Human Urine Samples in NMR-Based Metabolomics.

    PubMed

    Rist, Manuela J; Muhle-Goll, Claudia; Görling, Benjamin; Bub, Achim; Heissler, Stefan; Watzl, Bernhard; Luy, Burkhard

    2013-04-09

    It is consensus in the metabolomics community that standardized protocols should be followed for sample handling, storage and analysis, as it is of utmost importance to maintain constant measurement conditions to identify subtle biological differences. The aim of this work, therefore, was to systematically investigate the influence of freezing procedures and storage temperatures and their effect on NMR spectra as a potentially disturbing aspect for NMR-based metabolomics studies. Urine samples were collected from two healthy volunteers, centrifuged and divided into aliquots. Urine aliquots were frozen either at -20 °C, on dry ice, at -80 °C or in liquid nitrogen and then stored at -20 °C, -80 °C or in liquid nitrogen vapor phase for 1-5 weeks before NMR analysis. Results show spectral changes depending on the freezing procedure, with samples frozen on dry ice showing the largest deviations. The effect was found to be based on pH differences, which were caused by variations in CO2 concentrations introduced by the freezing procedure. Thus, we recommend that urine samples should be frozen at -20 °C and transferred to lower storage temperatures within one week and that freezing procedures should be part of the publication protocol.

  2. [Short-term and long-term fetal heart rate variability after amnioinfusion treatment of oligohydramnios complicated pregnancy].

    PubMed

    Machalski, T; Sikora, J; Bakon, I; Magnucki, J; Grzesiak-Kubica, E; Szkodny, E

    2001-12-01

    Results of computerised analysis of cardiotocograms obtained in the group of 21 pregnancies complicated by idiopathic oligohydramnios are presented in the study. Amnioinfusion procedures were administered serially in local anesthesia with ultrasound and colour Doppler control on the base of oligohydramnios criteria by Phelan. The analysis was based on KOMPOR software created by ITAM Zabrze based on PC computer connected to Hewlett-Packard Series 50A cardiotocograph. Significant short-term variability increase just after amnioinfusion procedure from 5.55 ms to 8.24 ms and after 24 hours up to 7.25 ms was found, while significant long-term variability values changes were not observed.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Townsend, D.W.; Linnhoff, B.

    In Part I, criteria for heat engine and heat pump placement in chemical process networks were derived, based on the ''temperature interval'' (T.I) analysis of the heat exchanger network problem. Using these criteria, this paper gives a method for identifying the best outline design for any combined system of chemical process, heat engines, and heat pumps. The method eliminates inferior alternatives early, and positively leads on to the most appropriate solution. A graphical procedure based on the T.I. analysis forms the heart of the approach, and the calculations involved are simple enough to be carried out on, say, a programmablemore » calculator. Application to a case study is demonstrated. Optimization methods based on this procedure are currently under research.« less

  4. Resource analysis applications in Michigan. [NASA remote sensing

    NASA Technical Reports Server (NTRS)

    Schar, S. W.; Enslin, W. R.; Sattinger, I. J.; Robinson, J. G.; Hosford, K. R.; Fellows, R. S.; Raad, J. H.

    1974-01-01

    During the past two years, available NASA imagery has been applied to a broad spectrum of problems of concern to Michigan-based agencies. These demonstrations include the testing of remote sensing for the purposes of (1) highway corridor planning and impact assessments, (2) game management-area information bases, (3) multi-agency river basin planning, (4) timber resource management information systems, (5) agricultural land reservation policies, and (6) shoreline flooding damage assessment. In addition, cost accounting procedures have been developed for evaluating the relative costs of utilizing remote sensing in land cover and land use analysis data collection procedures.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zdarek, J.; Pecinka, L.

    Leak-before-break (LBB) analysis of WWER type reactors in the Czech and Sloval Republics is summarized in this paper. Legislative bases, required procedures, and validation and verification of procedures are discussed. A list of significant issues identified during the application of LBB analysis is presented. The results of statistical evaluation of crack length characteristics are presented and compared for the WWER 440 Type 230 and 213 reactors and for the WWER 1000 Type 302, 320 and 338 reactors.

  6. TU-FG-201-12: Designing a Risk-Based Quality Assurance Program for a Newly Implemented Y-90 Microspheres Procedure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vile, D; Zhang, L; Cuttino, L

    2016-06-15

    Purpose: To create a quality assurance program based upon a risk-based assessment of a newly implemented SirSpheres Y-90 procedure. Methods: A process map was created for a newly implemented SirSpheres procedure at a community hospital. The process map documented each step of this collaborative procedure, as well as the roles and responsibilities of each member. From the process map, different potential failure modes were determined as well as any current controls in place. From this list, a full failure mode and effects analysis (FMEA) was performed by grading each failure mode’s likelihood of occurrence, likelihood of detection, and potential severity.more » These numbers were then multiplied to compute the risk priority number (RPN) for each potential failure mode. Failure modes were then ranked based on their RPN. Additional controls were then added, with failure modes corresponding to the highest RPNs taking priority. Results: A process map was created that succinctly outlined each step in the SirSpheres procedure in its current implementation. From this, 72 potential failure modes were identified and ranked according to their associated RPN. Quality assurance controls and safety barriers were then added for failure modes associated with the highest risk being addressed first. Conclusion: A quality assurance program was created from a risk-based assessment of the SirSpheres process. Process mapping and FMEA were effective in identifying potential high-risk failure modes for this new procedure, which were prioritized for new quality assurance controls. TG 100 recommends the fault tree analysis methodology to design a comprehensive and effective QC/QM program, yet we found that by simply introducing additional safety barriers to address high RPN failure modes makes the whole process simpler and safer.« less

  7. Testing homogeneity of proportion ratios for stratified correlated bilateral data in two-arm randomized clinical trials.

    PubMed

    Pei, Yanbo; Tian, Guo-Liang; Tang, Man-Lai

    2014-11-10

    Stratified data analysis is an important research topic in many biomedical studies and clinical trials. In this article, we develop five test statistics for testing the homogeneity of proportion ratios for stratified correlated bilateral binary data based on an equal correlation model assumption. Bootstrap procedures based on these test statistics are also considered. To evaluate the performance of these statistics and procedures, we conduct Monte Carlo simulations to study their empirical sizes and powers under various scenarios. Our results suggest that the procedure based on score statistic performs well generally and is highly recommended. When the sample size is large, procedures based on the commonly used weighted least square estimate and logarithmic transformation with Mantel-Haenszel estimate are recommended as they do not involve any computation of maximum likelihood estimates requiring iterative algorithms. We also derive approximate sample size formulas based on the recommended test procedures. Finally, we apply the proposed methods to analyze a multi-center randomized clinical trial for scleroderma patients. Copyright © 2014 John Wiley & Sons, Ltd.

  8. Colling Wipe Samples for VX Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Koester, C; Hoppes, W G

    2010-02-11

    This standard operating procedure (SOP) provides uniform procedures for the collection of wipe samples of VX residues from surfaces. Personnel may use this procedure to collect and handle wipe samples in the field. Various surfaces, including building materials (wood, metal, tile, vinyl, etc.) and equipment, may be sampled based on this procedure. The purpose of such sampling is to determine whether or not the relevant surfaces are contaminated, to determine the extent of their contamination, to evaluate the effectiveness of decontamination procedures, and to determine the amount of contaminant that might present as a contact hazard.

  9. A Comparison of Measurement Equivalence Methods Based on Confirmatory Factor Analysis and Item Response Theory.

    ERIC Educational Resources Information Center

    Flowers, Claudia P.; Raju, Nambury S.; Oshima, T. C.

    Current interest in the assessment of measurement equivalence emphasizes two methods of analysis, linear, and nonlinear procedures. This study simulated data using the graded response model to examine the performance of linear (confirmatory factor analysis or CFA) and nonlinear (item-response-theory-based differential item function or IRT-Based…

  10. 40 CFR 93.123 - Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... PM2.5 violations”) must be based on quantitative analysis using the applicable air quality models... either: (i) Quantitative methods that represent reasonable and common professional practice; or (ii) A...) The hot-spot demonstration required by § 93.116 must be based on quantitative analysis methods for the...

  11. Performance-based, cost- and time-effective pcb analytical methodology.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alvarado, J. S.

    1998-06-11

    Laboratory applications for the analysis of PCBs (polychlorinated biphenyls) in environmental matrices such as soil/sediment/sludge and oil/waste oil were evaluated for potential reduction in waste, source reduction, and alternative techniques for final determination. As a consequence, new procedures were studied for solvent substitution, miniaturization of extraction and cleanups, minimization of reagent consumption, reduction of cost per analysis, and reduction of time. These new procedures provide adequate data that meet all the performance requirements for the determination of PCBs. Use of the new procedures reduced costs for all sample preparation techniques. Time and cost were also reduced by combining the newmore » sample preparation procedures with the power of fast gas chromatography. Separation of Aroclor 1254 was achieved in less than 6 min by using DB-1 and SPB-608 columns. With the greatly shortened run times, reproducibility can be tested quickly and consequently with low cost. With performance-based methodology, the applications presented here can be applied now, without waiting for regulatory approval.« less

  12. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure.

    PubMed

    Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-07-28

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.

  13. The J3 SCR model applied to resonant converter simulation

    NASA Technical Reports Server (NTRS)

    Avant, R. L.; Lee, F. C. Y.

    1985-01-01

    The J3 SCR model is a continuous topology computer model for the SCR. Its circuit analog and parameter estimation procedure are uniformly applicable to popular computer-aided design and analysis programs such as SPICE2 and SCEPTRE. The circuit analog is based on the intrinsic three pn junction structure of the SCR. The parameter estimation procedure requires only manufacturer's specification sheet quantities as a data base.

  14. Comparison of Traditional and Trial-Based Methodologies for Conducting Functional Analyses

    ERIC Educational Resources Information Center

    LaRue, Robert H.; Lenard, Karen; Weiss, Mary Jane; Bamond, Meredith; Palmieri, Mark; Kelley, Michael E.

    2010-01-01

    Functional analysis represents a sophisticated and empirically supported functional assessment procedure. While these procedures have garnered considerable empirical support, they are often underused in clinical practice. Safety risks resulting from the evocation of maladaptive behavior and the length of time required to conduct functional…

  15. A Geometry Based Infra-structure for Computational Analysis and Design

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    1997-01-01

    The computational steps traditionally taken for most engineering analysis (CFD, structural analysis, and etc.) are: Surface Generation - usually by employing a CAD system; Grid Generation - preparing the volume for the simulation; Flow Solver - producing the results at the specified operational point; and Post-processing Visualization - interactively attempting to understand the results For structural analysis, integrated systems can be obtained from a number of commercial vendors. For CFD, these steps have worked well in the past for simple steady-state simulations at the expense of much user interaction. The data was transmitted between phases via files. Specifically the problems with this procedure are: (1) File based. Information flows from one step to the next via data files with formats specified for that procedure. (2) 'Good' Geometry. A bottleneck in getting results from a solver is the construction of proper geometry to be fed to the grid generator. With 'good' geometry a grid can be constructed in tens of minutes (even with a complex configuration) using unstructured techniques. (3) One-Way communication. All information travels on from one phase to the next. Until this process can be automated, more complex problems such as multi-disciplinary analysis or using the above procedure for design becomes prohibitive.

  16. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 4: Advanced fan section aerodynamic analysis computer program user's manual

    NASA Technical Reports Server (NTRS)

    Crook, Andrew J.; Delaney, Robert A.

    1992-01-01

    The computer program user's manual for the ADPACAPES (Advanced Ducted Propfan Analysis Code-Average Passage Engine Simulation) program is included. The objective of the computer program is development of a three-dimensional Euler/Navier-Stokes flow analysis for fan section/engine geometries containing multiple blade rows and multiple spanwise flow splitters. An existing procedure developed by Dr. J. J. Adamczyk and associates at the NASA Lewis Research Center was modified to accept multiple spanwise splitter geometries and simulate engine core conditions. The numerical solution is based upon a finite volume technique with a four stage Runge-Kutta time marching procedure. Multiple blade row solutions are based upon the average-passage system of equations. The numerical solutions are performed on an H-type grid system, with meshes meeting the requirement of maintaining a common axisymmetric mesh for each blade row grid. The analysis was run on several geometry configurations ranging from one to five blade rows and from one to four radial flow splitters. The efficiency of the solution procedure was shown to be the same as the original analysis.

  17. A procedure to estimate proximate analysis of mixed organic wastes.

    PubMed

    Zaher, U; Buffiere, P; Steyer, J P; Chen, S

    2009-04-01

    In waste materials, proximate analysis measuring the total concentration of carbohydrate, protein, and lipid contents from solid wastes is challenging, as a result of the heterogeneous and solid nature of wastes. This paper presents a new procedure that was developed to estimate such complex chemical composition of the waste using conventional practical measurements, such as chemical oxygen demand (COD) and total organic carbon. The procedure is based on mass balance of macronutrient elements (carbon, hydrogen, nitrogen, oxygen, and phosphorus [CHNOP]) (i.e., elemental continuity), in addition to the balance of COD and charge intensity that are applied in mathematical modeling of biological processes. Knowing the composition of such a complex substrate is crucial to study solid waste anaerobic degradation. The procedure was formulated to generate the detailed input required for the International Water Association (London, United Kingdom) Anaerobic Digestion Model number 1 (IWA-ADM1). The complex particulate composition estimated by the procedure was validated with several types of food wastes and animal manures. To make proximate analysis feasible for validation, the wastes were classified into 19 types to allow accurate extraction and proximate analysis. The estimated carbohydrates, proteins, lipids, and inerts concentrations were highly correlated to the proximate analysis; correlation coefficients were 0.94, 0.88, 0.99, and 0.96, respectively. For most of the wastes, carbohydrate was the highest fraction and was estimated accurately by the procedure over an extended range with high linearity. For wastes that are rich in protein and fiber, the procedure was even more consistent compared with the proximate analysis. The new procedure can be used for waste characterization in solid waste treatment design and optimization.

  18. Two Paradoxes in Linear Regression Analysis.

    PubMed

    Feng, Ge; Peng, Jing; Tu, Dongke; Zheng, Julia Z; Feng, Changyong

    2016-12-25

    Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection.

  19. A General Procedure to Assess the Internal Structure of a Noncognitive Measure--The Student360 Insight Program (S360) Time Management Scale. Research Report. ETS RR-11-42

    ERIC Educational Resources Information Center

    Ling, Guangming; Rijmen, Frank

    2011-01-01

    The factorial structure of the Time Management (TM) scale of the Student 360: Insight Program (S360) was evaluated based on a national sample. A general procedure with a variety of methods was introduced and implemented, including the computation of descriptive statistics, exploratory factor analysis (EFA), and confirmatory factor analysis (CFA).…

  20. Procedure-related risk of miscarriage following amniocentesis and chorionic villus sampling: a systematic review and meta-analysis.

    PubMed

    Akolekar, R; Beta, J; Picciarelli, G; Ogilvie, C; D'Antonio, F

    2015-01-01

    To estimate procedure-related risks of miscarriage following amniocentesis and chorionic villus sampling (CVS) based on a systematic review of the literature and a meta-analysis. A search of MEDLINE, EMBASE, CINHAL and The Cochrane Library (2000-2014) was performed to review relevant citations reporting procedure-related complications of amniocentesis and CVS. Only studies reporting data on more than 1000 procedures were included in this review to minimize the effect of bias from smaller studies. Heterogeneity between studies was estimated using Cochran's Q, the I(2) statistic and Egger bias. Meta-analysis of proportions was used to derive weighted pooled estimates for the risk of miscarriage before 24 weeks' gestation. Incidence-rate difference meta-analysis was used to estimate pooled procedure-related risks. The weighted pooled risks of miscarriage following invasive procedures were estimated from analysis of controlled studies including 324 losses in 42 716 women who underwent amniocentesis and 207 losses in 8899 women who underwent CVS. The risk of miscarriage prior to 24 weeks in women who underwent amniocentesis and CVS was 0.81% (95% CI, 0.58-1.08%) and 2.18% (95% CI, 1.61-2.82%), respectively. The background rates of miscarriage in women from the control group that did not undergo any procedures were 0.67% (95% CI, 0.46-0.91%) for amniocentesis and 1.79% (95% CI, 0.61-3.58%) for CVS. The weighted pooled procedure-related risks of miscarriage for amniocentesis and CVS were 0.11% (95% CI, -0.04 to 0.26%) and 0.22% (95% CI, -0.71 to 1.16%), respectively. The procedure-related risks of miscarriage following amniocentesis and CVS are much lower than are currently quoted. Copyright © 2014 ISUOG. Published by John Wiley & Sons Ltd.

  1. Preprocessing and Analysis of LC-MS-Based Proteomic Data

    PubMed Central

    Tsai, Tsung-Heng; Wang, Minkun; Ressom, Habtom W.

    2016-01-01

    Liquid chromatography coupled with mass spectrometry (LC-MS) has been widely used for profiling protein expression levels. This chapter is focused on LC-MS data preprocessing, which is a crucial step in the analysis of LC-MS based proteomics. We provide a high-level overview, highlight associated challenges, and present a step-by-step example for analysis of data from LC-MS based untargeted proteomic study. Furthermore, key procedures and relevant issues with the subsequent analysis by multiple reaction monitoring (MRM) are discussed. PMID:26519169

  2. Safety and Performance Analysis of the Non-Radar Oceanic/Remote Airspace In-Trail Procedure

    NASA Technical Reports Server (NTRS)

    Carreno, Victor A.; Munoz, Cesar A.

    2007-01-01

    This document presents a safety and performance analysis of the nominal case for the In-Trail Procedure (ITP) in a non-radar oceanic/remote airspace. The analysis estimates the risk of collision between the aircraft performing the ITP and a reference aircraft. The risk of collision is only estimated for the ITP maneuver and it is based on nominal operating conditions. The analysis does not consider human error, communication error conditions, or the normal risk of flight present in current operations. The hazards associated with human error and communication errors are evaluated in an Operational Hazards Analysis presented elsewhere.

  3. Implementation of electronic logbook for trainees of general surgery in Thailand.

    PubMed

    Aphinives, Potchavit

    2013-01-01

    All trainees are required to keep a record of their surgical skill and experiences throughout the trainingperiod in a logbook format. Paper-based logbook has several limitations. Therefore, an electronic logbook was introduced to replace the paper-based logbook. An electronic logbook program was developed in November 2005. This program was designed as web-based application based upon PHP scripts beneath Apache web server and MySQL database implementation. Only simpliJfied and essential data, such as hospital number diagnosis, surgical procedure, and pathological findings, etc. are recorded. The electronic logbook databases between Academic year 2006 and 2011 were analyzed. The annual recordedsurgical procedures gradually increasedfrom 41,214 procedures in 2006 to 66,643 procedures in 2011. Around one-third of all records were not verified by attending staffs, i.e. 27.59% (2006), 31.69% (2007), 18.06% (2008), 28.42% (2009), 30.18% (2010), and 31.41% (2011). On the Education year 2011, the three most common procedural groups included colon, rectum & anus group, appendix group, and vascular group, respectively. Advantages of the electronic logbook included more efficient data access, increased ability to monitor trainees and trainers, and analysis of procedural varieties among the training institutes.

  4. Modeling Training Site Vegetation Coverage Probability with a Random Optimizing Procedure: An Artificial Neural Network Approach.

    DTIC Science & Technology

    1998-05-01

    Coverage Probability with a Random Optimization Procedure: An Artificial Neural Network Approach by Biing T. Guan, George Z. Gertner, and Alan B...Modeling Training Site Vegetation Coverage Probability with a Random Optimizing Procedure: An Artificial Neural Network Approach 6. AUTHOR(S) Biing...coverage based on past coverage. Approach A literature survey was conducted to identify artificial neural network analysis techniques applicable for

  5. User's manual for the Shuttle Electric Power System analysis computer program (SEPS), volume 2 of program documentation

    NASA Technical Reports Server (NTRS)

    Bains, R. W.; Herwig, H. A.; Luedeman, J. K.; Torina, E. M.

    1974-01-01

    The Shuttle Electric Power System Analysis SEPS computer program which performs detailed load analysis including predicting energy demands and consumables requirements of the shuttle electric power system along with parameteric and special case studies on the shuttle electric power system is described. The functional flow diagram of the SEPS program is presented along with data base requirements and formats, procedure and activity definitions, and mission timeline input formats. Distribution circuit input and fixed data requirements are included. Run procedures and deck setups are described.

  6. Effect of an Automated Training Presentation on Pre-Service Behavior Analysts' Implementation of Trial-Based Functional Analysis

    ERIC Educational Resources Information Center

    Lambert, Joseph M.; Lloyd, Blair P.; Staubitz, Johanna L.; Weaver, Emily S.; Jennings, Chelsea M.

    2014-01-01

    The trial-based functional analysis (FA) is a useful alternative to the traditional FA in contexts in which it is challenging to establish environmental control for extended periods of time. Previous researchers have demonstrated that others can be trained to conduct trial-based FAs with high procedural fidelity by providing a didactic…

  7. 40 CFR 63.1104 - Process vents from continuous unit operations: applicability assessment procedures and methods.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... permit limit applicable to the process vent. (iv) Design analysis based on accepted chemical engineering... rates, halogenated process vent determinations, process vent TRE index values, and engineering... corrected to 2.3 percent moisture; or (2) The engineering assessment procedures in paragraph (k) of this...

  8. 40 CFR 63.1104 - Process vents from continuous unit operations: applicability assessment procedures and methods.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... permit limit applicable to the process vent. (iv) Design analysis based on accepted chemical engineering... rates, halogenated process vent determinations, process vent TRE index values, and engineering... corrected to 2.3 percent moisture; or (2) The engineering assessment procedures in paragraph (k) of this...

  9. 40 CFR 63.1104 - Process vents from continuous unit operations: applicability assessment procedures and methods.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... permit limit applicable to the process vent. (iv) Design analysis based on accepted chemical engineering... rates, halogenated process vent determinations, process vent TRE index values, and engineering... corrected to 2.3 percent moisture; or (2) The engineering assessment procedures in paragraph (k) of this...

  10. 40 CFR 63.1104 - Process vents from continuous unit operations: applicability assessment procedures and methods.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... permit limit applicable to the process vent. (iv) Design analysis based on accepted chemical engineering... rates, halogenated process vent determinations, process vent TRE index values, and engineering... corrected to 2.3 percent moisture; or (2) The engineering assessment procedures in paragraph (k) of this...

  11. Validated Test Method 1315: Mass Transfer Rates of Constituents in Monolithic or Compacted Granular Materials Using a Semi-Dynamic Tank Leaching Procedure

    EPA Pesticide Factsheets

    Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.

  12. Inferential Procedures for Correlation Coefficients Corrected for Attenuation.

    ERIC Educational Resources Information Center

    Hakstian, A. Ralph; And Others

    1988-01-01

    A model and computation procedure based on classical test score theory are presented for determination of a correlation coefficient corrected for attenuation due to unreliability. Delta and Monte Carlo method applications are discussed. A power analysis revealed no serious loss in efficiency resulting from correction for attentuation. (TJH)

  13. Validated Test Method 1316: Liquid-Solid Partitioning as a Function of Liquid-to-Solid Ratio in Solid Materials Using a Parallel Batch Procedure

    EPA Pesticide Factsheets

    Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.

  14. Element-by-element Solution Procedures for Nonlinear Structural Analysis

    NASA Technical Reports Server (NTRS)

    Hughes, T. J. R.; Winget, J. M.; Levit, I.

    1984-01-01

    Element-by-element approximate factorization procedures are proposed for solving the large finite element equation systems which arise in nonlinear structural mechanics. Architectural and data base advantages of the present algorithms over traditional direct elimination schemes are noted. Results of calculations suggest considerable potential for the methods described.

  15. Capsule Endoscopy in the Assessment of Obscure Gastrointestinal Bleeding: An Economic Analysis

    PubMed Central

    Palimaka, S; Blackhouse, Gord; Goeree, Ron

    2015-01-01

    Background Small-bowel capsule endoscopy is a tool used to visualize the small bowel to identify the location of bleeds in obscure gastrointestinal bleeding (OGIB). Capsule endoscopy is currently funded in Ontario in cases where there has been a failure to identify a source of bleeding via conventional diagnostic procedures. In Ontario, capsule endoscopy is a diagnostic option for patients whose findings on esophagogastroduodenoscopy, colonoscopy, and push enteroscopy have been negative (i.e., the source of bleeding was not found). Objectives This economic analysis aims to estimate the budget impact of different rates of capsule endoscopy use as a complement to push enteroscopy procedures in patients aged 18 years and older. Data Sources Population-based administrative databases for Ontario were used to identify patients receiving push enteroscopy and small-bowel capsule endoscopy in the fiscal years 2008 to 2012. Review Methods A systematic literature search was performed to identify economic evaluations of capsule endoscopy for the investigation of OGIB. Studies were assessed for their methodological quality and their applicability to the Ontarian setting. An original budget impact analysis was performed using data from Ontarian administrative sources and published literature. The budget impact was estimated for different levels of use of capsule endoscopy as a complement to push enteroscopy due to the uncertain clinical utility of the capsule based on current clinical evidence. The analysis was conducted from the provincial public payer perspective. Results With varying rates of capsule endoscopy use, the budgetary impact spans from savings of $510,000,1 when no (0%) push enteroscopy procedures are complemented with capsule endoscopy, to $2,036,000, when all (100%) push enteroscopy procedures are complemented with capsule endoscopy. A scenario where 50% of push enteroscopy procedures are complemented with capsule endoscopy (expected use based on expert opinion) would result in additional expenditure of about $763,000. Limitations In the literature on OGIB, estimates of rebleeding rates after endoscopic procedures or spontaneous cessation rates are unreliable, with a lack of data. Rough estimates from expert consultation can provide an indication of expected additional use of capsule endoscopy; however, a wide range of capsule uses was explored. Conclusions The budgetary impact in the first year in Ontario of capsule endoscopy use to complement push enteroscopy procedures ranges from $510,000 in savings to an additional expenditure of $2,036,000 (at 0% and 100% push enteroscopy procedures complemented, respectively). The expected scenario of 50% of push enteroscopy procedures likely to benefit from the use of capsule endoscopy, based on expert opinion, would result in additional expenditures of $763,000 in the first year. PMID:26355732

  16. Sensory description of sweet wines obtained by the winemaking procedures of raisining, botrytisation and fortification.

    PubMed

    González-Álvarez, Mariana; Noguerol-Pato, Raquel; González-Barreiro, Carmen; Cancho-Grande, Beatriz; Simal-Gándara, Jesús

    2014-02-15

    The effect of winemaking procedures on the sensory modification of sweet wines was investigated. Garnacha Tintorera-based sweet wines were obtained by two different processes: by using raisins for vinification to obtain a naturally sweet wine and by using freshly harvested grapes with the stoppage of the fermentation by the addition of alcohol. Eight international sweet wines were also subjected to sensory analysis for comparative description purposes. Wines were described with a sensory profile by 12 trained panellists on 70 sensory attributes by employing the frequency of citation method. Analysis of variance of the descriptive data confirmed the existence of subtle sensory differences among Garnacha Tintorera-based sweet wines depending on the procedure used for their production. Cluster analysis emphasised discriminated attributes between the Garnacha Tintorera-based and the commercial groups of sweet wines for both those obtained by raisining and by fortification. Several kinds of discriminant functions were used to separate groups of sweet wines--obtained by botrytisation, raisining and fortification--to show the key descriptors that contribute to their separation and define the sensory perception of each type of wine. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Soil Conservation Service Curve Number method: How to mend a wrong soil moisture accounting procedure?

    NASA Astrophysics Data System (ADS)

    Michel, Claude; Andréassian, Vazken; Perrin, Charles

    2005-02-01

    This paper unveils major inconsistencies in the age-old and yet efficient Soil Conservation Service Curve Number (SCS-CN) procedure. Our findings are based on an analysis of the continuous soil moisture accounting procedure implied by the SCS-CN equation. It is shown that several flaws plague the original SCS-CN procedure, the most important one being a confusion between intrinsic parameter and initial condition. A change of parameterization and a more complete assessment of the initial condition lead to a renewed SCS-CN procedure, while keeping the acknowledged efficiency of the original method.

  18. Laparoscopic sentinel node procedure using a combination of patent blue and radiocolloid in women with endometrial cancer.

    PubMed

    Barranger, Emmanuel; Cortez, Annie; Grahek, Dany; Callard, Patrice; Uzan, Serge; Darai, Emile

    2004-03-01

    We assessed the feasibility of a laparoscopic sentinel node (SN) procedure based on the combined use of radiocolloid and patent blue labeling in patients with endometrial cancer. Seventeen patients (median age, 69 years) with endometrial cancer of stage I (16 patients) or stage II (1 patient) underwent a laparoscopic SN procedure based on combined radiocolloid and patent blue injected pericervically. After the SN procedure, all patients underwent complete laparoscopic pelvic lymphadenectomy and either laparoscopically assisted vaginal hysterectomy (16 patients) or laparoscopic radical hysterectomy (1 patient). SNs (mean number per patient, 2.6; range, 1-4) were identified in 16 (94.1%) of the 17 patients. Macrometastases were detected in three SNs from two patients by hematoxylin and eosin staining. In three other patients, immunohistochemical analysis identified six micrometastatic SNs and one SN containing isolated tumor cells. No false-negative SN results were observed. An SN procedure based on a combination of radiocolloid and patent blue is feasible in patients with early endometrial cancer. Combined use of laparoscopy and this SN procedure permits minimally invasive management of endometrial cancer.

  19. Advanced composites structural concepts and materials technologies for primary aircraft structures: Structural response and failure analysis

    NASA Technical Reports Server (NTRS)

    Dorris, William J.; Hairr, John W.; Huang, Jui-Tien; Ingram, J. Edward; Shah, Bharat M.

    1992-01-01

    Non-linear analysis methods were adapted and incorporated in a finite element based DIAL code. These methods are necessary to evaluate the global response of a stiffened structure under combined in-plane and out-of-plane loading. These methods include the Arc Length method and target point analysis procedure. A new interface material model was implemented that can model elastic-plastic behavior of the bond adhesive. Direct application of this method is in skin/stiffener interface failure assessment. Addition of the AML (angle minus longitudinal or load) failure procedure and Hasin's failure criteria provides added capability in the failure predictions. Interactive Stiffened Panel Analysis modules were developed as interactive pre-and post-processors. Each module provides the means of performing self-initiated finite elements based analysis of primary structures such as a flat or curved stiffened panel; a corrugated flat sandwich panel; and a curved geodesic fuselage panel. This module brings finite element analysis into the design of composite structures without the requirement for the user to know much about the techniques and procedures needed to actually perform a finite element analysis from scratch. An interactive finite element code was developed to predict bolted joint strength considering material and geometrical non-linearity. The developed method conducts an ultimate strength failure analysis using a set of material degradation models.

  20. Trust, confidence, procedural fairness, outcome fairness, moral conviction, and the acceptance of GM field experiments.

    PubMed

    Siegrist, Michael; Connor, Melanie; Keller, Carmen

    2012-08-01

    In 2005, Swiss citizens endorsed a moratorium on gene technology, resulting in the prohibition of the commercial cultivation of genetically modified crops and the growth of genetically modified animals until 2013. However, scientific research was not affected by this moratorium, and in 2008, GMO field experiments were conducted that allowed us to examine the factors that influence their acceptance by the public. In this study, trust and confidence items were analyzed using principal component analysis. The analysis revealed the following three factors: "economy/health and environment" (value similarity based trust), "trust and honesty of industry and scientists" (value similarity based trust), and "competence" (confidence). The results of a regression analysis showed that all the three factors significantly influenced the acceptance of GM field experiments. Furthermore, risk communication scholars have suggested that fairness also plays an important role in the acceptance of environmental hazards. We, therefore, included measures for outcome fairness and procedural fairness in our model. However, the impact of fairness may be moderated by moral conviction. That is, fairness may be significant for people for whom GMO is not an important issue, but not for people for whom GMO is an important issue. The regression analysis showed that, in addition to the trust and confidence factors, moral conviction, outcome fairness, and procedural fairness were significant predictors. The results suggest that the influence of procedural fairness is even stronger for persons having high moral convictions compared with persons having low moral convictions. © 2012 Society for Risk Analysis.

  1. Applied behavior analysis: behavior management of children with autism spectrum disorders in dental environments.

    PubMed

    Hernandez, Purnima; Ikkanda, Zachary

    2011-03-01

    There are a limited number of studies addressing behavior management techniques and procedural modifications that dentists can use to treat people with an autism spectrum disorder (ASD). The authors conducted a search of the dental and behavioral analytic literature to identify management techniques that address problem behaviors exhibited by children with ASDs in dental and other health-related environments. Applied behavior analysis (ABA) is a science in which procedures are based on the principles of behavior through systematic experimentation. Clinicians have used ABA procedures successfully to modify socially significant behaviors of people with ASD. Basic behavior management techniques currently used in dentistry may not encourage people with cognitive and behavioral disabilities, such as ASD, to tolerate simple in-office dental procedures consistently. Instead, dental care providers often are required to use advanced behavior management techniques to complete simple in-office procedures such as prophylaxis, sealant placement and obtaining radiographs. ABA procedures can be integrated in the dental environment to manage problem behaviors often exhibited by children with an ASD. The authors found no evidence-based procedural modifications that address the behavioral characteristics and problematic behaviors of children with an ASD in a dental environment. Further research in this area should be conducted. Knowledge and in-depth understanding of behavioral principles is essential when a dentist is concerned with modifying behaviors. Using ABA procedures can help dentists manage problem behaviors effectively and systematically when performing routine dental treatment. Being knowledgeable about each patient's behavioral characteristics and the parents' level of involvement is important in the successful integration of the procedures and reduction of in-office time.

  2. Two Paradoxes in Linear Regression Analysis

    PubMed Central

    FENG, Ge; PENG, Jing; TU, Dongke; ZHENG, Julia Z.; FENG, Changyong

    2016-01-01

    Summary Regression is one of the favorite tools in applied statistics. However, misuse and misinterpretation of results from regression analysis are common in biomedical research. In this paper we use statistical theory and simulation studies to clarify some paradoxes around this popular statistical method. In particular, we show that a widely used model selection procedure employed in many publications in top medical journals is wrong. Formal procedures based on solid statistical theory should be used in model selection. PMID:28638214

  3. Human-Centered Systems Analysis of Aircraft Separation from Adverse Weather: Implications for Icing Remote Sensing

    NASA Technical Reports Server (NTRS)

    Vigeant-Langlois, Laurence; Hansman, R. John, Jr.

    2003-01-01

    The objective of this project was to propose a means to improve aviation weather information, training procedures based on a human-centered systems approach. Methodology: cognitive analysis of pilot's tasks; trajectory-based approach to weather information; contingency planning support; and implications for improving weather information.

  4. Effects of Instructional Design with Mental Model Analysis on Learning.

    ERIC Educational Resources Information Center

    Hong, Eunsook

    This paper presents a model for systematic instructional design that includes mental model analysis together with the procedures used in developing computer-based instructional materials in the area of statistical hypothesis testing. The instructional design model is based on the premise that the objective for learning is to achieve expert-like…

  5. Using Heuristic Task Analysis to Create Web-Based Instructional Design Theory

    ERIC Educational Resources Information Center

    Fiester, Herbert R.

    2010-01-01

    The first purpose of this study was to identify procedural and heuristic knowledge used when creating web-based instruction. The second purpose of this study was to develop suggestions for improving the Heuristic Task Analysis process, a technique for eliciting, analyzing, and representing expertise in cognitively complex tasks. Three expert…

  6. A Study about Placement Support Using Semantic Similarity

    ERIC Educational Resources Information Center

    Katz, Marco; van Bruggen, Jan; Giesbers, Bas; Waterink, Wim; Eshuis, Jannes; Koper, Rob

    2014-01-01

    This paper discusses Latent Semantic Analysis (LSA) as a method for the assessment of prior learning. The Accreditation of Prior Learning (APL) is a procedure to offer learners an individualized curriculum based on their prior experiences and knowledge. The placement decisions in this process are based on the analysis of student material by domain…

  7. Quality evaluation of LC-MS/MS-based E. coli H antigen typing (MS-H) through label-free quantitative data analysis in a clinical sample setup.

    PubMed

    Cheng, Keding; Sloan, Angela; McCorrister, Stuart; Peterson, Lorea; Chui, Huixia; Drebot, Mike; Nadon, Celine; Knox, J David; Wang, Gehua

    2014-12-01

    The need for rapid and accurate H typing is evident during Escherichia coli outbreak situations. This study explores the transition of MS-H, a method originally developed for rapid H antigen typing of E. coli using LC-MS/MS of flagella digest of reference strains and some clinical strains, to E. coli isolates in clinical scenario through quantitative analysis and method validation. Motile and nonmotile strains were examined in batches to simulate clinical sample scenario. Various LC-MS/MS batch run procedures and MS-H typing rules were compared and summarized through quantitative analysis of MS-H data output for a standard method development. Label-free quantitative data analysis of MS-H typing was proven very useful for examining the quality of MS-H result and the effects of some sample carryovers from motile E. coli isolates. Based on this, a refined procedure and protein identification rule specific for clinical MS-H typing was established and validated. With LC-MS/MS batch run procedure and database search parameter unique for E. coli MS-H typing, the standard procedure maintained high accuracy and specificity in clinical situations, and its potential to be used in a clinical setting was clearly established. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Simulation center training as a means to improve resident performance in percutaneous noncontinuous CT-guided fluoroscopic procedures with dose reduction.

    PubMed

    Mendiratta-Lala, Mishal; Williams, Todd R; Mendiratta, Vivek; Ahmed, Hafeez; Bonnett, John W

    2015-04-01

    The purpose of this study was to evaluate the effectiveness of a multifaceted simulation-based resident training for CT-guided fluoroscopic procedures by measuring procedural and technical skills, radiation dose, and procedure times before and after simulation training. A prospective analysis included 40 radiology residents and eight staff radiologists. Residents took an online pretest to assess baseline procedural knowledge. Second-through fourth-year residents' baseline technical skills with a procedural phantom were evaluated. First-through third-year residents then underwent formal didactic and simulation-based procedural and technical training with one of two interventional radiologists and followed the training with 1 month of supervised phantom-based practice. Thereafter, residents underwent final written and practical examinations. The practical examination included essential items from a 20-point checklist, including site and side marking, consent, time-out, and sterile technique along with a technical skills portion assessing pedal steps, radiation dose, needle redirects, and procedure time. The results indicated statistically significant improvement in procedural and technical skills after simulation training. For residents, the median number of pedal steps decreased by three (p=0.001), median dose decreased by 15.4 mGy (p<0.001), median procedure time decreased by 4.0 minutes (p<0.001), median number of needle redirects decreased by 1.0 (p=0.005), and median number of 20-point checklist items successfully completed increased by three (p<0.001). The results suggest that procedural skills can be acquired and improved by simulation-based training of residents, regardless of experience. CT simulation training decreases procedural time, decreases radiation dose, and improves resident efficiency and confidence, which may transfer to clinical practice with improved patient care and safety.

  9. Investigation of advanced counterrotation blade configuration concepts for high speed turboprop systems. Task 4: Advanced fan section aerodynamic analysis

    NASA Technical Reports Server (NTRS)

    Crook, Andrew J.; Delaney, Robert A.

    1992-01-01

    The purpose of this study is the development of a three-dimensional Euler/Navier-Stokes flow analysis for fan section/engine geometries containing multiple blade rows and multiple spanwise flow splitters. An existing procedure developed by Dr. J. J. Adamczyk and associates and the NASA Lewis Research Center was modified to accept multiple spanwise splitter geometries and simulate engine core conditions. The procedure was also modified to allow coarse parallelization of the solution algorithm. This document is a final report outlining the development and techniques used in the procedure. The numerical solution is based upon a finite volume technique with a four stage Runge-Kutta time marching procedure. Numerical dissipation is used to gain solution stability but is reduced in viscous dominated flow regions. Local time stepping and implicit residual smoothing are used to increase the rate of convergence. Multiple blade row solutions are based upon the average-passage system of equations. The numerical solutions are performed on an H-type grid system, with meshes being generated by the system (TIGG3D) developed earlier under this contract. The grid generation scheme meets the average-passage requirement of maintaining a common axisymmetric mesh for each blade row grid. The analysis was run on several geometry configurations ranging from one to five blade rows and from one to four radial flow splitters. Pure internal flow solutions were obtained as well as solutions with flow about the cowl/nacelle and various engine core flow conditions. The efficiency of the solution procedure was shown to be the same as the original analysis.

  10. Contact stresses in gear teeth: A new method of analysis

    NASA Technical Reports Server (NTRS)

    Somprakit, Paisan; Huston, Ronald L.; Oswald, Fred B.

    1991-01-01

    A new, innovative procedure called point load superposition for determining the contact stresses in mating gear teeth. It is believed that this procedure will greatly extend both the range of applicability and the accuracy of gear contact stress analysis. Point load superposition is based upon fundamental solutions from the theory of elasticity. It is an iterative numerical procedure which has distinct advantages over the classical Hertz method, the finite element method, and over existing applications with the boundary element method. Specifically, friction and sliding effects, which are either excluded from or difficult to study with the classical methods, are routinely handled with the new procedure. Presented here are the basic theory and the algorithms. Several examples are given. Results are consistent with those of the classical theories. Applications to spur gears are discussed.

  11. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure

    PubMed Central

    Castellazzi, Giovanni; D’Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-01-01

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation. PMID:26225978

  12. SU-E-T-635: Process Mapping of Eye Plaque Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huynh, J; Kim, Y

    Purpose: To apply a risk-based assessment and analysis technique (AAPM TG 100) to eye plaque brachytherapy treatment of ocular melanoma. Methods: The role and responsibility of personnel involved in the eye plaque brachytherapy is defined for retinal specialist, radiation oncologist, nurse and medical physicist. The entire procedure was examined carefully. First, major processes were identified and then details for each major process were followed. Results: Seventy-one total potential modes were identified. Eight major processes (corresponding detailed number of modes) are patient consultation (2 modes), pretreatment tumor localization (11), treatment planning (13), seed ordering and calibration (10), eye plaque assembly (10),more » implantation (11), removal (11), and deconstruction (3), respectively. Half of the total modes (36 modes) are related to physicist while physicist is not involved in processes such as during the actual procedure of suturing and removing the plaque. Conclusion: Not only can failure modes arise from physicist-related procedures such as treatment planning and source activity calibration, but it can also exist in more clinical procedures by other medical staff. The improvement of the accurate communication for non-physicist-related clinical procedures could potentially be an approach to prevent human errors. More rigorous physics double check would reduce the error for physicist-related procedures. Eventually, based on this detailed process map, failure mode and effect analysis (FMEA) will identify top tiers of modes by ranking all possible modes with risk priority number (RPN). For those high risk modes, fault tree analysis (FTA) will provide possible preventive action plans.« less

  13. A Permutation Approach for Selecting the Penalty Parameter in Penalized Model Selection

    PubMed Central

    Sabourin, Jeremy A; Valdar, William; Nobel, Andrew B

    2015-01-01

    Summary We describe a simple, computationally effcient, permutation-based procedure for selecting the penalty parameter in LASSO penalized regression. The procedure, permutation selection, is intended for applications where variable selection is the primary focus, and can be applied in a variety of structural settings, including that of generalized linear models. We briefly discuss connections between permutation selection and existing theory for the LASSO. In addition, we present a simulation study and an analysis of real biomedical data sets in which permutation selection is compared with selection based on the following: cross-validation (CV), the Bayesian information criterion (BIC), Scaled Sparse Linear Regression, and a selection method based on recently developed testing procedures for the LASSO. PMID:26243050

  14. Shrunken head (tsantsa): a complete forensic analysis procedure.

    PubMed

    Charlier, P; Huynh-Charlier, I; Brun, L; Hervé, C; de la Grandmaison, G Lorin

    2012-10-10

    Based on the analysis of shrunken heads referred to our forensic laboratory for anthropological expertise, and data from both anthropological and medical literature, we propose a complete forensic procedure for the analysis of such pieces. A list of 14 original morphological criteria has been developed, based on the global aspect, color, physical deformation, anatomical details, and eventual associated material (wood, vegetal fibers, sand, charcoals, etc.). Such criteria have been tested on a control sample of 20 tsantsa (i.e. shrunken heads from the Jivaro or Shuar tribes of South America). Further complementary analyses are described such as CT-scan and microscopic examination. Such expertise is more and more asked to forensic anthropologists and practitioners in a context of global repatriation of human artifacts to native communities. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  15. Analysis of D-penicillamine by gas chromatography utilizing nitrogen--phosphorus detection.

    PubMed

    Rushing, L G; Hansen, E B; Thompson, H C

    1985-01-11

    A method is presented for the analysis of the "orphan" drug D-penicillamine (D-Pa), which is used for the treatment of the inherited rare copper metabolism dysfunction known as Wilson's disease, by assaying a derivative of the compound by gas chromatography employing a rubidium sensitized nitrogen--phosphorus detector. Analytical procedures are described for the analyses of residues of D-Pa X HCl salt in animal feed and for the analyses of the salt or free base from aqueous solutions by utilizing a single-step double derivatization with diazomethane--acetone. Stability data for D-Pa X HCl in animal feed and for the free base in water are presented. An ancillary fluorescence derivatization procedure for the analysis of D-Pa in water is also reported.

  16. 40 CFR 93.123 - Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... (“Localized CO, PM10, and PM2.5 violations”) must be based on quantitative analysis using the applicable air... § 93.116 may be based on either: (i) Quantitative methods that represent reasonable and common... hot-spot analyses. (1) The hot-spot demonstration required by § 93.116 must be based on quantitative...

  17. 40 CFR 93.123 - Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis).

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... (“Localized CO, PM10, and PM2.5 violations”) must be based on quantitative analysis using the applicable air... § 93.116 may be based on either: (i) Quantitative methods that represent reasonable and common... hot-spot analyses. (1) The hot-spot demonstration required by § 93.116 must be based on quantitative...

  18. 40 CFR 93.123 - Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis).

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... (“Localized CO, PM10, and PM2.5 violations”) must be based on quantitative analysis using the applicable air... § 93.116 may be based on either: (i) Quantitative methods that represent reasonable and common... hot-spot analyses. (1) The hot-spot demonstration required by § 93.116 must be based on quantitative...

  19. 40 CFR 93.123 - Procedures for determining localized CO, PM10, and PM2.5 concentrations (hot-spot analysis).

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... (“Localized CO, PM10, and PM2.5 violations”) must be based on quantitative analysis using the applicable air... § 93.116 may be based on either: (i) Quantitative methods that represent reasonable and common... hot-spot analyses. (1) The hot-spot demonstration required by § 93.116 must be based on quantitative...

  20. Analysis of Slug Tests in Formations of High Hydraulic Conductivity

    USGS Publications Warehouse

    Butler, J.J.; Garnett, E.J.; Healey, J.M.

    2003-01-01

    A new procedure is presented for the analysis of slug tests performed in partially penetrating wells in formations of high hydraulic conductivity. This approach is a simple, spreadsheet-based implementation of existing models that can be used for analysis of tests from confined or unconfined aquifers. Field examples of tests exhibiting oscillatory and nonoscillatory behavior are used to illustrate the procedure and to compare results with estimates obtained using alternative approaches. The procedure is considerably simpler than recently proposed methods for this hydrogeologic setting. Although the simplifications required by the approach can introduce error into hydraulic-conductivity estimates, this additional error becomes negligible when appropriate measures are taken in the field. These measures are summarized in a set of practical field guidelines for slug tests in highly permeable aquifers.

  1. Evaluation of solution procedures for material and/or geometrically nonlinear structural analysis by the direct stiffness method.

    NASA Technical Reports Server (NTRS)

    Stricklin, J. A.; Haisler, W. E.; Von Riesemann, W. A.

    1972-01-01

    This paper presents an assessment of the solution procedures available for the analysis of inelastic and/or large deflection structural behavior. A literature survey is given which summarized the contribution of other researchers in the analysis of structural problems exhibiting material nonlinearities and combined geometric-material nonlinearities. Attention is focused at evaluating the available computation and solution techniques. Each of the solution techniques is developed from a common equation of equilibrium in terms of pseudo forces. The solution procedures are applied to circular plates and shells of revolution in an attempt to compare and evaluate each with respect to computational accuracy, economy, and efficiency. Based on the numerical studies, observations and comments are made with regard to the accuracy and economy of each solution technique.

  2. Simultaneous grouping and ranking with combination of SOM and TOPSIS for selection of preferable analytical procedure for furan determination in food.

    PubMed

    Jędrkiewicz, Renata; Tsakovski, Stefan; Lavenu, Aurore; Namieśnik, Jacek; Tobiszewski, Marek

    2018-02-01

    Novel methodology for grouping and ranking with application of self-organizing maps and multicriteria decision analysis is presented. The dataset consists of 22 objects that are analytical procedures applied to furan determination in food samples. They are described by 10 variables, referred to their analytical performance, environmental and economic aspects. Multivariate statistics analysis allows to limit the amount of input data for ranking analysis. Assessment results show that the most beneficial procedures are based on microextraction techniques with GC-MS final determination. It is presented how the information obtained from both tools complement each other. The applicability of combination of grouping and ranking is also discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  3. Analysis of slug tests in formations of high hydraulic conductivity.

    PubMed

    Butler, James J; Garnett, Elizabeth J; Healey, John M

    2003-01-01

    A new procedure is presented for the analysis of slug tests performed in partially penetrating wells in formations of high hydraulic conductivity. This approach is a simple, spreadsheet-based implementation of existing models that can be used for analysis of tests from confined or unconfined aquifers. Field examples of tests exhibiting oscillatory and nonoscillatory behavior are used to illustrate the procedure and to compare results with estimates obtained using alternative approaches. The procedure is considerably simpler than recently proposed methods for this hydrogeologic setting. Although the simplifications required by the approach can introduce error into hydraulic-conductivity estimates, this additional error becomes negligible when appropriate measures are taken in the field. These measures are summarized in a set of practical field guidelines for slug tests in highly permeable aquifers.

  4. Validated Test Method 1314: Liquid-Solid Partitioning as a Function of Liquid-Solid Ratio for Constituents in Solid Materials Using An Up-Flow Percolation Column Procedure

    EPA Pesticide Factsheets

    Describes procedures written based on the assumption that they will be performed by analysts who are formally trained in at least the basic principles of chemical analysis and in the use of the subject technology.

  5. Robotic-assisted laparoendoscopic single-site surgery (R-LESS) in urology: an evidence-based analysis.

    PubMed

    Barret, E; Sanchez-Salas, R; Ercolani, M; Forgues, A; Rozet, F; Galiano, M; Cathelineau, X

    2011-06-01

    The objective of this manuscript is to provide an evidence-based analysis of the current status and future perspectives of robotic laparoendoscopic single-site surgery (R-LESS). A PubMed search has been performed for all relevant urological literature regarding natural orifice transluminal endoscopic surgery (NOTES) and laparoendoscopic single-site surgery (LESS). All clinical and investigative reports for robotic LESS and NOTES procedures in the urological literature have been considered. A significant number of clinical urological procedures have been successfully completed utilizing R-LESS procedures. The available experience is limited to referral centers, where the case volume is sufficient to help overcome the challenges and learning curve of LESS surgery. The robotic interface remains the best fit for LESS procedures but its mode of use continues to evolve in attempts to improve surgical technique. We stand today at the dawn of R-LESS surgery, but this approach may well become the standard of care in the near future. Further technological development is needed to allow widespread adoption of the technique.

  6. Inverse Thermal Analysis of Alloy 690 Laser and Hybrid Laser-GMA Welds Using Solidification-Boundary Constraints

    NASA Astrophysics Data System (ADS)

    Lambrakos, S. G.

    2017-08-01

    An inverse thermal analysis of Alloy 690 laser and hybrid laser-GMA welds is presented that uses numerical-analytical basis functions and boundary constraints based on measured solidification cross sections. In particular, the inverse analysis procedure uses three-dimensional constraint conditions such that two-dimensional projections of calculated solidification boundaries are constrained to map within experimentally measured solidification cross sections. Temperature histories calculated by this analysis are input data for computational procedures that predict solid-state phase transformations and mechanical response. These temperature histories can be used for inverse thermal analysis of welds corresponding to other welding processes whose process conditions are within similar regimes.

  7. Interdisciplinary analysis procedures in the modeling and control of large space-based structures

    NASA Technical Reports Server (NTRS)

    Cooper, Paul A.; Stockwell, Alan E.; Kim, Zeen C.

    1987-01-01

    The paper describes a computer software system called the Integrated Multidisciplinary Analysis Tool, IMAT, that has been developed at NASA Langley Research Center. IMAT provides researchers and analysts with an efficient capability to analyze satellite control systems influenced by structural dynamics. Using a menu-driven interactive executive program, IMAT links a relational database to commercial structural and controls analysis codes. The paper describes the procedures followed to analyze a complex satellite structure and control system. The codes used to accomplish the analysis are described, and an example is provided of an application of IMAT to the analysis of a reference space station subject to a rectangular pulse loading at its docking port.

  8. Use of market segmentation to identify untapped consumer needs in vision correction surgery for future growth.

    PubMed

    Loarie, Thomas M; Applegate, David; Kuenne, Christopher B; Choi, Lawrence J; Horowitz, Diane P

    2003-01-01

    Market segmentation analysis identifies discrete segments of the population whose beliefs are consistent with exhibited behaviors such as purchase choice. This study applies market segmentation analysis to low myopes (-1 to -3 D with less than 1 D cylinder) in their consideration and choice of a refractive surgery procedure to discover opportunities within the market. A quantitative survey based on focus group research was sent to a demographically balanced sample of myopes using contact lenses and/or glasses. A variable reduction process followed by a clustering analysis was used to discover discrete belief-based segments. The resulting segments were validated both analytically and through in-market testing. Discontented individuals who wear contact lenses are the primary target for vision correction surgery. However, 81% of the target group is apprehensive about laser in situ keratomileusis (LASIK). They are nervous about the procedure and strongly desire reversibility and exchangeability. There exists a large untapped opportunity for vision correction surgery within the low myope population. Market segmentation analysis helped determine how to best meet this opportunity through repositioning existing procedures or developing new vision correction technology, and could also be applied to identify opportunities in other vision correction populations.

  9. Risk analysis procedure for post-wildfire natural hazards in British Columbia

    NASA Astrophysics Data System (ADS)

    Jordan, Peter

    2010-05-01

    Following a severe wildfire season in 2003, and several subsequent damaging debris flow and flood events, the British Columbia Forest Service developed a procedure for analysing risks to public safety and infrastructure from such events. At the same time, the Forest Service undertook a research program to determine the extent of post-wildfire hazards, and examine the hydrologic and geomorphic processes contributing to the hazards. The risk analysis procedure follows the Canadian Standards Association decision-making framework for risk management (which in turn is based on international standards). This has several steps: identification of risk, risk analysis and estimation, evaluation of risk tolerability, developing control or mitigation strategies, and acting on these strategies. The Forest Service procedure deals only with the first two steps. The results are passed on to authorities such as the Provincial Emergency Program and local government, who are responsible for evaluating risks, warning residents, and applying mitigation strategies if appropriate. The objective of the procedure is to identify and analyse risks to public safety and infrastructure. The procedure is loosely based on the BAER (burned area emergency response) program in the USA, with some important differences. Our procedure focuses on identifying risks and warning affected parties, not on mitigation activities such as broadcast erosion control measures. Partly this is due to limited staff and financial resources. Also, our procedure is not multi-agency, but is limited to wildfires on provincial forest land; in British Columbia about 95% of forest land is in the publicly-owned provincial forest. Each fire season, wildfires are screened by size and proximity to values at risk such as populated areas. For selected fires, when the fire is largely contained, the procedure begins with an aerial reconnaissance of the fire, and photography with a hand-held camera, which can be used to make a preliminary map of vegetation burn severity if desired. The next steps include mapping catchment boundaries, field traverses to collect data on soil burn severity and water repellency, identification of unstable hillslopes and channels, and inspection of values at risk from hazards such as debris flows or flooding. BARC (burned area reflectance classification) maps based on satellite imagery are prepared for some fires, although these are typically not available for several weeks. Our objective is to make a preliminary risk analysis report available about two weeks after the fire is contained. If high risks to public safety or infrastructure are identified, the risk analysis reports may make recommendations for mitigation measures to be considered; however, acting on these recommendations is the responsibility of local land managers, local government, or landowners. Mitigation measures for some fires have included engineering treatments to reduce the hydrologic impact of logging roads, protective structures such as dykes or berms, and straw mulching to reduce runoff and erosion on severely burned areas. The Terrace Mountain Fire, with burned 9000 hectares in the Okanagan Valley in 2009, is used as an example of the application of the procedure.

  10. "They Have to Adapt to Learn": Surgeons' Perspectives on the Role of Procedural Variation in Surgical Education.

    PubMed

    Apramian, Tavis; Cristancho, Sayra; Watling, Chris; Ott, Michael; Lingard, Lorelei

    2016-01-01

    Clinical research increasingly acknowledges the existence of significant procedural variation in surgical practice. This study explored surgeons' perspectives regarding the influence of intersurgeon procedural variation on the teaching and learning of surgical residents. This qualitative study used a grounded theory-based analysis of observational and interview data. Observational data were collected in 3 tertiary care teaching hospitals in Ontario, Canada. Semistructured interviews explored potential procedural variations arising during the observations and prompts from an iteratively refined guide. Ongoing data analysis refined the theoretical framework and informed data collection strategies, as prescribed by the iterative nature of grounded theory research. Our sample included 99 hours of observation across 45 cases with 14 surgeons. Semistructured, audio-recorded interviews (n = 14) occurred immediately following observational periods. Surgeons endorsed the use of intersurgeon procedural variations to teach residents about adapting to the complexity of surgical practice and the norms of surgical culture. Surgeons suggested that residents' efforts to identify thresholds of principle and preference are crucial to professional development. Principles that emerged from the study included the following: (1) knowing what comes next, (2) choosing the right plane, (3) handling tissue appropriately, (4) recognizing the abnormal, and (5) making safe progress. Surgeons suggested that learning to follow these principles while maintaining key aspects of surgical culture, like autonomy and individuality, are important social processes in surgical education. Acknowledging intersurgeon variation has important implications for curriculum development and workplace-based assessment in surgical education. Adapting to intersurgeon procedural variations may foster versatility in surgical residents. However, the existence of procedural variations and their active use in surgeons' teaching raises questions about the lack of attention to this form of complexity in current workplace-based assessment strategies. Failure to recognize the role of such variations may threaten the implementation of competency-based medical education in surgery. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  11. “They Have to Adapt to Learn”: Surgeons’ Perspectives on the Role of Procedural Variation in Surgical Education

    PubMed Central

    Apramian, Tavis; Cristancho, Sayra; Watling, Chris; Ott, Michael; Lingard, Lorelei

    2017-01-01

    OBJECTIVE Clinical research increasingly acknowledges the existence of significant procedural variation in surgical practice. This study explored surgeons’ perspectives regarding the influence of intersurgeon procedural variation on the teaching and learning of surgical residents. DESIGN AND SETTING This qualitative study used a grounded theory-based analysis of observational and interview data. Observational data were collected in 3 tertiary care teaching hospitals in Ontario, Canada. Semistructured interviews explored potential procedural variations arising during the observations and prompts from an iteratively refined guide. Ongoing data analysis refined the theoretical framework and informed data collection strategies, as prescribed by the iterative nature of grounded theory research. PARTICIPANTS Our sample included 99 hours of observation across 45 cases with 14 surgeons. Semistructured, audio-recorded interviews (n = 14) occurred immediately following observational periods. RESULTS Surgeons endorsed the use of intersurgeon procedural variations to teach residents about adapting to the complexity of surgical practice and the norms of surgical culture. Surgeons suggested that residents’ efforts to identify thresholds of principle and preference are crucial to professional development. Principles that emerged from the study included the following: (1) knowing what comes next, (2) choosing the right plane, (3) handling tissue appropriately, (4) recognizing the abnormal, and (5) making safe progress. Surgeons suggested that learning to follow these principles while maintaining key aspects of surgical culture, like autonomy and individuality, are important social processes in surgical education. CONCLUSIONS Acknowledging intersurgeon variation has important implications for curriculum development and workplace-based assessment in surgical education. Adapting to intersurgeon procedural variations may foster versatility in surgical residents. However, the existence of procedural variations and their active use in surgeons’ teaching raises questions about the lack of attention to this form of complexity in current workplace-based assessment strategies. Failure to recognize the role of such variations may threaten the implementation of competency-based medical education in surgery. PMID:26705062

  12. Uncommon combinations of ICD10-PCS or ICD-9-CM operative procedure codes account for most inpatient surgery at half of Texas hospitals.

    PubMed

    O'Neill, Liam; Dexter, Franklin; Park, Sae-Hwan; Epstein, Richard H

    2017-09-01

    Recently, there has been interest in activity-based cost accounting for inpatient surgical procedures to facilitate "value based" analyses. Research 10-20years ago, performed using data from 3 large teaching hospitals, found that activity-based cost accounting was practical and useful for modeling surgeons and subspecialties, but inaccurate for individual procedures. We hypothesized that these older results would apply to hundreds of hospitals, currently evaluable using administrative databases. Observational study. State of Texas hospital discharge abstract data for 1st quarter of 2016, 4th quarter of 2015, 1st quarter of 2015, and 4th quarter of 2014. Discharged from an acute care hospital in Texas with at least 1 major therapeutic ("operative") procedure. Counts of discharges for each procedure or combination of procedures, classified by ICD-10-PCS or ICD-9-CM. At the average hospital, most surgical discharges were for procedures performed at most once a month at the hospital (54%, 95% confidence interval [CI] 51% to 55%). At the average hospital, approximately 90% of procedures were performed at most once a month at the hospital (93%, CI 93% to 94%). The percentages were insensitive to the quarter of the year. The percentages were 3% to 6% greater with ICD-10-PCS than for the superseded ICD 9 CM. There are many different procedure codes, and many different combinations of codes, relative to the number of different hospital discharges. Since most procedures at most hospitals are performed no more than once a month, activity-based cost accounting with a sample size sufficient to be useful is impractical for the vast majority of procedures, in contrast to analysis by surgeon and/or subspecialty. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. When is carotid angioplasty and stenting the cost-effective alternative for revascularization of symptomatic carotid stenosis? A Canadian health system perspective.

    PubMed

    Almekhlafi, M A; Hill, M D; Wiebe, S; Goyal, M; Yavin, D; Wong, J H; Clement, F M

    2014-02-01

    Carotid revascularization procedures can be complicated by stroke. Additional disability adds to the already high costs of the procedure. To weigh the cost and benefit, we estimated the cost-utility of carotid angioplasty and stenting compared with carotid endarterectomy among patients with symptomatic carotid stenosis, with special emphasis on scenario analyses that would yield carotid angioplasty and stenting as the cost-effective alternative relative to carotid endarterectomy. A cost-utility analysis from the perspective of the health system payer was performed by using a Markov analytic model. Clinical estimates were based on a meta-analysis. The procedural costs were derived from a microcosting data base. The costs for hospitalization and rehabilitation of patients with stroke were based on a Canadian multicenter study. Utilities were based on a randomized controlled trial. In the base case analysis, carotid angioplasty and stenting were more expensive (incremental cost of $6107) and had a lower utility (-0.12 quality-adjusted life years) than carotid endarterectomy. The results are sensitive to changes in the risk of clinical events and the relative risk of death and stroke. Carotid angioplasty and stenting were more economically attractive among high-risk surgical patients. For carotid angioplasty and stenting to become the preferred option, their costs would need to fall from more than $7300 to $4350 or less and the risks of the periprocedural and annual minor strokes would have to be equivalent to that of carotid endarterectomy. In the base case analysis, carotid angioplasty and stenting were associated with higher costs and lower utility compared with carotid endarterectomy for patients with symptomatic carotid stenosis. Carotid angioplasty and stenting were cost-effective for patients with high surgical risk.

  14. Vibration Signature Analysis of a Faulted Gear Transmission System

    NASA Technical Reports Server (NTRS)

    Choy, F. K.; Huang, S.; Zakrajsek, J. J.; Handschuh, R. F.; Townsend, D. P.

    1994-01-01

    A comprehensive procedure in predicting faults in gear transmission systems under normal operating conditions is presented. Experimental data was obtained from a spiral bevel gear fatigue test rig at NASA Lewis Research Center. Time synchronous averaged vibration data was recorded throughout the test as the fault progressed from a small single pit to severe pitting over several teeth, and finally tooth fracture. A numerical procedure based on the Winger-Ville distribution was used to examine the time averaged vibration data. Results from the Wigner-Ville procedure are compared to results from a variety of signal analysis techniques which include time domain analysis methods and frequency analysis methods. Using photographs of the gear tooth at various stages of damage, the limitations and accuracy of the various techniques are compared and discussed. Conclusions are drawn from the comparison of the different approaches as well as the applicability of the Wigner-Ville method in predicting gear faults.

  15. General Staining and Segmentation Procedures for High Content Imaging and Analysis.

    PubMed

    Chambers, Kevin M; Mandavilli, Bhaskar S; Dolman, Nick J; Janes, Michael S

    2018-01-01

    Automated quantitative fluorescence microscopy, also known as high content imaging (HCI), is a rapidly growing analytical approach in cell biology. Because automated image analysis relies heavily on robust demarcation of cells and subcellular regions, reliable methods for labeling cells is a critical component of the HCI workflow. Labeling of cells for image segmentation is typically performed with fluorescent probes that bind DNA for nuclear-based cell demarcation or with those which react with proteins for image analysis based on whole cell staining. These reagents, along with instrument and software settings, play an important role in the successful segmentation of cells in a population for automated and quantitative image analysis. In this chapter, we describe standard procedures for labeling and image segmentation in both live and fixed cell samples. The chapter will also provide troubleshooting guidelines for some of the common problems associated with these aspects of HCI.

  16. Comparing preference assessments: selection- versus duration-based preference assessment procedures.

    PubMed

    Kodak, Tiffany; Fisher, Wayne W; Kelley, Michael E; Kisamore, April

    2009-01-01

    In the current investigation, the results of a selection- and a duration-based preference assessment procedure were compared. A Multiple Stimulus With Replacement (MSW) preference assessment [Windsor, J., Piché, L. M., & Locke, P. A. (1994). Preference testing: A comparison of two presentation methods. Research in Developmental Disabilities, 15, 439-455] and a variation of a Free-Operant (FO) preference assessment procedure [Roane, H. S., Vollmer, T. R., Ringdahl, J. E., & Marcus, B. A. (1998). Evaluation of a brief stimulus preference assessment. Journal of Applied Behavior Analysis, 31, 605-620] were conducted with four participants. A reinforcer assessment was conducted to determine which preference assessment procedure identified the item that produced the highest rates of responding. The items identified as most highly preferred were different across preference assessment procedures for all participants. Results of the reinforcer assessment showed that the MSW identified the item that functioned as the most effective reinforcer for two participants.

  17. Patient Preferences Regarding Surgical Interventions for Knee Osteoarthritis

    PubMed Central

    Moorman, Claude T; Kirwan, Tom; Share, Jennifer; Vannabouathong, Christopher

    2017-01-01

    Surgical interventions for knee osteoarthritis (OA) have markedly different procedure attributes and may have dramatic differences in patient desirability. A total of 323 patients with knee OA were included in a dual response, choice-based conjoint analysis to identify the relative preference of 9 different procedure attributes. A model was also developed to simulate how patients might respond if presented with the real-world knee OA procedures, based on conservative assumptions regarding their attributes. The “amount of cutting and removal of the existing bone” required for a procedure had the highest preference score, indicating that these patients considered it the most important attribute. More specifically, a procedure that requires the least amount of bone cutting or removal would be expected to be the most preferred surgical alternative. The model also suggested that patients who are younger and report the highest pain levels and greatest functional limitations would be more likely to opt for surgical intervention. PMID:28974919

  18. Symbolic dynamic filtering and language measure for behavior identification of mobile robots.

    PubMed

    Mallapragada, Goutham; Ray, Asok; Jin, Xin

    2012-06-01

    This paper presents a procedure for behavior identification of mobile robots, which requires limited or no domain knowledge of the underlying process. While the features of robot behavior are extracted by symbolic dynamic filtering of the observed time series, the behavior patterns are classified based on language measure theory. The behavior identification procedure has been experimentally validated on a networked robotic test bed by comparison with commonly used tools, namely, principal component analysis for feature extraction and Bayesian risk analysis for pattern classification.

  19. The environmental analysis of helicopter operations by Federal agencies: Current procedures and research needs

    NASA Technical Reports Server (NTRS)

    Smith, C. C.; Warner, D. B.; Dajani, J. S.

    1977-01-01

    The technical, economic, and environmental problems restricting commercial helicopter passenger operations are reviewed. The key considerations for effective assessment procedures are outlined and a preliminary model for the environmental analysis of helicopters is developed. It is recommended that this model, or some similar approach, be used as a common base for the development of comprehensive environmental assessment methods for each of the federal agencies concerned with helicopters. A description of the critical environmental research issues applicable to helicopters is also presented.

  20. Doing More for More: Unintended Consequences of Financial Incentives for Oncology Specialty Care.

    PubMed

    O'Neil, Brock; Graves, Amy J; Barocas, Daniel A; Chang, Sam S; Penson, David F; Resnick, Matthew J

    2016-02-01

    Specialty care remains a significant contributor to health care spending but largely unaddressed in novel payment models aimed at promoting value-based delivery. Bladder cancer, chiefly managed by subspecialists, is among the most costly. In 2005, Centers for Medicare and Medicaid Services (CMS) dramatically increased physician payment for office-based interventions for bladder cancer to shift care from higher cost facilities, but the impact is unknown. This study evaluated the effect of financial incentives on patterns of fee-for-service (FFS) bladder cancer care. Data from a 5% sample of Medicare beneficiaries from 2001-2013 were evaluated using interrupted time-series analysis with segmented regression. Primary outcomes were the effects of CMS fee modifications on utilization and site of service for procedures associated with the diagnosis and treatment of bladder cancer. Rates of related bladder cancer procedures that were not affected by the fee change were concurrent controls. Finally, the effect of payment changes on both diagnostic yield and need for redundant procedures were studied. All statistical tests were two-sided. Utilization of clinic-based procedures increased by 644% (95% confidence interval [CI] = 584% to 704%) after the fee change, but without reciprocal decline in facility-based procedures. Procedures unaffected by the fee incentive remained unchanged throughout the study period. Diagnostic yield decreased by 17.0% (95% CI = 12.7% to 21.3%), and use of redundant office-based procedures increased by 76.0% (95% CI = 59% to 93%). Financial incentives in bladder cancer care have unintended and costly consequences in the current FFS environment. The observed price sensitivity is likely to remain a major issue in novel payment models failing to incorporate procedure-based specialty physicians. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Neural networks for structural design - An integrated system implementation

    NASA Technical Reports Server (NTRS)

    Berke, Laszlo; Hafez, Wassim; Pao, Yoh-Han

    1992-01-01

    The development of powerful automated procedures to aid the creative designer is becoming increasingly critical for complex design tasks. In the work described here Artificial Neural Nets are applied to acquire structural analysis and optimization domain expertise. Based on initial instructions from the user an automated procedure generates random instances of structural analysis and/or optimization 'experiences' that cover a desired domain. It extracts training patterns from the created instances, constructs and trains an appropriate network architecture and checks the accuracy of net predictions. The final product is a trained neural net that can estimate analysis and/or optimization results instantaneously.

  2. Stepwise Analysis of Differential Item Functioning Based on Multiple-Group Partial Credit Model.

    ERIC Educational Resources Information Center

    Muraki, Eiji

    1999-01-01

    Extended an Item Response Theory (IRT) method for detection of differential item functioning to the partial credit model and applied the method to simulated data using a stepwise procedure. Then applied the stepwise DIF analysis based on the multiple-group partial credit model to writing trend data from the National Assessment of Educational…

  3. Ecological Fallacy in Reading Acquisition Research: Masking Constructive Processes of the Learner.

    ERIC Educational Resources Information Center

    Berninger, Virginia W.; Abbott, Robert D.

    A study examined whether conclusions about constructive processes in reading based on analysis of group data were consistent with those based on an analysis of individual data. Subjects, selected from a larger sample of 45 first grade students who had participated in a longitudinal study on acquisition of linguistic procedures for printed words,…

  4. Training Head Start Teachers to Conduct Trial-Based Functional Analysis of Challenging Behavior

    ERIC Educational Resources Information Center

    Rispoli, Mandy; Burke, Mack D.; Hatton, Heather; Ninci, Jennifer; Zaini, Samar; Sanchez, Lisa

    2015-01-01

    Trial-based functional analysis (TBFA) is a procedure for experimentally identifying the function of challenging behavior within applied settings. The purpose of this study was to examine the effects of a TBFA teacher-training package in the context of two Head Start centers implementing programwide positive behavior support (PWPBS). Four Head…

  5. A Robust Kalman Framework with Resampling and Optimal Smoothing

    PubMed Central

    Kautz, Thomas; Eskofier, Bjoern M.

    2015-01-01

    The Kalman filter (KF) is an extremely powerful and versatile tool for signal processing that has been applied extensively in various fields. We introduce a novel Kalman-based analysis procedure that encompasses robustness towards outliers, Kalman smoothing and real-time conversion from non-uniformly sampled inputs to a constant output rate. These features have been mostly treated independently, so that not all of their benefits could be exploited at the same time. Here, we present a coherent analysis procedure that combines the aforementioned features and their benefits. To facilitate utilization of the proposed methodology and to ensure optimal performance, we also introduce a procedure to calculate all necessary parameters. Thereby, we substantially expand the versatility of one of the most widely-used filtering approaches, taking full advantage of its most prevalent extensions. The applicability and superior performance of the proposed methods are demonstrated using simulated and real data. The possible areas of applications for the presented analysis procedure range from movement analysis over medical imaging, brain-computer interfaces to robot navigation or meteorological studies. PMID:25734647

  6. Using the Entrustable Professional Activities Framework in the Assessment of Procedural Skills.

    PubMed

    Pugh, Debra; Cavalcanti, Rodrigo B; Halman, Samantha; Ma, Irene W Y; Mylopoulos, Maria; Shanks, David; Stroud, Lynfa

    2017-04-01

    The entrustable professional activity (EPA) framework has been identified as a useful approach to assessment in competency-based education. To apply an EPA framework for assessment, essential skills necessary for entrustment to occur must first be identified. Using an EPA framework, our study sought to (1) define the essential skills required for entrustment for 7 bedside procedures expected of graduates of Canadian internal medicine (IM) residency programs, and (2) develop rubrics for the assessment of these procedural skills. An initial list of essential skills was defined for each procedural EPA by focus groups of experts at 4 academic centers using the nominal group technique. These lists were subsequently vetted by representatives from all Canadian IM training programs through a web-based survey. Consensus (more than 80% agreement) about inclusion of each item was sought using a modified Delphi exercise. Qualitative survey data were analyzed using a framework approach to inform final assessment rubrics for each procedure. Initial lists of essential skills for procedural EPAs ranged from 10 to 24 items. A total of 111 experts completed the national survey. After 2 iterations, consensus was reached on all items. Following qualitative analysis, final rubrics were created, which included 6 to 10 items per procedure. These EPA-based assessment rubrics represent a national consensus by Canadian IM clinician educators. They provide a practical guide for the assessment of procedural skills in a competency-based education model, and a robust foundation for future research on their implementation and evaluation.

  7. Evaluation of Bias-Variance Trade-Off for Commonly Used Post-Summarizing Normalization Procedures in Large-Scale Gene Expression Studies

    PubMed Central

    Qiu, Xing; Hu, Rui; Wu, Zhixin

    2014-01-01

    Normalization procedures are widely used in high-throughput genomic data analyses to remove various technological noise and variations. They are known to have profound impact to the subsequent gene differential expression analysis. Although there has been some research in evaluating different normalization procedures, few attempts have been made to systematically evaluate the gene detection performances of normalization procedures from the bias-variance trade-off point of view, especially with strong gene differentiation effects and large sample size. In this paper, we conduct a thorough study to evaluate the effects of normalization procedures combined with several commonly used statistical tests and MTPs under different configurations of effect size and sample size. We conduct theoretical evaluation based on a random effect model, as well as simulation and biological data analyses to verify the results. Based on our findings, we provide some practical guidance for selecting a suitable normalization procedure under different scenarios. PMID:24941114

  8. Access to Care Under Physician Payment Reform: A Physician-Based Analysis

    PubMed Central

    Meadow, Ann

    1995-01-01

    This article reports physician-based measures of access to care during the 3 years surrounding the 1989 physician payment reforms. Analysis was facilitated by a new system of physician identifiers in Medicare claims. Access measures include caseload per physician and related measures of the demographic composition of physicians' clientele, the proportion of physicians performing surgical and other procedures, and the assignment rate. The caseload and assignment measures were stable or improving over time, suggesting that reforms did not harm access. Procedure performance rates tended to decline between 1992 and 1993, but reductions were inversely related to the estimated fee changes, and several may be explainable by other factors. PMID:10172615

  9. Computation of laminar heat transfer from gaseous plasmas in electromagnetic fields

    NASA Technical Reports Server (NTRS)

    Bose, T. K.

    1972-01-01

    Heat transfer analysis procedure is presented for two-temperature gaseous plasma. Analysis is based on laminar flow of singly-ionized, quasineutral plasma with variable properties. Sheath analysis is described for species in accelerating field, decelerating field, emitted from wall, and recombining at wall.

  10. Advances in the indirect, descriptive, and experimental approaches to the functional analysis of problem behavior.

    PubMed

    Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier

    2014-05-01

    Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.

  11. Estimation of the behavior factor of existing RC-MRF buildings

    NASA Astrophysics Data System (ADS)

    Vona, Marco; Mastroberti, Monica

    2018-01-01

    In recent years, several research groups have studied a new generation of analysis methods for seismic response assessment of existing buildings. Nevertheless, many important developments are still needed in order to define more reliable and effective assessment procedures. Moreover, regarding existing buildings, it should be highlighted that due to the low knowledge level, the linear elastic analysis is the only analysis method allowed. The same codes (such as NTC2008, EC8) consider the linear dynamic analysis with behavior factor as the reference method for the evaluation of seismic demand. This type of analysis is based on a linear-elastic structural model subject to a design spectrum, obtained by reducing the elastic spectrum through a behavior factor. The behavior factor (reduction factor or q factor in some codes) is used to reduce the elastic spectrum ordinate or the forces obtained from a linear analysis in order to take into account the non-linear structural capacities. The behavior factors should be defined based on several parameters that influence the seismic nonlinear capacity, such as mechanical materials characteristics, structural system, irregularity and design procedures. In practical applications, there is still an evident lack of detailed rules and accurate behavior factor values adequate for existing buildings. In this work, some investigations of the seismic capacity of the main existing RC-MRF building types have been carried out. In order to make a correct evaluation of the seismic force demand, actual behavior factor values coherent with force based seismic safety assessment procedure have been proposed and compared with the values reported in the Italian seismic code, NTC08.

  12. Automated magnification calibration in transmission electron microscopy using Fourier analysis of replica images.

    PubMed

    van der Laak, Jeroen A W M; Dijkman, Henry B P M; Pahlplatz, Martin M M

    2006-03-01

    The magnification factor in transmission electron microscopy is not very precise, hampering for instance quantitative analysis of specimens. Calibration of the magnification is usually performed interactively using replica specimens, containing line or grating patterns with known spacing. In the present study, a procedure is described for automated magnification calibration using digital images of a line replica. This procedure is based on analysis of the power spectrum of Fourier transformed replica images, and is compared to interactive measurement in the same images. Images were used with magnification ranging from 1,000 x to 200,000 x. The automated procedure deviated on average 0.10% from interactive measurements. Especially for catalase replicas, the coefficient of variation of automated measurement was considerably smaller (average 0.28%) compared to that of interactive measurement (average 3.5%). In conclusion, calibration of the magnification in digital images from transmission electron microscopy may be performed automatically, using the procedure presented here, with high precision and accuracy.

  13. New microfluidic-based sampling procedure for overcoming the hematocrit problem associated with dried blood spot analysis.

    PubMed

    Leuthold, Luc Alexis; Heudi, Olivier; Déglon, Julien; Raccuglia, Marc; Augsburger, Marc; Picard, Franck; Kretz, Olivier; Thomas, Aurélien

    2015-02-17

    Hematocrit (Hct) is one of the most critical issues associated with the bioanalytical methods used for dried blood spot (DBS) sample analysis. Because Hct determines the viscosity of blood, it may affect the spreading of blood onto the filter paper. Hence, accurate quantitative data can only be obtained if the size of the paper filter extracted contains a fixed blood volume. We describe for the first time a microfluidic-based sampling procedure to enable accurate blood volume collection on commercially available DBS cards. The system allows the collection of a controlled volume of blood (e.g., 5 or 10 μL) within several seconds. Reproducibility of the sampling volume was examined in vivo on capillary blood by quantifying caffeine and paraxanthine on 5 different extracted DBS spots at two different time points and in vitro with a test compound, Mavoglurant, on 10 different spots at two Hct levels. Entire spots were extracted. In addition, the accuracy and precision (n = 3) data for the Mavoglurant quantitation in blood with Hct levels between 26% and 62% were evaluated. The interspot precision data were below 9.0%, which was equivalent to that of a manually spotted volume with a pipet. No Hct effect was observed in the quantitative results obtained for Hct levels from 26% to 62%. These data indicate that our microfluidic-based sampling procedure is accurate and precise and that the analysis of Mavoglurant is not affected by the Hct values. This provides a simple procedure for DBS sampling with a fixed volume of capillary blood, which could eliminate the recurrent Hct issue linked to DBS sample analysis.

  14. Behavior analytic approaches to problem behavior in intellectual disabilities.

    PubMed

    Hagopian, Louis P; Gregory, Meagan K

    2016-03-01

    The purpose of the current review is to summarize recent behavior analytic research on problem behavior in individuals with intellectual disabilities. We have focused our review on studies published from 2013 to 2015, but also included earlier studies that were relevant. Behavior analytic research on problem behavior continues to focus on the use and refinement of functional behavioral assessment procedures and function-based interventions. During the review period, a number of studies reported on procedures aimed at making functional analysis procedures more time efficient. Behavioral interventions continue to evolve, and there were several larger scale clinical studies reporting on multiple individuals. There was increased attention on the part of behavioral researchers to develop statistical methods for analysis of within subject data and continued efforts to aggregate findings across studies through evaluative reviews and meta-analyses. Findings support continued utility of functional analysis for guiding individualized interventions and for classifying problem behavior. Modifications designed to make functional analysis more efficient relative to the standard method of functional analysis were reported; however, these require further validation. Larger scale studies on behavioral assessment and treatment procedures provided additional empirical support for effectiveness of these approaches and their sustainability outside controlled clinical settings.

  15. Cognition and procedure representational requirements for predictive human performance models

    NASA Technical Reports Server (NTRS)

    Corker, K.

    1992-01-01

    Models and modeling environments for human performance are becoming significant contributors to early system design and analysis procedures. Issues of levels of automation, physical environment, informational environment, and manning requirements are being addressed by such man/machine analysis systems. The research reported here investigates the close interaction between models of human cognition and models that described procedural performance. We describe a methodology for the decomposition of aircrew procedures that supports interaction with models of cognition on the basis of procedures observed; that serves to identify cockpit/avionics information sources and crew information requirements; and that provides the structure to support methods for function allocation among crew and aiding systems. Our approach is to develop an object-oriented, modular, executable software representation of the aircrew, the aircraft, and the procedures necessary to satisfy flight-phase goals. We then encode in a time-based language, taxonomies of the conceptual, relational, and procedural constraints among the cockpit avionics and control system and the aircrew. We have designed and implemented a goals/procedures hierarchic representation sufficient to describe procedural flow in the cockpit. We then execute the procedural representation in simulation software and calculate the values of the flight instruments, aircraft state variables and crew resources using the constraints available from the relationship taxonomies. The system provides a flexible, extensible, manipulative and executable representation of aircrew and procedures that is generally applicable to crew/procedure task-analysis. The representation supports developed methods of intent inference, and is extensible to include issues of information requirements and functional allocation. We are attempting to link the procedural representation to models of cognitive functions to establish several intent inference methods including procedural backtracking with concurrent search, temporal reasoning, and constraint checking for partial ordering of procedures. Finally, the representation is being linked to models of human decision making processes that include heuristic, propositional and prescriptive judgement models that are sensitive to the procedural content in which the valuative functions are being performed.

  16. Design of Energy Storage Reactors for Dc-To-Dc Converters. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Chen, D. Y.

    1975-01-01

    Two methodical approaches to the design of energy-storage reactors for a group of widely used dc-to-dc converters are presented. One of these approaches is based on a steady-state time-domain analysis of piecewise-linearized circuit models of the converters, while the other approach is based on an analysis of the same circuit models, but from an energy point of view. The design procedure developed from the first approach includes a search through a stored data file of magnetic core characteristics and results in a list of usable reactor designs which meet a particular converter's requirements. Because of the complexity of this procedure, a digital computer usually is used to implement the design algorithm. The second approach, based on a study of the storage and transfer of energy in the magnetic reactors, leads to a straightforward design procedure which can be implemented with hand calculations. An equation to determine the lower-bound volume of workable cores for given converter design specifications is derived. Using this computer lower-bound volume, a comparative evaluation of various converter configurations is presented.

  17. Availability Analysis of Dual Mode Systems

    DOT National Transportation Integrated Search

    1974-04-01

    The analytical procedures presented define a method of evaluating the effects of failures in a complex dual-mode system based on a worst case steady-state analysis. The computed result is an availability figure of merit and not an absolute prediction...

  18. An analysis of transportation planning effectiveness

    DOT National Transportation Integrated Search

    1977-07-01

    The report documents a novel methodology and analysis procedure for measuring a program's effect, and it is based on data from case studies of a representative group of twenty urban areas, conducted during 1976, which are reported in a companion repo...

  19. Application of a faith-based integration tool to assess mental and physical health interventions.

    PubMed

    Saunders, Donna M; Leak, Jean; Carver, Monique E; Smith, Selina A

    2017-01-01

    To build on current research involving faith-based interventions (FBIs) for addressing mental and physical health, this study a) reviewed the extent to which relevant publications integrate faith concepts with health and b) initiated analysis of the degree of FBI integration with intervention outcomes. Derived from a systematic search of articles published between 2007 and 2017, 36 studies were assessed with a Faith-Based Integration Assessment Tool (FIAT) to quantify faith-health integration. Basic statistical procedures were employed to determine the association of faith-based integration with intervention outcomes. The assessed studies possessed (on average) moderate, inconsistent integration because of poor use of faith measures, and moderate, inconsistent use of faith practices. Analysis procedures for determining the effect of FBI integration on intervention outcomes were inadequate for formulating practical conclusions. Regardless of integration, interventions were associated with beneficial outcomes. To determine the link between FBI integration and intervention outcomes, additional analyses are needed.

  20. Propagating Water Quality Analysis Uncertainty Into Resource Management Decisions Through Probabilistic Modeling

    NASA Astrophysics Data System (ADS)

    Gronewold, A. D.; Wolpert, R. L.; Reckhow, K. H.

    2007-12-01

    Most probable number (MPN) and colony-forming-unit (CFU) are two estimates of fecal coliform bacteria concentration commonly used as measures of water quality in United States shellfish harvesting waters. The MPN is the maximum likelihood estimate (or MLE) of the true fecal coliform concentration based on counts of non-sterile tubes in serial dilution of a sample aliquot, indicating bacterial metabolic activity. The CFU is the MLE of the true fecal coliform concentration based on the number of bacteria colonies emerging on a growth plate after inoculation from a sample aliquot. Each estimating procedure has intrinsic variability and is subject to additional uncertainty arising from minor variations in experimental protocol. Several versions of each procedure (using different sized aliquots or different numbers of tubes, for example) are in common use, each with its own levels of probabilistic and experimental error and uncertainty. It has been observed empirically that the MPN procedure is more variable than the CFU procedure, and that MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the observed variability in, and discrepancy between, MPN and CFU measurements. We then explore how this variability and uncertainty might propagate into shellfish harvesting area management decisions through a two-phased modeling strategy. First, we apply our probabilistic model in a simulation-based analysis of future water quality standard violation frequencies under alternative land use scenarios, such as those evaluated under guidelines of the total maximum daily load (TMDL) program. Second, we apply our model to water quality data from shellfish harvesting areas which at present are closed (either conditionally or permanently) to shellfishing, to determine if alternative laboratory analysis procedures might have led to different management decisions. Our research results indicate that the (often large) observed differences between MPN and CFU values for the same water body are well within the ranges predicted by our probabilistic model. Our research also indicates that the probability of violating current water quality guidelines at specified true fecal coliform concentrations depends on the laboratory procedure used. As a result, quality-based management decisions, such as opening or closing a shellfishing area, may also depend on the laboratory procedure used.

  1. Brief surgical procedure code lists for outcomes measurement and quality improvement in resource-limited settings.

    PubMed

    Liu, Charles; Kayima, Peter; Riesel, Johanna; Situma, Martin; Chang, David; Firth, Paul

    2017-11-01

    The lack of a classification system for surgical procedures in resource-limited settings hinders outcomes measurement and reporting. Existing procedure coding systems are prohibitively large and expensive to implement. We describe the creation and prospective validation of 3 brief procedure code lists applicable in low-resource settings, based on analysis of surgical procedures performed at Mbarara Regional Referral Hospital, Uganda's second largest public hospital. We reviewed operating room logbooks to identify all surgical operations performed at Mbarara Regional Referral Hospital during 2014. Based on the documented indication for surgery and procedure(s) performed, we assigned each operation up to 4 procedure codes from the International Classification of Diseases, 9th Revision, Clinical Modification. Coding of procedures was performed by 2 investigators, and a random 20% of procedures were coded by both investigators. These codes were aggregated to generate procedure code lists. During 2014, 6,464 surgical procedures were performed at Mbarara Regional Referral Hospital, to which we assigned 435 unique procedure codes. Substantial inter-rater reliability was achieved (κ = 0.7037). The 111 most common procedure codes accounted for 90% of all codes assigned, 180 accounted for 95%, and 278 accounted for 98%. We considered these sets of codes as 3 procedure code lists. In a prospective validation, we found that these lists described 83.2%, 89.2%, and 92.6% of surgical procedures performed at Mbarara Regional Referral Hospital during August to September of 2015, respectively. Empirically generated brief procedure code lists based on International Classification of Diseases, 9th Revision, Clinical Modification can be used to classify almost all surgical procedures performed at a Ugandan referral hospital. Such a standardized procedure coding system may enable better surgical data collection for administration, research, and quality improvement in resource-limited settings. Copyright © 2017 Elsevier Inc. All rights reserved.

  2. An improved procedure for the validation of satellite-based precipitation estimates

    NASA Astrophysics Data System (ADS)

    Tang, Ling; Tian, Yudong; Yan, Fang; Habib, Emad

    2015-09-01

    The objective of this study is to propose and test a new procedure to improve the validation of remote-sensing, high-resolution precipitation estimates. Our recent studies show that many conventional validation measures do not accurately capture the unique error characteristics in precipitation estimates to better inform both data producers and users. The proposed new validation procedure has two steps: 1) an error decomposition approach to separate the total retrieval error into three independent components: hit error, false precipitation and missed precipitation; and 2) the hit error is further analyzed based on a multiplicative error model. In the multiplicative error model, the error features are captured by three model parameters. In this way, the multiplicative error model separates systematic and random errors, leading to more accurate quantification of the uncertainties. The proposed procedure is used to quantitatively evaluate the recent two versions (Version 6 and 7) of TRMM's Multi-sensor Precipitation Analysis (TMPA) real-time and research product suite (3B42 and 3B42RT) for seven years (2005-2011) over the continental United States (CONUS). The gauge-based National Centers for Environmental Prediction (NCEP) Climate Prediction Center (CPC) near-real-time daily precipitation analysis is used as the reference. In addition, the radar-based NCEP Stage IV precipitation data are also model-fitted to verify the effectiveness of the multiplicative error model. The results show that winter total bias is dominated by the missed precipitation over the west coastal areas and the Rocky Mountains, and the false precipitation over large areas in Midwest. The summer total bias is largely coming from the hit bias in Central US. Meanwhile, the new version (V7) tends to produce more rainfall in the higher rain rates, which moderates the significant underestimation exhibited in the previous V6 products. Moreover, the error analysis from the multiplicative error model provides a clear and concise picture of the systematic and random errors, with both versions of 3B42RT have higher errors in varying degrees than their research (post-real-time) counterparts. The new V7 algorithm shows obvious improvements in reducing random errors in both winter and summer seasons, compared to its predecessors V6. Stage IV, as expected, surpasses the satellite-based datasets in all the metrics over CONUS. Based on the results, we recommend the new procedure be adopted for routine validation of satellite-based precipitation datasets, and we expect the procedure will work effectively for higher resolution data to be produced in the Global Precipitation Measurement (GPM) era.

  3. A close examination of double filtering with fold change and t test in microarray analysis

    PubMed Central

    2009-01-01

    Background Many researchers use the double filtering procedure with fold change and t test to identify differentially expressed genes, in the hope that the double filtering will provide extra confidence in the results. Due to its simplicity, the double filtering procedure has been popular with applied researchers despite the development of more sophisticated methods. Results This paper, for the first time to our knowledge, provides theoretical insight on the drawback of the double filtering procedure. We show that fold change assumes all genes to have a common variance while t statistic assumes gene-specific variances. The two statistics are based on contradicting assumptions. Under the assumption that gene variances arise from a mixture of a common variance and gene-specific variances, we develop the theoretically most powerful likelihood ratio test statistic. We further demonstrate that the posterior inference based on a Bayesian mixture model and the widely used significance analysis of microarrays (SAM) statistic are better approximations to the likelihood ratio test than the double filtering procedure. Conclusion We demonstrate through hypothesis testing theory, simulation studies and real data examples, that well constructed shrinkage testing methods, which can be united under the mixture gene variance assumption, can considerably outperform the double filtering procedure. PMID:19995439

  4. Quest: The Interactive Test Analysis System.

    ERIC Educational Resources Information Center

    Adams, Raymond J.; Khoo, Siek-Toon

    The Quest program offers a comprehensive test and questionnaire analysis environment by providing a data analyst (a computer program) with access to the most recent developments in Rasch measurement theory, as well as a range of traditional analysis procedures. This manual helps the user use Quest to construct and validate variables based on…

  5. Parallel-vector computation for linear structural analysis and non-linear unconstrained optimization problems

    NASA Technical Reports Server (NTRS)

    Nguyen, D. T.; Al-Nasra, M.; Zhang, Y.; Baddourah, M. A.; Agarwal, T. K.; Storaasli, O. O.; Carmona, E. A.

    1991-01-01

    Several parallel-vector computational improvements to the unconstrained optimization procedure are described which speed up the structural analysis-synthesis process. A fast parallel-vector Choleski-based equation solver, pvsolve, is incorporated into the well-known SAP-4 general-purpose finite-element code. The new code, denoted PV-SAP, is tested for static structural analysis. Initial results on a four processor CRAY 2 show that using pvsolve reduces the equation solution time by a factor of 14-16 over the original SAP-4 code. In addition, parallel-vector procedures for the Golden Block Search technique and the BFGS method are developed and tested for nonlinear unconstrained optimization. A parallel version of an iterative solver and the pvsolve direct solver are incorporated into the BFGS method. Preliminary results on nonlinear unconstrained optimization test problems, using pvsolve in the analysis, show excellent parallel-vector performance indicating that these parallel-vector algorithms can be used in a new generation of finite-element based structural design/analysis-synthesis codes.

  6. Prenatal nutrition services: a cost analysis.

    PubMed

    Splett, P L; Caldwell, H M; Holey, E S; Alton, I R

    1987-02-01

    The scarcity of information about program costs in relation to quality care prompted a cost analysis of prenatal nutrition services in two urban settings. This study examined prenatal nutrition services in terms of total costs, per client costs, per visit costs, and cost per successful outcome. Standard cost-accounting principles were used. Outcome measures, based on written quality assurance criteria, were audited using standard procedures. In the studied programs, nutrition services were delivered for a per client cost of $72 in a health department setting and $121 in a hospital-based prenatal care program. Further analysis illustrates that total and per client costs can be misleading and that costs related to successful outcomes are much higher. The three levels of cost analysis reported provide baseline data for quantifying the costs of providing prenatal nutrition services to healthy pregnant women. Cost information from these cost analysis procedures can be used to guide adjustments in service delivery to assure successful outcomes of nutrition care. Accurate cost and outcome data are necessary prerequisites to cost-effectiveness and cost-benefit studies.

  7. Design and evaluation of Continuous Descent Approach as a fuel-saving procedure

    NASA Astrophysics Data System (ADS)

    Jin, Li

    Continuous Descent Approach (CDA), which is among the key concepts of the Next Generation Air Transportation System (NextGen), is a fuel economical procedure, but requires increased separation to accommodate spacing uncertainties among arriving aircraft. Such negative impact is often overlooked when benefits are estimated. Although a considerable number of researches have been devoted to the estimation of potential fuel saving of CDA, few have attempted to explain the fuel saving observed in field tests from an analytical point of view. This research gives insights into the reasons why CDA saves fuel, and a number of design guidelines for CDA procedures are derived. The analytical relationship between speed, altitude, and time-cumulative fuel consumption is derived based on Base of Aircraft Data (BADA) Total Energy Model. Theoretical analysis implies that speed profile has an impact as substantial as, if not more than, vertical profile on the fuel consumption in the terminal area. In addition, CDA is not intrinsically a fuel-saving procedure: whether CDA saves fuel or not is contingent upon whether the speed profile is properly designed or not. Based on this model, the potential fuel savings due to CDA at San Francisco International Airport were estimated, and the accuracy of this estimation is analyzed. Possible uncertainties in this fuel estimation primarily resulted from the modeled CDA procedure and the inaccuracy of BADA. This thesis also investigates the fuel savings due to CDAs under high traffic conditions, counting not only the savings benefiting from optimal vertical profiles but also the extra fuel burn resulting from the increased separations. The simulated CDAs traffic is based on radar track data, and deconflicted by a scheduling algorithm that targets minimized delays. The delays are absorbed by speed change and path stretching, accounting for the air traffic controls that are entailed by CDAs. The fuel burn statistics calculated based on the BADA Total Energy Model reveals that the CDAs save on average 171.87 kg per arrival, but the number is discounted by delay absorption. The savings diminish as the arrival demand increases, and could be even negative due to large delays. The throughput analysis demonstrated that the impact of CDA on airport capacity is insignificant and tolerable. The Atlanta International Airport was used as the testbed for sensitivity analysis, and the New York Metroplex was used as the test bed for throughput analysis.

  8. Evaluation of Second-Level Inference in fMRI Analysis

    PubMed Central

    Roels, Sanne P.; Loeys, Tom; Moerkerke, Beatrijs

    2016-01-01

    We investigate the impact of decisions in the second-level (i.e., over subjects) inferential process in functional magnetic resonance imaging on (1) the balance between false positives and false negatives and on (2) the data-analytical stability, both proxies for the reproducibility of results. Second-level analysis based on a mass univariate approach typically consists of 3 phases. First, one proceeds via a general linear model for a test image that consists of pooled information from different subjects. We evaluate models that take into account first-level (within-subjects) variability and models that do not take into account this variability. Second, one proceeds via inference based on parametrical assumptions or via permutation-based inference. Third, we evaluate 3 commonly used procedures to address the multiple testing problem: familywise error rate correction, False Discovery Rate (FDR) correction, and a two-step procedure with minimal cluster size. Based on a simulation study and real data we find that the two-step procedure with minimal cluster size results in most stable results, followed by the familywise error rate correction. The FDR results in most variable results, for both permutation-based inference and parametrical inference. Modeling the subject-specific variability yields a better balance between false positives and false negatives when using parametric inference. PMID:26819578

  9. Estimation procedures to measure and monitor failure rates of components during thermal-vacuum testing

    NASA Technical Reports Server (NTRS)

    Williams, R. E.; Kruger, R.

    1980-01-01

    Estimation procedures are described for measuring component failure rates, for comparing the failure rates of two different groups of components, and for formulating confidence intervals for testing hypotheses (based on failure rates) that the two groups perform similarly or differently. Appendix A contains an example of an analysis in which these methods are applied to investigate the characteristics of two groups of spacecraft components. The estimation procedures are adaptable to system level testing and to monitoring failure characteristics in orbit.

  10. Automation of scour analysis at Louisiana bridge sites : final report.

    DOT National Transportation Integrated Search

    1988-12-01

    The computerized system for the organization, analysis, and display of field collected scour data is described. This system will enhance the current manual procedure of accomplishing these tasks. The system accepts input from the user, and based on u...

  11. Inverse finite-size scaling for high-dimensional significance analysis

    NASA Astrophysics Data System (ADS)

    Xu, Yingying; Puranen, Santeri; Corander, Jukka; Kabashima, Yoshiyuki

    2018-06-01

    We propose an efficient procedure for significance determination in high-dimensional dependence learning based on surrogate data testing, termed inverse finite-size scaling (IFSS). The IFSS method is based on our discovery of a universal scaling property of random matrices which enables inference about signal behavior from much smaller scale surrogate data than the dimensionality of the original data. As a motivating example, we demonstrate the procedure for ultra-high-dimensional Potts models with order of 1010 parameters. IFSS reduces the computational effort of the data-testing procedure by several orders of magnitude, making it very efficient for practical purposes. This approach thus holds considerable potential for generalization to other types of complex models.

  12. The Social Construction of "Evidence-Based" Drug Prevention Programs: A Reanalysis of Data from the Drug Abuse Resistance Education (DARE) Program

    ERIC Educational Resources Information Center

    Gorman, Dennis M.; Huber, J. Charles, Jr.

    2009-01-01

    This study explores the possibility that any drug prevention program might be considered "evidence-based" given the use of data analysis procedures that optimize the chance of producing statistically significant results by reanalyzing data from a Drug Abuse Resistance Education (DARE) program evaluation. The analysis produced a number of…

  13. Using FIESTA , an R-based tool for analysts, to look at temporal trends in forest estimates

    Treesearch

    Tracey S. Frescino; Paul L. Patterson; Elizabeth A. Freeman; Gretchen G. Moisen

    2012-01-01

    FIESTA (Forest Inventory Estimation for Analysis) is a user-friendly R package that supports the production of estimates for forest resources based on procedures from Bechtold and Patterson (2005). The package produces output consistent with current tools available for the Forest Inventory and Analysis National Program, such as FIDO (Forest Inventory Data Online) and...

  14. SU-G-IeP3-05: Effects of Image Receptor Technology and Dose Reduction Software On Radiation Dose Estimates for Fluoroscopically-Guided Interventional (FGI) Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merritt, Z; Dave, J; Eschelman, D

    Purpose: To investigate the effects of image receptor technology and dose reduction software on radiation dose estimates for most frequently performed fluoroscopically-guided interventional (FGI) procedures at a tertiary health care center. Methods: IRB approval was obtained for retrospective analysis of FGI procedures performed in the interventional radiology suites between January-2011 and December-2015. This included procedures performed using image-intensifier (II) based systems which were subsequently replaced, flat-panel-detector (FPD) based systems which were later upgraded with ClarityIQ dose reduction software (Philips Healthcare) and relatively new FPD system already equipped with ClarityIQ. Post procedure, technologists entered system-reported cumulative air kerma (CAK) and kerma-areamore » product (KAP; only KAP for II based systems) in RIS; these values were analyzed. Data pre-processing included correcting typographical errors and cross-verifying CAK and KAP. The most frequent high and low dose FGI procedures were identified and corresponding CAK and KAP values were compared. Results: Out of 27,251 procedures within this time period, most frequent high and low dose procedures were chemo/immuno-embolization (n=1967) and abscess drainage (n=1821). Mean KAP for embolization and abscess drainage procedures were 260,657, 310,304 and 94,908 mGycm{sup 2}, and 14,497, 15,040 and 6307 mGycm{sup 2} using II-, FPD- and FPD with ClarityIQ- based systems, respectively. Statistically significant differences were observed in KAP values for embolization procedures with respect to different systems but for abscess drainage procedures significant differences were only noted between systems with FPD and FPD with ClarityIQ (p<0.05). Mean CAK reduced significantly from 823 to 308 mGy and from 43 to 21 mGy for embolization and abscess drainage procedures, respectively, in transitioning to FPD systems with ClarityIQ (p<0.05). Conclusion: While transitioning from II- to FPD- based systems was not associated with dose reduction for the most frequently performed FGI procedures, substantial dose reduction was noted with relatively newer systems and dose reduction software.« less

  15. Dispersive Solid Phase Extraction for the Analysis of Veterinary Drugs Applied to Food Samples: A Review

    PubMed Central

    Islas, Gabriela; Hernandez, Prisciliano

    2017-01-01

    To achieve analytical success, it is necessary to develop thorough clean-up procedures to extract analytes from the matrix. Dispersive solid phase extraction (DSPE) has been used as a pretreatment technique for the analysis of several compounds. This technique is based on the dispersion of a solid sorbent in liquid samples in the extraction isolation and clean-up of different analytes from complex matrices. DSPE has found a wide range of applications in several fields, and it is considered to be a selective, robust, and versatile technique. The applications of dispersive techniques in the analysis of veterinary drugs in different matrices involve magnetic sorbents, molecularly imprinted polymers, carbon-based nanomaterials, and the Quick, Easy, Cheap, Effective, Rugged, and Safe (QuEChERS) method. Techniques based on DSPE permit minimization of additional steps such as precipitation, centrifugation, and filtration, which decreases the manipulation of the sample. In this review, we describe the main procedures used for synthesis, characterization, and application of this pretreatment technique and how it has been applied to food analysis. PMID:29181027

  16. A decision-support system for the analysis of clinical practice patterns.

    PubMed

    Balas, E A; Li, Z R; Mitchell, J A; Spencer, D C; Brent, E; Ewigman, B G

    1994-01-01

    Several studies documented substantial variation in medical practice patterns, but physicians often do not have adequate information on the cumulative clinical and financial effects of their decisions. The purpose of developing an expert system for the analysis of clinical practice patterns was to assist providers in analyzing and improving the process and outcome of patient care. The developed QFES (Quality Feedback Expert System) helps users in the definition and evaluation of measurable quality improvement objectives. Based on objectives and actual clinical data, several measures can be calculated (utilization of procedures, annualized cost effect of using a particular procedure, and expected utilization based on peer-comparison and case-mix adjustment). The quality management rules help to detect important discrepancies among members of the selected provider group and compare performance with objectives. The system incorporates a variety of data and knowledge bases: (i) clinical data on actual practice patterns, (ii) frames of quality parameters derived from clinical practice guidelines, and (iii) rules of quality management for data analysis. An analysis of practice patterns of 12 family physicians in the management of urinary tract infections illustrates the use of the system.

  17. Real-time detection of organic contamination events in water distribution systems by principal components analysis of ultraviolet spectral data.

    PubMed

    Zhang, Jian; Hou, Dibo; Wang, Ke; Huang, Pingjie; Zhang, Guangxin; Loáiciga, Hugo

    2017-05-01

    The detection of organic contaminants in water distribution systems is essential to protect public health from potential harmful compounds resulting from accidental spills or intentional releases. Existing methods for detecting organic contaminants are based on quantitative analyses such as chemical testing and gas/liquid chromatography, which are time- and reagent-consuming and involve costly maintenance. This study proposes a novel procedure based on discrete wavelet transform and principal component analysis for detecting organic contamination events from ultraviolet spectral data. Firstly, the spectrum of each observation is transformed using discrete wavelet with a coiflet mother wavelet to capture the abrupt change along the wavelength. Principal component analysis is then employed to approximate the spectra based on capture and fusion features. The significant value of Hotelling's T 2 statistics is calculated and used to detect outliers. An alarm of contamination event is triggered by sequential Bayesian analysis when the outliers appear continuously in several observations. The effectiveness of the proposed procedure is tested on-line using a pilot-scale setup and experimental data.

  18. The Cluster Analysis of Jobs Based on Data from the Position Analysis Questionnaire (PAQ). Report No. 7.

    ERIC Educational Resources Information Center

    DeNisi, Angelo S.; McCormick, Ernest J.

    The Position Analysis Questionnaire (PAQ) is a structured job analysis procedure that provides for the analysis of jobs in terms of each of 187 job elements, these job elements being grouped into six divisions: information input, mental processes, work output, relationships with other persons, job context, and other job characteristics. Two…

  19. Probability of pregnancy after sterilization: a comparison of hysteroscopic versus laparoscopic sterilization.

    PubMed

    Gariepy, Aileen M; Creinin, Mitchell D; Smith, Kenneth J; Xu, Xiao

    2014-08-01

    To compare the expected probability of pregnancy after hysteroscopic versus laparoscopic sterilization based on available data using decision analysis. We developed an evidence-based Markov model to estimate the probability of pregnancy over 10 years after three different female sterilization procedures: hysteroscopic, laparoscopic silicone rubber band application and laparoscopic bipolar coagulation. Parameter estimates for procedure success, probability of completing follow-up testing and risk of pregnancy after different sterilization procedures were obtained from published sources. In the base case analysis at all points in time after the sterilization procedure, the initial and cumulative risk of pregnancy after sterilization is higher in women opting for hysteroscopic than either laparoscopic band or bipolar sterilization. The expected pregnancy rates per 1000 women at 1 year are 57, 7 and 3 for hysteroscopic sterilization, laparoscopic silicone rubber band application and laparoscopic bipolar coagulation, respectively. At 10 years, the cumulative pregnancy rates per 1000 women are 96, 24 and 30, respectively. Sensitivity analyses suggest that the three procedures would have an equivalent pregnancy risk of approximately 80 per 1000 women at 10 years if the probability of successful laparoscopic (band or bipolar) sterilization drops below 90% and successful coil placement on first hysteroscopic attempt increases to 98% or if the probability of undergoing a hysterosalpingogram increases to 100%. Based on available data, the expected population risk of pregnancy is higher after hysteroscopic than laparoscopic sterilization. Consistent with existing contraceptive classification, future characterization of hysteroscopic sterilization should distinguish "perfect" and "typical" use failure rates. Pregnancy probability at 1 year and over 10 years is expected to be higher in women having hysteroscopic as compared to laparoscopic sterilization. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. Refinement procedure for the image alignment in high-resolution electron tomography.

    PubMed

    Houben, L; Bar Sadan, M

    2011-01-01

    High-resolution electron tomography from a tilt series of transmission electron microscopy images requires an accurate image alignment procedure in order to maximise the resolution of the tomogram. This is the case in particular for ultra-high resolution where even very small misalignments between individual images can dramatically reduce the fidelity of the resultant reconstruction. A tomographic-reconstruction based and marker-free method is proposed, which uses an iterative optimisation of the tomogram resolution. The method utilises a search algorithm that maximises the contrast in tomogram sub-volumes. Unlike conventional cross-correlation analysis it provides the required correlation over a large tilt angle separation and guarantees a consistent alignment of images for the full range of object tilt angles. An assessment based on experimental reconstructions shows that the marker-free procedure is competitive to the reference of marker-based procedures at lower resolution and yields sub-pixel accuracy even for simulated high-resolution data. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. How many records should be used in ASCE/SEI-7 ground motion scaling procedure?

    USGS Publications Warehouse

    Reyes, Juan C.; Kalkan, Erol

    2012-01-01

    U.S. national building codes refer to the ASCE/SEI-7 provisions for selecting and scaling ground motions for use in nonlinear response history analysis of structures. Because the limiting values for the number of records in the ASCE/SEI-7 are based on engineering experience, this study examines the required number of records statistically, such that the scaled records provide accurate, efficient, and consistent estimates of “true” structural responses. Based on elastic–perfectly plastic and bilinear single-degree-of-freedom systems, the ASCE/SEI-7 scaling procedure is applied to 480 sets of ground motions; the number of records in these sets varies from three to ten. As compared to benchmark responses, it is demonstrated that the ASCE/SEI-7 scaling procedure is conservative if fewer than seven ground motions are employed. Utilizing seven or more randomly selected records provides more accurate estimate of the responses. Selecting records based on their spectral shape and design spectral acceleration increases the accuracy and efficiency of the procedure.

  2. Analysis of pain and satisfaction with office-based hysteroscopic sterilization.

    PubMed

    Levie, Mark; Weiss, Gil; Kaiser, Bente; Daif, Jennifer; Chudnoff, Scott G

    2010-09-01

    To assess pain and patient satisfaction with office-based hysteroscopic sterilization. This prospective, observational study was designed to assess patient pain perception and satisfaction with office-based hysteroscopic sterilization using the Essure device (Conceptus, Mountain View, CA). Faculty practice office at an inner-city urban medical center. Women seeking hysteroscopic sterilization. Office hysteroscopic sterilization under local anesthesia. Pain assessed at the time of the procedure by a 0-10 visual scale and satisfaction by a 1-5 scale. From June 2003 to June 2006, 209 patients were recruited. The mean scores for average procedural pain, most procedural pain, and average menstrual pain were 2.6+/-2.1, 3.3+/-2.5, and 3.6+/-2.6, respectively. Standardized pain scores revealed that 149 subjects (70%) experienced average pain that was less than or equal to the pain experienced with their menses. Mean satisfaction rating for the procedure was 4.7+/-0.71. Office-based hysteroscopic sterilization performed with local anesthesia alone is well tolerated, and patients are satisfied with this method for permanent sterilization. Copyright (c) 2010 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  3. Exploratory Analysis of Survey Data for Understanding Adoption of Novel Aerospace Systems

    NASA Astrophysics Data System (ADS)

    Reddy, Lauren M.

    In order to meet the increasing demand for manned and unmanned flight, the air transportation system must constantly evolve. As new technologies or operational procedures are conceived, we must determine their effect on humans in the system. In this research, we introduce a strategy to assess how individuals or organizations would respond to a novel aerospace system. We employ the most appropriate and sophisticated exploratory analysis techniques on the survey data to generate insight and identify significant variables. We employ three different methods for eliciting views from individuals or organizations who are affected by a system: an opinion survey, a stated preference survey, and structured interviews. We conduct an opinion survey of both the general public and stakeholders in the unmanned aircraft industry to assess their knowledge, attitude, and practices regarding unmanned aircraft. We complete a statistical analysis of the multiple-choice questions using multinomial logit and multivariate probit models and conduct qualitative analysis on free-text questions. We next present a stated preference survey of the general public on the use of an unmanned aircraft package delivery service. We complete a statistical analysis of the questions using multinomial logit, ordered probit, linear regression, and negative binomial models. Finally, we discuss structured interviews conducted on stakeholders from ANSPs and airlines operating in the North Atlantic. We describe how these groups may choose to adopt a new technology (space-based ADS-B) or operational procedure (in-trail procedures). We discuss similarities and differences between the stakeholders groups, the benefits and costs of in-trail procedures and space-based ADS-B as reported by the stakeholders, and interdependencies between the groups interviewed. To demonstrate the value of the data we generated, we explore how the findings from the surveys can be used to better characterize uncertainty in the cost-benefit analysis of aerospace systems. We demonstrate how the findings from the opinion and stated preference surveys can be infused into the cost-benefit analysis of an unmanned aircraft delivery system. We also demonstrate how to apply the findings from the interviews to characterize uncertainty in the estimation of the benefits of space-based ADS-B.

  4. Patent Network Analysis and Quadratic Assignment Procedures to Identify the Convergence of Robot Technologies

    PubMed Central

    Lee, Woo Jin; Lee, Won Kyung

    2016-01-01

    Because of the remarkable developments in robotics in recent years, technological convergence has been active in this area. We focused on finding patterns of convergence within robot technology using network analysis of patents in both the USPTO and KIPO. To identify the variables that affect convergence, we used quadratic assignment procedures (QAP). From our analysis, we observed the patent network ecology related to convergence and found technologies that have great potential to converge with other robotics technologies. The results of our study are expected to contribute to setting up convergence based R&D policies for robotics, which can lead new innovation. PMID:27764196

  5. Hybrid Data Assimilation without Ensemble Filtering

    NASA Technical Reports Server (NTRS)

    Todling, Ricardo; Akkraoui, Amal El

    2014-01-01

    The Global Modeling and Assimilation Office is preparing to upgrade its three-dimensional variational system to a hybrid approach in which the ensemble is generated using a square-root ensemble Kalman filter (EnKF) and the variational problem is solved using the Grid-point Statistical Interpolation system. As in most EnKF applications, we found it necessary to employ a combination of multiplicative and additive inflations, to compensate for sampling and modeling errors, respectively and, to maintain the small-member ensemble solution close to the variational solution; we also found it necessary to re-center the members of the ensemble about the variational analysis. During tuning of the filter we have found re-centering and additive inflation to play a considerably larger role than expected, particularly in a dual-resolution context when the variational analysis is ran at larger resolution than the ensemble. This led us to consider a hybrid strategy in which the members of the ensemble are generated by simply converting the variational analysis to the resolution of the ensemble and applying additive inflation, thus bypassing the EnKF. Comparisons of this, so-called, filter-free hybrid procedure with an EnKF-based hybrid procedure and a control non-hybrid, traditional, scheme show both hybrid strategies to provide equally significant improvement over the control; more interestingly, the filter-free procedure was found to give qualitatively similar results to the EnKF-based procedure.

  6. A flexible method for residual stress measurement of spray coated layers by laser made hole drilling and SLM based beam steering

    NASA Astrophysics Data System (ADS)

    Osten, W.; Pedrini, G.; Weidmann, P.; Gadow, R.

    2015-08-01

    A minimum invasive but high resolution method for residual stress analysis of ceramic coatings made by thermal spraycoating using a pulsed laser for flexible hole drilling is described. The residual stresses are retrieved by applying the measured surface data for a model-based reconstruction procedure. While the 3D deformations and the profile of the machined area are measured with digital holography, the residual stresses are calculated by FE analysis. To improve the sensitivity of the method, a SLM is applied to control the distribution and the shape of the holes. The paper presents the complete measurement and reconstruction procedure and discusses the advantages and challenges of the new technology.

  7. Function Allocation in Complex Socio-Technical Systems: Procedure usage in nuclear power and the Context Analysis Method for Identifying Design Solutions (CAMIDS) Model

    NASA Astrophysics Data System (ADS)

    Schmitt, Kara Anne

    This research aims to prove that strict adherence to procedures and rigid compliance to process in the US Nuclear Industry may not prevent incidents or increase safety. According to the Institute of Nuclear Power Operations, the nuclear power industry has seen a recent rise in events, and this research claims that a contributing factor to this rise is organizational, cultural, and based on peoples overreliance on procedures and policy. Understanding the proper balance of function allocation, automation and human decision-making is imperative to creating a nuclear power plant that is safe, efficient, and reliable. This research claims that new generations of operators are less engaged and thinking because they have been instructed to follow procedures to a fault. According to operators, they were once to know the plant and its interrelations, but organizationally more importance is now put on following procedure and policy. Literature reviews were performed, experts were questioned, and a model for context analysis was developed. The Context Analysis Method for Identifying Design Solutions (CAMIDS) Model was created, verified and validated through both peer review and application in real world scenarios in active nuclear power plant simulators. These experiments supported the claim that strict adherence and rigid compliance to procedures may not increase safety by studying the industry's propensity for following incorrect procedures, and when it directly affects the outcome of safety or security of the plant. The findings of this research indicate that the younger generations of operators rely highly on procedures, and the organizational pressures of required compliance to procedures may lead to incidents within the plant because operators feel pressured into following the rules and policy above performing the correct actions in a timely manner. The findings support computer based procedures, efficient alarm systems, and skill of the craft matrices. The solution to the problems facing the industry include in-depth, multiple fault failure training which tests the operator's knowledge of the situation. This builds operator collaboration, competence and confidence to know what to do, and when to do it in response to an emergency situation. Strict adherence to procedures and rigid compliance to process may not prevent incidents or increase safety; building operators' fundamental skills of collaboration, competence and confidence will.

  8. Statistical significance of task related deep brain EEG dynamic changes in the time-frequency domain.

    PubMed

    Chládek, J; Brázdil, M; Halámek, J; Plešinger, F; Jurák, P

    2013-01-01

    We present an off-line analysis procedure for exploring brain activity recorded from intra-cerebral electroencephalographic data (SEEG). The objective is to determine the statistical differences between different types of stimulations in the time-frequency domain. The procedure is based on computing relative signal power change and subsequent statistical analysis. An example of characteristic statistically significant event-related de/synchronization (ERD/ERS) detected across different frequency bands following different oddball stimuli is presented. The method is used for off-line functional classification of different brain areas.

  9. Thermo-viscoelastic analysis of composite materials

    NASA Technical Reports Server (NTRS)

    Lin, Kuen Y.; Hwang, I. H.

    1989-01-01

    The thermo-viscoelastic boundary value problem for anisotropic materials is formulated and a numerical procedure is developed for the efficient analysis of stress and deformation histories in composites. The procedure is based on the finite element method and therefore it is applicable to composite laminates containing geometric discontinuities and complicated boundary conditions. Using the present formulation, the time-dependent stress and strain distributions in both notched and unnotched graphite/epoxy composites have been obtained. The effect of temperature and ply orientation on the creep and relaxation response is also studied.

  10. Automated acid and base number determination of mineral-based lubricants by fourier transform infrared spectroscopy: commercial laboratory evaluation.

    PubMed

    Winterfield, Craig; van de Voort, F R

    2014-12-01

    The Fluid Life Corporation assessed and implemented Fourier transform infrared spectroscopy (FTIR)-based methods using American Society for Testing and Materials (ASTM)-like stoichiometric reactions for determination of acid and base number for in-service mineral-based oils. The basic protocols, quality control procedures, calibration, validation, and performance of these new quantitative methods are assessed. ASTM correspondence is attained using a mixed-mode calibration, using primary reference standards to anchor the calibration, supplemented by representative sample lubricants analyzed by ASTM procedures. A partial least squares calibration is devised by combining primary acid/base reference standards and representative samples, focusing on the main spectral stoichiometric response with chemometrics assisting in accounting for matrix variability. FTIR(AN/BN) methodology is precise, accurate, and free of most interference that affects ASTM D664 and D4739 results. Extensive side-by-side operational runs produced normally distributed differences with mean differences close to zero and standard deviations of 0.18 and 0.26 mg KOH/g, respectively. Statistically, the FTIR methods are a direct match to the ASTM methods, with superior performance in terms of analytical throughput, preparation time, and solvent use. FTIR(AN/BN) analysis is a viable, significant advance for in-service lubricant analysis, providing an economic means of trending samples instead of tedious and expensive conventional ASTM(AN/BN) procedures. © 2014 Society for Laboratory Automation and Screening.

  11. Comprehension and retrieval of failure cases in airborne observatories

    NASA Technical Reports Server (NTRS)

    Alvarado, Sergio J.; Mock, Kenrick J.

    1995-01-01

    This paper describes research dealing with the computational problem of analyzing and repairing failures of electronic and mechanical systems of telescopes in NASA's airborne observatories, such as KAO (Kuiper Airborne Observatory) and SOFIA (Stratospheric Observatory for Infrared Astronomy). The research has resulted in the development of an experimental system that acquires knowledge of failure analysis from input text, and answers questions regarding failure detection and correction. The system's design builds upon previous work on text comprehension and question answering, including: knowledge representation for conceptual analysis of failure descriptions, strategies for mapping natural language into conceptual representations, case-based reasoning strategies for memory organization and indexing, and strategies for memory search and retrieval. These techniques have been combined into a model that accounts for: (a) how to build a knowledge base of system failures and repair procedures from descriptions that appear in telescope-operators' logbooks and FMEA (failure modes and effects analysis) manuals; and (b) how to use that knowledge base to search and retrieve answers to questions about causes and effects of failures, as well as diagnosis and repair procedures. This model has been implemented in FANSYS (Failure ANalysis SYStem), a prototype text comprehension and question answering program for failure analysis.

  12. Comprehension and retrieval of failure cases in airborne observatories

    NASA Astrophysics Data System (ADS)

    Alvarado, Sergio J.; Mock, Kenrick J.

    1995-05-01

    This paper describes research dealing with the computational problem of analyzing and repairing failures of electronic and mechanical systems of telescopes in NASA's airborne observatories, such as KAO (Kuiper Airborne Observatory) and SOFIA (Stratospheric Observatory for Infrared Astronomy). The research has resulted in the development of an experimental system that acquires knowledge of failure analysis from input text, and answers questions regarding failure detection and correction. The system's design builds upon previous work on text comprehension and question answering, including: knowledge representation for conceptual analysis of failure descriptions, strategies for mapping natural language into conceptual representations, case-based reasoning strategies for memory organization and indexing, and strategies for memory search and retrieval. These techniques have been combined into a model that accounts for: (a) how to build a knowledge base of system failures and repair procedures from descriptions that appear in telescope-operators' logbooks and FMEA (failure modes and effects analysis) manuals; and (b) how to use that knowledge base to search and retrieve answers to questions about causes and effects of failures, as well as diagnosis and repair procedures. This model has been implemented in FANSYS (Failure ANalysis SYStem), a prototype text comprehension and question answering program for failure analysis.

  13. Task analysis method for procedural training curriculum development.

    PubMed

    Riggle, Jakeb D; Wadman, Michael C; McCrory, Bernadette; Lowndes, Bethany R; Heald, Elizabeth A; Carstens, Patricia K; Hallbeck, M Susan

    2014-06-01

    A central venous catheter (CVC) is an important medical tool used in critical care and emergent situations. Integral to proper care in many circumstances, insertion of a CVC introduces the risk of central line-associated blood stream infections and mechanical adverse events; proper training is important for safe CVC insertion. Cognitive task analysis (CTA) methods have been successfully implemented in the medical field to improve the training of postgraduate medical trainees, but can be very time-consuming to complete and require a significant time commitment from many subject matter experts (SMEs). Many medical procedures such as CVC insertion are linear processes with well-documented procedural steps. These linear procedures may not require a traditional CTA to gather the information necessary to create a training curriculum. Accordingly, a novel, streamlined CTA method designed primarily to collect cognitive cues for linear procedures was developed to be used by medical professionals with minimal CTA training. This new CTA methodology required fewer trained personnel, fewer interview sessions, and less time commitment from SMEs than a traditional CTA. Based on this study, a streamlined CTA methodology can be used to efficiently gather cognitive information on linear medical procedures for the creation of resident training curricula and procedural skills assessments.

  14. Analysis of Short Ramps for Dual-Mode and PRT Stations

    DOT National Transportation Integrated Search

    1977-07-01

    The report documents a novel methodology and analysis procedure for measuring a program's effect, and it is based on data from case studies of a representative group of twenty urban areas, conducted during 1976, which are reported in a companion repo...

  15. Attribution Theory and Crisis Intervention Therapy.

    ERIC Educational Resources Information Center

    Skilbeck, William M.

    It was proposed that existing therapeutic procedures may influence attributions about emotional states. Therefore an attributional analysis of crisis intervention, a model of community-based, short-term consultation, was presented. This analysis suggested that crisis intervention provides attributionally-relevant information about both the source…

  16. Determination of Reaction Stoichiometries by Flow Injection Analysis.

    ERIC Educational Resources Information Center

    Rios, Angel; And Others

    1986-01-01

    Describes a method of flow injection analysis intended for calculation of complex-formation and redox reaction stoichiometries based on a closed-loop configuration. The technique is suitable for use in undergraduate laboratories. Information is provided for equipment, materials, procedures, and sample results. (JM)

  17. Evidence-Based Language Practice

    ERIC Educational Resources Information Center

    Pollock, Eric J.

    2005-01-01

    The purpose of this paper was to examine evidence-based procedures in medicine and to demonstrate that the same protocols can be used in English language instruction. In the evidence-based methodology, studies are divided into those that address specific language problems. Integrated studies are presented as a systematic overview, meta-analysis,…

  18. Soil Studies: Applying Acid-Base Chemistry to Environmental Analysis.

    ERIC Educational Resources Information Center

    West, Donna M.; Sterling, Donna R.

    2001-01-01

    Laboratory activities for chemistry students focus attention on the use of acid-base chemistry to examine environmental conditions. After using standard laboratory procedures to analyze soil and rainwater samples, students use web-based resources to interpret their findings. Uses CBL probes and graphing calculators to gather and analyze data and…

  19. Application of systems and control theory-based hazard analysis to radiation oncology.

    PubMed

    Pawlicki, Todd; Samost, Aubrey; Brown, Derek W; Manger, Ryan P; Kim, Gwe-Ya; Leveson, Nancy G

    2016-03-01

    Both humans and software are notoriously challenging to account for in traditional hazard analysis models. The purpose of this work is to investigate and demonstrate the application of a new, extended accident causality model, called systems theoretic accident model and processes (STAMP), to radiation oncology. Specifically, a hazard analysis technique based on STAMP, system-theoretic process analysis (STPA), is used to perform a hazard analysis. The STPA procedure starts with the definition of high-level accidents for radiation oncology at the medical center and the hazards leading to those accidents. From there, the hierarchical safety control structure of the radiation oncology clinic is modeled, i.e., the controls that are used to prevent accidents and provide effective treatment. Using STPA, unsafe control actions (behaviors) are identified that can lead to the hazards as well as causal scenarios that can lead to the identified unsafe control. This information can be used to eliminate or mitigate potential hazards. The STPA procedure is demonstrated on a new online adaptive cranial radiosurgery procedure that omits the CT simulation step and uses CBCT for localization, planning, and surface imaging system during treatment. The STPA procedure generated a comprehensive set of causal scenarios that are traced back to system hazards and accidents. Ten control loops were created for the new SRS procedure, which covered the areas of hospital and department management, treatment design and delivery, and vendor service. Eighty three unsafe control actions were identified as well as 472 causal scenarios that could lead to those unsafe control actions. STPA provides a method for understanding the role of management decisions and hospital operations on system safety and generating process design requirements to prevent hazards and accidents. The interaction of people, hardware, and software is highlighted. The method of STPA produces results that can be used to improve safety and prevent accidents and warrants further investigation.

  20. Communications network design and costing model users manual

    NASA Technical Reports Server (NTRS)

    Logan, K. P.; Somes, S. S.; Clark, C. A.

    1983-01-01

    The information and procedures needed to exercise the communications network design and costing model for performing network analysis are presented. Specific procedures are included for executing the model on the NASA Lewis Research Center IBM 3033 computer. The concepts, functions, and data bases relating to the model are described. Model parameters and their format specifications for running the model are detailed.

  1. Evaluating the Effectiveness of Two Commonly Used Discrete Trial Procedures for Teaching Receptive Discrimination to Young Children with Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Gutierrez, Anibal, Jr.; Hale, Melissa N.; O'Brien, Heather A.; Fischer, Aaron J.; Durocher, Jennifer S.; Alessandri, Michael

    2009-01-01

    Discrete trial teaching procedures have been demonstrated to be effective in teaching a variety of important skills for children with autism spectrum disorders (ASD). Although all discrete trial programs are based in the principles of applied behavior analysis, some variability exists between programs with regards to the precise teaching…

  2. Advanced 13C NMR Analysis of the Light Fraction, Particulate Organic Matter, and Humic Acid Fractions From a Corn-Soybean Soil

    USDA-ARS?s Scientific Manuscript database

    Fractions of soil organic matter (SOM) are usually extracted from soil by either physical (e.g., size, density) or chemical (e.g., base, acid) procedures. Integrated procedures that combine both of these types promise greater insights into SOM chemistry and function. For a corn-soybean soil in Iowa,...

  3. A SAS(®) macro implementation of a multiple comparison post hoc test for a Kruskal-Wallis analysis.

    PubMed

    Elliott, Alan C; Hynan, Linda S

    2011-04-01

    The Kruskal-Wallis (KW) nonparametric analysis of variance is often used instead of a standard one-way ANOVA when data are from a suspected non-normal population. The KW omnibus procedure tests for some differences between groups, but provides no specific post hoc pair wise comparisons. This paper provides a SAS(®) macro implementation of a multiple comparison test based on significant Kruskal-Wallis results from the SAS NPAR1WAY procedure. The implementation is designed for up to 20 groups at a user-specified alpha significance level. A Monte-Carlo simulation compared this nonparametric procedure to commonly used parametric multiple comparison tests. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  4. Performance analysis of a film dosimetric quality assurance procedure for IMRT with regard to the employment of quantitative evaluation methods.

    PubMed

    Winkler, Peter; Zurl, Brigitte; Guss, Helmuth; Kindl, Peter; Stuecklschweiger, Georg

    2005-02-21

    A system for dosimetric verification of intensity-modulated radiotherapy (IMRT) treatment plans using absolute calibrated radiographic films is presented. At our institution this verification procedure is performed for all IMRT treatment plans prior to patient irradiation. Therefore clinical treatment plans are transferred to a phantom and recalculated. Composite treatment plans are irradiated to a single film. Film density to absolute dose conversion is performed automatically based on a single calibration film. A software application encompassing film calibration, 2D registration of measurement and calculated distributions, image fusion, and a number of visual and quantitative evaluation utilities was developed. The main topic of this paper is a performance analysis for this quality assurance procedure, with regard to the specification of tolerance levels for quantitative evaluations. Spatial and dosimetric precision and accuracy were determined for the entire procedure, comprising all possible sources of error. The overall dosimetric and spatial measurement uncertainties obtained thereby were 1.9% and 0.8 mm respectively. Based on these results, we specified 5% dose difference and 3 mm distance-to-agreement as our tolerance levels for patient-specific quality assurance for IMRT treatments.

  5. Partitioning strategy for efficient nonlinear finite element dynamic analysis on multiprocessor computers

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Peters, Jeanne M.

    1989-01-01

    A computational procedure is presented for the nonlinear dynamic analysis of unsymmetric structures on vector multiprocessor systems. The procedure is based on a novel hierarchical partitioning strategy in which the response of the unsymmetric and antisymmetric response vectors (modes), each obtained by using only a fraction of the degrees of freedom of the original finite element model. The three key elements of the procedure which result in high degree of concurrency throughout the solution process are: (1) mixed (or primitive variable) formulation with independent shape functions for the different fields; (2) operator splitting or restructuring of the discrete equations at each time step to delineate the symmetric and antisymmetric vectors constituting the response; and (3) two level iterative process for generating the response of the structure. An assessment is made of the effectiveness of the procedure on the CRAY X-MP/4 computers.

  6. Cost minimisation analysis of using acellular dermal matrix (Strattice™) for breast reconstruction compared with standard techniques.

    PubMed

    Johnson, R K; Wright, C K; Gandhi, A; Charny, M C; Barr, L

    2013-03-01

    We performed a cost analysis (using UK 2011/12 NHS tariffs as a proxy for cost) comparing immediate breast reconstruction using the new one-stage technique of acellular dermal matrix (Strattice™) with implant versus the standard alternative techniques of tissue expander (TE)/implant as a two-stage procedure and latissimus dorsi (LD) flap reconstruction. Clinical report data were collected for operative time, length of stay, outpatient procedures, and number of elective and emergency admissions in our first consecutive 24 patients undergoing one-stage Strattice reconstruction. Total cost to the NHS based on tariff, assuming top-up payments to cover Strattice acquisition costs, was assessed and compared to the two historical control groups matched on key variables. Eleven patients having unilateral Strattice reconstruction were compared to 10 having TE/implant reconstruction and 10 having LD flap and implant reconstruction. Thirteen patients having bilateral Strattice reconstruction were compared to 12 having bilateral TE/implant reconstruction. Total costs were: unilateral Strattice, £3685; unilateral TE, £4985; unilateral LD and implant, £6321; bilateral TE, £5478; and bilateral Strattice, £6771. The cost analysis shows a financial advantage of using acellular dermal matrix (Strattice) in unilateral breast reconstruction versus alternative procedures. The reimbursement system in England (Payment by Results) is based on disease-related groups similar to that of many countries across Europe and tariffs are based on reported hospital costs, making this analysis of relevance in other countries. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Motion estimation using point cluster method and Kalman filter.

    PubMed

    Senesh, M; Wolf, A

    2009-05-01

    The most frequently used method in a three dimensional human gait analysis involves placing markers on the skin of the analyzed segment. This introduces a significant artifact, which strongly influences the bone position and orientation and joint kinematic estimates. In this study, we tested and evaluated the effect of adding a Kalman filter procedure to the previously reported point cluster technique (PCT) in the estimation of a rigid body motion. We demonstrated the procedures by motion analysis of a compound planar pendulum from indirect opto-electronic measurements of markers attached to an elastic appendage that is restrained to slide along the rigid body long axis. The elastic frequency is close to the pendulum frequency, as in the biomechanical problem, where the soft tissue frequency content is similar to the actual movement of the bones. Comparison of the real pendulum angle to that obtained by several estimation procedures--PCT, Kalman filter followed by PCT, and low pass filter followed by PCT--enables evaluation of the accuracy of the procedures. When comparing the maximal amplitude, no effect was noted by adding the Kalman filter; however, a closer look at the signal revealed that the estimated angle based only on the PCT method was very noisy with fluctuation, while the estimated angle based on the Kalman filter followed by the PCT was a smooth signal. It was also noted that the instantaneous frequencies obtained from the estimated angle based on the PCT method is more dispersed than those obtained from the estimated angle based on Kalman filter followed by the PCT method. Addition of a Kalman filter to the PCT method in the estimation procedure of rigid body motion results in a smoother signal that better represents the real motion, with less signal distortion than when using a digital low pass filter. Furthermore, it can be concluded that adding a Kalman filter to the PCT procedure substantially reduces the dispersion of the maximal and minimal instantaneous frequencies.

  8. Improved Holistic Analysis of Rayleigh Waves for Single- and Multi-Offset Data: Joint Inversion of Rayleigh-Wave Particle Motion and Vertical- and Radial-Component Velocity Spectra

    NASA Astrophysics Data System (ADS)

    Dal Moro, Giancarlo; Moustafa, Sayed S. R.; Al-Arifi, Nassir S.

    2018-01-01

    Rayleigh waves often propagate according to complex mode excitation so that the proper identification and separation of specific modes can be quite difficult or, in some cases, just impossible. Furthermore, the analysis of a single component (i.e., an inversion procedure based on just one objective function) necessarily prevents solving the problems related to the non-uniqueness of the solution. To overcome these issues and define a holistic analysis of Rayleigh waves, we implemented a procedure to acquire data that are useful to define and efficiently invert the three objective functions defined from the three following "objects": the velocity spectra of the vertical- and radial-components and the Rayleigh-wave particle motion (RPM) frequency-offset data. Two possible implementations are presented. In the first case we consider classical multi-offset (and multi-component) data, while in a second possible approach we exploit the data recorded by a single three-component geophone at a fixed offset from the source. Given the simple field procedures, the method could be particularly useful for the unambiguous geotechnical exploration of large areas, where more complex acquisition procedures, based on the joint acquisition of Rayleigh and Love waves, would not be economically viable. After illustrating the different kinds of data acquisition and the data processing, the results of the proposed methodology are illustrated in a case study. Finally, a series of theoretical and practical aspects are discussed to clarify some issues involved in the overall procedure (data acquisition and processing).

  9. A Standard Procedure for Conducting Cognitive Task Analysis.

    ERIC Educational Resources Information Center

    Redding, Richard E.

    Traditional methods for task analysis have been largely based on the Instructional Systems Development (ISD) model, which is widely used throughout industry and the military. The first part of this document gives an overview of cognitive task analysis, which is conducted within the first phase of ISD. The following steps of cognitive task analysis…

  10. Meta-Analysis of Mathematic Basic-Fact Fluency Interventions: A Component Analysis

    ERIC Educational Resources Information Center

    Codding, Robin S.; Burns, Matthew K.; Lukito, Gracia

    2011-01-01

    Mathematics fluency is a critical component of mathematics learning yet few attempts have been made to synthesize this research base. Seventeen single-case design studies with 55 participants were reviewed using meta-analytic procedures. A component analysis of practice elements was conducted and treatment intensity and feasibility were examined.…

  11. Content Analysis Coding Schemes for Online Asynchronous Discussion

    ERIC Educational Resources Information Center

    Weltzer-Ward, Lisa

    2011-01-01

    Purpose: Researchers commonly utilize coding-based analysis of classroom asynchronous discussion contributions as part of studies of online learning and instruction. However, this analysis is inconsistent from study to study with over 50 coding schemes and procedures applied in the last eight years. The aim of this article is to provide a basis…

  12. Monitoring the use and outcomes of new devices and procedures: how does coding affect what Hospital Episode Statistics contribute? Lessons from 12 emerging procedures 2006-10.

    PubMed

    Patrick, Hannah; Sims, Andrew; Burn, Julie; Bousfield, Derek; Colechin, Elaine; Reay, Christopher; Alderson, Neil; Goode, Stephen; Cunningham, David; Campbell, Bruce

    2013-03-01

    New devices and procedures are often introduced into health services when the evidence base for their efficacy and safety is limited. The authors sought to assess the availability and accuracy of routinely collected Hospital Episodes Statistics (HES) data in the UK and their potential contribution to the monitoring of new procedures. Four years of HES data (April 2006-March 2010) were analysed to identify episodes of hospital care involving a sample of 12 new interventional procedures. HES data were cross checked against other relevant sources including national or local registers and manufacturers' information. HES records were available for all 12 procedures during the entire study period. Comparative data sources were available from national (5), local (2) and manufacturer (2) registers. Factors found to affect comparisons were miscoding, alternative coding and inconsistent use of subsidiary codes. The analysis of provider coverage showed that HES is sensitive at detecting centres which carry out procedures, but specificity is poor in some cases. Routinely collected HES data have the potential to support quality improvements and evidence-based commissioning of devices and procedures in health services but achievement of this potential depends upon the accurate coding of procedures.

  13. The Effect of Geographic Units of Analysis on Measuring Geographic Variation in Medical Services Utilization.

    PubMed

    Kim, Agnus M; Park, Jong Heon; Kang, Sungchan; Hwang, Kyosang; Lee, Taesik; Kim, Yoon

    2016-07-01

    We aimed to evaluate the effect of geographic units of analysis on measuring geographic variation in medical services utilization. For this purpose, we compared geographic variations in the rates of eight major procedures in administrative units (districts) and new areal units organized based on the actual health care use of the population in Korea. To compare geographic variation in geographic units of analysis, we calculated the age-sex standardized rates of eight major procedures (coronary artery bypass graft surgery, percutaneous transluminal coronary angioplasty, surgery after hip fracture, knee-replacement surgery, caesarean section, hysterectomy, computed tomography scan, and magnetic resonance imaging scan) from the National Health Insurance database in Korea for the 2013 period. Using the coefficient of variation, the extremal quotient, and the systematic component of variation, we measured geographic variation for these eight procedures in districts and new areal units. Compared with districts, new areal units showed a reduction in geographic variation. Extremal quotients and inter-decile ratios for the eight procedures were lower in new areal units. While the coefficient of variation was lower for most procedures in new areal units, the pattern of change of the systematic component of variation between districts and new areal units differed among procedures. Geographic variation in medical service utilization could vary according to the geographic unit of analysis. To determine how geographic characteristics such as population size and number of geographic units affect geographic variation, further studies are needed.

  14. Comparison of methods applied in photoinduced transient spectroscopy to determining the defect center parameters: The correlation procedure and the signal analysis based on inverse Laplace transformation

    NASA Astrophysics Data System (ADS)

    Suproniuk, M.; Pawłowski, M.; Wierzbowski, M.; Majda-Zdancewicz, E.; Pawłowski, Ma.

    2018-04-01

    The procedure for determination of trap parameters by photo-induced transient spectroscopy is based on the Arrhenius plot that illustrates a thermal dependence of the emission rate. In this paper, we show that the Arrhenius plot obtained by the correlation method is shifted toward lower temperatures as compared to the one obtained with the inverse Laplace transformation. This shift is caused by the model adequacy error of the correlation method and introduces errors to a calculation procedure of defect center parameters. The effect is exemplified by comparing the results of the determination of trap parameters with both methods based on photocurrent transients for defect centers observed in tin-doped neutron-irradiated silicon crystals and in gallium arsenide grown with the Vertical Gradient Freeze method.

  15. Data management in large-scale collaborative toxicity studies: how to file experimental data for automated statistical analysis.

    PubMed

    Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette

    2013-06-01

    High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Determination of Aspartame and Caffeine in Carbonated Beverages Utilizing Electrospray Ionization-Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Bergen, H. Robert, III; Benson, Linda M.; Naylor, Stephen

    2000-10-01

    Mass spectrometry has undergone considerable changes in the past decade. The advent of "soft ionization" techniques such as electrospray ionization (ESI) affords the direct analysis of very polar molecules without need for the complex inefficient derivatization procedures often required in GC-MS. These ionization techniques make possible the direct mass spectral analysis of polar nonvolatile molecules such as DNA and proteins, which previously were difficult or impossible to analyze by MS. Compounds that readily take on a charge (acids and bases) lend themselves to ESI-MS analysis, whereas compounds that do not readily accept a charge (e.g. sugars) are often not seen or are seen only as inefficient adducts (e.g., M+Na+). To gain exposure to this state-of-the-art analytical procedure, high school students utilize ESI-MS in an analysis of aspartame and caffeine. They dilute a beverage sample and inject the diluted sample into the ESI-MS. The lab is procedurally simple and the results clearly demonstrate the potential and limitations of ESI-coupled mass spectrometry. Depending upon the instructional goals, the outlined procedures can be used to quantify the content of caffeine and aspartame in beverages or to understand the capabilities of electrospray ionization.

  17. Analysis of the effect of mobile phone base station antenna loading on localized SAR and its consequences for measurements.

    PubMed

    Hansson, Björn; Thors, Björn; Törnevik, Christer

    2011-12-01

    In this work, the effect of antenna element loading on the localized specific absorption rate (SAR) has been analyzed for base station antennas. The analysis was conducted in order to determine whether localized SAR measurements of large multi-element base station antennas can be conducted using standardized procedures and commercially available equipment. More specifically, it was investigated if the antenna shifting measurement procedure, specified in the European base station exposure assessment standard EN 50383, will produce accurate localized SAR results for base station antennas larger than the specified measurement phantom. The obtained results show that SAR accuracy is affected by the presence of lossy material within distances of one wavelength from the tested antennas as a consequence of coupling and redistribution of transmitted power among the antenna elements. It was also found that the existing standardized phantom is not optimal for SAR measurements of large base station antennas. A new methodology is instead proposed based on a larger, box-shaped, whole-body phantom. Copyright © 2011 Wiley Periodicals, Inc.

  18. A double-labeling procedure for sequence analysis of picomole amounts of nonradioactive RNA fragments.

    PubMed Central

    Gupta, R C; Randerath, E; Randerath, K

    1976-01-01

    A double-labeling procedure for sequence analysis of nonradioactive polyribonucleotides is detailed, which is based on controlled endonucleolytic degradation of 3'-terminally (3H)-labeled oligonucleotide-(3') dialcohols and 5"-terminal analysis of the partial (3H)-labeled fragments following their separation according to chain length by polyethyleneimine- (PEI-)cellulose TLC and detection by fluorography. Undesired nonradioactive partial digestion products are eliminated by periodate oxidation. The 5'-termini are assayed by enzymic incorporation of (32p)-label into the isolated fragments, enzymic release of (32p)-labeled nucleoside-(5') monophosphates, two-dimensional PEI-cellulose chromatography, and autoradiography. Using this procedure, as little as 0.1 - 0.3 A260 unit of tRNA is needed to sequence all fragments in complete ribonuclease T1 and A digests, whereas radioactive derivative methods previously described by us1-4 required 4 - 6 A260 units. Images PMID:826884

  19. 2D Flood Modelling Using Advanced Terrain Analysis Techniques And A Fully Continuous DEM-Based Rainfall-Runoff Algorithm

    NASA Astrophysics Data System (ADS)

    Nardi, F.; Grimaldi, S.; Petroselli, A.

    2012-12-01

    Remotely sensed Digital Elevation Models (DEMs), largely available at high resolution, and advanced terrain analysis techniques built in Geographic Information Systems (GIS), provide unique opportunities for DEM-based hydrologic and hydraulic modelling in data-scarce river basins paving the way for flood mapping at the global scale. This research is based on the implementation of a fully continuous hydrologic-hydraulic modelling optimized for ungauged basins with limited river flow measurements. The proposed procedure is characterized by a rainfall generator that feeds a continuous rainfall-runoff model producing flow time series that are routed along the channel using a bidimensional hydraulic model for the detailed representation of the inundation process. The main advantage of the proposed approach is the characterization of the entire physical process during hydrologic extreme events of channel runoff generation, propagation, and overland flow within the floodplain domain. This physically-based model neglects the need for synthetic design hyetograph and hydrograph estimation that constitute the main source of subjective analysis and uncertainty of standard methods for flood mapping. Selected case studies show results and performances of the proposed procedure as respect to standard event-based approaches.

  20. Conjoint Analysis for New Service Development on Electricity Distribution in Indonesia

    NASA Astrophysics Data System (ADS)

    Widaningrum, D. L.; Chynthia; Astuti, L. D.; Seran, M. A. B.

    2017-07-01

    Many cases of illegal use of electricity in Indonesia is still rampant, especially for activities where the power source is not available, such as in the location of street vendors. It is not only detrimental to the state, but also harm the perpetrators of theft of electricity and the surrounding communities. The purpose of this study is to create New Service Development (NSD) to provide a new electricity source for street vendors' activity based on their preferences. The methods applied in NSD is Conjoint Analysis, Cluster Analysis, Quality Function Deployment (QFD), Service Blueprint, Process Flow Diagrams and Quality Control Plan. The results of this study are the attributes and their importance in the new electricity’s service based on street vendors’ preferences as customers, customer segmentation, service design for new service, designing technical response, designing operational procedures, the quality control plan of any existing operational procedures.

  1. Analysis of eccentric annular incompressible seals. II - Effects of eccentricity on rotordynamic coefficients

    NASA Technical Reports Server (NTRS)

    Nelson, C. C.; Nguyen, D. T.

    1987-01-01

    A new analysis procedure has been presented which solves for the flow variables of an annular pressure seal in which the rotor has a large static displacement (eccentricity) from the centered position. The present paper incorporates the solutions to investigate the effect of eccentricity on the rotordynamic coefficients. The analysis begins with a set of governing equations based on a turbulent bulk-flow model and Moody's friction factor equation. Perturbations of the flow variables yields a set of zeroth- and first-order equations. After integration of the zeroth-order equations, the resulting zeroth-order flow variables are used as input in the solution of the first-order equations. Further integration of the first order pressures yields the eccentric rotordynamic coefficients. The results from this procedure compare well with available experimental and theoretical data, with accuracy just as good or slightly better than the predictions based on a finite-element model.

  2. The Kjeldahl method as a primary reference procedure for total protein in certified reference materials used in clinical chemistry. I. A review of Kjeldahl methods adopted by laboratory medicine.

    PubMed

    Chromý, Vratislav; Vinklárková, Bára; Šprongl, Luděk; Bittová, Miroslava

    2015-01-01

    We found previously that albumin-calibrated total protein in certified reference materials causes unacceptable positive bias in analysis of human sera. The simplest way to cure this defect is the use of human-based serum/plasma standards calibrated by the Kjeldahl method. Such standards, commutative with serum samples, will compensate for bias caused by lipids and bilirubin in most human sera. To find a suitable primary reference procedure for total protein in reference materials, we reviewed Kjeldahl methods adopted by laboratory medicine. We found two methods recommended for total protein in human samples: an indirect analysis based on total Kjeldahl nitrogen corrected for its nonprotein nitrogen and a direct analysis made on isolated protein precipitates. The methods found will be assessed in a subsequent article.

  3. Financial analysis of community-based forest enterprises with the Green Value tool

    Treesearch

    S. Humphries; Tom Holmes

    2016-01-01

    The Green Value tool was developed in response to the need for simplified procedures that could be used in the field to conduct financial analysis for community-based forest enterprises (CFEs). Initially our efforts focused on a set of worksheets that could be used by both researchers and CFEs to monitor and analyze costs and income for one production period. The...

  4. Upscaling Cement Paste Microstructure to Obtain the Fracture, Shear, and Elastic Concrete Mechanical LDPM Parameters.

    PubMed

    Sherzer, Gili; Gao, Peng; Schlangen, Erik; Ye, Guang; Gal, Erez

    2017-02-28

    Modeling the complex behavior of concrete for a specific mixture is a challenging task, as it requires bridging the cement scale and the concrete scale. We describe a multiscale analysis procedure for the modeling of concrete structures, in which material properties at the macro scale are evaluated based on lower scales. Concrete may be viewed over a range of scale sizes, from the atomic scale (10 -10 m), which is characterized by the behavior of crystalline particles of hydrated Portland cement, to the macroscopic scale (10 m). The proposed multiscale framework is based on several models, including chemical analysis at the cement paste scale, a mechanical lattice model at the cement and mortar scales, geometrical aggregate distribution models at the mortar scale, and the Lattice Discrete Particle Model (LDPM) at the concrete scale. The analysis procedure starts from a known chemical and mechanical set of parameters of the cement paste, which are then used to evaluate the mechanical properties of the LDPM concrete parameters for the fracture, shear, and elastic responses of the concrete. Although a macroscopic validation study of this procedure is presented, future research should include a comparison to additional experiments in each scale.

  5. Upscaling Cement Paste Microstructure to Obtain the Fracture, Shear, and Elastic Concrete Mechanical LDPM Parameters

    PubMed Central

    Sherzer, Gili; Gao, Peng; Schlangen, Erik; Ye, Guang; Gal, Erez

    2017-01-01

    Modeling the complex behavior of concrete for a specific mixture is a challenging task, as it requires bridging the cement scale and the concrete scale. We describe a multiscale analysis procedure for the modeling of concrete structures, in which material properties at the macro scale are evaluated based on lower scales. Concrete may be viewed over a range of scale sizes, from the atomic scale (10−10 m), which is characterized by the behavior of crystalline particles of hydrated Portland cement, to the macroscopic scale (10 m). The proposed multiscale framework is based on several models, including chemical analysis at the cement paste scale, a mechanical lattice model at the cement and mortar scales, geometrical aggregate distribution models at the mortar scale, and the Lattice Discrete Particle Model (LDPM) at the concrete scale. The analysis procedure starts from a known chemical and mechanical set of parameters of the cement paste, which are then used to evaluate the mechanical properties of the LDPM concrete parameters for the fracture, shear, and elastic responses of the concrete. Although a macroscopic validation study of this procedure is presented, future research should include a comparison to additional experiments in each scale. PMID:28772605

  6. Validation of Antibody-Based Strategies for Diagnosis of Pediatric Celiac Disease Without Biopsy.

    PubMed

    Wolf, Johannes; Petroff, David; Richter, Thomas; Auth, Marcus K H; Uhlig, Holm H; Laass, Martin W; Lauenstein, Peter; Krahl, Andreas; Händel, Norman; de Laffolie, Jan; Hauer, Almuthe C; Kehler, Thomas; Flemming, Gunter; Schmidt, Frank; Rodrigues, Astor; Hasenclever, Dirk; Mothes, Thomas

    2017-08-01

    A diagnosis of celiac disease is made based on clinical, genetic, serologic, and duodenal morphology features. Recent pediatric guidelines, based largely on retrospective data, propose omitting biopsy analysis for patients with concentrations of IgA against tissue transglutaminase (IgA-TTG) >10-fold the upper limit of normal (ULN) and if further criteria are met. A retrospective study concluded that measurements of IgA-TTG and total IgA, or IgA-TTG and IgG against deamidated gliadin (IgG-DGL) could identify patients with and without celiac disease. Patients were assigned to categories of no celiac disease, celiac disease, or biopsy required, based entirely on antibody assays. We aimed to validate the positive and negative predictive values (PPV and NPV) of these diagnostic procedures. We performed a prospective study of 898 children undergoing duodenal biopsy analysis to confirm or rule out celiac disease at 13 centers in Europe. We compared findings from serologic analysis with findings from biopsy analyses, follow-up data, and diagnoses made by the pediatric gastroenterologists (celiac disease, no celiac disease, or no final diagnosis). Assays to measure IgA-TTG, IgG-DGL, and endomysium antibodies were performed by blinded researchers, and tissue sections were analyzed by local and blinded reference pathologists. We validated 2 procedures for diagnosis: total-IgA and IgA-TTG (the TTG-IgA procedure), as well as IgG-DGL with IgA-TTG (TTG-DGL procedure). Patients were assigned to categories of no celiac disease if all assays found antibody concentrations <1-fold the ULN, or celiac disease if at least 1 assay measured antibody concentrations >10-fold the ULN. All other cases were considered to require biopsy analysis. ULN values were calculated using the cutoff levels suggested by the test kit manufacturers. HLA typing was performed for 449 participants. We used models that considered how specificity values change with prevalence to extrapolate the PPV and NPV to populations with lower prevalence of celiac disease. Of the participants, 592 were found to have celiac disease, 345 were found not to have celiac disease, and 24 had no final diagnosis. The TTG-IgA procedure identified patients with celiac disease with a PPV of 0.988 and an NPV of 0.934; the TTG-DGL procedure identified patients with celiac disease with a PPV of 0.988 and an NPV of 0.958. Based on our extrapolation model, we estimated that the PPV and NPV would remain >0.95 even at a disease prevalence as low as 4%. Tests for endomysium antibodies and HLA type did not increase the PPV of samples with levels of IgA-TTG ≥10-fold the ULN. Notably, 4.2% of pathologists disagreed in their analyses of duodenal morphology-a rate comparable to the error rate for serologic assays. In a prospective study, we validated the TTG-IgA procedure and the TTG-DGL procedure in identification of pediatric patients with or without celiac disease, without biopsy. German Clinical Trials Registry no.: DRKS00003854. Copyright © 2017 AGA Institute. Published by Elsevier Inc. All rights reserved.

  7. Is There Room for Prevention? Examining the Effect of Outpatient Facility Type on the Risk of Surgical Site Infection.

    PubMed

    Parikh, Rishi; Pollock, Daniel; Sharma, Jyotirmay; Edwards, Jonathan

    2016-10-01

    OBJECTIVE We compared risk for surgical site infection (SSI) following surgical breast procedures among 2 patient groups: those whose procedures were performed in ambulatory surgery centers (ASCs) and those whose procedures were performed in hospital-based outpatient facilities. DESIGN Cohort study using National Healthcare Safety Network (NHSN) SSI data for breast procedures performed from 2010 to 2014. METHODS Unconditional multivariate logistic regression was used to examine the association between facility type and breast SSI, adjusting for American Society of Anesthesiologists (ASA) Physical Status Classification, patient age, and duration of procedure. Other potential adjustment factors examined were wound classification, anesthesia use, and gender. RESULTS Among 124,021 total outpatient breast procedures performed between 2010 and 2014, 110,987 procedure reports submitted to the NHSN provided complete covariate data and were included in the analysis. Breast procedures performed in ASCs carried a lower risk of SSI compared with those performed in hospital-based outpatient settings. For patients aged ≤51 years, the adjusted risk ratio was 0.36 (95% CI, 0.25-0.50) and for patients >51 years old, the adjusted risk ratio was 0.32 (95% CI, 0.21-0.49). CONCLUSIONS SSI risk following breast procedures was significantly lower among ASC patients than among hospital-based outpatients. These findings should be placed in the context of study limitations, including the possibility of incomplete ascertainment of SSIs and shortcomings in the data available to control for differences in patient case mix. Additional studies are needed to better understand the role of procedural settings in SSI risk following breast procedures and to identify prevention opportunities. Infect Control Hosp Epidemiol 2016;1-7.

  8. Reliability of sensor-based real-time workflow recognition in laparoscopic cholecystectomy.

    PubMed

    Kranzfelder, Michael; Schneider, Armin; Fiolka, Adam; Koller, Sebastian; Reiser, Silvano; Vogel, Thomas; Wilhelm, Dirk; Feussner, Hubertus

    2014-11-01

    Laparoscopic cholecystectomy is a very common minimally invasive surgical procedure that may be improved by autonomous or cooperative assistance support systems. Model-based surgery with a precise definition of distinct procedural tasks (PT) of the operation was implemented and tested to depict and analyze the process of this procedure. Reliability of real-time workflow recognition in laparoscopic cholecystectomy ([Formula: see text] cases) was evaluated by continuous sensor-based data acquisition. Ten PTs were defined including begin/end preparation calots' triangle, clipping/cutting cystic artery and duct, begin/end gallbladder dissection, begin/end hemostasis, gallbladder removal, and end of operation. Data acquisition was achieved with continuous instrument detection, room/table light status, intra-abdominal pressure, table tilt, irrigation/aspiration volume and coagulation/cutting current application. Two independent observers recorded start and endpoint of each step by analysis of the sensor data. The data were cross-checked with laparoscopic video recordings serving as gold standard for PT identification. Bland-Altman analysis revealed for 95% of cases a difference of annotation results within the limits of agreement ranging from [Formula: see text]309 s (PT 7) to +368 s (PT 5). Laparoscopic video and sensor data matched to a greater or lesser extent within the different procedural tasks. In the majority of cases, the observer results exceeded those obtained from the laparoscopic video. Empirical knowledge was required to detect phase transit. A set of sensors used to monitor laparoscopic cholecystectomy procedures was sufficient to enable expert observers to reliably identify each PT. In the future, computer systems may automate the task identification process provided a more robust data inflow is available.

  9. Value based care and bundled payments: Anesthesia care costs for outpatient oncology surgery using time-driven activity-based costing.

    PubMed

    French, Katy E; Guzman, Alexis B; Rubio, Augustin C; Frenzel, John C; Feeley, Thomas W

    2016-09-01

    With the movement towards bundled payments, stakeholders should know the true cost of the care they deliver. Time-driven activity-based costing (TDABC) can be used to estimate costs for each episode of care. In this analysis, TDABC is used to both estimate the costs of anesthesia care and identify the primary drivers of those costs of 11 common oncologic outpatient surgical procedures. Personnel cost were calculated by determining the hourly cost of each provider and the associated process time of the 11 surgical procedures. Using the anesthesia record, drugs, supplies and equipment costs were identified and calculated. The current staffing model was used to determine baseline personnel costs for each procedure. Using the costs identified through TDABC analysis, the effect of different staffing ratios on anesthesia costs could be predicted. Costs for each of the procedures were determined. Process time and costs are linearly related. Personnel represented 79% of overall cost while drugs, supplies and equipment represented the remaining 21%. Changing staffing ratios shows potential savings between 13% and 28% across the 11 procedures. TDABC can be used to estimate the costs of anesthesia care. This costing information is critical to assessing the anesthesiology component in a bundled payment. It can also be used to identify areas of cost savings and model costs of anesthesia care. CRNA to anesthesiologist staffing ratios profoundly influence the cost of care. This methodology could be applied to other medical specialties to help determine costs in the setting of bundled payments. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Value Based Care and Bundled Payments: Anesthesia Care Costs for Outpatient Oncology Surgery Using Time-Driven Activity-Based Costing

    PubMed Central

    French, Katy E.; Guzman, Alexis B.; Rubio, Augustin C.; Frenzel, John C.; Feeley, Thomas W

    2015-01-01

    Background With the movement towards bundled payments, stakeholders should know the true cost of the care they deliver. Time-driven activity-based costing (TDABC) can be used to estimate costs for each episode of care. In this analysis, TDABC is used to both estimate the costs of anesthesia care and identify the primary drivers of those costs of 11 common oncologic outpatient surgical procedures. Methods Personnel cost were calculated by determining the hourly cost of each provider and the associated process time of the 11 surgical procedures. Using the anesthesia record, drugs, supplies and equipment costs were identified and calculated. The current staffing model was used to determine baseline personnel costs for each procedure. Using the costs identified through TDABC analysis, the effect of different staffing ratios on anesthesia costs could be predicted. Results Costs for each of the procedures were determined. Process time and costs are linearly related. Personnel represented 79% of overall cost while drugs, supplies and equipment represented the remaining 21%. Changing staffing ratios shows potential savings between 13-28% across the 11 procedures. Conclusions TDABC can be used to estimate the costs of anesthesia care. This costing information is critical to assessing the anesthesiology component in a bundled payment. It can also be used to identify areas of cost savings and model costs of anesthesia care. CRNA to anesthesiologist staffing ratios profoundly influence the cost of care. This methodology could be applied to other medical specialties to help determine costs in the setting of bundled payments. PMID:27637823

  11. Use of power analysis to develop detectable significance criteria for sea urchin toxicity tests

    USGS Publications Warehouse

    Carr, R.S.; Biedenbach, J.M.

    1999-01-01

    When sufficient data are available, the statistical power of a test can be determined using power analysis procedures. The term “detectable significance” has been coined to refer to this criterion based on power analysis and past performance of a test. This power analysis procedure has been performed with sea urchin (Arbacia punctulata) fertilization and embryological development data from sediment porewater toxicity tests. Data from 3100 and 2295 tests for the fertilization and embryological development tests, respectively, were used to calculate the criteria and regression equations describing the power curves. Using Dunnett's test, a minimum significant difference (MSD) (β = 0.05) of 15.5% and 19% for the fertilization test, and 16.4% and 20.6% for the embryological development test, for α ≤ 0.05 and α ≤ 0.01, respectively, were determined. The use of this second criterion reduces type I (false positive) errors and helps to establish a critical level of difference based on the past performance of the test.

  12. Upper Kalamazoo watershed land cover inventory. [based on remote sensing

    NASA Technical Reports Server (NTRS)

    Richason, B., III; Enslin, W.

    1973-01-01

    Approximately 1000 square miles of the eastern portion of the watershed were inventoried based on remote sensing imagery. The classification scheme, imagery and interpretation procedures, and a cost analysis are discussed. The distributions of land cover within the area are tabulated.

  13. Mathematics Career Simulations: An Invitation

    ERIC Educational Resources Information Center

    Sinn, Robb; Phipps, Marnie

    2013-01-01

    A simulated academic career was combined with inquiry-based learning in an upper-division undergraduate mathematics course. Concepts such as tenure, professional conferences and journals were simulated. Simulation procedures were combined with student-led, inquiry-based classroom formats. A qualitative analysis (ethnography) describes the culture…

  14. Control of large flexible structures - An experiment on the NASA Mini-Mast facility

    NASA Technical Reports Server (NTRS)

    Hsieh, Chen; Kim, Jae H.; Liu, Ketao; Zhu, Guoming; Skelton, Robert E.

    1991-01-01

    The output variance constraint controller design procedure is integrated with model reduction by modal cost analysis. A procedure is given for tuning MIMO controller designs to find the maximal rms performance of the actual system. Controller designs based on a finite-element model of the system are compared with controller designs based on an identified model (obtained using the Q-Markov Cover algorithm). The identified model and the finite-element model led to similar closed-loop performance, when tested in the Mini-Mast facility at NASA Langley.

  15. Feature Screening for Ultrahigh Dimensional Categorical Data with Applications.

    PubMed

    Huang, Danyang; Li, Runze; Wang, Hansheng

    2014-01-01

    Ultrahigh dimensional data with both categorical responses and categorical covariates are frequently encountered in the analysis of big data, for which feature screening has become an indispensable statistical tool. We propose a Pearson chi-square based feature screening procedure for categorical response with ultrahigh dimensional categorical covariates. The proposed procedure can be directly applied for detection of important interaction effects. We further show that the proposed procedure possesses screening consistency property in the terminology of Fan and Lv (2008). We investigate the finite sample performance of the proposed procedure by Monte Carlo simulation studies, and illustrate the proposed method by two empirical datasets.

  16. Optimization for minimum sensitivity to uncertain parameters

    NASA Technical Reports Server (NTRS)

    Pritchard, Jocelyn I.; Adelman, Howard M.; Sobieszczanski-Sobieski, Jaroslaw

    1994-01-01

    A procedure to design a structure for minimum sensitivity to uncertainties in problem parameters is described. The approach is to minimize directly the sensitivity derivatives of the optimum design with respect to fixed design parameters using a nested optimization procedure. The procedure is demonstrated for the design of a bimetallic beam for minimum weight with insensitivity to uncertainties in structural properties. The beam is modeled with finite elements based on two dimensional beam analysis. A sequential quadratic programming procedure used as the optimizer supplies the Lagrange multipliers that are used to calculate the optimum sensitivity derivatives. The method was perceived to be successful from comparisons of the optimization results with parametric studies.

  17. Text-Content-Analysis based on the Syntactic Correlations between Ontologies

    NASA Astrophysics Data System (ADS)

    Tenschert, Axel; Kotsiopoulos, Ioannis; Koller, Bastian

    The work presented in this chapter is concerned with the analysis of semantic knowledge structures, represented in the form of Ontologies, through which Service Level Agreements (SLAs) are enriched with new semantic data. The objective of the enrichment process is to enable SLA negotiation in a way that is much more convenient for a Service Users. For this purpose the deployment of an SLA-Management-System as well as the development of an analyzing procedure for Ontologies is required. This chapter will refer to the BREIN, the FinGrid and the LarKC projects. The analyzing procedure examines the syntactic correlations of several Ontologies whose focus lies in the field of mechanical engineering. A method of analyzing text and content is developed as part of this procedure. In order to so, we introduce a formalism as well as a method for understanding content. The analysis and methods are integrated to an SLA Management System which enables a Service User to interact with the system as a service by negotiating the user requests and including the semantic knowledge. Through negotiation between Service User and Service Provider the analysis procedure considers the user requests by extending the SLAs with semantic knowledge. Through this the economic use of an SLA-Management-System is increased by the enhancement of SLAs with semantic knowledge structures. The main focus of this chapter is the analyzing procedure, respectively the Text-Content-Analysis, which provides the mentioned semantic knowledge structures.

  18. Medical Tourism for CCSVI Procedures in People with Multiple Sclerosis: An Observational Study.

    PubMed

    Metz, Luanne M; Greenfield, Jamie; Marrie, Ruth Ann; Jette, Nathalie; Blevins, Gregg; Svenson, Lawrence W; Alikhani, Katayoun; Wall, Winona; Dhaliwal, Raveena; Suchowersky, Oksana

    2016-05-01

    Many Canadians with multiple sclerosis (MS) have recently travelled internationally to have procedures for a putative condition called chronic cerebrospinal venous insufficiency (CCSVI). Here, we describe where and when they went and describe the baseline characteristics of persons with MS who participated in this non-evidence-based medical tourism for CCSVI procedures. We conducted a longitudinal observational study that used online questionnaires to collect patient-reported information about the safety, experiences, and outcomes following procedures for CCSVI. A convenience sample of all Albertans with MS was recruited between July 2011 and March 2013. In total, 868 individuals enrolled; 704 were included in this cross-sectional, baseline analysis. Of these, 128 (18.2%) participants retrospectively reported having procedures for CCSVI between April 2010 and September 2012. The proportion of participants reporting CCSVI procedures declined from 80 (62.5%) in 2010, to 40 (31.1%) in 2011, and 8 (6.3%) in 2012. In multivariable logistic regression analysis, CCSVI procedures were independently associated with longer disease duration, secondary progressive clinical course, and greater disability status. Although all types of people with MS pursued procedures for CCSVI, a major driver of participation was greater disability. This highlights that those with the greatest disability are the most vulnerable to unproven experimental procedures. Participation in CCSVI procedures waned over time possibly reflecting unmet expectations of treated patients, decreased media attention, or that individuals who wanted procedures had them soon after the CCSVI hypothesis was widely publicized.

  19. Model-Free Feature Screening for Ultrahigh Dimensional Discriminant Analysis

    PubMed Central

    Cui, Hengjian; Li, Runze

    2014-01-01

    This work is concerned with marginal sure independence feature screening for ultra-high dimensional discriminant analysis. The response variable is categorical in discriminant analysis. This enables us to use conditional distribution function to construct a new index for feature screening. In this paper, we propose a marginal feature screening procedure based on empirical conditional distribution function. We establish the sure screening and ranking consistency properties for the proposed procedure without assuming any moment condition on the predictors. The proposed procedure enjoys several appealing merits. First, it is model-free in that its implementation does not require specification of a regression model. Second, it is robust to heavy-tailed distributions of predictors and the presence of potential outliers. Third, it allows the categorical response having a diverging number of classes in the order of O(nκ) with some κ ≥ 0. We assess the finite sample property of the proposed procedure by Monte Carlo simulation studies and numerical comparison. We further illustrate the proposed methodology by empirical analyses of two real-life data sets. PMID:26392643

  20. Air-Gapped Structures as Magnetic Elements for Use in Power Processing Systems. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Ohri, A. K.

    1977-01-01

    Methodical approaches to the design of inductors for use in LC filters and dc-to-dc converters using air gapped magnetic structures are presented. Methods for the analysis and design of full wave rectifier LC filter circuits operating with the inductor current in both the continuous conduction and the discontinuous conduction modes are also described. In the continuous conduction mode, linear circuit analysis techniques are employed, while in the case of the discontinuous mode, the method of analysis requires computer solutions of the piecewise linear differential equations which describe the filter in the time domain. Procedures for designing filter inductors using air gapped cores are presented. The first procedure requires digital computation to yield a design which is optimized in the sense of minimum core volume and minimum number of turns. The second procedure does not yield an optimized design as defined above, but the design can be obtained by hand calculations or with a small calculator. The third procedure is based on the use of specially prepared magnetic core data and provides an easy way to quickly reach a workable design.

  1. Spectral analysis based on fast Fourier transformation (FFT) of surveillance data: the case of scarlet fever in China.

    PubMed

    Zhang, T; Yang, M; Xiao, X; Feng, Z; Li, C; Zhou, Z; Ren, Q; Li, X

    2014-03-01

    Many infectious diseases exhibit repetitive or regular behaviour over time. Time-domain approaches, such as the seasonal autoregressive integrated moving average model, are often utilized to examine the cyclical behaviour of such diseases. The limitations for time-domain approaches include over-differencing and over-fitting; furthermore, the use of these approaches is inappropriate when the assumption of linearity may not hold. In this study, we implemented a simple and efficient procedure based on the fast Fourier transformation (FFT) approach to evaluate the epidemic dynamic of scarlet fever incidence (2004-2010) in China. This method demonstrated good internal and external validities and overcame some shortcomings of time-domain approaches. The procedure also elucidated the cycling behaviour in terms of environmental factors. We concluded that, under appropriate circumstances of data structure, spectral analysis based on the FFT approach may be applicable for the study of oscillating diseases.

  2. Improvements in soft gelatin capsule sample preparation for USP-based simethicone FTIR analysis.

    PubMed

    Hargis, Amy D; Whittall, Linda B

    2013-02-23

    Due to the absence of a significant chromophore, Simethicone raw material and finished product analysis is achieved using a FTIR-based method that quantifies the polydimethylsiloxane (PDMS) component of the active ingredient. The method can be found in the USP monographs for several dosage forms of Simethicone-containing pharmaceutical products. For soft gelatin capsules, the PDMS assay values determined using the procedure described in the USP method were variable (%RSDs from 2 to 9%) and often lower than expected based on raw material values. After investigation, it was determined that the extraction procedure used for sample preparation was causing loss of material to the container walls due to the hydrophobic nature of PDMS. Evaluation revealed that a simple dissolution of the gelatin capsule fill in toluene provided improved assay results (%RSDs≤0.5%) as well as a simplified and rapid sample preparation. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Strain-Based Damage Determination Using Finite Element Analysis for Structural Health Management

    NASA Technical Reports Server (NTRS)

    Hochhalter, Jacob D.; Krishnamurthy, Thiagaraja; Aguilo, Miguel A.

    2016-01-01

    A damage determination method is presented that relies on in-service strain sensor measurements. The method employs a gradient-based optimization procedure combined with the finite element method for solution to the forward problem. It is demonstrated that strains, measured at a limited number of sensors, can be used to accurately determine the location, size, and orientation of damage. Numerical examples are presented to demonstrate the general procedure. This work is motivated by the need to provide structural health management systems with a real-time damage characterization. The damage cases investigated herein are characteristic of point-source damage, which can attain critical size during flight. The procedure described can be used to provide prognosis tools with the current damage configuration.

  4. A posteriori noise estimation in variable data sets. With applications to spectra and light curves

    NASA Astrophysics Data System (ADS)

    Czesla, S.; Molle, T.; Schmitt, J. H. M. M.

    2018-01-01

    Most physical data sets contain a stochastic contribution produced by measurement noise or other random sources along with the signal. Usually, neither the signal nor the noise are accurately known prior to the measurement so that both have to be estimated a posteriori. We have studied a procedure to estimate the standard deviation of the stochastic contribution assuming normality and independence, requiring a sufficiently well-sampled data set to yield reliable results. This procedure is based on estimating the standard deviation in a sample of weighted sums of arbitrarily sampled data points and is identical to the so-called DER_SNR algorithm for specific parameter settings. To demonstrate the applicability of our procedure, we present applications to synthetic data, high-resolution spectra, and a large sample of space-based light curves and, finally, give guidelines to apply the procedure in situation not explicitly considered here to promote its adoption in data analysis.

  5. FDI based on Artificial Neural Network for Low-Voltage-Ride-Through in DFIG-based Wind Turbine.

    PubMed

    Adouni, Amel; Chariag, Dhia; Diallo, Demba; Ben Hamed, Mouna; Sbita, Lassaâd

    2016-09-01

    As per modern electrical grid rules, Wind Turbine needs to operate continually even in presence severe grid faults as Low Voltage Ride Through (LVRT). Hence, a new LVRT Fault Detection and Identification (FDI) procedure has been developed to take the appropriate decision in order to develop the convenient control strategy. To obtain much better decision and enhanced FDI during grid fault, the proposed procedure is based on voltage indicators analysis using a new Artificial Neural Network architecture (ANN). In fact, two features are extracted (the amplitude and the angle phase). It is divided into two steps. The first is fault indicators generation and the second is indicators analysis for fault diagnosis. The first step is composed of six ANNs which are dedicated to describe the three phases of the grid (three amplitudes and three angle phases). Regarding to the second step, it is composed of a single ANN which analysis the indicators and generates a decision signal that describes the function mode (healthy or faulty). On other hand, the decision signal identifies the fault type. It allows distinguishing between the four faulty types. The diagnosis procedure is tested in simulation and experimental prototype. The obtained results confirm and approve its efficiency, rapidity, robustness and immunity to the noise and unknown inputs. Copyright © 2016 ISA. Published by Elsevier Ltd. All rights reserved.

  6. Trial-Based Functional Analysis Informs Treatment for Vocal Scripting.

    PubMed

    Rispoli, Mandy; Brodhead, Matthew; Wolfe, Katie; Gregori, Emily

    2018-05-01

    Research on trial-based functional analysis has primarily focused on socially maintained challenging behaviors. However, procedural modifications may be necessary to clarify ambiguous assessment results. The purposes of this study were to evaluate the utility of iterative modifications to trial-based functional analysis on the identification of putative reinforcement and subsequent treatment for vocal scripting. For all participants, modifications to the trial-based functional analysis identified a primary function of automatic reinforcement. The structure of the trial-based format led to identification of social attention as an abolishing operation for vocal scripting. A noncontingent attention treatment was evaluated using withdrawal designs for each participant. This noncontingent attention treatment resulted in near zero levels of vocal scripting for all participants. Implications for research and practice are presented.

  7. Integrated flight/propulsion control - Subsystem specifications for performance

    NASA Technical Reports Server (NTRS)

    Neighbors, W. K.; Rock, Stephen M.

    1993-01-01

    A procedure is presented for calculating multiple subsystem specifications given a number of performance requirements on the integrated system. This procedure applies to problems where the control design must be performed in a partitioned manner. It is based on a structured singular value analysis, and generates specifications as magnitude bounds on subsystem uncertainties. The performance requirements should be provided in the form of bounds on transfer functions of the integrated system. This form allows the expression of model following, command tracking, and disturbance rejection requirements. The procedure is demonstrated on a STOVL aircraft design.

  8. A new procedure for calculating contact stresses in gear teeth

    NASA Technical Reports Server (NTRS)

    Somprakit, Paisan; Huston, Ronald L.

    1991-01-01

    A numerical procedure for evaluating and monitoring contact stresses in meshing gear teeth is discussed. The procedure is intended to extend the range of applicability and to improve the accuracy of gear contact stress analysis. The procedure is based upon fundamental solution from the theory of elasticity. It is an iterative numerical procedure. The method is believed to have distinct advantages over the classical Hertz method, the finite-element method, and over existing approaches with the boundary element method. Unlike many classical contact stress analyses, friction effects and sliding are included. Slipping and sticking in the contact region are studied. Several examples are discussed. The results are in agreement with classical results. Applications are presented for spur gears.

  9. Estimating equations estimates of trends

    USGS Publications Warehouse

    Link, W.A.; Sauer, J.R.

    1994-01-01

    The North American Breeding Bird Survey monitors changes in bird populations through time using annual counts at fixed survey sites. The usual method of estimating trends has been to use the logarithm of the counts in a regression analysis. It is contended that this procedure is reasonably satisfactory for more abundant species, but produces biased estimates for less abundant species. An alternative estimation procedure based on estimating equations is presented.

  10. A TOTP-based enhanced route optimization procedure for mobile IPv6 to reduce handover delay and signalling overhead.

    PubMed

    Shah, Peer Azmat; Hasbullah, Halabi B; Lawal, Ibrahim A; Aminu Mu'azu, Abubakar; Tang Jung, Low

    2014-01-01

    Due to the proliferation of handheld mobile devices, multimedia applications like Voice over IP (VoIP), video conferencing, network music, and online gaming are gaining popularity in recent years. These applications are well known to be delay sensitive and resource demanding. The mobility of mobile devices, running these applications, across different networks causes delay and service disruption. Mobile IPv6 was proposed to provide mobility support to IPv6-based mobile nodes for continuous communication when they roam across different networks. However, the Route Optimization procedure in Mobile IPv6 involves the verification of mobile node's reachability at the home address and at the care-of address (home test and care-of test) that results in higher handover delays and signalling overhead. This paper presents an enhanced procedure, time-based one-time password Route Optimization (TOTP-RO), for Mobile IPv6 Route Optimization that uses the concepts of shared secret Token, time based one-time password (TOTP) along with verification of the mobile node via direct communication and maintaining the status of correspondent node's compatibility. The TOTP-RO was implemented in network simulator (NS-2) and an analytical analysis was also made. Analysis showed that TOTP-RO has lower handover delays, packet loss, and signalling overhead with an increased level of security as compared to the standard Mobile IPv6's Return-Routability-based Route Optimization (RR-RO).

  11. Mindfulness-Based Approaches in the Treatment of Disordered Gambling: A Systematic Review and Meta-Analysis

    ERIC Educational Resources Information Center

    Maynard, Brandy R.; Wilson, Alyssa N.; Labuzienski, Elizabeth; Whiting, Seth W.

    2018-01-01

    Background and Aims: To examine the effects of mindfulness-based interventions on gambling behavior and symptoms, urges, and financial outcomes. Method: Systematic review and meta-analytic procedures were employed to search, select, code, and analyze studies conducted between 1980 and 2014, assessing the effects of mindfulness-based interventions…

  12. Abdominoplasty: Risk Factors, Complication Rates, and Safety of Combined Procedures.

    PubMed

    Winocour, Julian; Gupta, Varun; Ramirez, J Roberto; Shack, R Bruce; Grotting, James C; Higdon, K Kye

    2015-11-01

    Among aesthetic surgery procedures, abdominoplasty is associated with a higher complication rate, but previous studies are limited by small sample sizes or single-institution experience. A cohort of patients who underwent abdominoplasty between 2008 and 2013 was identified from the CosmetAssure database. Major complications were recorded. Univariate and multivariate analysis was performed evaluating risk factors, including age, smoking, body mass index, sex, diabetes, type of surgical facility, and combined procedures. The authors identified 25,478 abdominoplasties from 183,914 procedures in the database. Of these, 8,975 patients had abdominoplasty alone and 16,503 underwent additional procedures. The number of complications recorded was 1,012 (4.0 percent overall rate versus 1.4 percent in other aesthetic surgery procedures). Of these, 31.5 percent were hematomas, 27.2 percent were infections and 20.2 percent were suspected or confirmed venous thromboembolism. On multivariate analysis, significant risk factors (p < 0.05) included male sex (relative risk, 1.8), age 55 years or older (1.4), body mass index greater than or equal to 30 (1.3), multiple procedures (1.5), and procedure performance in a hospital or surgical center versus office-based surgical suite (1.6). Combined procedures increased the risk of complication (abdominoplasty alone, 3.1 percent; with liposuction, 3.8 percent; breast procedure, 4.3 percent; liposuction and breast procedure, 4.6 percent; body-contouring procedure, 6.8 percent; liposuction and body-contouring procedure, 10.4 percent). Abdominoplasty is associated with a higher complication rate compared with other aesthetic procedures. Combined procedures can significantly increase complication rates and should be considered carefully in higher risk patients. Risk, II.

  13. [Scientific monitoring of the visitation procedure in inpatient rehabilitation centres of the German statutory pension insurance fund--the "Visit II" Project].

    PubMed

    Neuderth, S; Saupe-Heide, M; Brückner, U; Gross, B; Wenderoth, N; Vogel, H

    2012-06-01

    Visitation procedures are an established method of external quality assurance. They have been conducted for many years in the German statutory pension insurance's medical rehabilitation centres and have continuously been refined and standardized. The overall goal of the visitation procedure implemented by the German statutory pension fund is to ensure compliance with defined quality standards as well as information exchange and counselling of rehabilitation centres. In the context of advancing the visitation procedure in the German statutory pension funds' medical rehabilitation centres, the "Visit II" Project was initiated to evaluate the perspectives and expectations of the various professional groups involved in the visitations and to modify the materials used during visitations (documentation form and manual). Evaluation data from the rehabilitation centres visited in 2008 were gathered using both written surveys (utilization analysis) and telephone-based interviews with administration managers and chief physicians. The utilization analysis procedure was evaluated with regard to its methodological quality. In addition, the pension insurance physicians in charge of patient allocation during socio-medical assessment were surveyed with regard to potential needs for revision of the visitation procedure. Data collection was complemented by expert panels with auditors. Interviews with users as part of the formative evaluation of the visitation procedure showed positive results regarding acceptance and applicability of the visitations as well as of the utilization analysis procedures. Various suggestions were made with regard to modification and revision of the visitation materials, that could be implemented in many cases. Documentation forms were supplemented by current scientifically-based topics in rehabilitation (e. g., vocationally oriented measures), whereas items with minor relevance were skipped. The manual (for somatic indications) was thoroughly revised. The transparent presentation of visitation processes and visitation criteria has proven to be a useful basis for strengthening the cooperation between the statutory pension insurance funds and the rehabilitation centres. Moreover, it is a helpful tool for the systematic and continuous advancement of this complex method by including all parties involved. © Georg Thieme Verlag KG Stuttgart · New York.

  14. 3D spherical-cap fitting procedure for (truncated) sessile nano- and micro-droplets & -bubbles.

    PubMed

    Tan, Huanshu; Peng, Shuhua; Sun, Chao; Zhang, Xuehua; Lohse, Detlef

    2016-11-01

    In the study of nanobubbles, nanodroplets or nanolenses immobilised on a substrate, a cross-section of a spherical cap is widely applied to extract geometrical information from atomic force microscopy (AFM) topographic images. In this paper, we have developed a comprehensive 3D spherical-cap fitting procedure (3D-SCFP) to extract morphologic characteristics of complete or truncated spherical caps from AFM images. Our procedure integrates several advanced digital image analysis techniques to construct a 3D spherical-cap model, from which the geometrical parameters of the nanostructures are extracted automatically by a simple algorithm. The procedure takes into account all valid data points in the construction of the 3D spherical-cap model to achieve high fidelity in morphology analysis. We compare our 3D fitting procedure with the commonly used 2D cross-sectional profile fitting method to determine the contact angle of a complete spherical cap and a truncated spherical cap. The results from 3D-SCFP are consistent and accurate, while 2D fitting is unavoidably arbitrary in the selection of the cross-section and has a much lower number of data points on which the fitting can be based, which in addition is biased to the top of the spherical cap. We expect that the developed 3D spherical-cap fitting procedure will find many applications in imaging analysis.

  15. How Multiple Interventions Influenced Employee Turnover: A Case Study.

    ERIC Educational Resources Information Center

    Hatcher, Timothy

    1999-01-01

    A 3-year study of 46 textile industry workers identified causes of employee turnover (supervision, training, organizational communication) using performance analysis. A study of multiple interventions based on the analysis resulted in changes in orientation procedures, organizational leadership, and climate, reducing turnover by 24%. (SK)

  16. New Software for Market Segmentation Analysis: A Chi-Square Interaction Detector. AIR 1983 Annual Forum Paper.

    ERIC Educational Resources Information Center

    Lay, Robert S.

    The advantages and disadvantages of new software for market segmentation analysis are discussed, and the application of this new, chi-square based procedure (CHAID), is illustrated. A comparison is presented of an earlier, binary segmentation technique (THAID) and a multiple discriminant analysis. It is suggested that CHAID is superior to earlier…

  17. Alternative Strategies in Assessing Special Education Needs

    ERIC Educational Resources Information Center

    Dykeman, Bruce F.

    2006-01-01

    The conventional use of standardized testing within a discrepancy analysis model is reviewed. The Response-to-Intervention (RTI) process is explained, along with descriptions of assessment procedures within RTI: functional assessment, authentic assessment, curriculum-based measurement, and play-based assessment. Psychometric issues relevant to RTI…

  18. Application of a faith-based integration tool to assess mental and physical health interventions

    PubMed Central

    Saunders, Donna M.; Leak, Jean; Carver, Monique E.; Smith, Selina A.

    2017-01-01

    Background To build on current research involving faith-based interventions (FBIs) for addressing mental and physical health, this study a) reviewed the extent to which relevant publications integrate faith concepts with health and b) initiated analysis of the degree of FBI integration with intervention outcomes. Methods Derived from a systematic search of articles published between 2007 and 2017, 36 studies were assessed with a Faith-Based Integration Assessment Tool (FIAT) to quantify faith-health integration. Basic statistical procedures were employed to determine the association of faith-based integration with intervention outcomes. Results The assessed studies possessed (on average) moderate, inconsistent integration because of poor use of faith measures, and moderate, inconsistent use of faith practices. Analysis procedures for determining the effect of FBI integration on intervention outcomes were inadequate for formulating practical conclusions. Conclusions Regardless of integration, interventions were associated with beneficial outcomes. To determine the link between FBI integration and intervention outcomes, additional analyses are needed. PMID:29354795

  19. Real-time motion compensated patient positioning and non-rigid deformation estimation using 4-D shape priors.

    PubMed

    Wasza, Jakob; Bauer, Sebastian; Hornegger, Joachim

    2012-01-01

    Over the last years, range imaging (RI) techniques have been proposed for patient positioning and respiration analysis in motion compensation. Yet, current RI based approaches for patient positioning employ rigid-body transformations, thus neglecting free-form deformations induced by respiratory motion. Furthermore, RI based respiration analysis relies on non-rigid registration techniques with run-times of several seconds. In this paper we propose a real-time framework based on RI to perform respiratory motion compensated positioning and non-rigid surface deformation estimation in a joint manner. The core of our method are pre-procedurally obtained 4-D shape priors that drive the intra-procedural alignment of the patient to the reference state, simultaneously yielding a rigid-body table transformation and a free-form deformation accounting for respiratory motion. We show that our method outperforms conventional alignment strategies by a factor of 3.0 and 2.3 in the rotation and translation accuracy, respectively. Using a GPU based implementation, we achieve run-times of 40 ms.

  20. A Procedure for Structural Weight Estimation of Single Stage to Orbit Launch Vehicles (Interim User's Manual)

    NASA Technical Reports Server (NTRS)

    Martinovic, Zoran N.; Cerro, Jeffrey A.

    2002-01-01

    This is an interim user's manual for current procedures used in the Vehicle Analysis Branch at NASA Langley Research Center, Hampton, Virginia, for launch vehicle structural subsystem weight estimation based on finite element modeling and structural analysis. The process is intended to complement traditional methods of conceptual and early preliminary structural design such as the application of empirical weight estimation or application of classical engineering design equations and criteria on one dimensional "line" models. Functions of two commercially available software codes are coupled together. Vehicle modeling and analysis are done using SDRC/I-DEAS, and structural sizing is performed with the Collier Research Corp. HyperSizer program.

  1. Modal analysis of the thermal conductivity of nanowires: examining unique thermal transport features.

    PubMed

    Samaraweera, Nalaka; Larkin, Jason M; Chan, Kin L; Mithraratne, Kumar

    2018-06-06

    In this study, unique thermal transport features of nanowires over bulk materials are investigated using a combined analysis based on lattice dynamics and equilibrium molecular dynamics (EMD). The evaluation of the thermal conductivity (TC) of Lenard-Jones nanowires becomes feasible due to the multi-step normal mode decomposition (NMD) procedure implemented in the study. A convergence issue of the TC of nanowires is addressed by the NMD implementation for two case studies, which employ pristine nanowires (PNW) and superlattice nanowires. Interestingly, mode relaxation times at low frequencies of acoustic branches exhibit signs of approaching constant values, thus indicating the convergence of TC. The TC evaluation procedure is further verified by implementing EMD-based Green-Kubo analysis, which is based on a fundamentally different physical perspective. Having verified the NMD procedure, the non-monotonic trend of the TC of nanowires is addressed. It is shown that the principal cause for the observed trend is due to the competing effects of long wavelength phonons and phonon-surface scatterings as the nanowire's cross-sectional width is changed. A computational procedure is developed to decompose the different modal contribution to the TC of shell alloy nanowires (SANWs) using virtual crystal NMD and the Allen-Feldman theory. Several important conclusions can be drawn from the results. A propagons to non-propagons boundary appeared, resulting in a cut-off frequency (ω cut ); moreover, as alloy atomic mass is increased, ω cut shifts to lower frequencies. The existence of non-propagons partly causes the low TC of SANWs. It can be seen that modes with low frequencies demonstrate a similar behavior to corresponding modes of PNWs. Moreover, lower group velocities associated with higher alloy atomic mass resulted in a lower TC of SANWs.

  2. Modal analysis of the thermal conductivity of nanowires: examining unique thermal transport features

    NASA Astrophysics Data System (ADS)

    Samaraweera, Nalaka; Larkin, Jason M.; Chan, Kin L.; Mithraratne, Kumar

    2018-06-01

    In this study, unique thermal transport features of nanowires over bulk materials are investigated using a combined analysis based on lattice dynamics and equilibrium molecular dynamics (EMD). The evaluation of the thermal conductivity (TC) of Lenard–Jones nanowires becomes feasible due to the multi-step normal mode decomposition (NMD) procedure implemented in the study. A convergence issue of the TC of nanowires is addressed by the NMD implementation for two case studies, which employ pristine nanowires (PNW) and superlattice nanowires. Interestingly, mode relaxation times at low frequencies of acoustic branches exhibit signs of approaching constant values, thus indicating the convergence of TC. The TC evaluation procedure is further verified by implementing EMD-based Green–Kubo analysis, which is based on a fundamentally different physical perspective. Having verified the NMD procedure, the non-monotonic trend of the TC of nanowires is addressed. It is shown that the principal cause for the observed trend is due to the competing effects of long wavelength phonons and phonon–surface scatterings as the nanowire’s cross-sectional width is changed. A computational procedure is developed to decompose the different modal contribution to the TC of shell alloy nanowires (SANWs) using virtual crystal NMD and the Allen–Feldman theory. Several important conclusions can be drawn from the results. A propagons to non-propagons boundary appeared, resulting in a cut-off frequency (ω cut); moreover, as alloy atomic mass is increased, ω cut shifts to lower frequencies. The existence of non-propagons partly causes the low TC of SANWs. It can be seen that modes with low frequencies demonstrate a similar behavior to corresponding modes of PNWs. Moreover, lower group velocities associated with higher alloy atomic mass resulted in a lower TC of SANWs.

  3. Seismic Structural Considerations for the Stern and Base of Retaining Walls Subjected to Earthquake Ground Motions

    DTIC Science & Technology

    2005-05-01

    CONTRACT NUMBER 5b. GRANT NUMBER 4. TITLE AND SUBTITLE Seismic Structural Considerations for the Stem and Base of Retaining Walls...as represented by response spectra are determined. Several modes of vibration are considered. The number of modes included in the analysis is that...response spectrum- modal analysis procedure. Especially important is the number of excursions beyond acceptable displacement. As with the response

  4. Modified procedure to determine acid-insoluble lignin in wood and pulp

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Effland, M.J.

    1977-10-01

    If wood is treated with strong acid, carbohydrates are hydrolyzed and solubilized. The insoluble residue is by definition lignin and can be measured gravimetrically. The standard method of analysis requires samples of 1 or 2 g of wood or pulp. In research at this laboratory these amounts of sample are often not available for analytical determinations. Thus we developed a modification of the standard procedure suitable for much smaller sample amounts. The modification is based on the procedure of Saeman. Wood samples require extraction prior to lignin analysis to remove acid-insoluble extractives that will be measured as lignin. Usually thismore » involves only a standard extraction with ethanol--benzene. However, woods high in tannin must also be subjected to extraction with alcohol. Pulps seldom require extraction.« less

  5. [From Science to Law: Findings of Reha XI Project on Ascertaining the Need for Rehabilitation in Medical Service Assessments].

    PubMed

    Kalwitzki, T; Huter, K; Runte, R; Breuninger, K; Janatzek, S; Gronemeyer, S; Gansweid, B; Rothgang, H

    2017-03-01

    Introduction: In the broad-based consortium project "Reha XI - Identifying rehabilitative requirements in medical service assessments: evaluation and implementation", a comprehensive analysis of the corresponding procedures was carried out by the medical services of the German Health Insurance Funds (MDK). On the basis of this analysis, a Good Practice Standard (GPS) for assessments was drawn up and scientifically evaluated. This article discusses the findings and applicability of the GPS as the basis for a nationwide standardized procedure in Germany as required by the Second Act to Strengthen Long-Term Care (PSG II) under Vol. XI Para. 18 (6) of the German Social Welfare Code. Method: The consortium project comprised four project phases: 1. Qualitative and quantitative situation analysis of the procedures for ascertaining rehabilitative needs in care assessments carried out by the MDK; 2. Development of a Good Practice Standard (GPS) in a structured, consensus-based procedure; 3. Scientific evaluation of the validity, reliability and practicability of the assessment procedure according to the GPS in the MDK's operational practice; 4. Survey of long-term care insurance funds with respect to the appropriateness of the rehabilitation recommendations drawn up by care assessors in line with the GPS for providing a qualified recommendation for the applicant. The evaluation carried out in the third project phase was subject to methodological limitations that may have given rise to distortions in the findings. Findings: On the basis of the situation analysis, 7 major thematic areas were identified in which improvements were implemented by applying the GPS. For the evaluation of the GPS, a total of 3 247 applicants were assessed in line with the GPS; in 6.3% of the applicants, an indication for medical rehabilitation was determined. The GPS procedure showed a high degree of reliability and practicability, but the values for the validity of the assessment procedure were highly unsatisfactory. The degree of acceptance by the long-term care insurance funds with respect to the recommendations for rehabilitation following the GPS procedure was high. Conclusion: The application of a general standard across all MDKs shows marked improvements in the quality of the assessment procedure and leads more frequently to the ascertainment of an indication for medical rehabilitation. The methodological problems and the unsatisfactory findings with respect to the validity of the assessors' decisions require further scientific scrutiny. © Georg Thieme Verlag KG Stuttgart · New York.

  6. Improved method for roadside barrier length of need modeling using real-world trajectories.

    PubMed

    Johnson, Nicholas S; Thomson, Robert; Gabler, Hampton C

    2015-07-01

    The 2011 AASHTO Roadside Design Guide (RDG) contains perhaps the most widely used procedure for choosing an appropriate length of need (LON) for roadside barriers. However, this procedure has several limitations. The procedure uses a highly simplified model of vehicle departure, and the procedure does not allow designers to specify an explicit level of protection. A new procedure for choosing LON that addresses these limitations is presented in this paper. This new procedure is based on recent, real-world road departure trajectories and uses this departure data in a more realistic way. The new procedure also allows LON to be specified for a precisely known level of protection - a level which can be based on number of crashes, injury outcomes or even estimated crash cost - while still remaining straightforward and quick to use like the 2011 RDG procedure. In this analysis, the improved procedure was used to explore the effects of the RDG procedure's assumptions. LON recommendations given by the 2011 RDG procedure were compared with recommendations given by this improved procedure. For 55 mph roads, the 2011 RDG procedure appears to lead to a LON sufficient to intercept between 80% and 90% of right-side departures that would otherwise strike a hazard located 10 m from the roadway. For hazards closer than 10 m, the 2011 RDG procedure intercepts progressively higher percentages of real-world departures. This suggests the protection level provided by the 2011 RDG procedure varies with the hazard offset, becoming more conservative as the hazard moves closer to the roadway. The improved procedure, by comparison, gives a consistent protection level regardless of hazard location. Copyright © 2015. Published by Elsevier Ltd.

  7. In vitro biofilm formation on resin-based composites after different finishing and polishing procedures.

    PubMed

    Cazzaniga, Gloria; Ottobelli, Marco; Ionescu, Andrei C; Paolone, Gaetano; Gherlone, Enrico; Ferracane, Jack L; Brambilla, Eugenio

    2017-12-01

    To evaluate the influence of surface treatments of different resin-based composites (RBCs) on S. mutans biofilm formation. 4 RBCs (microhybrid, nanohybrid, nanofilled, bulk-filled) and 6 finishing-polishing (F/P) procedures (open-air light-curing, light-curing against Mylar strip, aluminum oxide discs, one-step rubber point, diamond bur, multi-blade carbide bur) were evaluated. Surface roughness (SR) (n=5/group), gloss (n=5/group), scanning electron microscopy morphological analysis (SEM), energy-dispersive X-ray spectrometry (EDS) (n=3/group), and S. mutans biofilm formation (n=16/group) were assessed. EDS analysis was repeated after the biofilm assay. A morphological evaluation of S. mutans biofilm was also performed using confocal laser-scanning microscopy (CLSM) (n=2/group). The data were analyzed using Wilcoxon (SR, gloss) and two-way ANOVA with Tukey as post-hoc tests (EDS, biofilm formation). F/P procedures as well as RBCs significantly influenced SR and gloss. While F/P procedures did not significantly influence S. mutans biofilm formation, a significant influence of RBCs on the same parameter was found. Different RBCs showed different surface elemental composition. Both F/P procedures and S. mutans biofilm formation significantly modified this parameter. The tested F/P procedures significantly influenced RBCs surface properties but did not significantly affect S. mutans biofilm formation. The significant influence of the different RBCs tested on S. mutans biofilm formation suggests that material characteristics and composition play a greater role than SR. F/P procedures of RBCs may unexpectedly play a minor role compared to that of the restoration material itself in bacterial colonization. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Blurry-frame detection and shot segmentation in colonoscopy videos

    NASA Astrophysics Data System (ADS)

    Oh, JungHwan; Hwang, Sae; Tavanapong, Wallapak; de Groen, Piet C.; Wong, Johnny

    2003-12-01

    Colonoscopy is an important screening procedure for colorectal cancer. During this procedure, the endoscopist visually inspects the colon. Human inspection, however, is not without error. We hypothesize that colonoscopy videos may contain additional valuable information missed by the endoscopist. Video segmentation is the first necessary step for the content-based video analysis and retrieval to provide efficient access to the important images and video segments from a large colonoscopy video database. Based on the unique characteristics of colonoscopy videos, we introduce a new scheme to detect and remove blurry frames, and segment the videos into shots based on the contents. Our experimental results show that the average precision and recall of the proposed scheme are over 90% for the detection of non-blurry images. The proposed method of blurry frame detection and shot segmentation is extensible to the videos captured from other endoscopic procedures such as upper gastrointestinal endoscopy, enteroscopy, cystoscopy, and laparoscopy.

  9. Office-based narrow band imaging-guided flexible laryngoscopy tissue sampling: A cost-effectiveness analysis evaluating its impact on Taiwanese health insurance program.

    PubMed

    Fang, Tuan-Jen; Li, Hsueh-Yu; Liao, Chun-Ta; Chiang, Hui-Chen; Chen, I-How

    2015-07-01

    Narrow band imaging (NBI)-guided flexible laryngoscopy tissue sampling for laryngopharyngeal lesions is a novel technique. Patients underwent the procedure in an office-based setting without being sedated, which is different from the conventional technique performed using direct laryngoscopy. Although the feasibility and effects of this procedure were established, its financial impact on the institution and Taiwanese National Health Insurance program was not determined. This is a retrospective case-control study. From May 2010 to April 2011, 20 consecutive patients who underwent NBI flexible laryngoscopy tissue sampling were recruited. During the same period, another 20 age-, sex-, and lesion-matched cases were enrolled in the control group. The courses for procedures and financial status were analyzed and compared between groups. Office-based NBI flexible laryngoscopy tissue sampling procedure took 27 minutes to be completed, while 191 minutes were required for the conventional technique. Average reimbursement for each case was New Taiwan Dollar (NT$)1264 for patients undergoing office-based NBI flexible laryngoscopy tissue sampling, while NT$10,913 for those undergoing conventional direct laryngoscopy in the operation room (p < 0.001). The institution suffered a loss of at least NT$690 when performing NBI flexible laryngoscopy tissue sampling. Office-based NBI flexible laryngoscopy tissue sampling is a cost-saving procedure for patients and the Taiwanese National Health Insurance program. It also saves the procedure time. However, the net financial loss for the institution and physician would limit its popularization unless reimbursement patterns are changed. Copyright © 2013. Published by Elsevier B.V.

  10. Thermal-stress analysis for a wood composite blade

    NASA Technical Reports Server (NTRS)

    Fu, K. C.; Harb, A.

    1984-01-01

    A thermal-stress analysis of a wind turbine blade made of wood composite material is reported. First, the governing partial differential equation on heat conduction is derived, then, a finite element procedure using variational approach is developed for the solution of the governing equation. Thus, the temperature distribution throughout the blade is determined. Next, based on the temperature distribution, a finite element procedure using potential energy approach is applied to determine the thermal-stress distribution. A set of results is obtained through the use of a computer, which is considered to be satisfactory. All computer programs are contained in the report.

  11. Comparing Eye Tracking with Electrooculography for Measuring Individual Sentence Comprehension Duration

    PubMed Central

    Müller, Jana Annina; Wendt, Dorothea; Kollmeier, Birger; Brand, Thomas

    2016-01-01

    The aim of this study was to validate a procedure for performing the audio-visual paradigm introduced by Wendt et al. (2015) with reduced practical challenges. The original paradigm records eye fixations using an eye tracker and calculates the duration of sentence comprehension based on a bootstrap procedure. In order to reduce practical challenges, we first reduced the measurement time by evaluating a smaller measurement set with fewer trials. The results of 16 listeners showed effects comparable to those obtained when testing the original full measurement set on a different collective of listeners. Secondly, we introduced electrooculography as an alternative technique for recording eye movements. The correlation between the results of the two recording techniques (eye tracker and electrooculography) was r = 0.97, indicating that both methods are suitable for estimating the processing duration of individual participants. Similar changes in processing duration arising from sentence complexity were found using the eye tracker and the electrooculography procedure. Thirdly, the time course of eye fixations was estimated with an alternative procedure, growth curve analysis, which is more commonly used in recent studies analyzing eye tracking data. The results of the growth curve analysis were compared with the results of the bootstrap procedure. Both analysis methods show similar processing durations. PMID:27764125

  12. Quantitative architectural analysis: a new approach to cortical mapping.

    PubMed

    Schleicher, A; Palomero-Gallagher, N; Morosan, P; Eickhoff, S B; Kowalski, T; de Vos, K; Amunts, K; Zilles, K

    2005-12-01

    Recent progress in anatomical and functional MRI has revived the demand for a reliable, topographic map of the human cerebral cortex. Till date, interpretations of specific activations found in functional imaging studies and their topographical analysis in a spatial reference system are, often, still based on classical architectonic maps. The most commonly used reference atlas is that of Brodmann and his successors, despite its severe inherent drawbacks. One obvious weakness in traditional, architectural mapping is the subjective nature of localising borders between cortical areas, by means of a purely visual, microscopical examination of histological specimens. To overcome this limitation, more objective, quantitative mapping procedures have been established in the past years. The quantification of the neocortical, laminar pattern by defining intensity line profiles across the cortical layers, has a long tradition. During the last years, this method has been extended to enable a reliable, reproducible mapping of the cortex based on image analysis and multivariate statistics. Methodological approaches to such algorithm-based, cortical mapping were published for various architectural modalities. In our contribution, principles of algorithm-based mapping are described for cyto- and receptorarchitecture. In a cytoarchitectural parcellation of the human auditory cortex, using a sliding window procedure, the classical areal pattern of the human superior temporal gyrus was modified by a replacing of Brodmann's areas 41, 42, 22 and parts of area 21, with a novel, more detailed map. An extension and optimisation of the sliding window procedure to the specific requirements of receptorarchitectonic mapping, is also described using the macaque central sulcus and adjacent superior parietal lobule as a second, biologically independent example. Algorithm-based mapping procedures, however, are not limited to these two architectural modalities, but can be applied to all images in which a laminar cortical pattern can be detected and quantified, e.g. myeloarchitectonic and in vivo high resolution MR imaging. Defining cortical borders, based on changes in cortical lamination in high resolution, in vivo structural MR images will result in a rapid increase of our knowledge on the structural parcellation of the human cerebral cortex.

  13. Cost analysis of single-use (Ambu® aScope™) and reusable bronchoscopes in the ICU.

    PubMed

    Perbet, S; Blanquet, M; Mourgues, C; Delmas, J; Bertran, S; Longère, B; Boïko-Alaux, V; Chennell, P; Bazin, J-E; Constantin, J-M

    2017-12-01

    Flexible optical bronchoscopes are essential for management of airways in ICU, but the conventional reusable flexible scopes have three major drawbacks: high cost of repairs, need for decontamination, and possible transmission of infectious agents. The main objective of this study was to measure the cost of bronchoalveolar lavage (BAL) and percutaneous tracheostomy (PT) using reusable bronchoscopes and single-use bronchoscopes in an ICU of an university hospital. The secondary objective was to compare the satisfaction of healthcare professionals with reusable and single-use bronchoscopes. The study was performed between August 2009 and July 2014 in a 16-bed ICU. All BAL and PT procedures were performed by experienced healthcare professionals. Cost analysis was performed considering ICU and hospital organization. Healthcare professional satisfaction with single-use and reusable scopes was determined based on eight factors. Sensitivity analysis was performed by applying discount rates (0, 3, and 5%) and by simulation of six situations based on different assumptions. At a discount rate of 3%, the costs per BAL for the two reusable scopes were 188.86€ (scope 1) and 185.94€ (scope 2), and the costs per PT for the reusable scope 1 and scope 2 and single-use scopes were 1613.84€, 410.24€, and 204.49€, respectively. The cost per procedure for the reusable scopes depended on the number of procedures performed, maintenance costs, and decontamination costs. Healthcare professionals were more satisfied with the third-generation single-use Ambu ® aScope™. The cost per procedure for the single-use scope was not superior to that for reusable scopes. The choice of single-use or reusable bronchoscopes in an ICU should consider the frequency of procedures and the number of bronchoscopes needed.

  14. Developing robust recurrence plot analysis techniques for investigating infant respiratory patterns.

    PubMed

    Terrill, Philip I; Wilson, Stephen; Suresh, Sadasivam; Cooper, David M

    2007-01-01

    Recurrence plot analysis is a useful non-linear analysis tool. There are still no well formalised procedures for carrying out this analysis on measured physiological data, and systemising analysis is often difficult. In this paper, the recurrence based embedding is compared to radius based embedding by studying a logistic attractor and measured breathing data collected from sleeping human infants. Recurrence based embedding appears to be a more robust method of carrying out a recurrence analysis when attractor size is likely to be different between datasets. In the infant breathing data, the radius measure calculated at a fixed recurrence, scaled by average respiratory period, allows the accurate discrimination of active sleep from quiet sleep states (AUC=0.975, Sn=098, Sp=0.94).

  15. An integrated portfolio optimisation procedure based on data envelopment analysis, artificial bee colony algorithm and genetic programming

    NASA Astrophysics Data System (ADS)

    Hsu, Chih-Ming

    2014-12-01

    Portfolio optimisation is an important issue in the field of investment/financial decision-making and has received considerable attention from both researchers and practitioners. However, besides portfolio optimisation, a complete investment procedure should also include the selection of profitable investment targets and determine the optimal timing for buying/selling the investment targets. In this study, an integrated procedure using data envelopment analysis (DEA), artificial bee colony (ABC) and genetic programming (GP) is proposed to resolve a portfolio optimisation problem. The proposed procedure is evaluated through a case study on investing in stocks in the semiconductor sub-section of the Taiwan stock market for 4 years. The potential average 6-month return on investment of 9.31% from 1 November 2007 to 31 October 2011 indicates that the proposed procedure can be considered a feasible and effective tool for making outstanding investment plans, and thus making profits in the Taiwan stock market. Moreover, it is a strategy that can help investors to make profits even when the overall stock market suffers a loss.

  16. Dangers resulting from DNA profiling of biological materials derived from patients after allogeneic hematopoietic stem cell transplantation (allo-HSCT) with regard to forensic genetic analysis.

    PubMed

    Jacewicz, R; Lewandowski, K; Rupa-Matysek, J; Jędrzejczyk, M; Berent, J

    The study documents the risk that comes with DNA analysis of materials derived from patients after allogeneic hematopoietic stem cell transplantation (allo-HSCT) in forensic genetics. DNA chimerism was studied in 30 patients after allo-HSCT, based on techniques applied in contemporary forensic genetics, i.e. real-time PCR and multiplex PCR-STR with the use of autosomal DNA as well as Y-DNA markers. The results revealed that the DNA profile of the recipient's blood was identical with the donor's in the majority of cases. Therefore, blood analysis can lead to false conclusions in personal identification as well as kinship analysis. An investigation of buccal swabs revealed a mixture of DNA in the majority of recipients. Consequently, personal identification on the basis of stain analysis of the same origin may be impossible. The safest (but not ideal) material turned out to be the hair root. Its analysis based on autosomal DNA revealed 100% of the recipient's profile. However, an analysis based on Y-chromosome markers performed in female allo-HSCT recipients with male donors demonstrated the presence of donor DNA in hair cells - similarly to the blood and buccal swabs. In the light of potential risks arising from DNA profiling of biological materials derived from persons after allotransplantation in judicial aspects, certain procedures were proposed to eliminate such dangers. The basic procedures include abandoning the approach based exclusively on blood collection, both for kinship analysis and personal identification; asking persons who are to be tested about their history of allo-HSCT before sample collection and profile entry in the DNA database, and verification of DNA profiling based on hair follicles in uncertain cases.

  17. The Influence of Study-Level Inference Models and Study Set Size on Coordinate-Based fMRI Meta-Analyses

    PubMed Central

    Bossier, Han; Seurinck, Ruth; Kühn, Simone; Banaschewski, Tobias; Barker, Gareth J.; Bokde, Arun L. W.; Martinot, Jean-Luc; Lemaitre, Herve; Paus, Tomáš; Millenet, Sabina; Moerkerke, Beatrijs

    2018-01-01

    Given the increasing amount of neuroimaging studies, there is a growing need to summarize published results. Coordinate-based meta-analyses use the locations of statistically significant local maxima with possibly the associated effect sizes to aggregate studies. In this paper, we investigate the influence of key characteristics of a coordinate-based meta-analysis on (1) the balance between false and true positives and (2) the activation reliability of the outcome from a coordinate-based meta-analysis. More particularly, we consider the influence of the chosen group level model at the study level [fixed effects, ordinary least squares (OLS), or mixed effects models], the type of coordinate-based meta-analysis [Activation Likelihood Estimation (ALE) that only uses peak locations, fixed effects, and random effects meta-analysis that take into account both peak location and height] and the amount of studies included in the analysis (from 10 to 35). To do this, we apply a resampling scheme on a large dataset (N = 1,400) to create a test condition and compare this with an independent evaluation condition. The test condition corresponds to subsampling participants into studies and combine these using meta-analyses. The evaluation condition corresponds to a high-powered group analysis. We observe the best performance when using mixed effects models in individual studies combined with a random effects meta-analysis. Moreover the performance increases with the number of studies included in the meta-analysis. When peak height is not taken into consideration, we show that the popular ALE procedure is a good alternative in terms of the balance between type I and II errors. However, it requires more studies compared to other procedures in terms of activation reliability. Finally, we discuss the differences, interpretations, and limitations of our results. PMID:29403344

  18. A Two-Step Approach to Analyze Satisfaction Data

    ERIC Educational Resources Information Center

    Ferrari, Pier Alda; Pagani, Laura; Fiorio, Carlo V.

    2011-01-01

    In this paper a two-step procedure based on Nonlinear Principal Component Analysis (NLPCA) and Multilevel models (MLM) for the analysis of satisfaction data is proposed. The basic hypothesis is that observed ordinal variables describe different aspects of a latent continuous variable, which depends on covariates connected with individual and…

  19. METHOD FOR THE DETERMINATION OF PERCHLORATE ANION IN PLANT AND SOLID MATRICES BY ION CHROMATOGRAPHY

    EPA Science Inventory

    A standardized method for the analysis of perchlorate in plants was developed, based on dry weight, and applied to the analysis of plant organs, foodstuffs, and plant products. The procedure greatly reduced the ionic interferences in water extracts of plant materials. Ion chro...

  20. Primary Trait Analysis to Assess a Learner-Centered, Upper-Level Mathematics Course

    ERIC Educational Resources Information Center

    Alsardary, Salar; Pontiggia, Laura; Hamid, Mohammed; Blumberg, Phyllis

    2011-01-01

    This study presents a primary trait analysis of a learner-centered, discrete mathematics course based on student-to-student instruction. The authors developed a scoring rubric for the primary traits: conceptual knowledge, procedural knowledge, application of understanding, and mathematical communication skills. Eleven students took an exam…

  1. Applied Behavior Analysis: Current Myths in Public Education

    ERIC Educational Resources Information Center

    Fielding, Cheryl; Lowdermilk, John; Lanier, Lauren L.; Fannin, Abigail G.; Schkade, Jennifer L.; Rose, Chad A.; Simpson, Cynthia G.

    2013-01-01

    The effective use of behavior management strategies and related policies continues to be a debated issue in public education. Despite overwhelming evidence espousing the benefits of the implementation of procedures derived from principles based on the science of applied behavior analysis (ABA), educators often indicate many common misconceptions…

  2. A Noncentral "t" Regression Model for Meta-Analysis

    ERIC Educational Resources Information Center

    Camilli, Gregory; de la Torre, Jimmy; Chiu, Chia-Yi

    2010-01-01

    In this article, three multilevel models for meta-analysis are examined. Hedges and Olkin suggested that effect sizes follow a noncentral "t" distribution and proposed several approximate methods. Raudenbush and Bryk further refined this model; however, this procedure is based on a normal approximation. In the current research literature, this…

  3. A Preliminary Analysis of a Behavioral Classrooms Needs Assessment

    ERIC Educational Resources Information Center

    Leaf, Justin B.; Leaf, Ronald; McCray, Cynthia; Lamkins, Carol; Taubman, Mitchell; McEachin, John; Cihon, Joseph H.

    2016-01-01

    Today many special education classrooms implement procedures based upon the principles of Applied Behavior Analysis (ABA) to establish educationally relevant skills and decrease aberrant behaviors. However, it is difficult for school staff and consultants to evaluate the implementation of various components of ABA and general classroom set up. In…

  4. Meta-Analysis of Scale Reliability Using Latent Variable Modeling

    ERIC Educational Resources Information Center

    Raykov, Tenko; Marcoulides, George A.

    2013-01-01

    A latent variable modeling approach is outlined that can be used for meta-analysis of reliability coefficients of multicomponent measuring instruments. Important limitations of efforts to combine composite reliability findings across multiple studies are initially pointed out. A reliability synthesis procedure is discussed that is based on…

  5. DETERMINATION OF PERCHLORATE AT PARTS-PER-BILLION LEVELS IN PLANTS BY ION CHROMATOGRAPHY

    EPA Science Inventory

    A standardized method for the analysis of perchlorate in plants was developed, based on dry weight, and applied to the analysis of plant organs, foodstuffs, and plant products. The procedure greatly reduced the ionic interferences in water extracts of plant materials. The high ba...

  6. What Is Evidence-Based Behavior Analysis?

    PubMed Central

    Smith, Tristram

    2013-01-01

    Although applied behavior analysts often say they engage in evidence-based practice, they express differing views on what constitutes “evidence” and “practice.” This article describes a practice as a service offered by a provider to help solve a problem presented by a consumer. Solving most problems (e.g., increasing or decreasing a behavior and maintaining this change) requires multiple intervention procedures (i.e., a package). Single-subject studies are invaluable in investigating individual procedures, but researchers still need to integrate the procedures into a package. The package must be standardized enough for independent providers to replicate yet flexible enough to allow individualization; intervention manuals are the primary technology for achieving this balance. To test whether the package is effective in solving consumers' problems, researchers must evaluate outcomes of the package as a whole, usually in group studies such as randomized controlled trials. From this perspective, establishing an evidence-based practice involves more than analyzing the effects of discrete intervention procedures on behavior; it requires synthesizing information so as to offer thorough solutions to problems. Recognizing the need for synthesis offers behavior analysts many promising opportunities to build on their existing research to increase the quality and quantity of evidence-based practices. PMID:25729130

  7. Mechanism of degradation of 2'-deoxycytidine by formamide: implications for chemical DNA sequencing procedures.

    PubMed

    Saladino, R; Crestini, C; Mincione, E; Costanzo, G; Di Mauro, E; Negri, R

    1997-11-01

    We describe the reaction of formamide with 2'-deoxycytidine to give pyrimidine ring opening by nucleophilic addition on the electrophilic C(6) and C(4) positions. This information is confirmed by the analysis of the products of formamide attack on 2'-deoxycytidine, 5-methyl-2'-deoxycytidine, and 5-bromo-2'-deoxycytidine, residues when the latter are incorporated into oligonucleotides by DNA polymerase-driven polymerization and solid-phase phosphoramidite procedure. The increased sensitivity of 5-bromo-2'-deoxycytidine relative to that of 2'-deoxycytidine is pivotal for the improvement of the one-lane chemical DNA sequencing procedure based on the base-selective reaction of formamide with DNA. In many DNA sequencing cases it will in fact be possible to incorporate this base analogue into the DNA to be sequenced, thus providing a complete discrimination between its UV absorption signal and that of the thymidine residues. The wide spectrum of different sensitivities to formamide displayed by the 2'-deoxycytidine analogues solves, in the DNA single-lane chemical sequencing procedure, the possible source of errors due to low discrimination between C and T residues.

  8. Classifying the Indication for Colonoscopy Procedures: A Comparison of NLP Approaches in a Diverse National Healthcare System.

    PubMed

    Patterson, Olga V; Forbush, Tyler B; Saini, Sameer D; Moser, Stephanie E; DuVall, Scott L

    2015-01-01

    In order to measure the level of utilization of colonoscopy procedures, identifying the primary indication for the procedure is required. Colonoscopies may be utilized not only for screening, but also for diagnostic or therapeutic purposes. To determine whether a colonoscopy was performed for screening, we created a natural language processing system to identify colonoscopy reports in the electronic medical record system and extract indications for the procedure. A rule-based model and three machine-learning models were created using 2,000 manually annotated clinical notes of patients cared for in the Department of Veterans Affairs. Performance of the models was measured and compared. Analysis of the models on a test set of 1,000 documents indicates that the rule-based system performance stays fairly constant as evaluated on training and testing sets. However, the machine learning model without feature selection showed significant decrease in performance. Therefore, rule-based classification system appears to be more robust than a machine-learning system in cases when no feature selection is performed.

  9. Iterative refinement of structure-based sequence alignments by Seed Extension

    PubMed Central

    Kim, Changhoon; Tai, Chin-Hsien; Lee, Byungkook

    2009-01-01

    Background Accurate sequence alignment is required in many bioinformatics applications but, when sequence similarity is low, it is difficult to obtain accurate alignments based on sequence similarity alone. The accuracy improves when the structures are available, but current structure-based sequence alignment procedures still mis-align substantial numbers of residues. In order to correct such errors, we previously explored the possibility of replacing the residue-based dynamic programming algorithm in structure alignment procedures with the Seed Extension algorithm, which does not use a gap penalty. Here, we describe a new procedure called RSE (Refinement with Seed Extension) that iteratively refines a structure-based sequence alignment. Results RSE uses SE (Seed Extension) in its core, which is an algorithm that we reported recently for obtaining a sequence alignment from two superimposed structures. The RSE procedure was evaluated by comparing the correctly aligned fractions of residues before and after the refinement of the structure-based sequence alignments produced by popular programs. CE, DaliLite, FAST, LOCK2, MATRAS, MATT, TM-align, SHEBA and VAST were included in this analysis and the NCBI's CDD root node set was used as the reference alignments. RSE improved the average accuracy of sequence alignments for all programs tested when no shift error was allowed. The amount of improvement varied depending on the program. The average improvements were small for DaliLite and MATRAS but about 5% for CE and VAST. More substantial improvements have been seen in many individual cases. The additional computation times required for the refinements were negligible compared to the times taken by the structure alignment programs. Conclusion RSE is a computationally inexpensive way of improving the accuracy of a structure-based sequence alignment. It can be used as a standalone procedure following a regular structure-based sequence alignment or to replace the traditional iterative refinement procedures based on residue-level dynamic programming algorithm in many structure alignment programs. PMID:19589133

  10. Recommendations for a Standardized Program Management Office (PMO) Time Compliance Network Order (TCNO) Patching Process

    DTIC Science & Technology

    2007-03-01

    self -reporting. The interview process and resulting data analysis may be impacted by research bias since both were conducted by the same individual...the processes you employ? Answer: 97 MAJCOM CONTACTS RESPOSIBLE FOR GENERAL TCNO PROCEDURES SECTION 1: INTERVIEWEE INFO Question 1: Please...BASE-LEVEL NCC CONTACTS RESPOSIBLE FOR GENERAL TCNO PROCEDURES SECTION 1: INTERVIEWEE INFO Question 1: Please provide your general job description

  11. Enhanced Multiobjective Optimization Technique for Comprehensive Aerospace Design. Part A

    NASA Technical Reports Server (NTRS)

    Chattopadhyay, Aditi; Rajadas, John N.

    1997-01-01

    A multidisciplinary design optimization procedure which couples formal multiobjectives based techniques and complex analysis procedures (such as computational fluid dynamics (CFD) codes) developed. The procedure has been demonstrated on a specific high speed flow application involving aerodynamics and acoustics (sonic boom minimization). In order to account for multiple design objectives arising from complex performance requirements, multiobjective formulation techniques are used to formulate the optimization problem. Techniques to enhance the existing Kreisselmeier-Steinhauser (K-S) function multiobjective formulation approach have been developed. The K-S function procedure used in the proposed work transforms a constrained multiple objective functions problem into an unconstrained problem which then is solved using the Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm. Weight factors are introduced during the transformation process to each objective function. This enhanced procedure will provide the designer the capability to emphasize specific design objectives during the optimization process. The demonstration of the procedure utilizes a computational Fluid dynamics (CFD) code which solves the three-dimensional parabolized Navier-Stokes (PNS) equations for the flow field along with an appropriate sonic boom evaluation procedure thus introducing both aerodynamic performance as well as sonic boom as the design objectives to be optimized simultaneously. Sensitivity analysis is performed using a discrete differentiation approach. An approximation technique has been used within the optimizer to improve the overall computational efficiency of the procedure in order to make it suitable for design applications in an industrial setting.

  12. [Guideline development for rehabilitation of breast cancer patients - phase 2: findings from the classification of therapeutic procedures, KTL-data-analysis].

    PubMed

    Domann, U; Brüggemann, S; Klosterhuis, H; Weis, J

    2007-08-01

    Aim of this project is the development of an evidence based guideline for the rehabilitation of breast cancer patients, funded by the German Pension Insurance scheme. The project consists of four phases. This paper is focused on the 2nd phase, i.e., analysis of procedures in rehabilitation based on evidence based therapeutic modules. As a result of a systematic literature review 14 therapeutic modules were defined. From a total of 840 possible KTL Codes (Klassifikation Therapeutischer Leistungen, Classification of therapeutic procedures), 229 could be assigned to these modules. These analyses are based on 24685 patients in 57 rehabilitation clinics, who had been treated in 2003. For these modules the number of patients having received those interventions as well as the duration of the modules were calculated. The data were analysed with respect to the influence of age and comorbidity. Moreover, differences between rehabilitation clinics were investigated according to the category of interventions. Our findings show great variability in the use of the therapeutic modules. Therapeutic modules like Physiotherapy (91.6%), Training Therapy (85.2%) and Information (97.8%) are provided to most of the patients. Younger patients receive more treatments than older patients, and patients with higher comorbidity receive more Physiotherapie, Lymphoedema Therapy and Psychological Interventions than patients without comorbidities. Data analysis shows wide interindividual variability with regard to the therapeutic modules. This variability is related to age and comorbidity of the patients. Furthermore, great differences were found between the rehabilitation clinics concerning the use of the various interventions. This variability supports the necessity of developing and implementing an evidence based guideline for the rehabilitation of breast cancer patients. The next step will be discussing these findings with experts from science and clinical practice.

  13. 3D force/torque characterization of emergency cricothyroidotomy procedure using an instrumented scalpel.

    PubMed

    Ryason, Adam; Sankaranarayanan, Ganesh; Butler, Kathryn L; DeMoya, Marc; De, Suvranu

    2016-08-01

    Emergency Cricothyroidotomy (CCT) is a surgical procedure performed to secure a patient's airway. This high-stakes, but seldom-performed procedure is an ideal candidate for a virtual reality simulator to enhance physician training. For the first time, this study characterizes the force/torque characteristics of the cricothyroidotomy procedure, to guide development of a virtual reality CCT simulator for use in medical training. We analyze the upper force and torque thresholds experienced at the human-scalpel interface. We then group individual surgical cuts based on style of cut and cut medium and perform a regression analysis to create two models that allow us to predict the style of cut performed and the cut medium.

  14. Examining Residents' Strategic Mindfulness During Self-Regulated Learning of a Simulated Procedural Skill.

    PubMed

    Brydges, Ryan; Hatala, Rose; Mylopoulos, Maria

    2016-07-01

    Simulation-based training is currently embedded in most health professions education curricula. Without evidence for how trainees think about their simulation-based learning, some training techniques may not support trainees' learning strategies. This study explored how residents think about and self-regulate learning during a lumbar puncture (LP) training session using a simulator. In 2010, 20 of 45 postgraduate year 1 internal medicine residents attended a mandatory procedural skills training boot camp. Independently, residents practiced the entire LP skill on a part-task trainer using a clinical LP tray and proper sterile technique. We interviewed participants regarding how they thought about and monitored their learning processes, and then we conducted a thematic analysis of the interview data. The analysis suggested that participants considered what they could and could not learn from the simulator; they developed their self-confidence by familiarizing themselves with the LP equipment and repeating the LP algorithmic steps. Participants articulated an idiosyncratic model of learning they used to interpret the challenges and successes they experienced. Participants reported focusing on obtaining cerebrospinal fluid and memorizing the "routine" version of the LP procedure. They did not report much thinking about their learning strategies (eg, self-questioning). During simulation-based training, residents described assigning greater weight to achieving procedural outcomes and tended to think that the simulated task provided them with routine, generalizable skills. Over this typical 1-hour session, trainees did not appear to consider their strategic mindfulness (ie, awareness and use of learning strategies).

  15. An unsupervised classification approach for analysis of Landsat data to monitor land reclamation in Belmont county, Ohio

    NASA Technical Reports Server (NTRS)

    Brumfield, J. O.; Bloemer, H. H. L.; Campbell, W. J.

    1981-01-01

    Two unsupervised classification procedures for analyzing Landsat data used to monitor land reclamation in a surface mining area in east central Ohio are compared for agreement with data collected from the corresponding locations on the ground. One procedure is based on a traditional unsupervised-clustering/maximum-likelihood algorithm sequence that assumes spectral groupings in the Landsat data in n-dimensional space; the other is based on a nontraditional unsupervised-clustering/canonical-transformation/clustering algorithm sequence that not only assumes spectral groupings in n-dimensional space but also includes an additional feature-extraction technique. It is found that the nontraditional procedure provides an appreciable improvement in spectral groupings and apparently increases the level of accuracy in the classification of land cover categories.

  16. Empirical likelihood based detection procedure for change point in mean residual life functions under random censorship.

    PubMed

    Chen, Ying-Ju; Ning, Wei; Gupta, Arjun K

    2016-05-01

    The mean residual life (MRL) function is one of the basic parameters of interest in survival analysis that describes the expected remaining time of an individual after a certain age. The study of changes in the MRL function is practical and interesting because it may help us to identify some factors such as age and gender that may influence the remaining lifetimes of patients after receiving a certain surgery. In this paper, we propose a detection procedure based on the empirical likelihood for the changes in MRL functions with right censored data. Two real examples are also given: Veterans' administration lung cancer study and Stanford heart transplant to illustrate the detecting procedure. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. A Unique Procedure to Identify Cell Surface Markers Through a Spherical Self-Organizing Map Applied to DNA Microarray Analysis.

    PubMed

    Sugii, Yuh; Kasai, Tomonari; Ikeda, Masashi; Vaidyanath, Arun; Kumon, Kazuki; Mizutani, Akifumi; Seno, Akimasa; Tokutaka, Heizo; Kudoh, Takayuki; Seno, Masaharu

    2016-01-01

    To identify cell-specific markers, we designed a DNA microarray platform with oligonucleotide probes for human membrane-anchored proteins. Human glioma cell lines were analyzed using microarray and compared with normal and fetal brain tissues. For the microarray analysis, we employed a spherical self-organizing map, which is a clustering method suitable for the conversion of multidimensional data into two-dimensional data and displays the relationship on a spherical surface. Based on the gene expression profile, the cell surface characteristics were successfully mirrored onto the spherical surface, thereby distinguishing normal brain tissue from the disease model based on the strength of gene expression. The clustered glioma-specific genes were further analyzed by polymerase chain reaction procedure and immunocytochemical staining of glioma cells. Our platform and the following procedure were successfully demonstrated to categorize the genes coding for cell surface proteins that are specific to glioma cells. Our assessment demonstrates that a spherical self-organizing map is a valuable tool for distinguishing cell surface markers and can be employed in marker discovery studies for the treatment of cancer.

  18. NASA DOE POD NDE Capabilities Data Book

    NASA Technical Reports Server (NTRS)

    Generazio, Edward R.

    2015-01-01

    This data book contains the Directed Design of Experiments for Validating Probability of Detection (POD) Capability of NDE Systems (DOEPOD) analyses of the nondestructive inspection data presented in the NTIAC, Nondestructive Evaluation (NDE) Capabilities Data Book, 3rd ed., NTIAC DB-97-02. DOEPOD is designed as a decision support system to validate inspection system, personnel, and protocol demonstrating 0.90 POD with 95% confidence at critical flaw sizes, a90/95. The test methodology used in DOEPOD is based on the field of statistical sequential analysis founded by Abraham Wald. Sequential analysis is a method of statistical inference whose characteristic feature is that the number of observations required by the procedure is not determined in advance of the experiment. The decision to terminate the experiment depends, at each stage, on the results of the observations previously made. A merit of the sequential method, as applied to testing statistical hypotheses, is that test procedures can be constructed which require, on average, a substantially smaller number of observations than equally reliable test procedures based on a predetermined number of observations.

  19. 13C-based metabolic flux analysis: fundamentals and practice.

    PubMed

    Yang, Tae Hoon

    2013-01-01

    Isotope-based metabolic flux analysis is one of the emerging technologies applied to system level metabolic phenotype characterization in metabolic engineering. Among the developed approaches, (13)C-based metabolic flux analysis has been established as a standard tool and has been widely applied to quantitative pathway characterization of diverse biological systems. To implement (13)C-based metabolic flux analysis in practice, comprehending the underlying mathematical and computational modeling fundamentals is of importance along with carefully conducted experiments and analytical measurements. Such knowledge is also crucial when designing (13)C-labeling experiments and properly acquiring key data sets essential for in vivo flux analysis implementation. In this regard, the modeling fundamentals of (13)C-labeling systems and analytical data processing are the main topics we will deal with in this chapter. Along with this, the relevant numerical optimization techniques are addressed to help implementation of the entire computational procedures aiming at (13)C-based metabolic flux analysis in vivo.

  20. Bio-Oil Analysis Laboratory Procedures | Bioenergy | NREL

    Science.gov Websites

    Bio-Oil Analysis Laboratory Procedures Bio-Oil Analysis Laboratory Procedures NREL develops standard procedures have been validated and allow for reliable bio-oil analysis. Procedures Determination different hydroxyl groups (-OH) in pyrolysis bio-oil: aliphatic-OH, phenolic-OH, and carboxylic-OH. Download

  1. Global aesthetic surgery statistics: a closer look.

    PubMed

    Heidekrueger, Paul I; Juran, S; Ehrl, D; Aung, T; Tanna, N; Broer, P Niclas

    2017-08-01

    Obtaining quality global statistics about surgical procedures remains an important yet challenging task. The International Society of Aesthetic Plastic Surgery (ISAPS) reports the total number of surgical and non-surgical procedures performed worldwide on a yearly basis. While providing valuable insight, ISAPS' statistics leave two important factors unaccounted for: (1) the underlying base population, and (2) the number of surgeons performing the procedures. Statistics of the published ISAPS' 'International Survey on Aesthetic/Cosmetic Surgery' were analysed by country, taking into account the underlying national base population according to the official United Nations population estimates. Further, the number of surgeons per country was used to calculate the number of surgeries performed per surgeon. In 2014, based on ISAPS statistics, national surgical procedures ranked in the following order: 1st USA, 2nd Brazil, 3rd South Korea, 4th Mexico, 5th Japan, 6th Germany, 7th Colombia, and 8th France. When considering the size of the underlying national populations, the demand for surgical procedures per 100,000 people changes the overall ranking substantially. It was also found that the rate of surgical procedures per surgeon shows great variation between the responding countries. While the US and Brazil are often quoted as the countries with the highest demand for plastic surgery, according to the presented analysis, other countries surpass these countries in surgical procedures per capita. While data acquisition and quality should be improved in the future, valuable insight regarding the demand for surgical procedures can be gained by taking specific demographic and geographic factors into consideration.

  2. What the Tweets Say: A Critical Analysis of Twitter Research in Language Learning from 2009 to 2016

    ERIC Educational Resources Information Center

    Hattem, David; Lomicka, Lara

    2016-01-01

    This study presents an overview and critical analysis of the literature related to Twitter and language learning published from 2009 to 2016. Seventeen studies were selected for inclusion based on a four-phase identification procedure, which helped us to identify published studies that resulted in a content analysis of themes in the articles…

  3. A Powerful Procedure for Pathway-Based Meta-analysis Using Summary Statistics Identifies 43 Pathways Associated with Type II Diabetes in European Populations.

    PubMed

    Zhang, Han; Wheeler, William; Hyland, Paula L; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai

    2016-06-01

    Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs.

  4. A Powerful Procedure for Pathway-Based Meta-analysis Using Summary Statistics Identifies 43 Pathways Associated with Type II Diabetes in European Populations

    PubMed Central

    Zhang, Han; Wheeler, William; Hyland, Paula L.; Yang, Yifan; Shi, Jianxin; Chatterjee, Nilanjan; Yu, Kai

    2016-01-01

    Meta-analysis of multiple genome-wide association studies (GWAS) has become an effective approach for detecting single nucleotide polymorphism (SNP) associations with complex traits. However, it is difficult to integrate the readily accessible SNP-level summary statistics from a meta-analysis into more powerful multi-marker testing procedures, which generally require individual-level genetic data. We developed a general procedure called Summary based Adaptive Rank Truncated Product (sARTP) for conducting gene and pathway meta-analysis that uses only SNP-level summary statistics in combination with genotype correlation estimated from a panel of individual-level genetic data. We demonstrated the validity and power advantage of sARTP through empirical and simulated data. We conducted a comprehensive pathway-based meta-analysis with sARTP on type 2 diabetes (T2D) by integrating SNP-level summary statistics from two large studies consisting of 19,809 T2D cases and 111,181 controls with European ancestry. Among 4,713 candidate pathways from which genes in neighborhoods of 170 GWAS established T2D loci were excluded, we detected 43 T2D globally significant pathways (with Bonferroni corrected p-values < 0.05), which included the insulin signaling pathway and T2D pathway defined by KEGG, as well as the pathways defined according to specific gene expression patterns on pancreatic adenocarcinoma, hepatocellular carcinoma, and bladder carcinoma. Using summary data from 8 eastern Asian T2D GWAS with 6,952 cases and 11,865 controls, we showed 7 out of the 43 pathways identified in European populations remained to be significant in eastern Asians at the false discovery rate of 0.1. We created an R package and a web-based tool for sARTP with the capability to analyze pathways with thousands of genes and tens of thousands of SNPs. PMID:27362418

  5. Aerodynamic design optimization using sensitivity analysis and computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay; Eleshaky, Mohamed E.

    1991-01-01

    A new and efficient method is presented for aerodynamic design optimization, which is based on a computational fluid dynamics (CFD)-sensitivity analysis algorithm. The method is applied to design a scramjet-afterbody configuration for an optimized axial thrust. The Euler equations are solved for the inviscid analysis of the flow, which in turn provides the objective function and the constraints. The CFD analysis is then coupled with the optimization procedure that uses a constrained minimization method. The sensitivity coefficients, i.e. gradients of the objective function and the constraints, needed for the optimization are obtained using a quasi-analytical method rather than the traditional brute force method of finite difference approximations. During the one-dimensional search of the optimization procedure, an approximate flow analysis (predicted flow) based on a first-order Taylor series expansion is used to reduce the computational cost. Finally, the sensitivity of the optimum objective function to various design parameters, which are kept constant during the optimization, is computed to predict new optimum solutions. The flow analysis of the demonstrative example are compared with the experimental data. It is shown that the method is more efficient than the traditional methods.

  6. Training at a faith-based institution matters for obstetrics and gynecology residents: results from a regional survey.

    PubMed

    Guiahi, Maryam; Westhoff, Carolyn L; Summers, Sondra; Kenton, Kimberly

    2013-06-01

    Prior data suggest that opportunities in family planning training may be limited during obstetrics and gynecology (Ob-Gyn) residency training, particularly at faith-based institutions with moral and ethical constraints, although this aspect of the Ob-Gyn curriculum has not been formally studied to date. We compared Ob-Gyn residents' self-rated competency and intentions to provide family planning procedures at faith-based versus those of residents at non-faith-based programs. We surveyed residents at all 20 Ob-Gyn programs in Illinois, Indiana, Iowa, and Wisconsin from 2008 to 2009. Residents were queried about current skills and future plans to perform family planning procedures. We examined associations based on program and residents' personal characteristics and performed multivariable logistic regression analysis. A total of 232 of 340 residents (68%) from 17 programs (85%) returned surveys. Seven programs were faith-based. Residents from non-faith-based programs were more likely to be completely satisfied with family planning training (odds ratio [OR]  =  3.4, 95% confidence limit [CI], 1.9-6.2) and to report they "understand and can perform on own" most procedures. Most residents, regardless of program type, planned to provide all surveyed family planning services. Despite similar intentions to provide family planning procedures after graduation, residents at faith-based training programs were less satisfied with their family planning training and rate their ability to perform family planning services lower than residents at non-faith-based training programs.

  7. The role of simulation in the design of a neural network chip

    NASA Technical Reports Server (NTRS)

    Desai, Utpal; Roppel, Thaddeus A.; Padgett, Mary L.

    1993-01-01

    An iterative, simulation-based design procedure for a neural network chip is introduced. For this design procedure, the goal is to produce a chip layout for a neural network in which the weights are determined by transistor gate width-to-length ratios. In a given iteration, the current layout is simulated using the circuit simulator SPICE, and layout adjustments are made based on conventional gradient-decent methods. After the iteration converges, the chip is fabricated. Monte Carlo analysis is used to predict the effect of statistical fabrication process variations on the overall performance of the neural network chip.

  8. An algorithm to diagnose ball bearing faults in servomotors running arbitrary motion profiles

    NASA Astrophysics Data System (ADS)

    Cocconcelli, Marco; Bassi, Luca; Secchi, Cristian; Fantuzzi, Cesare; Rubini, Riccardo

    2012-02-01

    This paper describes a procedure to extend the scope of classical methods to detect ball bearing faults (based on envelope analysis and fault frequencies identification) beyond their usual area of application. The objective of this procedure is to allow condition-based monitoring of such bearings in servomotor applications, where typically the motor in its normal mode of operation has to follow a non-constant angular velocity profile that may contain motion inversions. After describing and analyzing the algorithm from a theoretical point of view, experimental results obtained on a real industrial application are presented and commented.

  9. Markov models of genome segmentation

    NASA Astrophysics Data System (ADS)

    Thakur, Vivek; Azad, Rajeev K.; Ramaswamy, Ram

    2007-01-01

    We introduce Markov models for segmentation of symbolic sequences, extending a segmentation procedure based on the Jensen-Shannon divergence that has been introduced earlier. Higher-order Markov models are more sensitive to the details of local patterns and in application to genome analysis, this makes it possible to segment a sequence at positions that are biologically meaningful. We show the advantage of higher-order Markov-model-based segmentation procedures in detecting compositional inhomogeneity in chimeric DNA sequences constructed from genomes of diverse species, and in application to the E. coli K12 genome, boundaries of genomic islands, cryptic prophages, and horizontally acquired regions are accurately identified.

  10. Improved Variable Selection Algorithm Using a LASSO-Type Penalty, with an Application to Assessing Hepatitis B Infection Relevant Factors in Community Residents

    PubMed Central

    Guo, Pi; Zeng, Fangfang; Hu, Xiaomin; Zhang, Dingmei; Zhu, Shuming; Deng, Yu; Hao, Yuantao

    2015-01-01

    Objectives In epidemiological studies, it is important to identify independent associations between collective exposures and a health outcome. The current stepwise selection technique ignores stochastic errors and suffers from a lack of stability. The alternative LASSO-penalized regression model can be applied to detect significant predictors from a pool of candidate variables. However, this technique is prone to false positives and tends to create excessive biases. It remains challenging to develop robust variable selection methods and enhance predictability. Material and methods Two improved algorithms denoted the two-stage hybrid and bootstrap ranking procedures, both using a LASSO-type penalty, were developed for epidemiological association analysis. The performance of the proposed procedures and other methods including conventional LASSO, Bolasso, stepwise and stability selection models were evaluated using intensive simulation. In addition, methods were compared by using an empirical analysis based on large-scale survey data of hepatitis B infection-relevant factors among Guangdong residents. Results The proposed procedures produced comparable or less biased selection results when compared to conventional variable selection models. In total, the two newly proposed procedures were stable with respect to various scenarios of simulation, demonstrating a higher power and a lower false positive rate during variable selection than the compared methods. In empirical analysis, the proposed procedures yielding a sparse set of hepatitis B infection-relevant factors gave the best predictive performance and showed that the procedures were able to select a more stringent set of factors. The individual history of hepatitis B vaccination, family and individual history of hepatitis B infection were associated with hepatitis B infection in the studied residents according to the proposed procedures. Conclusions The newly proposed procedures improve the identification of significant variables and enable us to derive a new insight into epidemiological association analysis. PMID:26214802

  11. Analysis of large space structures assembly: Man/machine assembly analysis

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Procedures for analyzing large space structures assembly via three primary modes: manual, remote and automated are outlined. Data bases on each of the assembly modes and a general data base on the shuttle capabilities to support structures assembly are presented. Task element times and structure assembly component costs are given to provide a basis for determining the comparative economics of assembly alternatives. The lessons learned from simulations of space structures assembly are detailed.

  12. Descriptive analysis and comparison of strategic incremental rehearsal to "Business as Usual" sight-word instruction for an adult nonreader with intellectual disability.

    PubMed

    Richman, David M; Grubb, Laura; Thompson, Samuel

    2018-01-01

    Strategic Incremental Rehearsal (SIR) is an effective method for teaching sight-word acquisition, but has neither been evaluated for use in adults with an intellectual disability, nor directly compared to the ongoing instruction in the natural environment. Experimental analysis of sight word acquisition via an alternating treatment design was conducted with a 23-year-old woman with Down syndrome. SIR was compared to the current reading instruction (CRI) in a classroom for young adults with intellectual disabilities. CRI procedures included non-contingent praise, receptive touch prompts ("touch the word bat"), echoic prompts ("say bat"), textual prompts ("read the word"), and pre-determined introduction of new words. SIR procedures included textual prompts on flash cards, contingent praise, corrective feedback, and mastery-based introduction of new words. The results indicated that SIR was associated with more rapid acquisition of sight words than CRI. Directions for future research could include systematic comparisons to other procedures, and evaluations of procedural permutations of SIR.

  13. Combining principles of Cognitive Load Theory and diagnostic error analysis for designing job aids: Effects on motivation and diagnostic performance in a process control task.

    PubMed

    Kluge, Annette; Grauel, Britta; Burkolter, Dina

    2013-03-01

    Two studies are presented in which the design of a procedural aid and the impact of an additional decision aid for process control were assessed. In Study 1, a procedural aid was developed that avoids imposing unnecessary extraneous cognitive load on novices when controlling a complex technical system. This newly designed procedural aid positively affected germane load, attention, satisfaction, motivation, knowledge acquisition and diagnostic speed for novel faults. In Study 2, the effect of a decision aid for use before the procedural aid was investigated, which was developed based on an analysis of diagnostic errors committed in Study 1. Results showed that novices were able to diagnose both novel faults and practised faults, and were even faster at diagnosing novel faults. This research contributes to the question of how to optimally support novices in dealing with technical faults in process control. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  14. Notes on testing equality and interval estimation in Poisson frequency data under a three-treatment three-period crossover trial.

    PubMed

    Lui, Kung-Jong; Chang, Kuang-Chao

    2016-10-01

    When the frequency of event occurrences follows a Poisson distribution, we develop procedures for testing equality of treatments and interval estimators for the ratio of mean frequencies between treatments under a three-treatment three-period crossover design. Using Monte Carlo simulations, we evaluate the performance of these test procedures and interval estimators in various situations. We note that all test procedures developed here can perform well with respect to Type I error even when the number of patients per group is moderate. We further note that the two weighted-least-squares (WLS) test procedures derived here are generally preferable to the other two commonly used test procedures in the contingency table analysis. We also demonstrate that both interval estimators based on the WLS method and interval estimators based on Mantel-Haenszel (MH) approach can perform well, and are essentially of equal precision with respect to the average length. We use a double-blind randomized three-treatment three-period crossover trial comparing salbutamol and salmeterol with a placebo with respect to the number of exacerbations of asthma to illustrate the use of these test procedures and estimators. © The Author(s) 2014.

  15. Antecedent-Based Interventions for Young Children at Risk for Emotional and Behavioral Disorders

    ERIC Educational Resources Information Center

    Park, Kristy L.; Scott, Terrance M.

    2009-01-01

    Following descriptive functional assessment procedures, a brief structural analysis was used to confirm the hypothesized antecedent conditions that preceded problem behavior across three children enrolled in Head Start classrooms. A withdrawal design investigated the effectiveness of antecedent-based interventions to reduce disruptive behaviors…

  16. Advancements in LiDAR-based registration of FIA field plots

    Treesearch

    Demetrios Gatziolis

    2012-01-01

    Meaningful integration of National Forest Inventory field plot information with spectral imagery acquired from satellite or airborne platforms requires precise plot registration. Global positioning system-based plot registration procedures, such as the one employed by the Forest Inventory and Analysis (FIA) Program, yield plot coordinates that, although adequate for...

  17. Evidence-Based Kernels: Fundamental Units of Behavioral Influence

    ERIC Educational Resources Information Center

    Embry, Dennis D.; Biglan, Anthony

    2008-01-01

    This paper describes evidence-based kernels, fundamental units of behavioral influence that appear to underlie effective prevention and treatment for children, adults, and families. A kernel is a behavior-influence procedure shown through experimental analysis to affect a specific behavior and that is indivisible in the sense that removing any of…

  18. An Analysis of the Effectiveness of University Counselling Services

    ERIC Educational Resources Information Center

    Murray, Aja L.; McKenzie, Karen; Murray, Kara R.; Richelieu, Marc

    2016-01-01

    It is important to demonstrate replicable evidence of the effectiveness of counselling procedures. The study aimed to contribute to the currently limited evidence base examining the effectiveness of university student counselling in the UK. Information on therapeutic outcome [based on Clinical Outcomes in Routine Evaluation-Outcome Measure…

  19. Kepler AutoRegressive Planet Search

    NASA Astrophysics Data System (ADS)

    Feigelson, Eric

    NASA's Kepler mission is the source of more exoplanets than any other instrument, but the discovery depends on complex statistical analysis procedures embedded in the Kepler pipeline. A particular challenge is mitigating irregular stellar variability without loss of sensitivity to faint periodic planetary transits. This proposal presents a two-stage alternative analysis procedure. First, parametric autoregressive ARFIMA models, commonly used in econometrics, remove most of the stellar variations. Second, a novel matched filter is used to create a periodogram from which transit-like periodicities are identified. This analysis procedure, the Kepler AutoRegressive Planet Search (KARPS), is confirming most of the Kepler Objects of Interest and is expected to identify additional planetary candidates. The proposed research will complete application of the KARPS methodology to the prime Kepler mission light curves of 200,000: stars, and compare the results with Kepler Objects of Interest obtained with the Kepler pipeline. We will then conduct a variety of astronomical studies based on the KARPS results. Important subsamples will be extracted including Habitable Zone planets, hot super-Earths, grazing-transit hot Jupiters, and multi-planet systems. Groundbased spectroscopy of poorly studied candidates will be performed to better characterize the host stars. Studies of stellar variability will then be pursued based on KARPS analysis. The autocorrelation function and nonstationarity measures will be used to identify spotted stars at different stages of autoregressive modeling. Periodic variables with folded light curves inconsistent with planetary transits will be identified; they may be eclipsing or mutually-illuminating binary star systems. Classification of stellar variables with KARPS-derived statistical properties will be attempted. KARPS procedures will then be applied to archived K2 data to identify planetary transits and characterize stellar variability.

  20. A value-based medicine cost-utility analysis of idiopathic epiretinal membrane surgery.

    PubMed

    Gupta, Omesh P; Brown, Gary C; Brown, Melissa M

    2008-05-01

    To perform a reference case, cost-utility analysis of epiretinal membrane (ERM) surgery using current literature on outcomes and complications. Computer-based, value-based medicine analysis. Decision analyses were performed under two scenarios: ERM surgery in better-seeing eye and ERM surgery in worse-seeing eye. The models applied long-term published data primarily from the Blue Mountains Eye Study and the Beaver Dam Eye Study. Visual acuity and major complications were derived from 25-gauge pars plana vitrectomy studies. Patient-based, time trade-off utility values, Markov modeling, sensitivity analysis, and net present value adjustments were used in the design and calculation of results. Main outcome measures included the number of discounted quality-adjusted-life-years (QALYs) gained and dollars spent per QALY gained. ERM surgery in the better-seeing eye compared with observation resulted in a mean gain of 0.755 discounted QALYs (3% annual rate) per patient treated. This model resulted in $4,680 per QALY for this procedure. When sensitivity analysis was performed, utility values varied from $6,245 to $3,746/QALY gained, medical costs varied from $3,510 to $5,850/QALY gained, and ERM recurrence rate increased to $5,524/QALY. ERM surgery in the worse-seeing eye compared with observation resulted in a mean gain of 0.27 discounted QALYs per patient treated. The $/QALY was $16,146 with a range of $20,183 to $12,110 based on sensitivity analyses. Utility values ranged from $21,520 to $12,916/QALY and ERM recurrence rate increased to $16,846/QALY based on sensitivity analysis. ERM surgery is a very cost-effective procedure when compared with other interventions across medical subspecialties.

  1. Development of an ELISA for evaluation of swab recovery efficiencies of bovine serum albumin.

    PubMed

    Sparding, Nadja; Slotved, Hans-Christian; Nicolaisen, Gert M; Giese, Steen B; Elmlund, Jón; Steenhard, Nina R

    2014-01-01

    After a potential biological incident the sampling strategy and sample analysis are crucial for the outcome of the investigation and identification. In this study, we have developed a simple sandwich ELISA based on commercial components to quantify BSA (used as a surrogate for ricin) with a detection range of 1.32-80 ng/mL. We used the ELISA to evaluate different protein swabbing procedures (swabbing techniques and after-swabbing treatments) for two swab types: a cotton gauze swab and a flocked nylon swab. The optimal swabbing procedure for each swab type was used to obtain recovery efficiencies from different surface materials. The surface recoveries using the optimal swabbing procedure ranged from 0-60% and were significantly higher from nonporous surfaces compared to porous surfaces. In conclusion, this study presents a swabbing procedure evaluation and a simple BSA ELISA based on commercial components, which are easy to perform in a laboratory with basic facilities. The data indicate that different swabbing procedures were optimal for each of the tested swab types, and the particular swab preference depends on the surface material to be swabbed.

  2. Comparison of design strategies for a three-arm clinical trial with time-to-event endpoint: Power, time-to-analysis, and operational aspects.

    PubMed

    Asikanius, Elina; Rufibach, Kaspar; Bahlo, Jasmin; Bieska, Gabriele; Burger, Hans Ulrich

    2016-11-01

    To optimize resources, randomized clinical trials with multiple arms can be an attractive option to simultaneously test various treatment regimens in pharmaceutical drug development. The motivation for this work was the successful conduct and positive final outcome of a three-arm randomized clinical trial primarily assessing whether obinutuzumab plus chlorambucil in patients with chronic lympocytic lymphoma and coexisting conditions is superior to chlorambucil alone based on a time-to-event endpoint. The inference strategy of this trial was based on a closed testing procedure. We compare this strategy to three potential alternatives to run a three-arm clinical trial with a time-to-event endpoint. The primary goal is to quantify the differences between these strategies in terms of the time it takes until the first analysis and thus potential approval of a new drug, number of required events, and power. Operational aspects of implementing the various strategies are discussed. In conclusion, using a closed testing procedure results in the shortest time to the first analysis with a minimal loss in power. Therefore, closed testing procedures should be part of the statistician's standard clinical trials toolbox when planning multiarm clinical trials. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Effects of Reusing Gel Electrophoresis and Electrotransfer Buffers on Western Blotting.

    PubMed

    Heda, Ghanshyam D; Omotola, Oluwabukola B; Heda, Rajiv P; Avery, Jamie

    2016-09-01

    SDS-PAGE and Western blotting are 2 of the most commonly used biochemical methods for protein analysis. Proteins are electrophoretically separated based on their MWs by SDS-PAGE and then electrotransferred to a solid membrane surface for subsequent protein-specific analysis by immunoblotting, a procedure commonly known as Western blotting. Both of these procedures use a salt-based buffer, with the latter procedure consisting of methanol as an additive known for its toxicity. Previous reports present a contradictory view in favor or against reusing electrotransfer buffer, also known as Towbin's transfer buffer (TTB), with an aim to reduce the toxic waste. In this report, we present a detailed analysis of not only reusing TTB but also gel electrophoresis buffer (EB) on proteins of low to high MW range. Our results suggest that EB can be reused for at least 5 times without compromising the electrophoretic separation of mixture of proteins in an MW standard, BSA, and crude cell lysates. Additionally, reuse of EB did not affect the quality of subsequent Western blots. Successive reuse of TTB, on the other hand, diminished the signal of proteins of different MWs in a protein standard and a high MW membrane protein cystic fibrosis transmembrane-conductance regulator (CFTR) in Western blotting.

  4. Procedural instruction in invasive bedside procedures: a systematic review and meta-analysis of effective teaching approaches.

    PubMed

    Huang, Grace C; McSparron, Jakob I; Balk, Ethan M; Richards, Jeremy B; Smith, C Christopher; Whelan, Julia S; Newman, Lori R; Smetana, Gerald W

    2016-04-01

    Optimal approaches to teaching bedside procedures are unknown. To identify effective instructional approaches in procedural training. We searched PubMed, EMBASE, Web of Science and Cochrane Library through December 2014. We included research articles that addressed procedural training among physicians or physician trainees for 12 bedside procedures. Two independent reviewers screened 9312 citations and identified 344 articles for full-text review. Two independent reviewers extracted data from full-text articles. We included measurements as classified by translational science outcomes T1 (testing settings), T2 (patient care practices) and T3 (patient/public health outcomes). Due to incomplete reporting, we post hoc classified study outcomes as 'negative' or 'positive' based on statistical significance. We performed meta-analyses of outcomes on the subset of studies sharing similar outcomes. We found 161 eligible studies (44 randomised controlled trials (RCTs), 34 non-RCTs and 83 uncontrolled trials). Simulation was the most frequently published educational mode (78%). Our post hoc classification showed that studies involving simulation, competency-based approaches and RCTs had higher frequencies of T2/T3 outcomes. Meta-analyses showed that simulation (risk ratio (RR) 1.54 vs 0.55 for studies with vs without simulation, p=0.013) and competency-based approaches (RR 3.17 vs 0.89, p<0.001) were effective forms of training. This systematic review of bedside procedural skills demonstrates that the current literature is heterogeneous and of varying quality and rigour. Evidence is strongest for the use of simulation and competency-based paradigms in teaching procedures, and these approaches should be the mainstay of programmes that train physicians to perform procedures. Further research should clarify differences among instructional methods (eg, forms of hands-on training) rather than among educational modes (eg, lecture vs simulation). Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  5. Applications of remote sensing, volume 3

    NASA Technical Reports Server (NTRS)

    Landgrebe, D. A. (Principal Investigator)

    1977-01-01

    The author has identified the following significant results. Of the four change detection techniques (post classification comparison, delta data, spectral/temporal, and layered spectral temporal), the post classification comparison was selected for further development. This was based upon test performances of the four change detection method, straightforwardness of the procedures, and the output products desired. A standardized modified, supervised classification procedure for analyzing the Texas coastal zone data was compiled. This procedure was developed in order that all quadrangles in the study are would be classified using similar analysis techniques to allow for meaningful comparisons and evaluations of the classifications.

  6. Analysis of vibrational load influence upon passengers in trains with a compulsory body tilt

    NASA Astrophysics Data System (ADS)

    Antipin, D. Ya; Kobishchanov, V. V.; Lapshin, V. F.; Mitrakov, A. S.; Shorokhov, S. G.

    2017-02-01

    The procedure for forecasting the vibrational load influence upon passengers of trains of rolling stocks equipped with a system of a compulsory body tilt on railroad curves is offered. The procedure is based on the use of computer simulation methods and application of solid-state models of anthropometrical mannequins. As a result of the carried out investigations, there are substantiated criteria of the comfort level estimate for passengers in the rolling-stock under consideration. The procedure is approved by the example of the promising domestic rolling stock with a compulsory body tilt on railroad curves.

  7. Alternative Modal Basis Selection Procedures For Reduced-Order Nonlinear Random Response Simulation

    NASA Technical Reports Server (NTRS)

    Przekop, Adam; Guo, Xinyun; Rizi, Stephen A.

    2012-01-01

    Three procedures to guide selection of an efficient modal basis in a nonlinear random response analysis are examined. One method is based only on proper orthogonal decomposition, while the other two additionally involve smooth orthogonal decomposition. Acoustic random response problems are employed to assess the performance of the three modal basis selection approaches. A thermally post-buckled beam exhibiting snap-through behavior, a shallowly curved arch in the auto-parametric response regime and a plate structure are used as numerical test articles. The results of a computationally taxing full-order analysis in physical degrees of freedom are taken as the benchmark for comparison with the results from the three reduced-order analyses. For the cases considered, all three methods are shown to produce modal bases resulting in accurate and computationally efficient reduced-order nonlinear simulations.

  8. Blood gas testing and related measurements: National recommendations on behalf of the Croatian Society of Medical Biochemistry and Laboratory Medicine.

    PubMed

    Dukić, Lora; Kopčinović, Lara Milevoj; Dorotić, Adrijana; Baršić, Ivana

    2016-10-15

    Blood gas analysis (BGA) is exposed to risks of errors caused by improper sampling, transport and storage conditions. The Clinical and Laboratory Standards Institute (CLSI) generated documents with recommendations for avoidance of potential errors caused by sample mishandling. Two main documents related to BGA issued by the CLSI are GP43-A4 (former H11-A4) Procedures for the collection of arterial blood specimens; approved standard - fourth edition, and C46-A2 Blood gas and pH analysis and related measurements; approved guideline - second edition. Practices related to processing of blood gas samples are not standardized in the Republic of Croatia. Each institution has its own protocol for ordering, collection and analysis of blood gases. Although many laboratories use state of the art analyzers, still many preanalytical procedures remain unchanged. The objective of the Croatian Society of Medical Biochemistry and Laboratory Medicine (CSMBLM) is to standardize the procedures for BGA based on CLSI recommendations. The Working Group for Blood Gas Testing as part of the Committee for the Scientific Professional Development of the CSMBLM prepared a set of recommended protocols for sampling, transport, storage and processing of blood gas samples based on relevant CLSI documents, relevant literature search and on the results of Croatian survey study on practices and policies in acid-base testing. Recommendations are intended for laboratory professionals and all healthcare workers involved in blood gas processing.

  9. Blood gas testing and related measurements: National recommendations on behalf of the Croatian Society of Medical Biochemistry and Laboratory Medicine

    PubMed Central

    Dukić, Lora; Kopčinović, Lara Milevoj; Dorotić, Adrijana; Baršić, Ivana

    2016-01-01

    Blood gas analysis (BGA) is exposed to risks of errors caused by improper sampling, transport and storage conditions. The Clinical and Laboratory Standards Institute (CLSI) generated documents with recommendations for avoidance of potential errors caused by sample mishandling. Two main documents related to BGA issued by the CLSI are GP43-A4 (former H11-A4) Procedures for the collection of arterial blood specimens; approved standard – fourth edition, and C46-A2 Blood gas and pH analysis and related measurements; approved guideline – second edition. Practices related to processing of blood gas samples are not standardized in the Republic of Croatia. Each institution has its own protocol for ordering, collection and analysis of blood gases. Although many laboratories use state of the art analyzers, still many preanalytical procedures remain unchanged. The objective of the Croatian Society of Medical Biochemistry and Laboratory Medicine (CSMBLM) is to standardize the procedures for BGA based on CLSI recommendations. The Working Group for Blood Gas Testing as part of the Committee for the Scientific Professional Development of the CSMBLM prepared a set of recommended protocols for sampling, transport, storage and processing of blood gas samples based on relevant CLSI documents, relevant literature search and on the results of Croatian survey study on practices and policies in acid-base testing. Recommendations are intended for laboratory professionals and all healthcare workers involved in blood gas processing. PMID:27812301

  10. Formalizing the Austrian Procedure Catalogue: A 4-step methodological analysis approach.

    PubMed

    Neururer, Sabrina Barbara; Lasierra, Nelia; Peiffer, Karl Peter; Fensel, Dieter

    2016-04-01

    Due to the lack of an internationally accepted and adopted standard for coding health interventions, Austria has established its own country-specific procedure classification system - the Austrian Procedure Catalogue (APC). Even though the APC is an elaborate coding standard for medical procedures, it has shortcomings that limit its usability. In order to enhance usability and usefulness, especially for research purposes and e-health applications, we developed an ontologized version of the APC. In this paper we present a novel four-step approach for the ontology engineering process, which enables accurate extraction of relevant concepts for medical ontologies from written text. The proposed approach for formalizing the APC consists of the following four steps: (1) comparative pre-analysis, (2) definition analysis, (3) typological analysis, and (4) ontology implementation. The first step contained a comparison of the APC to other well-established or elaborate health intervention coding systems in order to identify strengths and weaknesses of the APC. In the second step, a list of definitions of medical terminology used in the APC was obtained. This list of definitions was used as input for Step 3, in which we identified the most important concepts to describe medical procedures using the qualitative typological analysis approach. The definition analysis as well as the typological analysis are well-known and effective methods used in social sciences, but not commonly employed in the computer science or ontology engineering domain. Finally, this list of concepts was used in Step 4 to formalize the APC. The pre-analysis highlighted the major shortcomings of the APC, such as the lack of formal definition, leading to implicitly available, but not directly accessible information (hidden data), or the poor procedural type classification. After performing the definition and subsequent typological analyses, we were able to identify the following main characteristics of health interventions: (1) Procedural type, (2) Anatomical site, (3) Medical device, (4) Pathology, (5) Access, (6) Body system, (7) Population, (8) Aim, (9) Discipline, (10) Technique, and (11) Body Function. These main characteristics were taken as input of classes for the formalization of the APC. We were also able to identify relevant relations between classes. The proposed four-step approach for formalizing the APC provides a novel, systematically developed, strong framework to semantically enrich procedure classifications. Although this methodology was designed to address the particularities of the APC, the included methods are based on generic analysis tasks, and therefore can be re-used to provide a systematic representation of other procedure catalogs or classification systems and hence contribute towards a universal alignment of such representations, if desired. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Multidisciplinary tailoring of hot composite structures

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.; Chamis, Christos C.

    1993-01-01

    A computational simulation procedure is described for multidisciplinary analysis and tailoring of layered multi-material hot composite engine structural components subjected to simultaneous multiple discipline-specific thermal, structural, vibration, and acoustic loads. The effect of aggressive environments is also simulated. The simulation is based on a three-dimensional finite element analysis technique in conjunction with structural mechanics codes, thermal/acoustic analysis methods, and tailoring procedures. The integrated multidisciplinary simulation procedure is general-purpose including the coupled effects of nonlinearities in structure geometry, material, loading, and environmental complexities. The composite material behavior is assessed at all composite scales, i.e., laminate/ply/constituents (fiber/matrix), via a nonlinear material characterization hygro-thermo-mechanical model. Sample tailoring cases exhibiting nonlinear material/loading/environmental behavior of aircraft engine fan blades, are presented. The various multidisciplinary loads lead to different tailored designs, even those competing with each other, as in the case of minimum material cost versus minimum structure weight and in the case of minimum vibration frequency versus minimum acoustic noise.

  12. An experimental protocol for the definition of upper limb anatomical frames on children using magneto-inertial sensors.

    PubMed

    Ricci, L; Formica, D; Tamilia, E; Taffoni, F; Sparaci, L; Capirci, O; Guglielmelli, E

    2013-01-01

    Motion capture based on magneto-inertial sensors is a technology enabling data collection in unstructured environments, allowing "out of the lab" motion analysis. This technology is a good candidate for motion analysis of children thanks to the reduced weight and size as well as the use of wireless communication that has improved its wearability and reduced its obtrusivity. A key issue in the application of such technology for motion analysis is its calibration, i.e. a process that allows mapping orientation information from each sensor to a physiological reference frame. To date, even if there are several calibration procedures available for adults, no specific calibration procedures have been developed for children. This work addresses this specific issue presenting a calibration procedure for motion capture of thorax and upper limbs on healthy children. Reported results suggest comparable performance with similar studies on adults and emphasize some critical issues, opening the way to further improvements.

  13. Effect of Music on Outpatient Urological Procedures: A Systematic Review and Meta-Analysis from the European Association of Urology Section of Uro-Technology.

    PubMed

    Kyriakides, Rena; Jones, Patrick; Geraghty, Robert; Skolarikos, Andreas; Liatsikos, Evangellos; Traxer, Olivier; Pietropaolo, Amelia; Somani, Bhaskar K

    2018-05-01

    Music is a practical, inexpensive and harmless analgesic and anxiolytic. An increasing number of original studies have been performed to investigate its potential application in urology. Our aim was to identify the effect of music on outpatient based urological procedures. We systematically reviewed the effect of using music during all reported outpatient urology procedures, including transrectal ultrasound guided prostate biopsy, shock wave lithotripsy, urodynamic studies, percutaneous nephrostomy tube placement and cystoscopy. Data were included on all randomized trials from 1980 to 2017 and no language restrictions were applied. Included in analysis were 16 randomized studies in which 972 of 1,950 patients (49.8%) were exposed to music during an outpatient procedure. The procedures included transrectal ultrasound guided prostate biopsy in 4 studies in a total of 286 patients, shock wave lithotripsy in 6 studies in a total of 1,023, cystoscopy in 3 studies in a total of 331, urodynamics in 2 studies in a total of 210 and percutaneous nephrostomy in 1 study in a total of 100. All studies incorporated a visual analog score to measure pain. Anxiety was measured by STAI (State-Trait Anxiety Inventory) in 13 studies and by a visual analog scale in 2. While 14 of the 16 studies showed a reduction in self-reported pain, a reduction in anxiety was seen in 14. When using music, overall procedural satisfaction was better in 9 studies and patient willingness to repeat the procedure was also higher in 7. Our meta-analysis revealed a significant reduction in visual analog scale and STAI findings across all studies (p <0.001). Our systematic review demonstrated a beneficial effect of music on urological outpatient procedures. Music seemed to decrease anxiety and pain. It might serve as a useful adjunct to increase procedural satisfaction and patient willingness to undergo the procedure again. Copyright © 2018 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  14. AGR-1 Thermocouple Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeff Einerson

    2012-05-01

    This report documents an effort to analyze measured and simulated data obtained in the Advanced Gas Reactor (AGR) fuel irradiation test program conducted in the INL's Advanced Test Reactor (ATR) to support the Next Generation Nuclear Plant (NGNP) R&D program. The work follows up on a previous study (Pham and Einerson, 2010), in which statistical analysis methods were applied for AGR-1 thermocouple data qualification. The present work exercises the idea that, while recognizing uncertainties inherent in physics and thermal simulations of the AGR-1 test, results of the numerical simulations can be used in combination with the statistical analysis methods tomore » further improve qualification of measured data. Additionally, the combined analysis of measured and simulation data can generate insights about simulation model uncertainty that can be useful for model improvement. This report also describes an experimental control procedure to maintain fuel target temperature in the future AGR tests using regression relationships that include simulation results. The report is organized into four chapters. Chapter 1 introduces the AGR Fuel Development and Qualification program, AGR-1 test configuration and test procedure, overview of AGR-1 measured data, and overview of physics and thermal simulation, including modeling assumptions and uncertainties. A brief summary of statistical analysis methods developed in (Pham and Einerson 2010) for AGR-1 measured data qualification within NGNP Data Management and Analysis System (NDMAS) is also included for completeness. Chapters 2-3 describe and discuss cases, in which the combined use of experimental and simulation data is realized. A set of issues associated with measurement and modeling uncertainties resulted from the combined analysis are identified. This includes demonstration that such a combined analysis led to important insights for reducing uncertainty in presentation of AGR-1 measured data (Chapter 2) and interpretation of simulation results (Chapter 3). The statistics-based simulation-aided experimental control procedure described for the future AGR tests is developed and demonstrated in Chapter 4. The procedure for controlling the target fuel temperature (capsule peak or average) is based on regression functions of thermocouple readings and other relevant parameters and accounting for possible changes in both physical and thermal conditions and in instrument performance.« less

  15. Factors that influence length of stay for in-patient gynaecology surgery: is the Case Mix Group (CMG) or type of procedure more important?

    PubMed

    Carey, Mark S; Victory, Rahi; Stitt, Larry; Tsang, Nicole

    2006-02-01

    To compare the association between the Case Mix Group (CMG) code and length of stay (LOS) with the association between the type of procedure and LOS in patients admitted for gynaecology surgery. We examined the records of women admitted for surgery in CMG 579 (major uterine/adnexal procedure, no malignancy) or 577 (major surgery ovary/adnexa with malignancy) between April 1997 and March 1999. Factors thought to influence LOS included age, weight, American Society of Anesthesiologists (ASA) score, physician, day of the week on which surgery was performed, and procedure type. Procedures were divided into six categories, four for CMG 579 and two for CMG 577. Data were abstracted from the hospital information costing system (T2 system) and by retrospective chart review. Multivariable analysis was performed using linear regression with backwards elimination. There were 606 patients in CMG 579 and 101 patients in CMG 577, and the corresponding median LOS was four days (range 1-19) for CMG 579 and nine days (range 3-30) for CMG 577. Combined analysis of both CMGs 577 and 579 revealed the following factors as highly significant determinants of LOS: procedure, age, physician, and ASA score. Although confounded by procedure type, the CMG did not significantly account for differences in LOS in the model if procedure was considered. Pairwise comparisons of procedure categories were all found to be statistically significant, even when controlled for other important variables. The type of procedure better accounts for differences in LOS by describing six statistically distinct procedure groups rather than the traditional two CMGs. It is reasonable therefore to consider changing the current CMG codes for gynaecology to a classification based on the type of procedure.

  16. Analysis of free and bound chlorophenoxy acids in cereals.

    PubMed

    Lokke, H

    1975-06-01

    Extraction of the chlorophenoxy acids 2,4-D and dichlorprop in cereals has been examined by analyzing barley from spraying experiments. A procedure has been set up by combination of acid hydrolysis and enzymatic degradation followed by extraction and clean up on either silica gel or basic aluminum oxide. The final determination is based on reaction with diazomethane and subsequently GLC with ECD. This procedure was compared with two different extraction procedures previously described in the literature. The one comparative procedure uses a mixture of 50% diethyl ether/hexane in presence of sulphuric acid and resulted in residues up to ten times lower than found after the combined acid hydrolysis/enzymatic degradation procedure. In the second comparison a direct extraction was made with a mixture of 65% (v/v) acetonitrile in water. No differences were found between this and the combined acid hydrolysis/enzymatic degradation procedure.

  17. A TOTP-Based Enhanced Route Optimization Procedure for Mobile IPv6 to Reduce Handover Delay and Signalling Overhead

    PubMed Central

    Shah, Peer Azmat; Hasbullah, Halabi B.; Lawal, Ibrahim A.; Aminu Mu'azu, Abubakar; Tang Jung, Low

    2014-01-01

    Due to the proliferation of handheld mobile devices, multimedia applications like Voice over IP (VoIP), video conferencing, network music, and online gaming are gaining popularity in recent years. These applications are well known to be delay sensitive and resource demanding. The mobility of mobile devices, running these applications, across different networks causes delay and service disruption. Mobile IPv6 was proposed to provide mobility support to IPv6-based mobile nodes for continuous communication when they roam across different networks. However, the Route Optimization procedure in Mobile IPv6 involves the verification of mobile node's reachability at the home address and at the care-of address (home test and care-of test) that results in higher handover delays and signalling overhead. This paper presents an enhanced procedure, time-based one-time password Route Optimization (TOTP-RO), for Mobile IPv6 Route Optimization that uses the concepts of shared secret Token, time based one-time password (TOTP) along with verification of the mobile node via direct communication and maintaining the status of correspondent node's compatibility. The TOTP-RO was implemented in network simulator (NS-2) and an analytical analysis was also made. Analysis showed that TOTP-RO has lower handover delays, packet loss, and signalling overhead with an increased level of security as compared to the standard Mobile IPv6's Return-Routability-based Route Optimization (RR-RO). PMID:24688398

  18. Ranking of predictor variables based on effect size criterion provides an accurate means of automatically classifying opinion column articles

    NASA Astrophysics Data System (ADS)

    Legara, Erika Fille; Monterola, Christopher; Abundo, Cheryl

    2011-01-01

    We demonstrate an accurate procedure based on linear discriminant analysis that allows automatic authorship classification of opinion column articles. First, we extract the following stylometric features of 157 column articles from four authors: statistics on high frequency words, number of words per sentence, and number of sentences per paragraph. Then, by systematically ranking these features based on an effect size criterion, we show that we can achieve an average classification accuracy of 93% for the test set. In comparison, frequency size based ranking has an average accuracy of 80%. The highest possible average classification accuracy of our data merely relying on chance is ∼31%. By carrying out sensitivity analysis, we show that the effect size criterion is superior than frequency ranking because there exist low frequency words that significantly contribute to successful author discrimination. Consistent results are seen when the procedure is applied in classifying the undisputed Federalist papers of Alexander Hamilton and James Madison. To the best of our knowledge, the work is the first attempt in classifying opinion column articles, that by virtue of being shorter in length (as compared to novels or short stories), are more prone to over-fitting issues. The near perfect classification for the longer papers supports this claim. Our results provide an important insight on authorship attribution that has been overlooked in previous studies: that ranking discriminant variables based on word frequency counts is not necessarily an optimal procedure.

  19. Quantitative evaluation of variations in rule-based classifications of land cover in urban neighbourhoods using WorldView-2 imagery.

    PubMed

    Belgiu, Mariana; Dr Guţ, Lucian; Strobl, Josef

    2014-01-01

    The increasing availability of high resolution imagery has triggered the need for automated image analysis techniques, with reduced human intervention and reproducible analysis procedures. The knowledge gained in the past might be of use to achieving this goal, if systematically organized into libraries which would guide the image analysis procedure. In this study we aimed at evaluating the variability of digital classifications carried out by three experts who were all assigned the same interpretation task. Besides the three classifications performed by independent operators, we developed an additional rule-based classification that relied on the image classifications best practices found in the literature, and used it as a surrogate for libraries of object characteristics. The results showed statistically significant differences among all operators who classified the same reference imagery. The classifications carried out by the experts achieved satisfactory results when transferred to another area for extracting the same classes of interest, without modification of the developed rules.

  20. Quantitative evaluation of variations in rule-based classifications of land cover in urban neighbourhoods using WorldView-2 imagery

    PubMed Central

    Belgiu, Mariana; Drǎguţ, Lucian; Strobl, Josef

    2014-01-01

    The increasing availability of high resolution imagery has triggered the need for automated image analysis techniques, with reduced human intervention and reproducible analysis procedures. The knowledge gained in the past might be of use to achieving this goal, if systematically organized into libraries which would guide the image analysis procedure. In this study we aimed at evaluating the variability of digital classifications carried out by three experts who were all assigned the same interpretation task. Besides the three classifications performed by independent operators, we developed an additional rule-based classification that relied on the image classifications best practices found in the literature, and used it as a surrogate for libraries of object characteristics. The results showed statistically significant differences among all operators who classified the same reference imagery. The classifications carried out by the experts achieved satisfactory results when transferred to another area for extracting the same classes of interest, without modification of the developed rules. PMID:24623959

  1. Quantitative evaluation of variations in rule-based classifications of land cover in urban neighbourhoods using WorldView-2 imagery

    NASA Astrophysics Data System (ADS)

    Belgiu, Mariana; ǎguţ, Lucian, , Dr; Strobl, Josef

    2014-01-01

    The increasing availability of high resolution imagery has triggered the need for automated image analysis techniques, with reduced human intervention and reproducible analysis procedures. The knowledge gained in the past might be of use to achieving this goal, if systematically organized into libraries which would guide the image analysis procedure. In this study we aimed at evaluating the variability of digital classifications carried out by three experts who were all assigned the same interpretation task. Besides the three classifications performed by independent operators, we developed an additional rule-based classification that relied on the image classifications best practices found in the literature, and used it as a surrogate for libraries of object characteristics. The results showed statistically significant differences among all operators who classified the same reference imagery. The classifications carried out by the experts achieved satisfactory results when transferred to another area for extracting the same classes of interest, without modification of the developed rules.

  2. Two Decades of Literature on Self-Directed Learning: A Content Analysis.

    ERIC Educational Resources Information Center

    Brockett, Ralph G.; Stockdale, Susan L.; Fogerson, Dewey L.; Cox, Barry F.; Canipe, James B.; Chuprina, Larissa A.; Donaghy, Robert C.; Chadwell, Nancy E.

    Using a quantitative content analysis approach, a study examined the literature on self direction, or self-directed learning (SDL), that appeared in 14 mainstream adult education journals between 1980-98. The procedure involved classifying, entering, and tallying information on each article through use of an Internet-based program. Results…

  3. Image processing and classification procedures for analysis of sub-decimeter imagery acquired with an unmanned aircraft over arid rangelands

    USDA-ARS?s Scientific Manuscript database

    Using five centimeter resolution images acquired with an unmanned aircraft system (UAS), we developed and evaluated an image processing workflow that included the integration of resolution-appropriate field sampling, feature selection, object-based image analysis, and processing approaches for UAS i...

  4. Method 365.5 Determination of Orthophosphate in Estuarine and Coastal Waters by Automated Colorimetric Analysis

    EPA Science Inventory

    This method provides a procedure for the determination of low-level orthophosphate concentrations normally found in estuarine and/or coastal waters. It is based upon the method of Murphy and Riley1 adapted for automated segmented flow analysis2 in which the two reagent solutions ...

  5. Quantitative 13C NMR characterization of fast pyrolysis oils

    DOE PAGES

    Happs, Renee M.; Lisa, Kristina; Ferrell, III, Jack R.

    2016-10-20

    Quantitative 13C NMR analysis of model catalytic fast pyrolysis (CFP) oils following literature procedures showed poor agreement for aromatic hydrocarbons between NMR measured concentrations and actual composition. Furthermore, modifying integration regions based on DEPT analysis for aromatic carbons resulted in better agreement. Solvent effects were also investigated for hydrotreated CFP oil.

  6. Optical disk processing of solar images.

    NASA Astrophysics Data System (ADS)

    Title, A.; Tarbell, T.

    The current generation of space and ground-based experiments in solar physics produces many megabyte-sized image data arrays. Optical disk technology is the leading candidate for convenient analysis, distribution, and archiving of these data. The authors have been developing data analysis procedures which use both analog and digital optical disks for the study of solar phenomena.

  7. Quantitative 13C NMR characterization of fast pyrolysis oils

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Happs, Renee M.; Lisa, Kristina; Ferrell, III, Jack R.

    Quantitative 13C NMR analysis of model catalytic fast pyrolysis (CFP) oils following literature procedures showed poor agreement for aromatic hydrocarbons between NMR measured concentrations and actual composition. Furthermore, modifying integration regions based on DEPT analysis for aromatic carbons resulted in better agreement. Solvent effects were also investigated for hydrotreated CFP oil.

  8. A Comparison of Mean Phase Difference and Generalized Least Squares for Analyzing Single-Case Data

    ERIC Educational Resources Information Center

    Manolov, Rumen; Solanas, Antonio

    2013-01-01

    The present study focuses on single-case data analysis specifically on two procedures for quantifying differences between baseline and treatment measurements. The first technique tested is based on generalized least square regression analysis and is compared to a proposed non-regression technique, which allows obtaining similar information. The…

  9. Failure mode analysis in adrenal vein sampling: a single-center experience.

    PubMed

    Trerotola, Scott O; Asmar, Melissa; Yan, Yan; Fraker, Douglas L; Cohen, Debbie L

    2014-10-01

    To analyze failure modes in a high-volume adrenal vein sampling (AVS) practice in an effort to identify preventable causes of nondiagnostic sampling. A retrospective database was constructed containing 343 AVS procedures performed over a 10-year period. Each nondiagnostic AVS procedure was reviewed for failure mode and correlated with results of any repeat AVS. Data collected included selectivity index, lateralization index, adrenalectomy outcomes if performed, and details of AVS procedure. All AVS procedures were performed after cosyntropin stimulation, using sequential technique. AVS was nondiagnostic in 12 of 343 (3.5%) primary procedures and 2 secondary procedures. Failure was right-sided in 8 (57%) procedures, left-sided in 4 (29%) procedures, bilateral in 1 procedure, and neither in 1 procedure (laboratory error). Failure modes included diluted sample from correctly identified vein (n = 7 [50%]; 3 right and 4 left), vessel misidentified as adrenal vein (n = 3 [21%]; all right), failure to locate an adrenal vein (n = 2 [14%]; both right), cosyntropin stimulation failure (n = 1 [7%]; diagnostic by nonstimulated criteria), and laboratory error (n = 1 [7%]; specimen loss). A second AVS procedure was diagnostic in three of five cases (60%), and a third AVS procedure was diagnostic in one of one case (100%). Among the eight patients in whom AVS ultimately was not diagnostic, four underwent adrenalectomy based on diluted AVS samples, and one underwent adrenalectomy based on imaging; all five experienced improvement in aldosteronism. A substantial percentage of AVS failures occur on the left, all related to dilution. Even when technically nondiagnostic per strict criteria, some "failed" AVS procedures may be sufficient to guide therapy. Repeat AVS has a good yield. Copyright © 2014 SIR. Published by Elsevier Inc. All rights reserved.

  10. LC-MS metabolic profiling of Arabidopsis thaliana plant leaves and cell cultures: optimization of pre-LC-MS procedure parameters.

    PubMed

    t'Kindt, Ruben; De Veylder, Lieven; Storme, Michael; Deforce, Dieter; Van Bocxlaer, Jan

    2008-08-01

    This study treats the optimization of methods for homogenizing Arabidopsis thaliana plant leaves as well as cell cultures, and extracting their metabolites for metabolomics analysis by conventional liquid chromatography electrospray ionization mass spectrometry (LC-ESI/MS). Absolute recovery, process efficiency and procedure repeatability have been compared between different pre-LC-MS homogenization/extraction procedures through the use of samples fortified before extraction with a range of representative metabolites. Hereby, the magnitude of the matrix effect observed in the ensuing LC-MS based metabolomics analysis was evaluated. Based on relative recovery and repeatability of key metabolites, comprehensiveness of extraction (number of m/z-retention time pairs) and clean-up potential of the approach (minimum matrix effects), the most appropriate sample pre-treatment was adopted. It combines liquid nitrogen homogenization for plant leaves with thermomixer based extraction using MeOH/H(2)O 80/20. As such, an efficient and highly reproducible LC-MS plant metabolomics set-up is achieved, as illustrated by the obtained results for both LC-MS (8.88%+/-5.16 versus 7.05%+/-4.45) and technical variability (12.53%+/-11.21 versus 9.31%+/-6.65) data in a comparative investigation of A. thaliana plant leaves and cell cultures, respectively.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yuan, Chris, E-mail: cyuan@uwm.edu; Wang, Endong; Zhai, Qiang

    Temporal homogeneity of inventory data is one of the major problems in life cycle assessment (LCA). Addressing temporal homogeneity of life cycle inventory data is important in reducing the uncertainties and improving the reliability of LCA results. This paper attempts to present a critical review and discussion on the fundamental issues of temporal homogeneity in conventional LCA and propose a theoretical framework for temporal discounting in LCA. Theoretical perspectives for temporal discounting in life cycle inventory analysis are discussed first based on the key elements of a scientific mechanism for temporal discounting. Then generic procedures for performing temporal discounting inmore » LCA is derived and proposed based on the nature of the LCA method and the identified key elements of a scientific temporal discounting method. A five-step framework is proposed and reported in details based on the technical methods and procedures needed to perform a temporal discounting in life cycle inventory analysis. Challenges and possible solutions are also identified and discussed for the technical procedure and scientific accomplishment of each step within the framework. - Highlights: • A critical review for temporal homogeneity problem of life cycle inventory data • A theoretical framework for performing temporal discounting on inventory data • Methods provided to accomplish each step of the temporal discounting framework.« less

  12. Feature Screening in Ultrahigh Dimensional Cox's Model.

    PubMed

    Yang, Guangren; Yu, Ye; Li, Runze; Buu, Anne

    Survival data with ultrahigh dimensional covariates such as genetic markers have been collected in medical studies and other fields. In this work, we propose a feature screening procedure for the Cox model with ultrahigh dimensional covariates. The proposed procedure is distinguished from the existing sure independence screening (SIS) procedures (Fan, Feng and Wu, 2010, Zhao and Li, 2012) in that the proposed procedure is based on joint likelihood of potential active predictors, and therefore is not a marginal screening procedure. The proposed procedure can effectively identify active predictors that are jointly dependent but marginally independent of the response without performing an iterative procedure. We develop a computationally effective algorithm to carry out the proposed procedure and establish the ascent property of the proposed algorithm. We further prove that the proposed procedure possesses the sure screening property. That is, with the probability tending to one, the selected variable set includes the actual active predictors. We conduct Monte Carlo simulation to evaluate the finite sample performance of the proposed procedure and further compare the proposed procedure and existing SIS procedures. The proposed methodology is also demonstrated through an empirical analysis of a real data example.

  13. 32 CFR 989.37 - Procedures for analysis abroad.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 6 2011-07-01 2011-07-01 false Procedures for analysis abroad. 989.37 Section... PROTECTION ENVIRONMENTAL IMPACT ANALYSIS PROCESS (EIAP) § 989.37 Procedures for analysis abroad. Procedures for analysis of environmental actions abroad are contained in 32 CFR part 187. That directive provides...

  14. Evaluation and "PACE": A Study of Procedures and Effectiveness of Evaluation Sections in Approved PACE Project, with Recommendations for Improvement.

    ERIC Educational Resources Information Center

    Miller, Richard I.

    This report is based upon an analysis of the evaluation procedures outlined in 21 funded Title III proposals. The proposals were analyzed from a number of points of view. (1) Using the Stufflabeam model as a guide, they were examined to determine what, if any, provisions had been made for each of the four classes of evaluation: context, input,…

  15. Barnes Maze Procedure for Spatial Learning and Memory in Mice.

    PubMed

    Pitts, Matthew W

    2018-03-05

    The Barnes maze is a dry-land based rodent behavioral paradigm for assessing spatial learning and memory that was originally developed by its namesake, Carol Barnes. It represents a well-established alternative to the more popular Morris Water maze and offers the advantage of being free from the potentially confounding influence of swimming behavior. Herein, the Barnes maze experimental setup and corresponding procedures for testing and analysis in mice are described in detail.

  16. Compositional Analysis of Lignocellulosic Feedstocks. 1. Review and Description of Methods

    PubMed Central

    2010-01-01

    As interest in lignocellulosic biomass feedstocks for conversion into transportation fuels grows, the summative compositional analysis of biomass, or plant-derived material, becomes ever more important. The sulfuric acid hydrolysis of biomass has been used to measure lignin and structural carbohydrate content for more than 100 years. Researchers have applied these methods to measure the lignin and structural carbohydrate contents of woody materials, estimate the nutritional value of animal feed, analyze the dietary fiber content of human food, compare potential biofuels feedstocks, and measure the efficiency of biomass-to-biofuels processes. The purpose of this paper is to review the history and lineage of biomass compositional analysis methods based on a sulfuric acid hydrolysis. These methods have become the de facto procedure for biomass compositional analysis. The paper traces changes to the biomass compositional analysis methods through time to the biomass methods currently used at the National Renewable Energy Laboratory (NREL). The current suite of laboratory analytical procedures (LAPs) offered by NREL is described, including an overview of the procedures and methodologies and some common pitfalls. Suggestions are made for continuing improvement to the suite of analyses. PMID:20669951

  17. Efficient sensitivity analysis and optimization of a helicopter rotor

    NASA Technical Reports Server (NTRS)

    Lim, Joon W.; Chopra, Inderjit

    1989-01-01

    Aeroelastic optimization of a system essentially consists of the determination of the optimum values of design variables which minimize the objective function and satisfy certain aeroelastic and geometric constraints. The process of aeroelastic optimization analysis is illustrated. To carry out aeroelastic optimization effectively, one needs a reliable analysis procedure to determine steady response and stability of a rotor system in forward flight. The rotor dynamic analysis used in the present study developed inhouse at the University of Maryland is based on finite elements in space and time. The analysis consists of two major phases: vehicle trim and rotor steady response (coupled trim analysis), and aeroelastic stability of the blade. For a reduction of helicopter vibration, the optimization process requires the sensitivity derivatives of the objective function and aeroelastic stability constraints. For this, the derivatives of steady response, hub loads and blade stability roots are calculated using a direct analytical approach. An automated optimization procedure is developed by coupling the rotor dynamic analysis, design sensitivity analysis and constrained optimization code CONMIN.

  18. A novel procedure on next generation sequencing data analysis using text mining algorithm.

    PubMed

    Zhao, Weizhong; Chen, James J; Perkins, Roger; Wang, Yuping; Liu, Zhichao; Hong, Huixiao; Tong, Weida; Zou, Wen

    2016-05-13

    Next-generation sequencing (NGS) technologies have provided researchers with vast possibilities in various biological and biomedical research areas. Efficient data mining strategies are in high demand for large scale comparative and evolutional studies to be performed on the large amounts of data derived from NGS projects. Topic modeling is an active research field in machine learning and has been mainly used as an analytical tool to structure large textual corpora for data mining. We report a novel procedure to analyse NGS data using topic modeling. It consists of four major procedures: NGS data retrieval, preprocessing, topic modeling, and data mining using Latent Dirichlet Allocation (LDA) topic outputs. The NGS data set of the Salmonella enterica strains were used as a case study to show the workflow of this procedure. The perplexity measurement of the topic numbers and the convergence efficiencies of Gibbs sampling were calculated and discussed for achieving the best result from the proposed procedure. The output topics by LDA algorithms could be treated as features of Salmonella strains to accurately describe the genetic diversity of fliC gene in various serotypes. The results of a two-way hierarchical clustering and data matrix analysis on LDA-derived matrices successfully classified Salmonella serotypes based on the NGS data. The implementation of topic modeling in NGS data analysis procedure provides a new way to elucidate genetic information from NGS data, and identify the gene-phenotype relationships and biomarkers, especially in the era of biological and medical big data. The implementation of topic modeling in NGS data analysis provides a new way to elucidate genetic information from NGS data, and identify the gene-phenotype relationships and biomarkers, especially in the era of biological and medical big data.

  19. Improving Small Signal Stability through Operating Point Adjustment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Zhenyu; Zhou, Ning; Tuffner, Francis K.

    2010-09-30

    ModeMeter techniques for real-time small signal stability monitoring continue to mature, and more and more phasor measurements are available in power systems. It has come to the stage to bring modal information into real-time power system operation. This paper proposes to establish a procedure for Modal Analysis for Grid Operations (MANGO). Complementary to PSS’s and other traditional modulation-based control, MANGO aims to provide suggestions such as increasing generation or decreasing load for operators to mitigate low-frequency oscillations. Different from modulation-based control, the MANGO procedure proactively maintains adequate damping for all time, instead of reacting to disturbances when they occur. Effectmore » of operating points on small signal stability is presented in this paper. Implementation with existing operating procedures is discussed. Several approaches for modal sensitivity estimation are investigated to associate modal damping and operating parameters. The effectiveness of the MANGO procedure is confirmed through simulation studies of several test systems.« less

  20. Coupled rotor/airframe vibration analysis program manual manual. Volume 1: User's and programmer's instructions

    NASA Technical Reports Server (NTRS)

    Cassarino, S.; Sopher, R.

    1982-01-01

    user instruction and software descriptions for the base program of the coupled rotor/airframe vibration analysis are provided. The functional capabilities and procedures for running the program are provided. Interfaces with external programs are discussed. The procedure of synthesizing a dynamic system and the various solution methods are described. Input data and output results are presented. Detailed information is provided on the program structure. Sample test case results for five representative dynamic configurations are provided and discussed. System response are plotted to demonstrate the plots capabilities available. Instructions to install and execute SIMVIB on the CDC computer system are provided.

  1. The use of optimization techniques to design controlled diffusion compressor blading

    NASA Technical Reports Server (NTRS)

    Sanger, N. L.

    1982-01-01

    A method for automating compressor blade design using numerical optimization, and applied to the design of a controlled diffusion stator blade row is presented. A general purpose optimization procedure is employed, based on conjugate directions for locally unconstrained problems and on feasible directions for locally constrained problems. Coupled to the optimizer is an analysis package consisting of three analysis programs which calculate blade geometry, inviscid flow, and blade surface boundary layers. The optimizing concepts and selection of design objective and constraints are described. The procedure for automating the design of a two dimensional blade section is discussed, and design results are presented.

  2. Data on DNA gel sample load, gel electrophoresis, PCR and cost analysis.

    PubMed

    Kuhn, Ramona; Böllmann, Jörg; Krahl, Kathrin; Bryant, Isaac Mbir; Martienssen, Marion

    2018-02-01

    The data presented in this article provide supporting information to the related research article "Comparison of ten different DNA extraction procedures with respect to their suitability for environmental samples" (Kuhn et al., 2017) [1]. In that article, we compared the suitability of ten selected DNA extraction methods based on DNA quality, purity, quantity and applicability to universal PCR. Here we provide the data on the specific DNA gel sample load, all unreported gel images of crude DNA and PCR results, and the complete cost analysis for all tested extraction procedures and in addition two commercial DNA extraction kits for soil and water.

  3. Simulating soil moisture change in a semiarid rangeland watershed with a process-based water-balance model

    Treesearch

    Howard Evan Canfield; Vicente L. Lopes

    2000-01-01

    A process-based, simulation model for evaporation, soil water and streamflow (BROOK903) was used to estimate soil moisture change on a semiarid rangeland watershed in southeastern Arizona. A sensitivity analysis was performed to select parameters affecting ET and soil moisture for calibration. Automatic parameter calibration was performed using a procedure based on a...

  4. 41 CFR 102-80.115 - Is there more than one option for establishing that an equivalent level of safety exists?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... equivalent level of safety. (c) As a third option, other technical analysis procedures, as approved by the... Equivalent Level of Safety Analysis § 102-80.115 Is there more than one option for establishing that an... areas of safety. Available safe egress times would be developed based on analysis of a number of assumed...

  5. Solar energy system economic evaluation for IBM System 3, Glendo, Wyoming

    NASA Technical Reports Server (NTRS)

    1980-01-01

    This analysis was based on the technical and economic models in f-chart design procedures with inputs based on the characteristics of the parameters of present worth of system cost over a projected twenty year life: life cycle savings, year of positive savings, and year of payback for the optimized solar energy system at each of the analysis sites. The sensitivity of the economic evaluation to uncertainties in constituent system and economic variables was also investigated.

  6. Performance of Panfungal- and Specific-PCR-Based Procedures for Etiological Diagnosis of Invasive Fungal Diseases on Tissue Biopsy Specimens with Proven Infection: a 7-Year Retrospective Analysis from a Reference Laboratory

    PubMed Central

    Bernal-Martinez, L.; Castelli, M. V.; Rodriguez-Tudela, J. L.; Cuenca-Estrella, M.

    2014-01-01

    A retrospective analysis of real-time PCR (RT-PCR) results for 151 biopsy samples obtained from 132 patients with proven invasive fungal diseases was performed. PCR-based techniques proved to be fast and sensitive and enabled definitive diagnosis in all cases studied, with detection of a total of 28 fungal species. PMID:24574295

  7. Identification of feline polycystic kidney disease mutation using fret probes and melting curve analysis.

    PubMed

    Criado-Fornelio, A; Buling, A; Barba-Carretero, J C

    2009-02-01

    We developed and validated a real-time polymerase chain reaction (PCR) assay using fluorescent hybridization probes and melting curve analysis to identify the PKD1 exon 29 (C-->A) mutation, which is implicated in polycystic kidney disease of cats. DNA was isolated from peripheral blood of 20 Persian cats. The employ of the new real-time PCR and melting curve analysis in these samples indicated that 13 cats (65%) were wild type homozygotes and seven cats (35%) were heterozygotes. Both PCR-RFLP and sequencing procedures were in full agreement with real-time PCR test results. Sequence analysis showed that the mutant gene had the expected base change compared to the wild type gene. The new procedure is not only very reliable but also faster than the techniques currently applied for diagnosis of the mutation.

  8. Development of hospital data warehouse for cost analysis of DPC based on medical costs.

    PubMed

    Muranaga, F; Kumamoto, I; Uto, Y

    2007-01-01

    To develop a data warehouse system for cost analysis, based on the categories of the diagnosis procedure combination (DPC) system, in which medical costs were estimated by DPC category and factors influencing the balance between costs and fees. We developed a data warehouse system for cost analysis using data from the hospital central data warehouse system. The balance data of patients who were discharged from Kagoshima University Hospital from April 2003 to March 2005 were determined in terms of medical procedure, cost per day and patient admission in order to conduct a drill-down analysis. To evaluate this system, we analyzed cash flow by DPC category of patients who were categorized as having malignant tumors and whose DPC category was reevaluated in 2004. The percentages of medical expenses were highest in patients with acute leukemia, non-Hodgkin's lymphoma, and particularly in patients with malignant tumors of the liver and intrahepatic bile duct. Imaging tests degraded the percentages of medical expenses in Kagoshima University Hospital. These results suggested that cost analysis by patient is important for hospital administration in the inclusive evaluation system using a case-mix index such as DPC.

  9. An efficient numerical procedure for thermohydrodynamic analysis of cavitating bearings

    NASA Technical Reports Server (NTRS)

    Vijayaraghavan, D.

    1995-01-01

    An efficient and accurate numerical procedure to determine the thermo-hydrodynamic performance of cavitating bearings is described. This procedure is based on the earlier development of Elrod for lubricating films, in which the properties across the film thickness are determined at Lobatto points and their distributions are expressed by collocated polynomials. The cavitated regions and their boundaries are rigorously treated. Thermal boundary conditions at the surfaces, including heat dissipation through the metal to the ambient, are incorporated. Numerical examples are presented comparing the predictions using this procedure with earlier theoretical predictions and experimental data. With a few points across the film thickness and across the journal and the bearing in the radial direction, the temperature profile is very well predicted.

  10. Simultaneous determination of selected biogenic amines in alcoholic beverage samples by isotachophoretic and chromatographic methods.

    PubMed

    Jastrzębska, Aneta; Piasta, Anna; Szłyk, Edward

    2014-01-01

    A simple and useful method for the determination of biogenic amines in beverage samples based on isotachophoretic separation is described. The proposed procedure permitted simultaneous analysis of histamine, tyramine, cadaverine, putrescine, tryptamine, 2-phenylethylamine, spermine and spermidine. The data presented demonstrate the utility, simplicity, flexibility, sensitivity and environmentally friendly character of the proposed method. The precision of the method expressed as coefficient of variations varied from 0.1% to 5.9% for beverage samples, whereas recoveries varied from 91% to 101%. The results for the determination of biogenic amines were compared with an HPLC procedure based on a pre-column derivatisation reaction of biogenic amines with dansyl chloride. Furthermore, the derivatisation procedure was optimised by verification of concentration and pH of the buffer, the addition of organic solvents, reaction time and temperature.

  11. GeneBee-net: Internet-based server for analyzing biopolymers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brodsky, L.I.; Ivanov, V.V.; Nikolaev, V.K.

    This work describes a network server for searching databanks of biopolymer structures and performing other biocomputing procedures; it is available via direct Internet connection. Basic server procedures are dedicated to homology (similarity) search of sequence and 3D structure of proteins. The homologies found could be used to build multiple alignments, predict protein and RNA secondary structure, and construct phylogenetic trees. In addition to traditional methods of sequence similarity search, the authors propose {open_quotes}non-matrix{close_quotes} (correlational) search. An analogous approach is used to identify regions of similar tertiary structure of proteins. Algorithm concepts and usage examples are presented for new methods. Servicemore » logic is based upon interaction of a client program and server procedures. The client program allows the compilation of queries and the processing of results of an analysis.« less

  12. Biology Procedural Knowledge at Eleventh Grade of Senior High School in West Lampung Based on Curriculum

    NASA Astrophysics Data System (ADS)

    Sari, T. M.; Paidi; Mercuriani, I. S.

    2018-03-01

    This study was aim to determine Biology procedural knowledge of senior high school in West Lampung based on curriculum at 11th grade in even semester. This research was descriptive research. The population was all students of senior high school in West Lampung. The sampling technique in this research used purposive sampling technique, so the researcher obtained 3 schools using K13 and 3 schools using KTSP. Data collecting technique used instrument test. Data analysis technique used U-Mann Whitney test. The result showed that p=0.028 (p<0.05), so there was significant differences between school using K13 and KTSP. The procedural knowledge of schools which using K13 is higher than school which using KTSP, with the mean score K13=4.35 and KTSP=4.00

  13. The Utility of Job Dimensions Based on Form B of the Position Analysis Questionnaire (PAQ) in a Job Component Validation Model. Report No. 5.

    ERIC Educational Resources Information Center

    Marquardt, Lloyd D.; McCormick, Ernest J.

    The study involved the use of a structured job analysis instrument called the Position Analysis Questionnaire (PAQ) as the direct basis for the establishment of the job component validity of aptitude tests (that is, a procedure for estimating the aptitude requirements for jobs strictly on the basis of job analysis data). The sample of jobs used…

  14. Automatic differentiation for design sensitivity analysis of structural systems using multiple processors

    NASA Technical Reports Server (NTRS)

    Nguyen, Duc T.; Storaasli, Olaf O.; Qin, Jiangning; Qamar, Ramzi

    1994-01-01

    An automatic differentiation tool (ADIFOR) is incorporated into a finite element based structural analysis program for shape and non-shape design sensitivity analysis of structural systems. The entire analysis and sensitivity procedures are parallelized and vectorized for high performance computation. Small scale examples to verify the accuracy of the proposed program and a medium scale example to demonstrate the parallel vector performance on multiple CRAY C90 processors are included.

  15. Learning the facts in medical school is not enough: which factors predict successful application of procedural knowledge in a laboratory setting?

    PubMed

    Schmidmaier, Ralf; Eiber, Stephan; Ebersbach, Rene; Schiller, Miriam; Hege, Inga; Holzer, Matthias; Fischer, Martin R

    2013-02-22

    Medical knowledge encompasses both conceptual (facts or "what" information) and procedural knowledge ("how" and "why" information). Conceptual knowledge is known to be an essential prerequisite for clinical problem solving. Primarily, medical students learn from textbooks and often struggle with the process of applying their conceptual knowledge to clinical problems. Recent studies address the question of how to foster the acquisition of procedural knowledge and its application in medical education. However, little is known about the factors which predict performance in procedural knowledge tasks. Which additional factors of the learner predict performance in procedural knowledge? Domain specific conceptual knowledge (facts) in clinical nephrology was provided to 80 medical students (3rd to 5th year) using electronic flashcards in a laboratory setting. Learner characteristics were obtained by questionnaires. Procedural knowledge in clinical nephrology was assessed by key feature problems (KFP) and problem solving tasks (PST) reflecting strategic and conditional knowledge, respectively. Results in procedural knowledge tests (KFP and PST) correlated significantly with each other. In univariate analysis, performance in procedural knowledge (sum of KFP+PST) was significantly correlated with the results in (1) the conceptual knowledge test (CKT), (2) the intended future career as hospital based doctor, (3) the duration of clinical clerkships, and (4) the results in the written German National Medical Examination Part I on preclinical subjects (NME-I). After multiple regression analysis only clinical clerkship experience and NME-I performance remained independent influencing factors. Performance in procedural knowledge tests seems independent from the degree of domain specific conceptual knowledge above a certain level. Procedural knowledge may be fostered by clinical experience. More attention should be paid to the interplay of individual clinical clerkship experiences and structured teaching of procedural knowledge and its assessment in medical education curricula.

  16. Utilization of Public System for Gastric Bands Placed by Private Providers: a 4-Year Population-Based Analysis in Ontario, Canada.

    PubMed

    Prabhu, Kristel Lobo; Okrainec, Allan; Maeda, Azusa; Saskin, Refik; Urbach, David; Bell, Chaim M; Jackson, Timothy D

    2018-06-16

    Laparoscopic adjustable gastric band (LAGB) placement remains a common bariatric procedure. While LAGB procedure is performed within private clinics in most Canadian provinces, public health care is often utilized for LAGB-related reoperations. We identified 642 gastric band removal procedures performed in Ontario from 2011 to 2014 using population-level administrative data. The number of procedures performed increased annually from 101 in 2011 to 220 in 2014. Notably, 54.7% of the patients required laparotomy, and 17.6% of patients underwent a subsequent bariatric surgery. Our findings demonstrated that LAGB placement in private clinics resulted in a large number of band removal procedures performed within the public system. This represents a significant public health concern that may result in significant health care utilization and patient morbidity.

  17. A Customizable Language Learning Support System Using Ontology-Driven Engine

    ERIC Educational Resources Information Center

    Wang, Jingyun; Mendori, Takahiko; Xiong, Juan

    2013-01-01

    This paper proposes a framework for web-based language learning support systems designed to provide customizable pedagogical procedures based on the analysis of characteristics of both learner and course. This framework employs a course-centered ontology and a teaching method ontology as the foundation for the student model, which includes learner…

  18. The Theoretical Basis of Experience-Based Career Education.

    ERIC Educational Resources Information Center

    Jenks, C. Lynn

    This study analyzes the extent to which the assumptions and procedures of the Experience-Based Career Education model (EBCE) as developed by the Far West Laboratory (FWL) are supported by empirical data and by recognized scholars in educational theory. The analysis is presented as relevant to the more general problem: the limited availability of…

  19. Implementation and Validation of Trial-Based Functional Analyses in Public Elementary School Settings

    ERIC Educational Resources Information Center

    Lloyd, Blair P.; Wehby, Joseph H.; Weaver, Emily S.; Goldman, Samantha E.; Harvey, Michelle N.; Sherlock, Daniel R.

    2015-01-01

    Although functional analysis (FA) remains the standard for identifying the function of problem behavior for students with developmental disabilities, traditional FA procedures are typically costly in terms of time, resources, and perceived risks. Preliminary research suggests that trial-based FA may be a less costly alternative. The purpose of…

  20. The Influence of National Culture toward Learner Interaction: Shanghai TV University and Wawasan Open University

    ERIC Educational Resources Information Center

    Bing, Wu; Ai-Ping, Teoh

    2008-01-01

    The authors conducted a comparative analysis to examine learners' interaction in the Web-based learning environment of 2 distance education institutions. The interaction was critically analyzed based on social, procedural, expository, explanatory, and cognitive dimensions, across 7 categories of exchanges between course coordinator to groups,…

  1. A generalized procedure for analyzing sustained and dynamic vocal fold vibrations from laryngeal high-speed videos using phonovibrograms.

    PubMed

    Unger, Jakob; Schuster, Maria; Hecker, Dietmar J; Schick, Bernhard; Lohscheller, Jörg

    2016-01-01

    This work presents a computer-based approach to analyze the two-dimensional vocal fold dynamics of endoscopic high-speed videos, and constitutes an extension and generalization of a previously proposed wavelet-based procedure. While most approaches aim for analyzing sustained phonation conditions, the proposed method allows for a clinically adequate analysis of both dynamic as well as sustained phonation paradigms. The analysis procedure is based on a spatio-temporal visualization technique, the phonovibrogram, that facilitates the documentation of the visible laryngeal dynamics. From the phonovibrogram, a low-dimensional set of features is computed using a principle component analysis strategy that quantifies the type of vibration patterns, irregularity, lateral symmetry and synchronicity, as a function of time. Two different test bench data sets are used to validate the approach: (I) 150 healthy and pathologic subjects examined during sustained phonation. (II) 20 healthy and pathologic subjects that were examined twice: during sustained phonation and a glissando from a low to a higher fundamental frequency. In order to assess the discriminative power of the extracted features, a Support Vector Machine is trained to distinguish between physiologic and pathologic vibrations. The results for sustained phonation sequences are compared to the previous approach. Finally, the classification performance of the stationary analyzing procedure is compared to the transient analysis of the glissando maneuver. For the first test bench the proposed procedure outperformed the previous approach (proposed feature set: accuracy: 91.3%, sensitivity: 80%, specificity: 97%, previous approach: accuracy: 89.3%, sensitivity: 76%, specificity: 96%). Comparing the classification performance of the second test bench further corroborates that analyzing transient paradigms provides clear additional diagnostic value (glissando maneuver: accuracy: 90%, sensitivity: 100%, specificity: 80%, sustained phonation: accuracy: 75%, sensitivity: 80%, specificity: 70%). The incorporation of parameters describing the temporal evolvement of vocal fold vibration clearly improves the automatic identification of pathologic vibration patterns. Furthermore, incorporating a dynamic phonation paradigm provides additional valuable information about the underlying laryngeal dynamics that cannot be derived from sustained conditions. The proposed generalized approach provides a better overall classification performance than the previous approach, and hence constitutes a new advantageous tool for an improved clinical diagnosis of voice disorders. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. DHLAS: A web-based information system for statistical genetic analysis of HLA population data.

    PubMed

    Thriskos, P; Zintzaras, E; Germenis, A

    2007-03-01

    DHLAS (database HLA system) is a user-friendly, web-based information system for the analysis of human leukocyte antigens (HLA) data from population studies. DHLAS has been developed using JAVA and the R system, it runs on a Java Virtual Machine and its user-interface is web-based powered by the servlet engine TOMCAT. It utilizes STRUTS, a Model-View-Controller framework and uses several GNU packages to perform several of its tasks. The database engine it relies upon for fast access is MySQL, but others can be used a well. The system estimates metrics, performs statistical testing and produces graphs required for HLA population studies: (i) Hardy-Weinberg equilibrium (calculated using both asymptotic and exact tests), (ii) genetics distances (Euclidian or Nei), (iii) phylogenetic trees using the unweighted pair group method with averages and neigbor-joining method, (iv) linkage disequilibrium (pairwise and overall, including variance estimations), (v) haplotype frequencies (estimate using the expectation-maximization algorithm) and (vi) discriminant analysis. The main merit of DHLAS is the incorporation of a database, thus, the data can be stored and manipulated along with integrated genetic data analysis procedures. In addition, it has an open architecture allowing the inclusion of other functions and procedures.

  3. Rural-urban differences in dental service use among children enrolled in a private dental insurance plan in Wisconsin: analysis of administrative data.

    PubMed

    Bhagavatula, Pradeep; Xiang, Qun; Szabo, Aniko; Eichmiller, Fredrick; Kuthy, Raymond A; Okunseri, Christopher E

    2012-12-21

    Studies on rural-urban differences in dental care have primarily focused on differences in utilization rates and preventive dental services. Little is known about rural-urban differences in the use of wider range of dental procedures. This study examined patterns of preventive, restorative, endodontic, and extraction procedures provided to children enrolled in Delta Dental of Wisconsin (DDWI). We analyzed DDWI enrollment and claims data for children aged 0-18 years from 2002 to 2008. We modified and used a rural and urban classification based on ZIP codes developed by the Wisconsin Area Health Education Center (AHEC). We categorized the ZIP codes into 6 AHEC categories (3 rural and 3 urban). Descriptive and multivariable analysis using generalized linear mixed models (GLMM) were used to examine the patterns of dental procedures provided to children. Tukey-Kramer adjustment was used to control for multiple comparisons. Approximately, 50%, 67% and 68% of enrollees in inner-city Milwaukee, Rural 1 (less than 2500 people), and suburban-Milwaukee had at least one annual dental visit, respectively. Children in inner city-Milwaukee had the lowest utilization rates for all procedures examined, except for endodontic procedures. Compared to children from inner-city Milwaukee, children in other locations had significantly more preventive procedures. Children in Rural 1-ZIP codes had more restorative, endodontic and extraction procedures, compared to children from all other regions. We found significant geographic variation in dental procedures received by children enrolled in DDWI.

  4. Beyond volume: hospital-based healthcare technology as a predictor of mortality for cardiovascular patients in Korea.

    PubMed

    Kim, Jae-Hyun; Lee, Yunhwan; Park, Eun-Cheol

    2016-06-01

    To examine whether hospital-based healthcare technology is related to 30-day postoperative mortality rates after adjusting for hospital volume of cardiovascular surgical procedures.This study used the National Health Insurance Service-Cohort Sample Database from 2002 to 2013, which was released by the Korean National Health Insurance Service. A total of 11,109 cardiovascular surgical procedure patients were analyzed. The primary analysis was based on logistic regression models to examine our hypothesis.After adjusting for hospital volume of cardiovascular surgical procedures as well as for all other confounders, the odds ratio (OR) of 30-day mortality in low healthcare technology hospitals was 1.567-times higher (95% confidence interval [CI] = 1.069-2.297) than in those with high healthcare technology. We also found that, overall, cardiovascular surgical patients treated in low healthcare technology hospitals, regardless of the extent of cardiovascular surgical procedures, had the highest 30-day mortality rate.Although the results of our study provide scientific evidence for a hospital volume-mortality relationship in cardiovascular surgical patients, the independent effect of hospital-based healthcare technology is strong, resulting in a lower mortality rate. As hospital characteristics such as clinical pathways and protocols are likely to also play an important role in mortality, further research is required to explore their respective contributions.

  5. Bootstrap-based procedures for inference in nonparametric receiver-operating characteristic curve regression analysis.

    PubMed

    Rodríguez-Álvarez, María Xosé; Roca-Pardiñas, Javier; Cadarso-Suárez, Carmen; Tahoces, Pablo G

    2018-03-01

    Prior to using a diagnostic test in a routine clinical setting, the rigorous evaluation of its diagnostic accuracy is essential. The receiver-operating characteristic curve is the measure of accuracy most widely used for continuous diagnostic tests. However, the possible impact of extra information about the patient (or even the environment) on diagnostic accuracy also needs to be assessed. In this paper, we focus on an estimator for the covariate-specific receiver-operating characteristic curve based on direct regression modelling and nonparametric smoothing techniques. This approach defines the class of generalised additive models for the receiver-operating characteristic curve. The main aim of the paper is to offer new inferential procedures for testing the effect of covariates on the conditional receiver-operating characteristic curve within the above-mentioned class. Specifically, two different bootstrap-based tests are suggested to check (a) the possible effect of continuous covariates on the receiver-operating characteristic curve and (b) the presence of factor-by-curve interaction terms. The validity of the proposed bootstrap-based procedures is supported by simulations. To facilitate the application of these new procedures in practice, an R-package, known as npROCRegression, is provided and briefly described. Finally, data derived from a computer-aided diagnostic system for the automatic detection of tumour masses in breast cancer is analysed.

  6. Evaluation of Operational Procedures for Using a Time-Based Airborne Inter-arrival Spacing Tool

    NASA Technical Reports Server (NTRS)

    Oseguera-Lohr, Rosa M.; Lohr, Gary W.; Abbott, Terence S.; Eischeid, Todd M.

    2002-01-01

    An airborne tool has been developed based on the concept of an aircraft maintaining a time-based spacing interval from the preceding aircraft. The Advanced Terminal Area Approach Spacing (ATAAS) tool uses Automatic Dependent Surveillance-Broadcast (ADS-B) aircraft state data to compute a speed command for the ATAAS-equipped aircraft to obtain a required time interval behind another aircraft. The tool and candidate operational procedures were tested in a high-fidelity, full mission simulator with active airline subject pilots flying an arrival scenario using three different modes for speed control. The objectives of this study were to validate the results of a prior Monte Carlo analysis of the ATAAS algorithm and to evaluate the concept from the standpoint of pilot acceptability and workload. Results showed that the aircraft was able to consistently achieve the target spacing interval within one second (the equivalent of approximately 220 ft at a final approach speed of 130 kt) when the ATAAS speed guidance was autothrottle-coupled, and a slightly greater (4-5 seconds), but consistent interval with the pilot-controlled speed modes. The subject pilots generally rated the workload level with the ATAAS procedure as similar to that with standard procedures, and also rated most aspects of the procedure high in terms of acceptability. Although pilots indicated that the head-down time was higher with ATAAS, the acceptability of head-down time was rated high. Oculometer data indicated slight changes in instrument scan patterns, but no significant change in the amount of time spent looking out the window between the ATAAS procedure versus standard procedures.

  7. Is Office-Based Surgery Safe? Comparing Outcomes of 183,914 Aesthetic Surgical Procedures Across Different Types of Accredited Facilities.

    PubMed

    Gupta, Varun; Parikh, Rikesh; Nguyen, Lyly; Afshari, Ashkan; Shack, R Bruce; Grotting, James C; Higdon, K Kye

    2017-02-01

    There has been a dramatic rise in office-based surgery. However, due to wide variations in regulatory standards, the safety of office-based aesthetic surgery has been questioned. This study compares complication rates of cosmetic surgery performed at office-based surgical suites (OBSS) to ambulatory surgery centers (ASCs) and hospitals. A prospective cohort of patients undergoing cosmetic surgery between 2008 and 2013 were identified from the CosmetAssure database (Birmingham, AL). Patients were grouped by type of accredited facility where the surgery was performed: OBSS, ASC, or hospital. The primary outcome was the incidence of major complication(s) requiring emergency room visit, hospital admission, or reoperation within 30 days postoperatively. Potential risk factors including age, gender, body mass index (BMI), smoking, diabetes, type of procedure, and combined procedures were reviewed. Of the 129,007 patients (183,914 procedures) in the dataset, the majority underwent the procedure at ASCs (57.4%), followed by hospitals (26.7%) and OBSS (15.9%). Patients operated in OBSS were less likely to undergo combined procedures (30.3%) compared to ASCs (31.8%) and hospitals (35.3%, P < .01). Complication rates in OBSS, ASCs, and hospitals were 1.3%, 1.9%, and 2.4%, respectively. On multivariate analysis, there was a lower risk of developing a complication in an OBSS compared to an ASC (RR 0.67, 95% CI 0.59-0.77, P < .01) or a hospital (RR 0.59, 95% CI 0.52-0.68, P < .01). Accredited OBSS appear to be a safe alternative to ASCs and hospitals for cosmetic procedures. Plastic surgeons should continue to triage their patients carefully based on other significant comorbidities that were not measured in this present study. LEVEL OF EVIDENCE 3. © 2016 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com.

  8. Statistical Analyses of Scatterplots to Identify Important Factors in Large-Scale Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kleijnen, J.P.C.; Helton, J.C.

    1999-04-01

    The robustness of procedures for identifying patterns in scatterplots generated in Monte Carlo sensitivity analyses is investigated. These procedures are based on attempts to detect increasingly complex patterns in the scatterplots under consideration and involve the identification of (1) linear relationships with correlation coefficients, (2) monotonic relationships with rank correlation coefficients, (3) trends in central tendency as defined by means, medians and the Kruskal-Wallis statistic, (4) trends in variability as defined by variances and interquartile ranges, and (5) deviations from randomness as defined by the chi-square statistic. The following two topics related to the robustness of these procedures are consideredmore » for a sequence of example analyses with a large model for two-phase fluid flow: the presence of Type I and Type II errors, and the stability of results obtained with independent Latin hypercube samples. Observations from analysis include: (1) Type I errors are unavoidable, (2) Type II errors can occur when inappropriate analysis procedures are used, (3) physical explanations should always be sought for why statistical procedures identify variables as being important, and (4) the identification of important variables tends to be stable for independent Latin hypercube samples.« less

  9. Comparison of automatic procedures in the selection of peaks over threshold in flood frequency analysis: A Canadian case study in the context of climate change

    NASA Astrophysics Data System (ADS)

    Durocher, M.; Mostofi Zadeh, S.; Burn, D. H.; Ashkar, F.

    2017-12-01

    Floods are one of the most costly hazards and frequency analysis of river discharges is an important part of the tools at our disposal to evaluate their inherent risks and to provide an adequate response. In comparison to the common examination of annual streamflow maximums, peaks over threshold (POT) is an interesting alternative that makes better use of the available information by including more than one flood event per year (on average). However, a major challenge is the selection of a satisfactory threshold above which peaks are assumed to respect certain conditions necessary for an adequate estimation of the risk. Additionally, studies have shown that POT is also a valuable approach to investigate the evolution of flood regimes in the context of climate change. Recently, automatic procedures for the selection of the threshold were suggested to guide that important choice, which otherwise rely on graphical tools and expert judgment. Furthermore, having an automatic procedure that is objective allows for quickly repeating the analysis on a large number of samples, which is useful in the context of large databases or for uncertainty analysis based on a resampling approach. This study investigates the impact of considering such procedures in a case study including many sites across Canada. A simulation study is conducted to evaluate the bias and predictive power of the automatic procedures in similar conditions as well as investigating the power of derived nonstationarity tests. The results obtained are also evaluated in the light of expert judgments established in a previous study. Ultimately, this study provides a thorough examination of the considerations that need to be addressed when conducting POT analysis using automatic threshold selection.

  10. Acellular dermal matrix for mucogingival surgery: a meta-analysis.

    PubMed

    Gapski, Ricardo; Parks, Christopher Allen; Wang, Hom-Lay

    2005-11-01

    Many clinical studies revealed the effectiveness of acellular dermal matrix (ADM) in the treatment of mucogingival defects. The purpose of this meta-analysis was to compare the efficacy of ADM-based root coverage (RC) and ADM-based increase in keratinized tissues to other commonly used mucogingival surgeries. Meta-analysis was limited to randomized clinical trials (RCT). Articles from January 1, 1990 to October 2004 related to ADM were searched utilizing the MEDLINE database from the National Library of Medicine, the Cochrane Oral Health Group Specialized Trials Registry, and through hand searches of reviews and recent journals. Relevant studies were identified, ranked independently, and mean data from each were weighted accordingly. Selected outcomes were analyzed using a meta-analysis software program. The significant estimates of the treatment effects from different trials were assessed by means of Cochrane's test of heterogeneity. 1) Few RCT studies were found to compile the data. In summary, selection identified eight RCT that met the inclusion criteria. There were four studies comparing ADM versus a connective tissue graft for root coverage procedures, two studies comparing ADM versus coronally advanced flap (CAF) for root coverage procedures, and two studies comparing ADM to free gingival graft in augmentation of keratinized tissue. 2) There were no statistically significant differences between groups for any of the outcomes measured (recession coverage, keratinized tissue formation, probing depths, and clinical attachment levels). 3) The majority of the analyses demonstrated moderate to high levels of heterogeneity. 4) Considering the heterogeneity values found among the studies, certain trends could be found: a) three out of four studies favored the ADM-RC group for recession coverage; b) a connective tissue graft tended to increase keratinized tissue compared to ADM (0.52-mm difference; P = 0.11); c) there were trends of increased clinical attachment gains comparing ADM to CAF procedures (0.56-mm difference; P = 0.16). Differences in study design and lack of data precluded an adequate and complete pooling of data for a more comprehensive analysis. Therefore, considering the trends presented in this study, there is a need for further randomized clinical studies of ADM procedures in comparison to common mucogingival surgical procedures to confirm our findings. It is difficult to draw anything other than tentative conclusions from this meta-analysis of ADM for mucogingival surgery, primarily because of the weakness in the design and reporting of existing trials.

  11. A single pre-operative antibiotic dose is as effective as continued antibiotic prophylaxis in implant-based breast reconstruction: A matched cohort study.

    PubMed

    Townley, William A; Baluch, Narges; Bagher, Shaghayegh; Maass, Saskia W M C; O'Neill, Anne; Zhong, Toni; Hofer, Stefan O P

    2015-05-01

    Infections following implant-based breast reconstruction can lead to devastating consequences. There is currently no consensus on the need for post-operative antibiotics in preventing immediate infection. This study compared two different methods of infection prevention in this group of patients. A retrospective matched cohort study was performed on consecutive women undergoing implant-based breast reconstruction at University Health Network, Toronto (November 2008-December 2012). All patients received a single pre-operative intravenous antibiotic dose. Group A received minimal interventions and Group B underwent maximal prophylactic measures. Patient (age, smoking, diabetes, co-morbidities), oncologic and procedural variables (timing and laterality) were collected. Univariate and multivariate logistic regression were performed to compare outcomes between the two groups. Two hundred and eight patients underwent 647 implant procedures. After matching the two treatment groups by BMI, 94 patients in each treatment group yielding a total of 605 implant procedures were selected for analysis. The two groups were comparable in terms of patient and disease variables. Post-operative wound infection was similar in Group A (n = 11, 12%) compared with Group B (n = 9, 10%; p = 0.8). Univariate analysis revealed only pre-operative radiotherapy to be associated with the development of infection (0.004). Controlling for the effect of radiotherapy, multivariate analysis demonstrated that there was no statistically significant difference between the two methods for infection prevention. Our findings suggest that a single pre-operative dose of intravenous antibiotics is equally as effective as continued antibiotic prophylaxis in preventing immediate infection in patients undergoing implant-based breast reconstructions. Copyright © 2015 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  12. NASA trend analysis procedures

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This publication is primarily intended for use by NASA personnel engaged in managing or implementing trend analysis programs. 'Trend analysis' refers to the observation of current activity in the context of the past in order to infer the expected level of future activity. NASA trend analysis was divided into 5 categories: problem, performance, supportability, programmatic, and reliability. Problem trend analysis uncovers multiple occurrences of historical hardware or software problems or failures in order to focus future corrective action. Performance trend analysis observes changing levels of real-time or historical flight vehicle performance parameters such as temperatures, pressures, and flow rates as compared to specification or 'safe' limits. Supportability trend analysis assesses the adequacy of the spaceflight logistics system; example indicators are repair-turn-around time and parts stockage levels. Programmatic trend analysis uses quantitative indicators to evaluate the 'health' of NASA programs of all types. Finally, reliability trend analysis attempts to evaluate the growth of system reliability based on a decreasing rate of occurrence of hardware problems over time. Procedures for conducting all five types of trend analysis are provided in this publication, prepared through the joint efforts of the NASA Trend Analysis Working Group.

  13. Identifying reprioritization response shift in a stroke caregiver population: a comparison of missing data methods.

    PubMed

    Sajobi, Tolulope T; Lix, Lisa M; Singh, Gurbakhshash; Lowerison, Mark; Engbers, Jordan; Mayo, Nancy E

    2015-03-01

    Response shift (RS) is an important phenomenon that influences the assessment of longitudinal changes in health-related quality of life (HRQOL) studies. Given that RS effects are often small, missing data due to attrition or item non-response can contribute to failure to detect RS effects. Since missing data are often encountered in longitudinal HRQOL data, effective strategies to deal with missing data are important to consider. This study aims to compare different imputation methods on the detection of reprioritization RS in the HRQOL of caregivers of stroke survivors. Data were from a Canadian multi-center longitudinal study of caregivers of stroke survivors over a one-year period. The Stroke Impact Scale physical function score at baseline, with a cutoff of 75, was used to measure patient stroke severity for the reprioritization RS analysis. Mean imputation, likelihood-based expectation-maximization imputation, and multiple imputation methods were compared in test procedures based on changes in relative importance weights to detect RS in SF-36 domains over a 6-month period. Monte Carlo simulation methods were used to compare the statistical powers of relative importance test procedures for detecting RS in incomplete longitudinal data under different missing data mechanisms and imputation methods. Of the 409 caregivers, 15.9 and 31.3 % of them had missing data at baseline and 6 months, respectively. There were no statistically significant changes in relative importance weights on any of the domains when complete-case analysis was adopted. But statistical significant changes were detected on physical functioning and/or vitality domains when mean imputation or EM imputation was adopted. There were also statistically significant changes in relative importance weights for physical functioning, mental health, and vitality domains when multiple imputation method was adopted. Our simulations revealed that relative importance test procedures were least powerful under complete-case analysis method and most powerful when a mean imputation or multiple imputation method was adopted for missing data, regardless of the missing data mechanism and proportion of missing data. Test procedures based on relative importance measures are sensitive to the type and amount of missing data and imputation method. Relative importance test procedures based on mean imputation and multiple imputation are recommended for detecting RS in incomplete data.

  14. Evaluation of Transverse Thermal Stresses in Composite Plates Based on First-Order Shear Deformation Theory

    NASA Technical Reports Server (NTRS)

    Rolfes, R.; Noor, A. K.; Sparr, H.

    1998-01-01

    A postprocessing procedure is presented for the evaluation of the transverse thermal stresses in laminated plates. The analytical formulation is based on the first-order shear deformation theory and the plate is discretized by using a single-field displacement finite element model. The procedure is based on neglecting the derivatives of the in-plane forces and the twisting moments, as well as the mixed derivatives of the bending moments, with respect to the in-plane coordinates. The calculated transverse shear stiffnesses reflect the actual stacking sequence of the composite plate. The distributions of the transverse stresses through-the-thickness are evaluated by using only the transverse shear forces and the thermal effects resulting from the finite element analysis. The procedure is implemented into a postprocessing routine which can be easily incorporated into existing commercial finite element codes. Numerical results are presented for four- and ten-layer cross-ply laminates subjected to mechanical and thermal loads.

  15. Analysis of trace contamination of phthalate esters in ultrapure water using a modified solid-phase extraction procedure and automated thermal desorption-gas chromatography/mass spectrometry.

    PubMed

    Liu, Hsu-Chuan; Den, Walter; Chan, Shu-Fei; Kin, Kuan Tzu

    2008-04-25

    The present study was aimed to develop a procedure modified from the conventional solid-phase extraction (SPE) method for the analysis of trace concentration of phthalate esters in industrial ultrapure water (UPW). The proposed procedure allows UPW sample to be drawn through a sampling tube containing hydrophobic sorbent (Tenax TA) to concentrate the aqueous phthalate esters. The solid trap was then demoisturized by two-stage gas drying before subjecting to thermal desorption and analysis by gas chromatography-mass spectrometry. This process removes the solvent extraction procedure necessary for the conventional SPE method, and permits automation of the analytical procedure for high-volume analyses. Several important parameters, including desorption temperature and duration, packing quantity and demoisturizing procedure, were optimized in this study based on the analytical sensitivity for a standard mixture containing five different phthalate esters. The method detection limits for the five phthalate esters were between 36 ng l(-1) and 95 ng l(-1) and recovery rates between 15% and 101%. Dioctyl phthalate (DOP) was not recovered adequately because the compound was both poorly adsorbed and desorbed on and off Tenax TA sorbents. Furthermore, analyses of material leaching from poly(vinyl chloride) (PVC) tubes as well as the actual water samples showed that di-n-butyl phthalate (DBP) and di(2-ethylhexyl) phthalate (DEHP) were the common contaminants detected from PVC contaminated UPW and the actual UPW, as well as in tap water. The reduction of DEHP in the production processes of actual UPW was clearly observed, however a DEHP concentration of 0.20 microg l(-1) at the point of use was still being quantified, suggesting that the contamination of phthalate esters could present a barrier to the future cleanliness requirement of UPW. The work demonstrated that the proposed modified SPE procedure provided an effective method for rapid analysis and contamination identification in UPW production lines.

  16. Convective-diffusion-based fluorescence correlation spectroscopy for detection of a trace amount of E. coli in water.

    PubMed

    Qing, De-Kui; Mengüç, M Pinar; Payne, Fred A; Danao, Mary-Grace C

    2003-06-01

    Fluorescence correlation spectroscopy (FCS) is adapted for a new procedure to detect trace amounts of Escherichia coli in water. The present concept is based on convective diffusion rather than Brownian diffusion and employs confocal microscopy as in traditional FCS. With this system it is possible to detect concentrations as small as 1.5 x 10(5) E. coli per milliliter (2.5 x 10(-16) M). This concentration corresponds to an approximately 1.0-nM level of Rhodamine 6G dyes. A detailed analysis of the optical system is presented, and further improvements for the procedure are discussed.

  17. 76 FR 78015 - Revised Analysis and Mapping Procedures for Non-Accredited Levees

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-15

    ...] Revised Analysis and Mapping Procedures for Non-Accredited Levees AGENCY: Federal Emergency Management... comments on the proposed solution for Revised Analysis and Mapping Procedures for Non-Accredited Levees. This document proposes a revised procedure for the analysis and mapping of non-accredited levees on...

  18. Quantitative analysis of crystalline pharmaceuticals in tablets by pattern-fitting procedure using X-ray diffraction pattern.

    PubMed

    Takehira, Rieko; Momose, Yasunori; Yamamura, Shigeo

    2010-10-15

    A pattern-fitting procedure using an X-ray diffraction pattern was applied to the quantitative analysis of binary system of crystalline pharmaceuticals in tablets. Orthorhombic crystals of isoniazid (INH) and mannitol (MAN) were used for the analysis. Tablets were prepared under various compression pressures using a direct compression method with various compositions of INH and MAN. Assuming that X-ray diffraction pattern of INH-MAN system consists of diffraction intensities from respective crystals, observed diffraction intensities were fitted to analytic expression based on X-ray diffraction theory and separated into two intensities from INH and MAN crystals by a nonlinear least-squares procedure. After separation, the contents of INH were determined by using the optimized normalization constants for INH and MAN. The correction parameter including all the factors that are beyond experimental control was required for quantitative analysis without calibration curve. The pattern-fitting procedure made it possible to determine crystalline phases in the range of 10-90% (w/w) of the INH contents. Further, certain characteristics of the crystals in the tablets, such as the preferred orientation, size of crystallite, and lattice disorder were determined simultaneously. This method can be adopted to analyze compounds whose crystal structures are known. It is a potentially powerful tool for the quantitative phase analysis and characterization of crystals in tablets and powders using X-ray diffraction patterns. Copyright 2010 Elsevier B.V. All rights reserved.

  19. Model of Procedure Usage – Results from a Qualitative Study to Inform Design of Computer-Based Procedures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johanna H Oxstrand; Katya L Le Blanc

    The nuclear industry is constantly trying to find ways to decrease the human error rate, especially the human errors associated with procedure use. As a step toward the goal of improving procedure use performance, researchers, together with the nuclear industry, have been looking at replacing the current paper-based procedures with computer-based procedure systems. The concept of computer-based procedures is not new by any means; however most research has focused on procedures used in the main control room. Procedures reviewed in these efforts are mainly emergency operating procedures and normal operating procedures. Based on lessons learned for these previous efforts wemore » are now exploring a more unknown application for computer based procedures - field procedures, i.e. procedures used by nuclear equipment operators and maintenance technicians. The Idaho National Laboratory, the Institute for Energy Technology, and participants from the U.S. commercial nuclear industry are collaborating in an applied research effort with the objective of developing requirements and specifications for a computer-based procedure system to be used by field operators. The goal is to identify the types of human errors that can be mitigated by using computer-based procedures and how to best design the computer-based procedures to do this. The underlying philosophy in the research effort is “Stop – Start – Continue”, i.e. what features from the use of paper-based procedures should we not incorporate (Stop), what should we keep (Continue), and what new features or work processes should be added (Start). One step in identifying the Stop – Start – Continue was to conduct a baseline study where affordances related to the current usage of paper-based procedures were identified. The purpose of the study was to develop a model of paper based procedure use which will help to identify desirable features for computer based procedure prototypes. Affordances such as note taking, markups, sharing procedures between fellow coworkers, the use of multiple procedures at once, etc. were considered. The model describes which affordances associated with paper based procedures should be transferred to computer-based procedures as well as what features should not be incorporated. The model also provides a means to identify what new features not present in paper based procedures need to be added to the computer-based procedures to further enhance performance. The next step is to use the requirements and specifications to develop concepts and prototypes of computer-based procedures. User tests and other data collection efforts will be conducted to ensure that the real issues with field procedures and their usage are being addressed and solved in the best manner possible. This paper describes the baseline study, the construction of the model of procedure use, and the requirements and specifications for computer-based procedures that were developed based on the model. It also addresses how the model and the insights gained from it were used to develop concepts and prototypes for computer based procedures.« less

  20. Object-oriented philosophy in designing adaptive finite-element package for 3D elliptic deferential equations

    NASA Astrophysics Data System (ADS)

    Zhengyong, R.; Jingtian, T.; Changsheng, L.; Xiao, X.

    2007-12-01

    Although adaptive finite-element (AFE) analysis is becoming more and more focused in scientific and engineering fields, its efficient implementations are remain to be a discussed problem as its more complex procedures. In this paper, we propose a clear C++ framework implementation to show the powerful properties of Object-oriented philosophy (OOP) in designing such complex adaptive procedure. In terms of the modal functions of OOP language, the whole adaptive system is divided into several separate parts such as the mesh generation or refinement, a-posterior error estimator, adaptive strategy and the final post processing. After proper designs are locally performed on these separate modals, a connected framework of adaptive procedure is formed finally. Based on the general elliptic deferential equation, little efforts should be added in the adaptive framework to do practical simulations. To show the preferable properties of OOP adaptive designing, two numerical examples are tested. The first one is the 3D direct current resistivity problem in which the powerful framework is efficiently shown as only little divisions are added. And then, in the second induced polarization£¨IP£©exploration case, new adaptive procedure is easily added which adequately shows the strong extendibility and re-usage of OOP language. Finally we believe based on the modal framework adaptive implementation by OOP methodology, more advanced adaptive analysis system will be available in future.

  1. Economic analysis of the future growth of cosmetic surgery procedures.

    PubMed

    Liu, Tom S; Miller, Timothy A

    2008-06-01

    The economic growth of cosmetic surgical and nonsurgical procedures has been tremendous. Between 1992 and 2005, annual U.S. cosmetic surgery volume increased by 725 percent, with over $10 billion spent in 2005. It is unknown whether this growth will continue for the next decade and, if so, what impact it will it have on the plastic surgeon workforce. The authors analyzed annual U.S. cosmetic surgery procedure volume reported by the American Society of Plastic Surgeons (ASPS) National Clearinghouse of Plastic Surgery Statistics between 1992 and 2005. Reconstructive plastic surgery volume was not included in the analysis. The authors analyzed the ability of economic and noneconomic variables to predict annual cosmetic surgery volume. The authors also used growth rate analyses to construct models with which to predict the future growth of cosmetic surgery. None of the economic and noneconomic variables were a significant predictor of annual cosmetic surgery volume. Instead, based on current compound annual growth rates, the authors predict that total cosmetic surgery volume (surgical and nonsurgical) will exceed 55 million annual procedures by 2015. ASPS members are projected to perform 299 surgical and 2165 nonsurgical annual procedures. Non-ASPS members are projected to perform 39 surgical and 1448 nonsurgical annual procedures. If current growth rates continue into the next decade, the future demand in cosmetic surgery will be driven largely by nonsurgical procedures. The growth of surgical procedures will be met by ASPS members. However, meeting the projected growth in nonsurgical procedures could be a potential challenge and a potential area for increased competition.

  2. Beyond Volume: Hospital-Based Healthcare Technology for Better Outcomes in Cerebrovascular Surgical Patients Diagnosed With Ischemic Stroke: A Population-Based Nationwide Cohort Study From 2002 to 2013.

    PubMed

    Kim, Jae-Hyun; Park, Eun-Cheol; Lee, Sang Gyu; Lee, Tae-Hyun; Jang, Sung-In

    2016-03-01

    We examined whether the level of hospital-based healthcare technology was related to the 30-day postoperative mortality rates, after adjusting for hospital volume, of ischemic stroke patients who underwent a cerebrovascular surgical procedure. Using the National Health Insurance Service-Cohort Sample Database, we reviewed records from 2002 to 2013 for data on patients with ischemic stroke who underwent cerebrovascular surgical procedures. Statistical analysis was performed using Cox proportional hazard models to test our hypothesis. A total of 798 subjects were included in our study. After adjusting for hospital volume of cerebrovascular surgical procedures as well as all for other potential confounders, the hazard ratio (HR) of 30-day mortality in low healthcare technology hospitals as compared to high healthcare technology hospitals was 2.583 (P < 0.001). We also found that, although the HR of 30-day mortality in low healthcare technology hospitals with high volume as compared to high healthcare technology hospitals with high volume was the highest (10.014, P < 0.0001), cerebrovascular surgical procedure patients treated in low healthcare technology hospitals had the highest 30-day mortality rate, irrespective of hospital volume. Although results of our study provide scientific evidence for a hospital volume/30-day mortality rate relationship in ischemic stroke patients who underwent cerebrovascular surgical procedures, our results also suggest that the level of hospital-based healthcare technology is associated with mortality rates independent of hospital volume. Given these results, further research into what components of hospital-based healthcare technology significantly impact mortality is warranted.

  3. Parallel-vector solution of large-scale structural analysis problems on supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O.; Nguyen, Duc T.; Agarwal, Tarun K.

    1989-01-01

    A direct linear equation solution method based on the Choleski factorization procedure is presented which exploits both parallel and vector features of supercomputers. The new equation solver is described, and its performance is evaluated by solving structural analysis problems on three high-performance computers. The method has been implemented using Force, a generic parallel FORTRAN language.

  4. Key Issues in the Analysis of Remote Sensing Data: A report on the workshop

    NASA Technical Reports Server (NTRS)

    Swain, P. H. (Principal Investigator)

    1981-01-01

    The procedures of a workshop assessing the state of the art of machine analysis of remotely sensed data are summarized. Areas discussed were: data bases, image registration, image preprocessing operations, map oriented considerations, advanced digital systems, artificial intelligence methods, image classification, and improved classifier training. Recommendations of areas for further research are presented.

  5. Meta-Analysis as a Choice to Improve Research in Career and Technical Education

    ERIC Educational Resources Information Center

    Gordon, Howard R. D.; McClain, Clifford R.; Kim, Yeonsoo; Maldonado, Cecilia

    2010-01-01

    A search of the ERIC and Academic Search Premier data bases, and a comprehensive review of literature suggest that meta-analysis is ignored by career and technical education (CTE) researchers, a situation that is regrettable but remediable. The purpose of this theoretical paper is to provide CTE researchers and consumers with procedures for…

  6. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR ANALYSIS OF 2-PHASE MULTISORBENT SAMPLERS FOR VOCS (BCO-L-31.0)

    EPA Science Inventory

    The purpose of this SOP is to describe the methodology for the analysis of certain trace volatile organic compounds (VOCs) in air that are captured on two-phase carbon-based multisorbent tubes packed with Carbotrap (graphitized carbon blacks) and Carbosieve S-III (a carbon molecu...

  7. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR ANALYSIS OF 3-PHASE MULTISORBENT SAMPLERS FOR VOCS (BCO-L-22.1)

    EPA Science Inventory

    The purpose of this SOP is to describe the methodology for the analysis of certain trace volatile organic compounds (VOCs) in air that are captured on carbon-based multisorbent tubes packed with Carbotrap C, Carbotrap (graphitized carbon blacks), and Carbosieve S-III (a carbon mo...

  8. Adaptation of instructional materials: a commentary on the research on adaptations of Who Polluted the Potomac

    NASA Astrophysics Data System (ADS)

    Ercikan, Kadriye; Alper, Naim

    2009-03-01

    This commentary first summarizes and discusses the analysis of the two translation processes described in the Oliveira, Colak, and Akerson article and the inferences these researchers make based on their research. In the second part of the commentary, we describe procedures and criteria used in adapting tests into different languages and how they may apply to adaptation of instructional materials. The authors provide a good theoretical analysis of what took place in two translation instances and make an important contribution by taking the first step in providing a systematic discussion of adaptation of instructional materials. Our discussion proposes procedures for adapting instructional materials for examining equivalence of source and target versions of adapted instructional materials. We highlight that many of the procedures and criteria used in examining comparability of educational tests is missing in this emerging research of area.

  9. New insights into old methods for identifying causal rare variants.

    PubMed

    Wang, Haitian; Huang, Chien-Hsun; Lo, Shaw-Hwa; Zheng, Tian; Hu, Inchi

    2011-11-29

    The advance of high-throughput next-generation sequencing technology makes possible the analysis of rare variants. However, the investigation of rare variants in unrelated-individuals data sets faces the challenge of low power, and most methods circumvent the difficulty by using various collapsing procedures based on genes, pathways, or gene clusters. We suggest a new way to identify causal rare variants using the F-statistic and sliced inverse regression. The procedure is tested on the data set provided by the Genetic Analysis Workshop 17 (GAW17). After preliminary data reduction, we ranked markers according to their F-statistic values. Top-ranked markers were then subjected to sliced inverse regression, and those with higher absolute coefficients in the most significant sliced inverse regression direction were selected. The procedure yields good false discovery rates for the GAW17 data and thus is a promising method for future study on rare variants.

  10. Thresholds of Principle and Preference: Exploring Procedural Variation in Postgraduate Surgical Education.

    PubMed

    Apramian, Tavis; Cristancho, Sayra; Watling, Chris; Ott, Michael; Lingard, Lorelei

    2015-11-01

    Expert physicians develop their own ways of doing things. The influence of such practice variation in clinical learning is insufficiently understood. Our grounded theory study explored how residents make sense of, and behave in relation to, the procedural variations of faculty surgeons. We sampled senior postgraduate surgical residents to construct a theoretical framework for how residents make sense of procedural variations. Using a constructivist grounded theory approach, we used marginal participant observation in the operating room across 56 surgical cases (146 hours), field interviews (38), and formal interviews (6) to develop a theoretical framework for residents' ways of dealing with procedural variations. Data analysis used constant comparison to iteratively refine the framework and data collection until theoretical saturation was reached. The core category of the constructed theory was called thresholds of principle and preference and it captured how faculty members position some procedural variations as negotiable and others not. The term thresholding was coined to describe residents' daily experiences of spotting, mapping, and negotiating their faculty members' thresholds and defending their own emerging thresholds. Thresholds of principle and preference play a key role in workplace-based medical education. Postgraduate medical learners are occupied on a day-to-day level with thresholding and attempting to make sense of the procedural variations of faculty. Workplace-based teaching and assessment should include an understanding of the integral role of thresholding in shaping learners' development. Future research should explore the nature and impact of thresholding in workplace-based learning beyond the surgical context.

  11. COAL SULFUR MEASUREMENTS

    EPA Science Inventory

    The report describes a new technique for sulfur forms analysis based on low-temperature oxygen plasma ashing. The technique involves analyzing the low-temperature plasma ash by modified ASTM techniques after selectively removing the organic material. The procedure has been tested...

  12. Automated Clean Chemistry for Bulk Analysis of Environmental Swipe Samples - FY17 Year End Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ticknor, Brian W.; Metzger, Shalina C.; McBay, Eddy H.

    Sample preparation methods for mass spectrometry are being automated using commercial-off-the-shelf (COTS) equipment to shorten lengthy and costly manual chemical purification procedures. This development addresses a serious need in the International Atomic Energy Agency’s Network of Analytical Laboratories (IAEA NWAL) to increase efficiency in the Bulk Analysis of Environmental Samples for Safeguards program with a method that allows unattended, overnight operation. In collaboration with Elemental Scientific Inc., the prepFAST-MC2 was designed based on COTS equipment. It was modified for uranium/plutonium separations using renewable columns packed with Eichrom TEVA and UTEVA resins, with a chemical separation method based on the Oakmore » Ridge National Laboratory (ORNL) NWAL chemical procedure. The newly designed prepFAST-SR has had several upgrades compared with the original prepFAST-MC2. Both systems are currently installed in the Ultra-Trace Forensics Science Center at ORNL.« less

  13. Self-constrained inversion of potential fields

    NASA Astrophysics Data System (ADS)

    Paoletti, V.; Ialongo, S.; Florio, G.; Fedi, M.; Cella, F.

    2013-11-01

    We present a potential-field-constrained inversion procedure based on a priori information derived exclusively from the analysis of the gravity and magnetic data (self-constrained inversion). The procedure is designed to be applied to underdetermined problems and involves scenarios where the source distribution can be assumed to be of simple character. To set up effective constraints, we first estimate through the analysis of the gravity or magnetic field some or all of the following source parameters: the source depth-to-the-top, the structural index, the horizontal position of the source body edges and their dip. The second step is incorporating the information related to these constraints in the objective function as depth and spatial weighting functions. We show, through 2-D and 3-D synthetic and real data examples, that potential field-based constraints, for example, structural index, source boundaries and others, are usually enough to obtain substantial improvement in the density and magnetization models.

  14. Procedures for woody vegetation surveys in the Kazgail rural council area, Kordofan, Sudan

    USGS Publications Warehouse

    Falconer, Allan; Cross, Matthew D.; Orr, Donald G.

    1990-01-01

    Efforts to reforest parts of the Kordofan Province of Sudan are receiving support from international development agencies. These efforts include planning and implementing reforestation activities that require the collection of natural resources and socioeconomic data, and the preparation of base maps. A combination of remote sensing, geographic information system and global positioning systems procedures are used in this study to meet these requirements.Remote sensing techniques were used to provide base maps and to guide the compilation of vegetation resources maps. These techniques provided a rapid and efficient method for documenting available resources. Pocket‐sized global positioning system units were used to establish the location of field data collected for mapping and resource analysis. A microcomputer data management system tabulated and displayed the field data. The resulting system for data analysis, management, and planning has been adopted for the mapping and inventory of the Gum Belt of Sudan.

  15. Perforator-based propeller flaps reliability in upper extremity soft tissue reconstruction: a systematic review.

    PubMed

    Vitse, J; Bekara, F; Bertheuil, N; Sinna, R; Chaput, B; Herlin, C

    2017-02-01

    Current data on upper extremity propeller flaps are poor and do not allow the assessment of the safety of this technique. A systematic literature review was conducted searching PubMed, EMBASE, and the Cochrane Library electronic databases, and the selection process was adapted from the preferred reporting items for systematic reviews and meta-analysis statement. The final analysis included ten relevant articles involving 117 flaps. The majority of flaps were used for the hand, distal wrist, and elbow. The radial artery perforator and ulnar artery perforator were the most frequently used flaps. The were 7% flaps with venous congestion and 3% with complete necrosis. No difference in complications rate was found for different flaps sites. Perforator-based propeller flaps appear to be an interesting procedure for covering soft tissue defects involving the upper extremities, even for large defects, but the procedure requires experience and close monitoring. II.

  16. A study of stiffness, residual strength and fatigue life relationships for composite laminates

    NASA Technical Reports Server (NTRS)

    Ryder, J. T.; Crossman, F. W.

    1983-01-01

    Qualitative and quantitative exploration of the relationship between stiffness, strength, fatigue life, residual strength, and damage of unnotched, graphite/epoxy laminates subjected to tension loading. Clarification of the mechanics of the tension loading is intended to explain previous contradictory observations and hypotheses; to develop a simple procedure to anticipate strength, fatigue life, and stiffness changes; and to provide reasons for the study of more complex cases of compression, notches, and spectrum fatigue loading. Mathematical models are developed based upon analysis of the damage states. Mathematical models were based on laminate analysis, free body type modeling or a strain energy release rate. Enough understanding of the tension loaded case is developed to allow development of a proposed, simple procedure for calculating strain to failure, stiffness, strength, data scatter, and shape of the stress-life curve for unnotched laminates subjected to tension load.

  17. Cost-Effectiveness of Procedures for Treatment of Ostium Secundum Atrial Septal Defects Occlusion Comparing Conventional Surgery and Septal Percutaneous Implant

    PubMed Central

    da Costa, Márcia Gisele Santos; Santos, Marisa da Silva; Sarti, Flávia Mori; Senna, Kátia Marie Simões e.; Tura, Bernardo Rangel; Goulart, Marcelo Correia

    2014-01-01

    Objectives The study performs a cost-effectiveness analysis of procedures for atrial septal defects occlusion, comparing conventional surgery to septal percutaneous implant. Methods A model of analytical decision was structured with symmetric branches to estimate cost-effectiveness ratio between the procedures. The decision tree model was based on evidences gathered through meta-analysis of literature, and validated by a panel of specialists. The lower number of surgical procedures performed for atrial septal defects occlusion at each branch was considered as the effectiveness outcome. Direct medical costs and probabilities for each event were inserted in the model using data available from Brazilian public sector database system and information extracted from the literature review, using micro-costing technique. Sensitivity analysis included price variations of percutaneous implant. Results The results obtained from the decision model demonstrated that the percutaneous implant was more cost effective in cost-effectiveness analysis at a cost of US$8,936.34 with a reduction in the probability of surgery occurrence in 93% of the cases. Probability of atrial septal communication occlusion and cost of the implant are the determinant factors of cost-effectiveness ratio. Conclusions The proposal of a decision model seeks to fill a void in the academic literature. The decision model proposed includes the outcomes that present major impact in relation to the overall costs of the procedure. The atrial septal defects occlusion using percutaneous implant reduces the physical and psychological distress to the patients in relation to the conventional surgery, which represent intangible costs in the context of economic evaluation. PMID:25302806

  18. Cost-effectiveness of procedures for treatment of ostium secundum atrial septal defects occlusion comparing conventional surgery and septal percutaneous implant.

    PubMed

    da Costa, Márcia Gisele Santos; Santos, Marisa da Silva; Sarti, Flávia Mori; Simões e Senna, Kátia Marie; Tura, Bernardo Rangel; Correia, Marcelo Goulart; Goulart, Marcelo Correia

    2014-01-01

    The study performs a cost-effectiveness analysis of procedures for atrial septal defects occlusion, comparing conventional surgery to septal percutaneous implant. A model of analytical decision was structured with symmetric branches to estimate cost-effectiveness ratio between the procedures. The decision tree model was based on evidences gathered through meta-analysis of literature, and validated by a panel of specialists. The lower number of surgical procedures performed for atrial septal defects occlusion at each branch was considered as the effectiveness outcome. Direct medical costs and probabilities for each event were inserted in the model using data available from Brazilian public sector database system and information extracted from the literature review, using micro-costing technique. Sensitivity analysis included price variations of percutaneous implant. The results obtained from the decision model demonstrated that the percutaneous implant was more cost effective in cost-effectiveness analysis at a cost of US$8,936.34 with a reduction in the probability of surgery occurrence in 93% of the cases. Probability of atrial septal communication occlusion and cost of the implant are the determinant factors of cost-effectiveness ratio. The proposal of a decision model seeks to fill a void in the academic literature. The decision model proposed includes the outcomes that present major impact in relation to the overall costs of the procedure. The atrial septal defects occlusion using percutaneous implant reduces the physical and psychological distress to the patients in relation to the conventional surgery, which represent intangible costs in the context of economic evaluation.

  19. Incidence and Risk Factors for Major Hematomas in Aesthetic Surgery: Analysis of 129,007 Patients.

    PubMed

    Kaoutzanis, Christodoulos; Winocour, Julian; Gupta, Varun; Ganesh Kumar, Nishant; Sarosiek, Konrad; Wormer, Blair; Tokin, Christopher; Grotting, James C; Higdon, K Kye

    2017-10-16

    Postoperative hematomas are one of the most frequent complications following aesthetic surgery. Identifying risk factors for hematoma has been limited by underpowered studies from single institution experiences. To examine the incidence and identify independent risk factors for postoperative hematomas following cosmetic surgery utilizing a prospective, multicenter database. A prospectively enrolled cohort of patients who underwent aesthetic surgery between 2008 and 2013 was identified from the CosmetAssure database. Primary outcome was occurrence of major hematomas requiring emergency room visit, hospital admission, or reoperation within 30 days of the index operation. Univariate and multivariate analysis was used to identify potential risk factors for hematomas including age, gender, body mass index (BMI), smoking, diabetes, type of surgical facility, procedure by body region, and combined procedures. Of 129,007 patients, 1180 (0.91%) had a major hematoma. Mean age (42.0 ± 13.0 years vs 40.9 ± 13.9 years, P < 0.01) and BMI (24.5 ± 5.0 kg/m2 vs 24.3 ± 4.6 kg/m2, P < 0.01) were higher in patients with hematomas. Males suffered more hematomas than females (1.4% vs 0.9%, P < 0.01). Hematoma rates were higher in patients undergoing combined procedures compared to single procedures (1.1% vs 0.8%, P < 0.01), and breast procedures compared to body/extremity or face procedures (1.0% vs 0.8% vs 0.7%, P < 0.01). On multivariate analysis, independent predictors of hematoma included age (Relative Risk [RR] 1.01), male gender (RR 1.98), the procedure being performed in a hospital setting rather than an office-based setting (RR 1.68), combined procedures (RR 1.35), and breast procedures rather than the body/extremity and face procedures (RR 1.81). Major hematoma is the most common complication following aesthetic surgery. Male patients and those undergoing breast or combined procedures have a significantly higher risk of developing hematomas. 2. © 2017 The American Society for Aesthetic Plastic Surgery, Inc. Reprints and permission: journals.permissions@oup.com

  20. Wind Characterization for the Assessment of Collision Risk During Flight Level Changes

    NASA Technical Reports Server (NTRS)

    Carreno, Victor; Chartrand, Ryan

    2009-01-01

    A model of vertical wind gradient is presented based on National Oceanic and Atmospheric Administration (NOAA) wind data. The objective is to have an accurate representation of wind to be used in Collision Risk Models (CRM) of aircraft procedures. Depending on how an aircraft procedure is defined, wind and the different characteristics of the wind will have a more severe or less severe impact on distances between aircraft. For the In-Trail Procedure, the non-linearity of the vertical wind gradient has the greatest impact on longitudinal distance. The analysis in this paper extracts standard deviation, mean, maximum, and linearity characteristics from the NOAA data.

  1. Reduced complexity structural modeling for automated airframe synthesis

    NASA Technical Reports Server (NTRS)

    Hajela, Prabhat

    1987-01-01

    A procedure is developed for the optimum sizing of wing structures based on representing the built-up finite element assembly of the structure by equivalent beam models. The reduced-order beam models are computationally less demanding in an optimum design environment which dictates repetitive analysis of several trial designs. The design procedure is implemented in a computer program requiring geometry and loading information to create the wing finite element model and its equivalent beam model, and providing a rapid estimate of the optimum weight obtained from a fully stressed design approach applied to the beam. The synthesis procedure is demonstrated for representative conventional-cantilever and joined wing configurations.

  2. A Step-by-Step Design Methodology for a Base Case Vanadium Redox-Flow Battery

    ERIC Educational Resources Information Center

    Moore, Mark; Counce, Robert M.; Watson, Jack S.; Zawodzinski, Thomas A.; Kamath, Haresh

    2012-01-01

    The purpose of this work is to develop an evolutionary procedure to be used by Chemical Engineering students for the base-case design of a Vanadium Redox-Flow Battery. The design methodology is based on the work of Douglas (1985) and provides a profitability analysis at each decision level so that more profitable alternatives and directions can be…

  3. Meta-Analysis of a Continuous Outcome Combining Individual Patient Data and Aggregate Data: A Method Based on Simulated Individual Patient Data

    ERIC Educational Resources Information Center

    Yamaguchi, Yusuke; Sakamoto, Wataru; Goto, Masashi; Staessen, Jan A.; Wang, Jiguang; Gueyffier, Francois; Riley, Richard D.

    2014-01-01

    When some trials provide individual patient data (IPD) and the others provide only aggregate data (AD), meta-analysis methods for combining IPD and AD are required. We propose a method that reconstructs the missing IPD for AD trials by a Bayesian sampling procedure and then applies an IPD meta-analysis model to the mixture of simulated IPD and…

  4. Automated reconstruction of rainfall events responsible for shallow landslides

    NASA Astrophysics Data System (ADS)

    Vessia, G.; Parise, M.; Brunetti, M. T.; Peruccacci, S.; Rossi, M.; Vennari, C.; Guzzetti, F.

    2014-04-01

    Over the last 40 years, many contributions have been devoted to identifying the empirical rainfall thresholds (e.g. intensity vs. duration ID, cumulated rainfall vs. duration ED, cumulated rainfall vs. intensity EI) for the initiation of shallow landslides, based on local as well as worldwide inventories. Although different methods to trace the threshold curves have been proposed and discussed in literature, a systematic study to develop an automated procedure to select the rainfall event responsible for the landslide occurrence has rarely been addressed. Nonetheless, objective criteria for estimating the rainfall responsible for the landslide occurrence (effective rainfall) play a prominent role on the threshold values. In this paper, two criteria for the identification of the effective rainfall events are presented: (1) the first is based on the analysis of the time series of rainfall mean intensity values over one month preceding the landslide occurrence, and (2) the second on the analysis of the trend in the time function of the cumulated mean intensity series calculated from the rainfall records measured through rain gauges. The two criteria have been implemented in an automated procedure written in R language. A sample of 100 shallow landslides collected in Italy by the CNR-IRPI research group from 2002 to 2012 has been used to calibrate the proposed procedure. The cumulated rainfall E and duration D of rainfall events that triggered the documented landslides are calculated through the new procedure and are fitted with power law in the (D,E) diagram. The results are discussed by comparing the (D,E) pairs calculated by the automated procedure and the ones by the expert method.

  5. The Gap Procedure: for the identification of phylogenetic clusters in HIV-1 sequence data.

    PubMed

    Vrbik, Irene; Stephens, David A; Roger, Michel; Brenner, Bluma G

    2015-11-04

    In the context of infectious disease, sequence clustering can be used to provide important insights into the dynamics of transmission. Cluster analysis is usually performed using a phylogenetic approach whereby clusters are assigned on the basis of sufficiently small genetic distances and high bootstrap support (or posterior probabilities). The computational burden involved in this phylogenetic threshold approach is a major drawback, especially when a large number of sequences are being considered. In addition, this method requires a skilled user to specify the appropriate threshold values which may vary widely depending on the application. This paper presents the Gap Procedure, a distance-based clustering algorithm for the classification of DNA sequences sampled from individuals infected with the human immunodeficiency virus type 1 (HIV-1). Our heuristic algorithm bypasses the need for phylogenetic reconstruction, thereby supporting the quick analysis of large genetic data sets. Moreover, this fully automated procedure relies on data-driven gaps in sorted pairwise distances to infer clusters, thus no user-specified threshold values are required. The clustering results obtained by the Gap Procedure on both real and simulated data, closely agree with those found using the threshold approach, while only requiring a fraction of the time to complete the analysis. Apart from the dramatic gains in computational time, the Gap Procedure is highly effective in finding distinct groups of genetically similar sequences and obviates the need for subjective user-specified values. The clusters of genetically similar sequences returned by this procedure can be used to detect patterns in HIV-1 transmission and thereby aid in the prevention, treatment and containment of the disease.

  6. The Use of Isotope Dilution Alpha Spectrometry and Liquid Scintillation Counting to Determine Radionuclides in Environmental Samples (abstract)

    NASA Astrophysics Data System (ADS)

    Bylyku, Elida

    2009-04-01

    In Albania in recent years it has been of increasing interest to determine various pollutants in the environment and their possible effects on human health. The radiochemical procedure used to identify Pu, Am, U, Th, and Sr radioisotopes in soil, sediment, water, coal, and milk samples is described. The analysis is carried out in the presence of respective tracer solutions and combines the procedure for Pu analysis based on anion exchange, the selective method for Sr isolation based on extraction chromatography using Sr-Spec resin, and the application of the TRU-Spec column for separation of Am fraction. An acid digestion method has been applied for the decomposition of samples. The radiochemical procedure involves the separation of Pu from Th, Am, and Sr by anion exchange, followed by the preconcentration of Am and Sr by coprecipitation with calcium oxalate. Am is separated from Sr by extraction chromatography. Uranium is separated from the bulk elements by liquid-liquid extraction using UTEVA® resin. Thin sources for alpha spectrometric measurements are prepared by microprecipitation with NdF3. Two International Atomic Energy Agency reference materials were analyzed in parallel with the samples.

  7. Variable Selection for Regression Models of Percentile Flows

    NASA Astrophysics Data System (ADS)

    Fouad, G.

    2017-12-01

    Percentile flows describe the flow magnitude equaled or exceeded for a given percent of time, and are widely used in water resource management. However, these statistics are normally unavailable since most basins are ungauged. Percentile flows of ungauged basins are often predicted using regression models based on readily observable basin characteristics, such as mean elevation. The number of these independent variables is too large to evaluate all possible models. A subset of models is typically evaluated using automatic procedures, like stepwise regression. This ignores a large variety of methods from the field of feature (variable) selection and physical understanding of percentile flows. A study of 918 basins in the United States was conducted to compare an automatic regression procedure to the following variable selection methods: (1) principal component analysis, (2) correlation analysis, (3) random forests, (4) genetic programming, (5) Bayesian networks, and (6) physical understanding. The automatic regression procedure only performed better than principal component analysis. Poor performance of the regression procedure was due to a commonly used filter for multicollinearity, which rejected the strongest models because they had cross-correlated independent variables. Multicollinearity did not decrease model performance in validation because of a representative set of calibration basins. Variable selection methods based strictly on predictive power (numbers 2-5 from above) performed similarly, likely indicating a limit to the predictive power of the variables. Similar performance was also reached using variables selected based on physical understanding, a finding that substantiates recent calls to emphasize physical understanding in modeling for predictions in ungauged basins. The strongest variables highlighted the importance of geology and land cover, whereas widely used topographic variables were the weakest predictors. Variables suffered from a high degree of multicollinearity, possibly illustrating the co-evolution of climatic and physiographic conditions. Given the ineffectiveness of many variables used here, future work should develop new variables that target specific processes associated with percentile flows.

  8. New approaches to the analysis of population trends in land birds: Comment

    USGS Publications Warehouse

    Link, W.A.; Sauer, J.R.

    1997-01-01

    James et al. (1996, Ecology 77:13-27) used data from the North American Breeding Bird Survey (BBS) to examine geographic variability in patterns of population change for 26 species of wood warblers. They emphasized the importance of evaluating nonlinear patterns of change in bird populations, proposed LOESS-based non-parametric and semi-parametric analyses of BBS data, and contrasted their results with other analyses, including those of Robbins et al. (1989, Proceedings of the National Academy of Sciences 86: 7658-7662) and Peterjohn et al. (1995, Pages 3-39 in T. E. Martin and D. M. Finch, eds. Ecology and management of Neotropical migratory birds: a synthesis and review of critical issues. Oxford University Press, New York.). In this note, we briefly comment on some of the issues that arose from their analysis of BBS data, suggest a few aspects of the survey that should inspire caution in analysts, and review the differences between the LOESS-based procedures and other procedures (e.g., Link and Sauer 1994). We strongly discourage the use of James et al.'s completely non-parametric procedure, which fails to account for observer effects. Our comparisons of estimators adds to the evidence already present in the literature of the bias associated with omitting observer information in analyses of BBS data. Bias resulting from change in observer abilities should be a consideration in any analysis of BBS data.

  9. Prototype of a computer method for designing and analyzing heating, ventilating and air conditioning proportional, electronic control systems

    NASA Astrophysics Data System (ADS)

    Barlow, Steven J.

    1986-09-01

    The Air Force needs a better method of designing new and retrofit heating, ventilating and air conditioning (HVAC) control systems. Air Force engineers currently use manual design/predict/verify procedures taught at the Air Force Institute of Technology, School of Civil Engineering, HVAC Control Systems course. These existing manual procedures are iterative and time-consuming. The objectives of this research were to: (1) Locate and, if necessary, modify an existing computer-based method for designing and analyzing HVAC control systems that is compatible with the HVAC Control Systems manual procedures, or (2) Develop a new computer-based method of designing and analyzing HVAC control systems that is compatible with the existing manual procedures. Five existing computer packages were investigated in accordance with the first objective: MODSIM (for modular simulation), HVACSIM (for HVAC simulation), TRNSYS (for transient system simulation), BLAST (for building load and system thermodynamics) and Elite Building Energy Analysis Program. None were found to be compatible or adaptable to the existing manual procedures, and consequently, a prototype of a new computer method was developed in accordance with the second research objective.

  10. In Vivo Myeloperoxidase Imaging and Flow Cytometry Analysis of Intestinal Myeloid Cells.

    PubMed

    Hülsdünker, Jan; Zeiser, Robert

    2016-01-01

    Myeloperoxidase (MPO) imaging is a non-invasive method to detect cells that produce the enzyme MPO that is most abundant in neutrophils, macrophages, and inflammatory monocytes. While lacking specificity for any of these three cell types, MPO imaging can provide guidance for further flow cytometry-based analysis of tissues where these cell types reside. Isolation of leukocytes from the intestinal tract is an error-prone procedure. Here, we describe a protocol for intestinal leukocyte isolation that works reliable in our hands and allows for flow cytometry-based analysis, in particular of neutrophils.

  11. An integrated data-analysis and database system for AMS 14C

    NASA Astrophysics Data System (ADS)

    Kjeldsen, Henrik; Olsen, Jesper; Heinemeier, Jan

    2010-04-01

    AMSdata is the name of a combined database and data-analysis system for AMS 14C and stable-isotope work that has been developed at Aarhus University. The system (1) contains routines for data analysis of AMS and MS data, (2) allows a flexible and accurate description of sample extraction and pretreatment, also when samples are split into several fractions, and (3) keeps track of all measured, calculated and attributed data. The structure of the database is flexible and allows an unlimited number of measurement and pretreatment procedures. The AMS 14C data analysis routine is fairly advanced and flexible, and it can be easily optimized for different kinds of measuring processes. Technically, the system is based on a Microsoft SQL server and includes stored SQL procedures for the data analysis. Microsoft Office Access is used for the (graphical) user interface, and in addition Excel, Word and Origin are exploited for input and output of data, e.g. for plotting data during data analysis.

  12. Argonne National Laboratory Expedited Site Characterization: First International Symposium on Integrated Technical Approaches to Site Characterization - Proceedings Volume

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-06-08

    Laboratory applications for the analysis of PCBS (polychlorinated biphenyls) in environmental matrices such as soil/sediment/sludge and oil/waste oil were evaluated for potential reduction in waste, source reduction, and alternative techniques for final determination. As a consequence, new procedures were studied for solvent substitution, miniaturization of extraction and cleanups, minimization of reagent consumption, reduction of cost per analysis, and reduction of time. These new procedures provide adequate data that meet all the performance requirements for the determination of PCBS. Use of the new procedures reduced costs for all sample preparation techniques. Time and cost were also reduced by combining the newmore » sample preparation procedures with the power of fast gas chromatography. Separation of Aroclor 1254 was achieved in less than 6 min by using DB-1 and SPB-608 columns. With the greatly shortened run times, reproducibility can be tested quickly and consequently with low cost. With performance-based methodology, the applications presented here can be applied now, without waiting for regulatory approval.« less

  13. Digital image processing and analysis for activated sludge wastewater treatment.

    PubMed

    Khan, Muhammad Burhan; Lee, Xue Yong; Nisar, Humaira; Ng, Choon Aun; Yeap, Kim Ho; Malik, Aamir Saeed

    2015-01-01

    Activated sludge system is generally used in wastewater treatment plants for processing domestic influent. Conventionally the activated sludge wastewater treatment is monitored by measuring physico-chemical parameters like total suspended solids (TSSol), sludge volume index (SVI) and chemical oxygen demand (COD) etc. For the measurement, tests are conducted in the laboratory, which take many hours to give the final measurement. Digital image processing and analysis offers a better alternative not only to monitor and characterize the current state of activated sludge but also to predict the future state. The characterization by image processing and analysis is done by correlating the time evolution of parameters extracted by image analysis of floc and filaments with the physico-chemical parameters. This chapter briefly reviews the activated sludge wastewater treatment; and, procedures of image acquisition, preprocessing, segmentation and analysis in the specific context of activated sludge wastewater treatment. In the latter part additional procedures like z-stacking, image stitching are introduced for wastewater image preprocessing, which are not previously used in the context of activated sludge. Different preprocessing and segmentation techniques are proposed, along with the survey of imaging procedures reported in the literature. Finally the image analysis based morphological parameters and correlation of the parameters with regard to monitoring and prediction of activated sludge are discussed. Hence it is observed that image analysis can play a very useful role in the monitoring of activated sludge wastewater treatment plants.

  14. Selection of Variables in Cluster Analysis: An Empirical Comparison of Eight Procedures

    ERIC Educational Resources Information Center

    Steinley, Douglas; Brusco, Michael J.

    2008-01-01

    Eight different variable selection techniques for model-based and non-model-based clustering are evaluated across a wide range of cluster structures. It is shown that several methods have difficulties when non-informative variables (i.e., random noise) are included in the model. Furthermore, the distribution of the random noise greatly impacts the…

  15. Image-based change estimation for land cover and land use monitoring

    Treesearch

    Jeremy Webb; C. Kenneth Brewer; Nicholas Daniels; Chris Maderia; Randy Hamilton; Mark Finco; Kevin A. Megown; Andrew J. Lister

    2012-01-01

    The Image-based Change Estimation (ICE) project resulted from the need to provide estimates and information for land cover and land use change over large areas. The procedure uses Forest Inventory and Analysis (FIA) plot locations interpreted using two different dates of imagery from the National Agriculture Imagery Program (NAIP). In order to determine a suitable...

  16. Data-Mining Techniques in Detecting Factors Linked to Academic Achievement

    ERIC Educational Resources Information Center

    Martínez Abad, Fernando; Chaparro Caso López, Alicia A.

    2017-01-01

    In light of the emergence of statistical analysis techniques based on data mining in education sciences, and the potential they offer to detect non-trivial information in large databases, this paper presents a procedure used to detect factors linked to academic achievement in large-scale assessments. The study is based on a non-experimental,…

  17. The effect of implementing cognitive load theory-based design principles in virtual reality simulation training of surgical skills: a randomized controlled trial.

    PubMed

    Andersen, Steven Arild Wuyts; Mikkelsen, Peter Trier; Konge, Lars; Cayé-Thomasen, Per; Sørensen, Mads Sølvsten

    2016-01-01

    Cognitive overload can inhibit learning, and cognitive load theory-based instructional design principles can be used to optimize learning situations. This study aims to investigate the effect of implementing cognitive load theory-based design principles in virtual reality simulation training of mastoidectomy. Eighteen novice medical students received 1 h of self-directed virtual reality simulation training of the mastoidectomy procedure randomized for standard instructions (control) or cognitive load theory-based instructions with a worked example followed by a problem completion exercise (intervention). Participants then completed two post-training virtual procedures for assessment and comparison. Cognitive load during the post-training procedures was estimated by reaction time testing on an integrated secondary task. Final-product analysis by two blinded expert raters was used to assess the virtual mastoidectomy performances. Participants in the intervention group had a significantly increased cognitive load during the post-training procedures compared with the control group (52 vs. 41 %, p  = 0.02). This was also reflected in the final-product performance: the intervention group had a significantly lower final-product score than the control group (13.0 vs. 15.4, p  < 0.005). Initial instruction using worked examples followed by a problem completion exercise did not reduce the cognitive load or improve the performance of the following procedures in novices. Increased cognitive load when part tasks needed to be integrated in the post-training procedures could be a possible explanation for this. Other instructional designs and methods are needed to lower the cognitive load and improve the performance in virtual reality surgical simulation training of novices.

  18. Proceedings of the National Conference on Energy Resource Management. Volume 1: Techniques, Procedures and Data Bases

    NASA Technical Reports Server (NTRS)

    Brumfield, J. O. (Editor); Schiffman, Y. M. (Editor)

    1982-01-01

    Topics dealing with the integration of remotely sensed data with geographic information system for application in energy resources management are discussed. Associated remote sensing and image analysis techniques are also addressed.

  19. 7 CFR 1486.303 - What specific contracting procedures must be adhered to?

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... financial liability for any costs or claims resulting from suits, challenges, or other disputes based on...) Perform some form of fee, price, or cost analysis, such as a comparison of price quotations to market...

  20. Landslide risk models for decision making.

    PubMed

    Bonachea, Jaime; Remondo, Juan; de Terán, José Ramón Díaz; González-Díez, Alberto; Cendrero, Antonio

    2009-11-01

    This contribution presents a quantitative procedure for landslide risk analysis and zoning considering hazard, exposure (or value of elements at risk), and vulnerability. The method provides the means to obtain landslide risk models (expressing expected damage due to landslides on material elements and economic activities in monetary terms, according to different scenarios and periods) useful to identify areas where mitigation efforts will be most cost effective. It allows identifying priority areas for the implementation of actions to reduce vulnerability (elements) or hazard (processes). The procedure proposed can also be used as a preventive tool, through its application to strategic environmental impact analysis (SEIA) of land-use plans. The underlying hypothesis is that reliable predictions about hazard and risk can be made using models based on a detailed analysis of past landslide occurrences in connection with conditioning factors and data on past damage. The results show that the approach proposed and the hypothesis formulated are essentially correct, providing estimates of the order of magnitude of expected losses for a given time period. Uncertainties, strengths, and shortcomings of the procedure and results obtained are discussed and potential lines of research to improve the models are indicated. Finally, comments and suggestions are provided to generalize this type of analysis.

  1. Stress Analysis for the Formation of En Echelon Veins and Vortex Structures: a Lesson Plan with a Brief Illumination

    NASA Astrophysics Data System (ADS)

    Zeng, Z.; Birnbaum, S.

    2006-12-01

    An English lesson plan exploring stress analysis of En Echelon veins and vortex structures used in the bilingual course in Structural Geology at the National Science Training Base of China is described. Two mechanical models are introduced in class and both mathematical and mechanical analyses are conducted. Samples, pictures and case studies are selected from Britain, Switzerland, and China. These case studies are augmented from the previous research results of the first author. Students are guided through the entire thought process, including methods and procedures used in the stress analysis of geologic structures. The teaching procedures are also illustrated. The method showed is effective to help students to get the initial knowledge of quantitative analysis for the formation of geological structures. This work is supported by the Ministry of Education of China, the Education Bureau of Hubei Province of China and China University of Geosciences (Wuhan).

  2. Flood analysis in mixed-urban areas reflecting interactions with the complete water cycle through coupled hydrologic-hydraulic modelling.

    PubMed

    Sto Domingo, N D; Refsgaard, A; Mark, O; Paludan, B

    2010-01-01

    The potential devastating effects of urban flooding have given high importance to thorough understanding and management of water movement within catchments, and computer modelling tools have found widespread use for this purpose. The state-of-the-art in urban flood modelling is the use of a coupled 1D pipe and 2D overland flow model to simultaneously represent pipe and surface flows. This method has been found to be accurate for highly paved areas, but inappropriate when land hydrology is important. The objectives of this study are to introduce a new urban flood modelling procedure that is able to reflect system interactions with hydrology, verify that the new procedure operates well, and underline the importance of considering the complete water cycle in urban flood analysis. A physically-based and distributed hydrological model was linked to a drainage network model for urban flood analysis, and the essential components and concepts used were described in this study. The procedure was then applied to a catchment previously modelled with the traditional 1D-2D procedure to determine if the new method performs similarly well. Then, results from applying the new method in a mixed-urban area were analyzed to determine how important hydrologic contributions are to flooding in the area.

  3. SU-C-201-04: Quantification of Perfusion Heterogeneity Based On Texture Analysis for Fully Automatic Detection of Ischemic Deficits From Myocardial Perfusion Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fang, Y; Huang, H; Su, T

    Purpose: Texture-based quantification of image heterogeneity has been a popular topic for imaging studies in recent years. As previous studies mainly focus on oncological applications, we report our recent efforts of applying such techniques on cardiac perfusion imaging. A fully automated procedure has been developed to perform texture analysis for measuring the image heterogeneity. Clinical data were used to evaluate the preliminary performance of such methods. Methods: Myocardial perfusion images of Thallium-201 scans were collected from 293 patients with suspected coronary artery disease. Each subject underwent a Tl-201 scan and a percutaneous coronary intervention (PCI) within three months. The PCImore » Result was used as the gold standard of coronary ischemia of more than 70% stenosis. Each Tl-201 scan was spatially normalized to an image template for fully automatic segmentation of the LV. The segmented voxel intensities were then carried into the texture analysis with our open-source software Chang Gung Image Texture Analysis toolbox (CGITA). To evaluate the clinical performance of the image heterogeneity for detecting the coronary stenosis, receiver operating characteristic (ROC) analysis was used to compute the overall accuracy, sensitivity and specificity as well as the area under curve (AUC). Those indices were compared to those obtained from the commercially available semi-automatic software QPS. Results: With the fully automatic procedure to quantify heterogeneity from Tl-201 scans, we were able to achieve a good discrimination with good accuracy (74%), sensitivity (73%), specificity (77%) and AUC of 0.82. Such performance is similar to those obtained from the semi-automatic QPS software that gives a sensitivity of 71% and specificity of 77%. Conclusion: Based on fully automatic procedures of data processing, our preliminary data indicate that the image heterogeneity of myocardial perfusion imaging can provide useful information for automatic determination of the myocardial ischemia.« less

  4. Recognition of surface lithologic and topographic patterns in southwest Colorado with ADP techniques

    NASA Technical Reports Server (NTRS)

    Melhorn, W. N.; Sinnock, S.

    1973-01-01

    Analysis of ERTS-1 multispectral data by automatic pattern recognition procedures is applicable toward grappling with current and future resource stresses by providing a means for refining existing geologic maps. The procedures used in the current analysis already yield encouraging results toward the eventual machine recognition of extensive surface lithologic and topographic patterns. Automatic mapping of a series of hogbacks, strike valleys, and alluvial surfaces along the northwest flank of the San Juan Basin in Colorado can be obtained by minimal man-machine interaction. The determination of causes for separable spectral signatures is dependent upon extensive correlation of micro- and macro field based ground truth observations and aircraft underflight data with the satellite data.

  5. High frequency vibration analysis by the complex envelope vectorization.

    PubMed

    Giannini, O; Carcaterra, A; Sestieri, A

    2007-06-01

    The complex envelope displacement analysis (CEDA) is a procedure to solve high frequency vibration and vibro-acoustic problems, providing the envelope of the physical solution. CEDA is based on a variable transformation mapping the high frequency oscillations into signals of low frequency content and has been successfully applied to one-dimensional systems. However, the extension to plates and vibro-acoustic fields met serious difficulties so that a general revision of the theory was carried out, leading finally to a new method, the complex envelope vectorization (CEV). In this paper the CEV method is described, underlying merits and limits of the procedure, and a set of applications to vibration and vibro-acoustic problems of increasing complexity are presented.

  6. POWER-ENHANCED MULTIPLE DECISION FUNCTIONS CONTROLLING FAMILY-WISE ERROR AND FALSE DISCOVERY RATES.

    PubMed

    Peña, Edsel A; Habiger, Joshua D; Wu, Wensong

    2011-02-01

    Improved procedures, in terms of smaller missed discovery rates (MDR), for performing multiple hypotheses testing with weak and strong control of the family-wise error rate (FWER) or the false discovery rate (FDR) are developed and studied. The improvement over existing procedures such as the Šidák procedure for FWER control and the Benjamini-Hochberg (BH) procedure for FDR control is achieved by exploiting possible differences in the powers of the individual tests. Results signal the need to take into account the powers of the individual tests and to have multiple hypotheses decision functions which are not limited to simply using the individual p -values, as is the case, for example, with the Šidák, Bonferroni, or BH procedures. They also enhance understanding of the role of the powers of individual tests, or more precisely the receiver operating characteristic (ROC) functions of decision processes, in the search for better multiple hypotheses testing procedures. A decision-theoretic framework is utilized, and through auxiliary randomizers the procedures could be used with discrete or mixed-type data or with rank-based nonparametric tests. This is in contrast to existing p -value based procedures whose theoretical validity is contingent on each of these p -value statistics being stochastically equal to or greater than a standard uniform variable under the null hypothesis. Proposed procedures are relevant in the analysis of high-dimensional "large M , small n " data sets arising in the natural, physical, medical, economic and social sciences, whose generation and creation is accelerated by advances in high-throughput technology, notably, but not limited to, microarray technology.

  7. Isolation of Microarray-Grade Total RNA, MicroRNA, and DNA from a Single PAXgene Blood RNA Tube

    PubMed Central

    Kruhøffer, Mogens; Dyrskjøt, Lars; Voss, Thorsten; Lindberg, Raija L.P.; Wyrich, Ralf; Thykjaer, Thomas; Orntoft, Torben F.

    2007-01-01

    We have developed a procedure for isolation of microRNA and genomic DNA in addition to total RNA from whole blood stabilized in PAXgene Blood RNA tubes. The procedure is based on automatic extraction on a BioRobot MDx and includes isolation of DNA from a fraction of the stabilized blood and recovery of small RNA species that are otherwise lost. The procedure presented here is suitable for large-scale experiments and is amenable to further automation. Procured total RNA and DNA was tested using Affymetrix Expression and single-nucleotide polymorphism GeneChips, respectively, and isolated microRNA was tested using spotted locked nucleic acid-based microarrays. We conclude that the yield and quality of total RNA, microRNA, and DNA from a single PAXgene blood RNA tube is sufficient for downstream microarray analysis. PMID:17690207

  8. The (Un)Certainty of Selectivity in Liquid Chromatography Tandem Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Berendsen, Bjorn J. A.; Stolker, Linda A. M.; Nielen, Michel W. F.

    2013-01-01

    We developed a procedure to determine the "identification power" of an LC-MS/MS method operated in the MRM acquisition mode, which is related to its selectivity. The probability of any compound showing the same precursor ion, product ions, and retention time as the compound of interest is used as a measure of selectivity. This is calculated based upon empirical models constructed from three very large compound databases. Based upon the final probability estimation, additional measures to assure unambiguous identification can be taken, like the selection of different or additional product ions. The reported procedure in combination with criteria for relative ion abundances results in a powerful technique to determine the (un)certainty of the selectivity of any LC-MS/MS analysis and thus the risk of false positive results. Furthermore, the procedure is very useful as a tool to validate method selectivity.

  9. Global Warming Estimation From Microwave Sounding Unit

    NASA Technical Reports Server (NTRS)

    Prabhakara, C.; Iacovazzi, R., Jr.; Yoo, J.-M.; Dalu, G.

    1998-01-01

    Microwave Sounding Unit (MSU) Ch 2 data sets, collected from sequential, polar-orbiting, Sun-synchronous National Oceanic and Atmospheric Administration operational satellites, contain systematic calibration errors that are coupled to the diurnal temperature cycle over the globe. Since these coupled errors in MSU data differ between successive satellites, it is necessary to make compensatory adjustments to these multisatellite data sets in order to determine long-term global temperature change. With the aid of the observations during overlapping periods of successive satellites, we can determine such adjustments and use them to account for the coupled errors in the long-term time series of MSU Ch 2 global temperature. In turn, these adjusted MSU Ch 2 data sets can be used to yield global temperature trend. In a pioneering study, Spencer and Christy (SC) (1990) developed a procedure to derive the global temperature trend from MSU Ch 2 data. Such a procedure can leave unaccounted residual errors in the time series of the temperature anomalies deduced by SC, which could lead to a spurious long-term temperature trend derived from their analysis. In the present study, we have developed a method that avoids the shortcomings of the SC procedure, the magnitude of the coupled errors is not determined explicitly. Furthermore, based on some assumptions, these coupled errors are eliminated in three separate steps. Such a procedure can leave unaccounted residual errors in the time series of the temperature anomalies deduced by SC, which could lead to a spurious long-term temperature trend derived from their analysis. In the present study, we have developed a method that avoids the shortcomings of the SC procedures. Based on our analysis, we find there is a global warming of 0.23+/-0.12 K between 1980 and 1991. Also, in this study, the time series of global temperature anomalies constructed by removing the global mean annual temperature cycle compares favorably with a similar time series obtained from conventional observations of temperature.

  10. New approach to measure soil particulate organic matter in intact samples using X-ray computed micro-tomography

    NASA Astrophysics Data System (ADS)

    Kravchenko, Alexandra; Negassa, Wakene; Guber, Andrey; Schmidt, Sonja

    2014-05-01

    Particulate soil organic matter (POM) is biologically and chemically active fraction of soil organic matter. It is a source of many agricultural and ecological benefits, among which are POM's contribution to C sequestration. Most of conventional research methods for studying organic matter dynamics involve measurements conducted on pre-processed i.e., ground and sieved soil samples. Unfortunately, grinding and sieving completely destroys soil structure, the component crucial for soil functioning and C protection. Importance of a better understanding of the role of soil structure and of the physical protection that it provides to soil C cannot be overstated; and analysis of quantities, characteristics, and decomposition rates of POM in soil samples with intact structure is among the key elements of gaining such understanding. However, a marked difficulty hindering the progress in such analyses is a lack of tools for identification and quantitative analysis of POM in intact soil samples. Recent advancement in applications of X-ray computed micro-tomography (μ-CT) to soil science has given an opportunity to conduct such analyses. The objective of the current study is to develop a procedure for identification and quantitative characterization of POM within intact soil samples using X-ray μ-CT images and to test performance of the proposed procedure on a set of multiple intact soil macro-aggregates. We used 16 4-6 mm soil aggregates collected at 0-15 cm depth from a Typic Hapludalf soil at multiple field sites with diverse agricultural management history. The aggregates have been scanned at SIMBIOS Centre, Dundee, Scotland at 10 micron resolution. POM was determined from the aggregate images using the developed procedure. The procedure was based on combining image pre-processing steps with discriminant analysis classification. The first component of the procedure consisted of image pre-processing steps based on the range of gray values (GV) along with shape and size of POM pieces. That was followed by discriminant analysis conducted using statistical and geostatistical characteristics of POM pieces. POM identified in the intact individual soil aggregates using the proposed procedure was in good agreement with POM measured in the studied aggregates using conventional lab method (R2=0.75). Of particular importance for accurate identification of POM in the images was the information on spatial characteristics of POM's GVs. Since this is the first attempt of POM determination, future work will be needed to explore how the proposed procedure performs under a variety of potentially influential factors, such as POM's origin and decomposition stage, X-ray scanning settings, image filtering and segmentation methods.

  11. Energy Navigation: Simulation Evaluation and Benefit Analysis

    NASA Technical Reports Server (NTRS)

    Williams, David H.; Oseguera-Lohr, Rosa M.; Lewis, Elliot T.

    2011-01-01

    This paper presents results from two simulation studies investigating the use of advanced flight-deck-based energy navigation (ENAV) and conventional transport-category vertical navigation (VNAV) for conducting a descent through a busy terminal area, using Continuous Descent Arrival (CDA) procedures. This research was part of the Low Noise Flight Procedures (LNFP) element within the Quiet Aircraft Technology (QAT) Project, and the subsequent Airspace Super Density Operations (ASDO) research focus area of the Airspace Project. A piloted simulation study addressed development of flight guidance, and supporting pilot and Air Traffic Control (ATC) procedures for high density terminal operations. The procedures and charts were designed to be easy to understand, and to make it easy for the crew to make changes via the Flight Management Computer Control-Display Unit (FMC-CDU) to accommodate changes from ATC.

  12. Bioprocessing feasibility analysis. [thymic hormone bioassay and electrophoresis

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The biology and pathophysiology of the thymus gland is discussed and a clinical procedure for thymic hormone assay is described. The separation of null lymphocytes from mice spleens and the functional characteristics of the cells after storage and transportation were investigated to develop a clinical procedure for thymic hormone assay, and to determine whether a ground-based approach will provide the desired end-product in sufficient quantities, or whether the microgravity of space should be exploited for more economical preparation of the hormone.

  13. Environmental Risk Assessment of dredging processes - application to Marin harbour (NW Spain)

    NASA Astrophysics Data System (ADS)

    Gómez, A. G.; García Alba, J.; Puente, A.; Juanes, J. A.

    2014-04-01

    A methodological procedure to estimate the environmental risk of dredging operations in aquatic systems has been developed. Environmental risk estimations are based on numerical models results, which provide an appropriated spatio-temporal framework analysis to guarantee an effective decision-making process. The methodological procedure has been applied on a real dredging operation in the port of Marin (NW Spain). Results from Marin harbour confirmed the suitability of the developed methodology and the conceptual approaches as a comprehensive and practical management tool.

  14. Workload Trend Analysis for the Military Graduate Medical Education Program in San Antonio

    DTIC Science & Technology

    2005-05-25

    Procedures 57 Introduction and Methodology 57 Results and Discussion 58 Craniotomy 61 Introduction and Methodology 61 Results and Discussion 62...distribution of major vascular procedures by age group for FY 00-04 36. WHMC and BAMC craniotomies for FY 00-04 by age group 37. WHMC and BAMC FY 00-04...average craniotomies by age group compared to required average based on RRC requirement 38. WHMC and BAMC distribution of craniotomies by age group for

  15. Investigation of lightweight designs and materials for LO2 and LH2 propellant tanks for space vehicles, phase 2 and phase 3

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Full size Tug LO2 and LH2 tank configurations were defined, based on selected tank geometries. These configurations were then locally modeled for computer stress analysis. A large subscale test tank, representing the selected Tug LO2 tank, was designed and analyzed. This tank was fabricated using procedures which represented production operations. An evaluation test program was outlined and a test procedure defined. The necessary test hardware was also fabricated.

  16. Automating approximate Bayesian computation by local linear regression.

    PubMed

    Thornton, Kevin R

    2009-07-07

    In several biological contexts, parameter inference often relies on computationally-intensive techniques. "Approximate Bayesian Computation", or ABC, methods based on summary statistics have become increasingly popular. A particular flavor of ABC based on using a linear regression to approximate the posterior distribution of the parameters, conditional on the summary statistics, is computationally appealing, yet no standalone tool exists to automate the procedure. Here, I describe a program to implement the method. The software package ABCreg implements the local linear-regression approach to ABC. The advantages are: 1. The code is standalone, and fully-documented. 2. The program will automatically process multiple data sets, and create unique output files for each (which may be processed immediately in R), facilitating the testing of inference procedures on simulated data, or the analysis of multiple data sets. 3. The program implements two different transformation methods for the regression step. 4. Analysis options are controlled on the command line by the user, and the program is designed to output warnings for cases where the regression fails. 5. The program does not depend on any particular simulation machinery (coalescent, forward-time, etc.), and therefore is a general tool for processing the results from any simulation. 6. The code is open-source, and modular.Examples of applying the software to empirical data from Drosophila melanogaster, and testing the procedure on simulated data, are shown. In practice, the ABCreg simplifies implementing ABC based on local-linear regression.

  17. Diagnostic flexible pharyngo-laryngoscopy: development of a procedure specific assessment tool using a Delphi methodology.

    PubMed

    Melchiors, Jacob; Henriksen, Mikael Johannes Vuokko; Dikkers, Frederik G; Gavilán, Javier; Noordzij, J Pieter; Fried, Marvin P; Novakovic, Daniel; Fagan, Johannes; Charabi, Birgitte W; Konge, Lars; von Buchwald, Christian

    2018-05-01

    Proper training and assessment of skill in flexible pharyngo-laryngoscopy are central in the education of otorhinolaryngologists. To facilitate an evidence-based approach to curriculum development in this field, a structured analysis of what constitutes flexible pharyngo-laryngoscopy is necessary. Our aim was to develop an assessment tool based on this analysis. We conducted an international Delphi study involving experts from twelve countries in five continents. Utilizing reiterative assessment, the panel defined the procedure and reached consensus (defined as 80% agreement) on the phrasing of an assessment tool. FIFTY PANELISTS COMPLETED THE DELPHI PROCESS. THE MEDIAN AGE OF THE PANELISTS WAS 44 YEARS (RANGE 33-64 YEARS). MEDIAN EXPERIENCE IN OTORHINOLARYNGOLOGY WAS 15 YEARS (RANGE 6-35 YEARS). TWENTY-FIVE WERE SPECIALIZED IN LARYNGOLOGY, 16 WERE HEAD AND NECK SURGEONS, AND NINE WERE GENERAL OTORHINOLARYNGOLOGISTS. AN ASSESSMENT TOOL WAS CREATED CONSISTING OF TWELVE DISTINCT ITEMS.: Conclusion The gathering of validity evidence for assessment of core procedural skills within Otorhinolaryngology is central to the development of a competence-based education. The use of an international Delphi panel allows for the creation of an assessment tool which is widely applicable and valid. This work allows for an informed approach to technical skills training for flexible pharyngo-laryngoscopy and as further validity evidence is gathered allows for a valid assessment of clinical performance within this important skillset.

  18. The presence-absence coliform test for monitoring drinking water quality.

    PubMed Central

    Rice, E W; Geldreich, E E; Read, E J

    1989-01-01

    The concern for improved monitoring of the sanitary quality of drinking water has prompted interest in alternative methods for the detection of total coliform bacteria. A simplified qualitative presence-absence test has been proposed as an alternate procedure for detecting coliform bacteria in potable water. In this paper data from four comparative studies were analyzed to compare the recovery of total coliform bacteria from drinking water using the presence-absence test, the multiple fermentation tube procedure, and the membrane filter technique. The four studies were of water samples taken from four different geographic areas of the United States: Hawaii, New England (Vermont and New Hampshire), Oregon, and Pennsylvania. Analysis of the results of these studies were compared, based upon the number of positive samples detected by each method. Combined recoveries showed the presence-absence test detected significantly higher numbers of samples with coliforms than either the fermentation tube or membrane filter methods, P less than 0.01. The fermentation tube procedure detected significantly more positive samples than the membrane filter technique, P less than 0.01. Based upon the analysis of the combined data base, it is clear that the presence-absence test is as sensitive as the current coliform methods for the examination of potable water. The presence-absence test offers a viable alternative to water utility companies that elect to use the frequency-of-occurrence approach for compliance monitoring. PMID:2493663

  19. A simplified method in comparison with comprehensive interaction incremental dynamic analysis to assess seismic performance of jacket-type offshore platforms

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Ajamy, A.; Asgarian, B.

    2015-12-01

    The primary goal of seismic reassessment procedures in oil platform codes is to determine the reliability of a platform under extreme earthquake loading. Therefore, in this paper, a simplified method is proposed to assess seismic performance of existing jacket-type offshore platforms (JTOP) in regions ranging from near-elastic to global collapse. The simplified method curve exploits well agreement between static pushover (SPO) curve and the entire summarized interaction incremental dynamic analysis (CI-IDA) curve of the platform. Although the CI-IDA method offers better understanding and better modelling of the phenomenon, it is a time-consuming and challenging task. To overcome the challenges, the simplified procedure, a fast and accurate approach, is introduced based on SPO analysis. Then, an existing JTOP in the Persian Gulf is presented to illustrate the procedure, and finally a comparison is made between the simplified method and CI-IDA results. The simplified method is very informative and practical for current engineering purposes. It is able to predict seismic performance elasticity to global dynamic instability with reasonable accuracy and little computational effort.

  20. Rapid Measurement of Soil Carbon in Rice Paddy Field of Lombok Island Indonesia Using Near Infrared Technology

    NASA Astrophysics Data System (ADS)

    Kusumo, B. H.; Sukartono, S.; Bustan, B.

    2018-02-01

    Measuring soil organic carbon (C) using conventional analysis is tedious procedure, time consuming and expensive. It is needed simple procedure which is cheap and saves time. Near infrared technology offers rapid procedure as it works based on the soil spectral reflectance and without any chemicals. The aim of this research is to test whether this technology able to rapidly measure soil organic C in rice paddy field. Soil samples were collected from rice paddy field of Lombok Island Indonesia, and the coordinates of the samples were recorded. Parts of the samples were analysed using conventional analysis (Walkley and Black) and some other parts were scanned using near infrared spectroscopy (NIRS) for soil spectral collection. Partial Least Square Regression (PLSR) Models were developed using data of soil C analysed using conventional analysis and data from soil spectral reflectance. The models were moderately successful to measure soil C in rice paddy field of Lombok Island. This shows that the NIR technology can be further used to monitor the C change in rice paddy soil.

  1. Solvent-free MALDI-MS for the analysis of a membrane protein via the mini ball mill approach: case study of bacteriorhodopsin.

    PubMed

    Trimpin, Sarah; Deinzer, Max L

    2007-01-01

    A mini ball mill (MBM) solvent-free matrix-assisted laser desorption/ionization (MALDI) mass spectrometry (MS) method allows for the analysis of bacteriorhodopsin (BR), an integral membrane protein that previously presented special analytical problems. For well-defined signals in the molecular ion region of the analytes, a desalting procedure of the MBM sample directly on the MALDI target plate was used to reduce adduction by sodium and other cations that are normally attendant with hydrophobic peptides and proteins as a result of the sample preparation procedure. Mass analysis of the intact hydrophobic protein and the few hydrophobic and hydrophilic tryptic peptides available in the digest is demonstrated with this robust new approach. MS and MS/MS spectra of BR tryptic peptides and intact protein were generally superior to the traditional solvent-based method using the desalted "dry" MALDI preparation procedure. The solvent-free method expands the range of peptides that can be effectively analyzed by MALDI-MS to those that are hydrophobic and solubility-limited.

  2. Statistical analysis of the calibration procedure for personnel radiation measurement instruments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, W.J.; Bengston, S.J.; Kalbeitzer, F.L.

    1980-11-01

    Thermoluminescent analyzer (TLA) calibration procedures were used to estimate personnel radiation exposure levels at the Idaho National Engineering Laboratory (INEL). A statistical analysis is presented herein based on data collected over a six month period in 1979 on four TLA's located in the Department of Energy (DOE) Radiological and Environmental Sciences Laboratory at the INEL. The data were collected according to the day-to-day procedure in effect at that time. Both gamma and beta radiation models are developed. Observed TLA readings of thermoluminescent dosimeters are correlated with known radiation levels. This correlation is then used to predict unknown radiation doses frommore » future analyzer readings of personnel thermoluminescent dosimeters. The statistical techniques applied in this analysis include weighted linear regression, estimation of systematic and random error variances, prediction interval estimation using Scheffe's theory of calibration, the estimation of the ratio of the means of two normal bivariate distributed random variables and their corresponding confidence limits according to Kendall and Stuart, tests of normality, experimental design, a comparison between instruments, and quality control.« less

  3. Effects of laser-aided circumferential supracrestal fiberotomy on root surfaces.

    PubMed

    Lee, Ji-Won; Park, Ki-Ho; Chung, Jong-Hyuk; Kim, Su-Jung

    2011-11-01

    To evaluate and compare the effects of circumferential supracrestal fiberotomy in vivo (using diode, CO(2), and Er∶YAG lasers) on the morphology and chemical composition of the root surface. Forty healthy premolar teeth, intended for extraction for orthodontic reasons, were used in this study. Root surfaces were treated using different laser methods, as follows: (1) control; (2) Er∶YAG laser (2.94 µm, 100 mJ, 10 Hz); (3) diode laser (808 nm, 1.2 W, continuous wave); and (4) CO(2) laser (10.6 µm, 3 W, continuous wave). Subsequently, the teeth were removed and subjected to scanning electron microscopic (SEM) examination and energy dispersive x-ray (EDX) spectrometric analysis. SEM analysis indicated that no thermal changes, including melting or carbonization, were observed following the lasing procedures. EDX analysis showed that the laser procedures resulted in similar mineral contents (weight % of calcium and phosphate) as compared to those in the control group. Based on these findings, we concluded that laser-aided procedures, when used at appropriate laser settings, preserve the original morphology and chemical composition of cementum.

  4. Training shelter volunteers to teach dog compliance.

    PubMed

    Howard, Veronica J; DiGennaro Reed, Florence D

    2014-01-01

    This study examined the degree to which training procedures influenced the integrity of behaviorally based dog training implemented by volunteers of an animal shelter. Volunteers were taught to implement discrete-trial obedience training to teach 2 skills (sit and wait) to dogs. Procedural integrity during the baseline and written instructions conditions was low across all participants. Although performance increased with use of a video model, integrity did not reach criterion levels until performance feedback and modeling were provided. Moreover, the integrity of the discrete-trial training procedure was significantly and positively correlated with dog compliance to instructions for all dyads. Correct implementation and compliance were observed when participants were paired with a novel dog and trainer, respectively, although generalization of procedural integrity from the discrete-trial sit procedure to the discrete-trial wait procedure was not observed. Shelter consumers rated the behavior change in dogs and trainers as socially significant. Implications of these findings and future directions for research are discussed. © Society for the Experimental Analysis of Behavior.

  5. Quality-Assurance Data for Routine Water Analyses by the U.S. Geological Survey Laboratory in Troy, New York - July 2001 Through June 2003

    USGS Publications Warehouse

    Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.

    2009-01-01

    The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's Lab Master data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality control samples analyzed from July 2001 through June 2003. Results for the quality-control samples for 19 analytical procedures were evaluated for bias and precision. Control charts indicate that data for six of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, chloride, magnesium, nitrate (ion chromatography), potassium, and sodium. The calcium procedure was biased throughout the analysis period for the high-concentration sample, but was within control limits. The total monomeric aluminum and fluoride procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The total aluminum, pH, specific conductance, and sulfate procedures were biased for the high-concentration and low-concentration samples, but were within control limits. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 16 of 18 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for the dissolved organic carbon or specific conductance procedures. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 18 of the 21 analytes. At least 90 percent of the samples met data-quality objectives for all procedures except total monomeric aluminum (83 percent of samples met objectives), total aluminum (76 percent of samples met objectives), ammonium (73 percent of samples met objectives), dissolved organic carbon (86 percent of samples met objectives), and nitrate (81 percent of samples met objectives). The data-quality objective was not met for the nitrite procedure. Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated satisfactory or above data quality over the time period, with most performance ratings for each sample in the good-to-excellent range. The N-sample (nutrient constituents) analysis had one unsatisfactory rating for the ammonium procedure in one study. The T-sample (trace constituents) analysis had one unsatisfactory rating for the magnesium procedure and one marginal rating for the potassium procedure in one study and one unsatisfactory rating for the sodium procedure in another. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 90 percent of the samples met data-quality objectives for 10 of the 14 analytes; the exceptions were acid-neutralizing capacity, ammonium, dissolved organic carbon, and sodium. Data-quality objectives were not met in 37 percent of samples analyzed for acid-neutralizing capacity, 28 percent of samples analyzed for dissolved organic carbon, and 30 percent of samples analyzed for sodium. Results indicate a positive bias for the ammonium procedure in one study and a negative bias in another. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 90 percent of the samples analyzed for calcium, chloride, magnesium, pH, potassium, and sodium. Data-quality objectives were met by 78 percent of

  6. A review of accuracy assessment for object-based image analysis: From per-pixel to per-polygon approaches

    NASA Astrophysics Data System (ADS)

    Ye, Su; Pontius, Robert Gilmore; Rakshit, Rahul

    2018-07-01

    Object-based image analysis (OBIA) has gained widespread popularity for creating maps from remotely sensed data. Researchers routinely claim that OBIA procedures outperform pixel-based procedures; however, it is not immediately obvious how to evaluate the degree to which an OBIA map compares to reference information in a manner that accounts for the fact that the OBIA map consists of objects that vary in size and shape. Our study reviews 209 journal articles concerning OBIA published between 2003 and 2017. We focus on the three stages of accuracy assessment: (1) sampling design, (2) response design and (3) accuracy analysis. First, we report the literature's overall characteristics concerning OBIA accuracy assessment. Simple random sampling was the most used method among probability sampling strategies, slightly more than stratified sampling. Office interpreted remotely sensed data was the dominant reference source. The literature reported accuracies ranging from 42% to 96%, with an average of 85%. A third of the articles failed to give sufficient information concerning accuracy methodology such as sampling scheme and sample size. We found few studies that focused specifically on the accuracy of the segmentation. Second, we identify a recent increase of OBIA articles in using per-polygon approaches compared to per-pixel approaches for accuracy assessment. We clarify the impacts of the per-pixel versus the per-polygon approaches respectively on sampling, response design and accuracy analysis. Our review defines the technical and methodological needs in the current per-polygon approaches, such as polygon-based sampling, analysis of mixed polygons, matching of mapped with reference polygons and assessment of segmentation accuracy. Our review summarizes and discusses the current issues in object-based accuracy assessment to provide guidance for improved accuracy assessments for OBIA.

  7. Percutaneous adhesiolysis procedures in the medicare population: analysis of utilization and growth patterns from 2000 to 2011.

    PubMed

    Manchikanti, Laxmaiah; Helm Ii, Standiford; Pampati, Vidyasagar; Racz, Gabor B

    2014-01-01

    Multiple reviews have shown that interventional techniques for chronic pain have increased dramatically over the years. Of these interventional techniques, both sacroiliac joint injections and facet joint interventions showed explosive growth, followed by epidural procedures. Percutaneous adhesiolysis procedures have not been assessed for their utilization patterns separately from epidural injections. An analysis of the utilization patterns of percutaneous adhesiolysis procedures in managing chronic low back pain in the Medicare population from 2000 to 2011. To assess the utilization and growth patterns of percutaneous adhesiolysis in managing chronic low back pain. The study was performed utilizing the Centers for Medicare and Medicaid Services (CMS) Physician Supplier Procedure Summary Master of Fee-For-Service (FFS) Data from 2000 to 2011. Percutaneous adhesiolysis procedures increased 47% with an annual growth rate of 3.6% in the FFS Medicare population from 2000 to 2011. These growth rates are significantly lower than the growth rates for sacroiliac joint injections (331%), facet joint interventions (308%), and epidural injections (130%), but substantially lower than lumbar transforaminal injections (665%) and lumbar facet joint neurolysis (544%). Study limitations include lack of inclusion of Medicare Advantage patients. In addition, the statewide data is based on claims which may include the contiguous or other states. Percutaneous adhesiolysis utilization increased moderately in Medicare beneficiaries from 2000 to 2011. Overall, there was an increase of 47% in the utilization of adhesiolysis procedures per 100,000 Medicare beneficiaries, with an annual geometric average increase of 3.6%.

  8. Determination of Hypochlorite in Bleaching Products with Flower Extracts to Demonstrate the Principles of Flow Injection Analysis

    ERIC Educational Resources Information Center

    Ramos, Luiz Antonio; Prieto, Katia Roberta; Carvalheiro, Eder Tadeu Gomes; Carvalheiro, Carla Cristina Schmitt

    2005-01-01

    The use of crude flower extracts to the principle of analytical chemistry automation, with the flow injection analysis (FIA) procedure developed to determine hypochlorite in household bleaching products was performed. The FIA comprises a group of techniques based on injection of a liquid sample into a moving, nonsegmented carrier stream of a…

  9. A Functional Analysis of Non-Vocal Verbal Behavior of a Young Child with Autism

    ERIC Educational Resources Information Center

    Normand, M. P.; Severtson, E. S.; Beavers, G. A.

    2008-01-01

    The functions of an American Sign Language response were experimentally evaluated with a young boy diagnosed with autism. A functional analysis procedure based on that reported by Lerman et al. (2005) was used to evaluate whether the target sign response would occur under mand, tact, mimetic, or control conditions. The target sign was observed…

  10. Analysis of swainsonine and swainsonine N-oxide as trimethylsilyl derivatives by Liquid Chromatography-Mass Spectrometry and their relative occurrence in plants toxic to livestock

    USDA-ARS?s Scientific Manuscript database

    A liquid chromatography-mass spectrometry method was developed for the analysis of the indolizidine alkaloid swainsonine and its N-oxide. The method is based on a one step solvent partitioning extraction procedure followed by trimethylsilylation of the dried extract and subsequent detection and qua...

  11. The Games Analysis Intervention: A Procedure to Increase the Peer Acceptance and Social Adjustment of a Retarded Child.

    ERIC Educational Resources Information Center

    Marlowe, Mike

    1979-01-01

    A study investigated the effectiveness of a therapeutic motor development program in increasing the social adjustment and peer acceptance of a mainstreamed 10-year-old educable mentally retarded boy. The motor development program was based on the games analysis model and involved the S and 13 of his normal classmates. (Author/DLS)

  12. 40 CFR 63.5997 - How do I conduct tests and procedures for tire cord production affected sources?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) to confirm the reported HAP content. If the results of an analysis by EPA Method 311 are different... determinations. (2) Unless you demonstrate otherwise, the HAP content analysis must be based on coatings prior to...) Methods to determine the mass percent of each HAP in coatings. (1) To determine the HAP content in the...

  13. Practical guidelines to select and scale earthquake records for nonlinear response history analysis of structures

    USGS Publications Warehouse

    Kalkan, Erol; Chopra, Anil K.

    2010-01-01

    Earthquake engineering practice is increasingly using nonlinear response history analysis (RHA) to demonstrate performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. Presented herein is a modal-pushover-based scaling (MPS) method to scale ground motions for use in nonlinear RHA of buildings and bridges. In the MPS method, the ground motions are scaled to match (to a specified tolerance) a target value of the inelastic deformation of the first-'mode' inelastic single-degree-of-freedom (SDF) system whose properties are determined by first-'mode' pushover analysis. Appropriate for first-?mode? dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-'mode' SDF system in selecting a subset of the scaled ground motions. Based on results presented for two bridges, covering single- and multi-span 'ordinary standard' bridge types, and six buildings, covering low-, mid-, and tall building types in California, the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.

  14. The SCALE Verified, Archived Library of Inputs and Data - VALID

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marshall, William BJ J; Rearden, Bradley T

    The Verified, Archived Library of Inputs and Data (VALID) at ORNL contains high quality, independently reviewed models and results that improve confidence in analysis. VALID is developed and maintained according to a procedure of the SCALE quality assurance (QA) plan. This paper reviews the origins of the procedure and its intended purpose, the philosophy of the procedure, some highlights of its implementation, and the future of the procedure and associated VALID library. The original focus of the procedure was the generation of high-quality models that could be archived at ORNL and applied to many studies. The review process associated withmore » model generation minimized the chances of errors in these archived models. Subsequently, the scope of the library and procedure was expanded to provide high quality, reviewed sensitivity data files for deployment through the International Handbook of Evaluated Criticality Safety Benchmark Experiments (IHECSBE). Sensitivity data files for approximately 400 such models are currently available. The VALID procedure and library continue fulfilling these multiple roles. The VALID procedure is based on the quality assurance principles of ISO 9001 and nuclear safety analysis. Some of these key concepts include: independent generation and review of information, generation and review by qualified individuals, use of appropriate references for design data and documentation, and retrievability of the models, results, and documentation associated with entries in the library. Some highlights of the detailed procedure are discussed to provide background on its implementation and to indicate limitations of data extracted from VALID for use by the broader community. Specifically, external users of data generated within VALID must take responsibility for ensuring that the files are used within the QA framework of their organization and that use is appropriate. The future plans for the VALID library include expansion to include additional experiments from the IHECSBE, to include experiments from areas beyond criticality safety, such as reactor physics and shielding, and to include application models. In the future, external SCALE users may also obtain qualification under the VALID procedure and be involved in expanding the library. The VALID library provides a pathway for the criticality safety community to leverage modeling and analysis expertise at ORNL.« less

  15. Cognitive Dissonance and Pediatric Procedural Pain Management: A Concept Clarification.

    PubMed

    Bice, April A

    2018-06-01

    Pediatric nurses have often reported that pain management is a vital part of patient care. Evidence, however, suggests pediatric procedural pain treatments are often underused. Cognitive dissonance, the mental conflict leading to unpleasant thoughts and or feelings, may be related to this evidence-based gap found between what pediatric nurses claim about procedural pain management (that it is important) and what they actually do (underutilize pain treatments). The purpose of this manuscript is to clarify and further develop the concept of cognitive dissonance in terms of its relationship to nurses' mental struggles with underutilization of pediatric procedural pain treatments. A more relevant and extended definition of cognitive dissonance is presented. The concept of cognitive dissonance was examined using Rodgers' evolutionary concept analysis approach/framework. Analysis Methods: Through a six-step process of concept identification, setting and sample identification, data collection, data analysis, and future implication discussion, a more accurate and representative definition of cognitive dissonance is described. Databases used included CINAHL, Google Scholar, PsycINFO, ERIC, and PubMed. Seminal, recent, and relevant works were included in the review to adequately develop and clarify the concept. Procedural pain management breech among pediatric nurses is proposed to occur before the mental conflict produced. The unpleasant mental conflict created after the breech is followed by the nurse's determination to reduce mental conflict through attitude change followed by cognition change, which more closely reflects his or her behavior. Copyright © 2017 American Society for Pain Management Nursing. Published by Elsevier Inc. All rights reserved.

  16. Evaluation of shoulder function in clavicular fracture patients after six surgical procedures based on a network meta-analysis.

    PubMed

    Huang, Shou-Guo; Chen, Bo; Lv, Dong; Zhang, Yong; Nie, Feng-Feng; Li, Wei; Lv, Yao; Zhao, Huan-Li; Liu, Hong-Mei

    2017-01-01

    Purpose Using a network meta-analysis approach, our study aims to develop a ranking of the six surgical procedures, that is, Plate, titanium elastic nail (TEN), tension band wire (TBW), hook plate (HP), reconstruction plate (RP) and Knowles pin, by comparing the post-surgery constant shoulder scores in patients with clavicular fracture (CF). Methods A comprehensive search of electronic scientific literature databases was performed to retrieve publications investigating surgical procedures in CF, with the stringent eligible criteria, and clinical experimental studies of high quality and relevance to our area of interest were selected for network meta-analysis. Statistical analyses were conducted using Stata 12.0. Results A total of 19 studies met our inclusion criteria were eventually enrolled into our network meta-analysis, representing 1164 patients who had undergone surgical procedures for CF (TEN group = 240; Plate group = 164; TBW group  =  180; RP group  =  168; HP group  =  245; Knowles pin group  =  167). The network meta-analysis results revealed that RP significantly improved constant shoulder score in patients with CF when compared with TEN, and the post-operative constant shoulder scores in patients with CF after Plate, TBW, HP, Knowles pin and TEN were similar with no statistically significant differences. The treatment relative ranking of predictive probabilities of constant shoulder scores in patients with CF after surgery revealed the surface under the cumulative ranking curves (SUCRA) value is the highest in RP. Conclusion The current network meta-analysis suggests that RP may be the optimum surgical treatment among six inventions for patients with CF, and it can improve the shoulder score of patients with CF. Implications for Rehabilitation RP improves shoulder joint function after surgical procedure. RP achieves stability with minimal complications after surgery. RP may be the optimum surgical treatment for rehabilitation of patients with CF.

  17. Quality-Assurance Data for Routine Water Analyses by the U.S. Geological Survey Laboratory in Troy, New York--July 1999 through June 2001

    USGS Publications Warehouse

    Lincoln, Tricia A.; Horan-Ross, Debra A.; McHale, Michael R.; Lawrence, Gregory B.

    2006-01-01

    The laboratory for analysis of low-ionic-strength water at the U.S. Geological Survey (USGS) Water Science Center in Troy, N.Y., analyzes samples collected by USGS projects throughout the Northeast. The laboratory's quality-assurance program is based on internal and interlaboratory quality-assurance samples and quality-control procedures that were developed to ensure proper sample collection, processing, and analysis. The quality-assurance and quality-control data were stored in the laboratory's LabMaster data-management system, which provides efficient review, compilation, and plotting of data. This report presents and discusses results of quality-assurance and quality-control samples analyzed from July 1999 through June 2001. Results for the quality-control samples for 18 analytical procedures were evaluated for bias and precision. Control charts indicate that data for eight of the analytical procedures were occasionally biased for either high-concentration or low-concentration samples but were within control limits; these procedures were: acid-neutralizing capacity, total monomeric aluminum, total aluminum, calcium, chloride and nitrate (ion chromatography and colormetric method) and sulfate. The total aluminum and dissolved organic carbon procedures were biased throughout the analysis period for the high-concentration sample, but were within control limits. The calcium and specific conductance procedures were biased throughout the analysis period for the low-concentration sample, but were within control limits. The magnesium procedure was biased for the high-concentration and low concentration samples, but was within control limits. Results from the filter-blank and analytical-blank analyses indicate that the procedures for 14 of 15 analytes were within control limits, although the concentrations for blanks were occasionally outside the control limits. The data-quality objective was not met for dissolved organic carbon. Sampling and analysis precision are evaluated herein in terms of the coefficient of variation obtained for triplicate samples in the procedures for 17 of the 18 analytes. At least 90 percent of the samples met data-quality objectives for all analytes except ammonium (81 percent of samples met objectives), chloride (75 percent of samples met objectives), and sodium (86 percent of samples met objectives). Results of the USGS interlaboratory Standard Reference Sample (SRS) Project indicated good data quality over the time period, with most ratings for each sample in the good to excellent range. The P-sample (low-ionic-strength constituents) analysis had one satisfactory rating for the specific conductance procedure in one study. The T-sample (trace constituents) analysis had one satisfactory rating for the aluminum procedure in one study and one unsatisfactory rating for the sodium procedure in another. The remainder of the samples had good or excellent ratings for each study. Results of Environment Canada's National Water Research Institute (NWRI) program indicated that at least 89 percent of the samples met data-quality objectives for 10 of the 14 analytes; the exceptions were ammonium, total aluminum, dissolved organic carbon, and sodium. Results indicate a positive bias for the ammonium procedure in all studies. Data-quality objectives were not met in 50 percent of samples analyzed for total aluminum, 38 percent of samples analyzed for dissolved organic carbon, and 27 percent of samples analyzed for sodium. Results from blind reference-sample analyses indicated that data-quality objectives were met by at least 91 percent of the samples analyzed for calcium, chloride, fluoride, magnesium, pH, potassium, and sulfate. Data-quality objectives were met by 75 percent of the samples analyzed for sodium and 58 percent of the samples analyzed for specific conductance.

  18. Racial/ethnic disparities in provision of dental procedures to children enrolled in Delta Dental insurance in Milwaukee, Wisconsin.

    PubMed

    Bhagavatula, Pradeep; Xiang, Qun; Eichmiller, Fredrick; Szabo, Aniko; Okunseri, Christopher

    2014-01-01

    Most studies on the provision of dental procedures have focused on Medicaid enrollees known to have inadequate access to dental care. Little information on private insurance enrollees exists. This study documents the rates of preventive, restorative, endodontic, and surgical dental procedures provided to children enrolled in Delta Dental of Wisconsin (DDWI) in Milwaukee. We analyzed DDWI claims data for Milwaukee children aged 0-18 years between 2002 and 2008. We linked the ZIP codes of enrollees to the 2000 U.S. Census information to derive racial/ethnic estimates in the different ZIP codes. We estimated the rates of preventive, restorative, endodontic, and surgical procedures provided to children in different racial/ethnic groups based on the population estimates derived from the U.S. Census data. Descriptive and multivariable analysis was done using Poisson regression modeling on dental procedures per year. In 7 years, a total of 266,380 enrollees were covered in 46 ZIP codes in the database. Approximately, 64 percent, 44 percent, and 49 percent of White, African American, and Hispanic children had at least one dental visit during the study period, respectively. The rates of preventive procedures increased up to the age of 9 years and decreased thereafter among children in all three racial groups included in the analysis. African American and Hispanic children received half as many preventive procedures as White children. Our study shows that substantial racial disparities may exist in the types of dental procedures that were received by children. © 2012 American Association of Public Health Dentistry.

  19. Evaluation of standardized and applied variables in predicting treatment outcomes of polytrauma patients.

    PubMed

    Aksamija, Goran; Mulabdic, Adi; Rasic, Ismar; Muhovic, Samir; Gavric, Igor

    2011-01-01

    Polytrauma is defined as an injury where they are affected by at least two different organ systems or body, with at least one life-threatening injuries. Given the multilevel model care of polytrauma patients within KCUS are inevitable weaknesses in the management of this category of patients. To determine the dynamics of existing procedures in treatment of polytrauma patients on admission to KCUS, and based on statistical analysis of variables applied to determine and define the factors that influence the final outcome of treatment, and determine their mutual relationship, which may result in eliminating the flaws in the approach to the problem. The study was based on 263 polytrauma patients. Parametric and non-parametric statistical methods were used. Basic statistics were calculated, based on the calculated parameters for the final achievement of research objectives, multicoleration analysis, image analysis, discriminant analysis and multifactorial analysis were used. From the universe of variables for this study we selected sample of n = 25 variables, of which the first two modular, others belong to the common measurement space (n = 23) and in this paper defined as a system variable methods, procedures and assessments of polytrauma patients. After the multicoleration analysis, since the image analysis gave a reliable measurement results, we started the analysis of eigenvalues, that is defining the factors upon which they obtain information about the system solve the problem of the existing model and its correlation with treatment outcome. The study singled out the essential factors that determine the current organizational model of care, which may affect the treatment and better outcome of polytrauma patients. This analysis has shown the maximum correlative relationships between these practices and contributed to development guidelines that are defined by isolated factors.

  20. An automated procedure for detection of IDP's dwellings using VHR satellite imagery

    NASA Astrophysics Data System (ADS)

    Jenerowicz, Malgorzata; Kemper, Thomas; Soille, Pierre

    2011-11-01

    This paper presents the results for the estimation of dwellings structures in Al Salam IDP Camp, Southern Darfur, based on Very High Resolution multispectral satellite images obtained by implementation of Mathematical Morphology analysis. A series of image processing procedures, feature extraction methods and textural analysis have been applied in order to provide reliable information about dwellings structures. One of the issues in this context is related to similarity of the spectral response of thatched dwellings' roofs and the surroundings in the IDP camps, where the exploitation of multispectral information is crucial. This study shows the advantage of automatic extraction approach and highlights the importance of detailed spatial and spectral information analysis based on multi-temporal dataset. The additional data fusion of high-resolution panchromatic band with lower resolution multispectral bands of WorldView-2 satellite has positive influence on results and thereby can be useful for humanitarian aid agency, providing support of decisions and estimations of population especially in situations when frequent revisits by space imaging system are the only possibility of continued monitoring.

Top