Sample records for practical analysis method

  1. A practical reliability analysis method for engineers

    Microsoft Academic Search

    Chang Huajian; Shi Yongchang

    1995-01-01

    According to the basic theory of interference, we assume that the material resistance and complicated engineering structure stress follow normal distributions. A Practical Reliability Analysis Method for Engineers (PRAME) has been developed in order to avoid directly seeking the function of stress intensity related to structure size and external loads. The stress given any independent random variables of sizes and

  2. A Practical Method of Policy Analysis by Simulating Policy Options

    ERIC Educational Resources Information Center

    Phelps, James L.

    2011-01-01

    This article focuses on a method of policy analysis that has evolved from the previous articles in this issue. The first section, "Toward a Theory of Educational Production," identifies concepts from science and achievement production to be incorporated into this policy analysis method. Building on Kuhn's (1970) discussion regarding paradigms, the…

  3. A Practical Method for the Direct Analysis of Transient Stability

    Microsoft Academic Search

    T. Athay; R. Podmore; S. Virmani

    1979-01-01

    This paper describes the development and evaluation of an analytical method for the direct determination of transient stability. The method developed is based on the analysis of transient energy and accounts for the nature of the system disturbance as well as for the effects of transfer conductances onsystenmbehavior. It has been evaluated on a 10 generator 39 bus system and

  4. Buckling analysis in deviated wells: A practical method

    SciTech Connect

    Mitchell, R.F.

    1996-12-31

    Current helical buckling models are valid for vertical wells, but provide only approximate solutions for horizontal wells. Solutions of the non-linear buckling equations for arbitrary well deviation have been developed, but are too complex for practical use. This paper presents a set of correlations that match the exact solutions extremely well, but are simple to use. These correlations show the effects of well deviation on buckling shape, tubing length change, contact force and bending stress.

  5. A Practical Method of Policy Analysis by Estimating Effect Size

    ERIC Educational Resources Information Center

    Phelps, James L.

    2011-01-01

    The previous articles on class size and other productivity research paint a complex and confusing picture of the relationship between policy variables and student achievement. Missing is a conceptual scheme capable of combining the seemingly unrelated research and dissimilar estimates of effect size into a unified structure for policy analysis and…

  6. AN ANALYSIS OF SOME PRACTICAL METHODS FOR ESTIMATING HEATS OF COMBUSTION IN

    E-print Network

    Paris-Sud XI, Université de

    99-42 AN ANALYSIS OF SOME PRACTICAL METHODS FOR ESTIMATING HEATS OF COMBUSTION IN FIRE SAFETY (*) Factory Mutual Research Corporation, Norwood, Ma, USA ABSTRACT The theoretical (net) heat of combustion of the heats of combustion, that is to say when at most a simple datasheet processor is the only tool required

  7. Review of Bayesian statistical analysis methods for cytogenetic radiation biodosimetry, with a practical example.

    PubMed

    Ainsbury, Elizabeth A; Vinnikov, Volodymyr A; Puig, Pedro; Higueras, Manuel; Maznyk, Nataliya A; Lloyd, David C; Rothkamm, Kai

    2014-12-01

    Classical methods of assessing the uncertainty associated with radiation doses estimated using cytogenetic techniques are now extremely well defined. However, several authors have suggested that a Bayesian approach to uncertainty estimation may be more suitable for cytogenetic data, which are inherently stochastic in nature. The Bayesian analysis framework focuses on identification of probability distributions (for yield of aberrations or estimated dose), which also means that uncertainty is an intrinsic part of the analysis, rather than an 'afterthought'. In this paper Bayesian, as well as some more advanced classical, data analysis methods for radiation cytogenetics are reviewed that have been proposed in the literature. A practical overview of Bayesian cytogenetic dose estimation is also presented, with worked examples from the literature. PMID:24282320

  8. A topography analysis incorporated optimization method for the selection and placement of best management practices.

    PubMed

    Shen, Zhenyao; Chen, Lei; Xu, Liang

    2013-01-01

    Best Management Practices (BMPs) are one of the most effective methods to control nonpoint source (NPS) pollution at a watershed scale. In this paper, the use of a topography analysis incorporated optimization method (TAIOM) was proposed, which integrates topography analysis with cost-effective optimization. The surface status, slope and the type of land use were evaluated as inputs for the optimization engine. A genetic algorithm program was coded to obtain the final optimization. The TAIOM was validated in conjunction with the Soil and Water Assessment Tool (SWAT) in the Yulin watershed in Southwestern China. The results showed that the TAIOM was more cost-effective than traditional optimization methods. The distribution of selected BMPs throughout landscapes comprising relatively flat plains and gentle slopes, suggests the need for a more operationally effective scheme, such as the TAIOM, to determine the practicability of BMPs before widespread adoption. The TAIOM developed in this study can easily be extended to other watersheds to help decision makers control NPS pollution. PMID:23349917

  9. A practical method to estimate the cost of equity capital for a firm using cluster analysis

    Microsoft Academic Search

    Marcus Ingram; Speros Margetis

    2010-01-01

    Purpose – The purpose of this paper is to propose and test a method for selecting a portfolio of public firms which can be used for computing the cost of equity capital for a non-public firm or division of a firm. Design\\/methodology\\/approach – This method relies on cluster analysis and a large sample of firms. Using the accounting data from

  10. Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory?Determination of Trihalomethane Formation Potential, Method Validation, and Quality-Control Practices

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Fram, Miranda S.; Bush, Noel

    2004-01-01

    An analytical method for the determination of the trihalomethane formation potential of water samples has been developed. The trihalomethane formation potential is measured by dosing samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine, and then analyzing the resulting trihalomethanes by purge and trap/gas chromatography equipped with an electron capture detector. Detailed explanations of the method and quality-control practices are provided. Method validation experiments showed that the trihalomethane formation potential varies as a function of time between sample collection and analysis, residual-free chlorine concentration, method of sample dilution, and the concentration of bromide in the sample.

  11. Empowering Discourse: Discourse Analysis as Method and Practice in the Sociology Classroom

    ERIC Educational Resources Information Center

    Hjelm, Titus

    2013-01-01

    Collaborative learning and critical pedagogy are widely recognized as "empowering" pedagogies for higher education. Yet, the practical implementation of both has a mixed record. The question, then, is: How could collaborative and critical pedagogies be empowered themselves? This paper makes a primarily theoretical case for discourse…

  12. A practical method for reliability analysis of phased-mission systems

    Microsoft Academic Search

    Suprasad V. Amari; ASQ CRE

    2011-01-01

    SUMMARY & CONCLUSIONS Many practical systems are phased-mission systems where the mission consists of multiple, consecutive, nonoverlapping phases. For the mission to be a success, the system must operate successfully during each of the phases. In each phase, the system has to accomplish a specific task and may be subject to different stresses. Thus, the system configuration, success criteria, and

  13. Practical Thermal Evaluation Methods For HAC Fire Analysis In Type B Radiaoactive Material (RAM) Packages

    SciTech Connect

    Abramczyk, Glenn; Hensel, Stephen J; Gupta, Narendra K.

    2013-03-28

    Title 10 of the United States Code of Federal Regulations Part 71 for the Nuclear Regulatory Commission (10 CFR Part 71.73) requires that Type B radioactive material (RAM) packages satisfy certain Hypothetical Accident Conditions (HAC) thermal design requirements to ensure package safety during accidental fire conditions. Compliance with thermal design requirements can be met by prototype tests, analyses only or a combination of tests and analyses. Normally, it is impractical to meet all the HAC using tests only and the analytical methods are too complex due to the multi-physics non-linear nature of the fire event. Therefore, a combination of tests and thermal analyses methods using commercial heat transfer software are used to meet the necessary design requirements. The authors, along with his other colleagues at Savannah River National Laboratory in Aiken, SC, USA, have successfully used this 'tests and analyses' approach in the design and certification of several United States' DOE/NNSA certified packages, e.g. 9975, 9977, 9978, 9979, H1700, and Bulk Tritium Shipping Package (BTSP). This paper will describe these methods and it is hoped that the RAM Type B package designers and analysts can use them for their applications.

  14. Formal Fault Tree Analysis: Practical Experiences

    E-print Network

    Paris-Sud XI, Université de

    AVoCS 2006 Formal Fault Tree Analysis: Practical Experiences Frank Ortmeier Gerhard Schellhorn spread safety analysis methods: fault tree analysis (FTA). Formal FTA allows to rigorously reason about FTA by using model checking. Keywords: fault tree analysis, dependability, safety analysis, formal

  15. Practice development: a concept analysis

    Microsoft Academic Search

    M. Hanrahan

    2004-01-01

    Exploration of the term `practice development' is required for the discipline of infection control nursing. Improved understanding of the term would allow practitioners to approach practice development in a more constructive and measurable fashion. A concept analysis based on the model of Walker and Avant is therefore presented. The analysis includes the definition of the term `practice development' and discussion

  16. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: practical guidelines for the crystalline sponge method

    SciTech Connect

    Ramadhar, Timothy R. [Department of Biological Chemistry and Molecular Pharmacology, Harvard Medical School, 240 Longwood Avenue, Boston, Massachusetts, 02115 (United States); Zheng, Shao-Liang [Department of Chemistry and Chemical Biology, Harvard University, 12 Oxford Street, Cambridge, Massachusetts, 02138 (United States); Chen, Yu-Sheng [ChemMatCARS, Center for Advanced Radiation Sources, The University of Chicago c/o Advanced Photon Source, Argonne National Laboratory, 9700 South Cass Avenue, Argonne, Illinois, 60439 (United States); Clardy, Jon, E-mail: jon-clardy@hms.harvard.edu [Department of Biological Chemistry and Molecular Pharmacology, Harvard Medical School, 240 Longwood Avenue, Boston, Massachusetts, 02115 (United States)

    2015-01-01

    This report describes complete practical guidelines and insights for the crystalline sponge method, which have been derived through the first use of synchrotron radiation on these systems, and includes a procedure for faster synthesis of the sponges. These guidelines will be applicable to crystal sponge data collected at synchrotrons or in-house facilities, and will allow researchers to obtain reliable high-quality data and construct chemically and physically sensible models for guest structural determination. A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high-quality data, and will allow construction of chemically and physically sensible models for guest structural determination.

  17. Methods of Cognitive Analysis to Support the Design and Evaluation of Biomedical Systems: The Case of Clinical Practice Guidelines

    Microsoft Academic Search

    Vimla L. Patel; José F. Arocha; Melissa Diermeier; Robert A. Greenes; Edward H. Shortliffe

    2001-01-01

    This article provides a theoretical and methodological framework for the use of cognitive analysis to support the representation of biomedical knowledge and the design of clinical systems, using clinical-practice guidelines (CPGs) as an example. We propose that propositional and semantic analyses, when used as part of the system-development process, can improve the validity, usability, and comprehension of the resulting biomedical

  18. Advanced practice nursing: a concept analysis.

    PubMed

    Dowling, Maura; Beauchesne, Michelle; Farrelly, Frances; Murphy, Kathy

    2013-04-01

    A variety of terms are used to describe advanced practice nursing roles internationally. This has resulted in confusion in terminology around these roles. The aim of this concept analysis was to clarify what is meant by advanced practice nursing internationally, what attributes signify advanced practice nursing and what are its antecedents, consequences, references and related terms. Rodgers's evolutionary method of concept analysis was used. Data sources included Medline, CINAHL, Applied Social Sciences Index and Abstracts (ASSIA), Cochrane Library, Science Direct, SCOPUS, Web of Science, Dissertation Abstracts and DARE as well as relevant nursing texts and professional organization websites. The analysis reveals that there are many different articulations of the advanced practice nursing role outlined in the literature. This variety in terminology hinders developments in advanced practice nursing roles. Consensus on advanced practice nursing definitions, terminology, educational requirements and regulatory approaches is integral to the implementation of the advanced practice nursing role internationally. PMID:23577970

  19. A Critical Analysis of SocINDEX and Sociological Abstracts Using an Evaluation Method for the Practicing Bibliographer

    ERIC Educational Resources Information Center

    Mellone, James T.

    2010-01-01

    This study provides a database evaluation method for the practicing bibliographer that is more than a brief review yet less than a controlled experiment. The author establishes evaluation criteria in the context of the bibliographic instruction provided to meet the research requirements of undergraduate sociology majors at Queens College, City…

  20. Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory-- Determination of Dissolved Organic Carbon in Water by High Temperature Catalytic Oxidation, Method Validation, and Quality-Control Practices

    USGS Publications Warehouse

    Bird, Susan M.; Fram, Miranda S.; Crepeau, Kathryn L.

    2003-01-01

    An analytical method has been developed for the determination of dissolved organic carbon concentration in water samples. This method includes the results of the tests used to validate the method and the quality-control practices used for dissolved organic carbon analysis. Prior to analysis, water samples are filtered to remove suspended particulate matter. A Shimadzu TOC-5000A Total Organic Carbon Analyzer in the nonpurgeable organic carbon mode is used to analyze the samples by high temperature catalytic oxidation. The analysis usually is completed within 48 hours of sample collection. The laboratory reporting level is 0.22 milligrams per liter.

  1. Good practices for quantitative bias analysis.

    PubMed

    Lash, Timothy L; Fox, Matthew P; MacLehose, Richard F; Maldonado, George; McCandless, Lawrence C; Greenland, Sander

    2014-12-01

    Quantitative bias analysis serves several objectives in epidemiological research. First, it provides a quantitative estimate of the direction, magnitude and uncertainty arising from systematic errors. Second, the acts of identifying sources of systematic error, writing down models to quantify them, assigning values to the bias parameters and interpreting the results combat the human tendency towards overconfidence in research results, syntheses and critiques and the inferences that rest upon them. Finally, by suggesting aspects that dominate uncertainty in a particular research result or topic area, bias analysis can guide efficient allocation of sparse research resources. The fundamental methods of bias analyses have been known for decades, and there have been calls for more widespread use for nearly as long. There was a time when some believed that bias analyses were rarely undertaken because the methods were not widely known and because automated computing tools were not readily available to implement the methods. These shortcomings have been largely resolved. We must, therefore, contemplate other barriers to implementation. One possibility is that practitioners avoid the analyses because they lack confidence in the practice of bias analysis. The purpose of this paper is therefore to describe what we view as good practices for applying quantitative bias analysis to epidemiological data, directed towards those familiar with the methods. We focus on answering questions often posed to those of us who advocate incorporation of bias analysis methods into teaching and research. These include the following. When is bias analysis practical and productive? How does one select the biases that ought to be addressed? How does one select a method to model biases? How does one assign values to the parameters of a bias model? How does one present and interpret a bias analysis?. We hope that our guide to good practices for conducting and presenting bias analyses will encourage more widespread use of bias analysis to estimate the potential magnitude and direction of biases, as well as the uncertainty in estimates potentially influenced by the biases. PMID:25080530

  2. Doing Conversation Analysis: A Practical Guide.

    ERIC Educational Resources Information Center

    ten Have, Paul

    Noting that conversation analysis (CA) has developed into one of the major methods of analyzing speech in the disciplines of communications, linguistics, anthropology and sociology, this book demonstrates in a practical way how to become a conversation analyst. As well as providing an overall introduction to the approach, it focuses on the…

  3. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: practical guidelines for the crystalline sponge method.

    PubMed

    Ramadhar, Timothy R; Zheng, Shao Liang; Chen, Yu Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal-organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high-quality data, and will allow construction of chemically and physically sensible models for guest structural determination. PMID:25537388

  4. Method for optical inspection of nanoscale objects based upon analysis of their defocused images and features of its practical implementation.

    PubMed

    Ryabko, M V; Koptyaev, S N; Shcherbakov, A V; Lantsov, A D; Oh, S Y

    2013-10-21

    A microscopic method to inspect isolated sub 100 nm scale structures made of silicon is presented. This method is based upon an analysis of light intensity distributions at defocused images obtained along the optical axis normal to the sample plane. Experimental measurements of calibrated lines (height 50 nm, length 100 ?m, and widths of 40-150 nm in 10 nm steps) on top of a monocrystalline silicon substrate are presented. Library of defocused images of calibrated lines is obtained experimentally and numerically with accordance to experimental setup parameters and measurements conditions. Processing of the measured defocused images and comparison with simulated ones from library allow one to distinguish between objects with a 10 nm change in width. It is shown that influence of optical system aberrations must be taken into account in order to achieve coincidence between simulation and measured results and increase accuracy of line width inspection accuracy. The limits of accuracy for object width measurements using this optical method are discussed. PMID:24150293

  5. Practical state of health estimation of power batteries based on Delphi method and grey relational grade analysis

    NASA Astrophysics Data System (ADS)

    Sun, Bingxiang; Jiang, Jiuchun; Zheng, Fangdan; Zhao, Wei; Liaw, Bor Yann; Ruan, Haijun; Han, Zhiqiang; Zhang, Weige

    2015-05-01

    The state of health (SOH) estimation is very critical to battery management system to ensure the safety and reliability of EV battery operation. Here, we used a unique hybrid approach to enable complex SOH estimations. The approach hybridizes the Delphi method known for its simplicity and effectiveness in applying weighting factors for complicated decision-making and the grey relational grade analysis (GRGA) for multi-factor optimization. Six critical factors were used in the consideration for SOH estimation: peak power at 30% state-of-charge (SOC), capacity, the voltage drop at 30% SOC with a C/3 pulse, the temperature rises at the end of discharge and charge at 1C; respectively, and the open circuit voltage at the end of charge after 1-h rest. The weighting of these factors for SOH estimation was scored by the 'experts' in the Delphi method, indicating the influencing power of each factor on SOH. The parameters for these factors expressing the battery state variations are optimized by GRGA. Eight battery cells were used to illustrate the principle and methodology to estimate the SOH by this hybrid approach, and the results were compared with those based on capacity and power capability. The contrast among different SOH estimations is discussed.

  6. An analysis of the teaching methods and sources of information used in adopting improved practices in rice production in Texas

    E-print Network

    Kibria, A. K. M. Anwarul

    1967-01-01

    prepara- tion, fertilization, irrigation, weed control, control of insect. and diseases, and harvesting are considered to be seven essential steps for rice production. CHAPTER I I LITERATURE REVIEW Baker, in his study in Rice County, Minnesota... made in Chenango, Monroe, and Jefferson Coun. ties of New York, Wilson and Crossby interviewed 1?005 farmers and found that 51 percent adopt, ed practices to control smut disease of oats by treating seeds with fungicides as advocated through the mass...

  7. Project evaluation : a practical asset pricing method

    E-print Network

    Jacoby, Henry D.

    1992-01-01

    This paper presents a practical approach to project evaluation using techniques of modern financial economics, with a sample application to oil development under a complex tax system. The method overcomes shortcomings of ...

  8. The piezoelectric sorption technique: a practical method

    E-print Network

    Flipse, Eugene Charles

    1983-01-01

    THE PIEZOELECTRIC SORPTION TECHNIQUE, A PRACTICAL METHOD A Thesis by EUGENE CHARLES FLIPSE Submitted to the Graduate College of Texas A&M University in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE August 1983... Major Subject: Chemical Engineering THE PI E ZOELECTRI C SORPTI ON TECHNI QUE, A PRACTICAL METHOD A Thesis by EUGENE CHARLES FLIPSE Approved as to style and content by: J. C. Holste (Chairman of Committee) C. J. Glover (Member) K. Lou (Member...

  9. Assessment of management in general practice: validation of a practice visit method.

    PubMed Central

    van den Hombergh, P; Grol, R; van den Hoogen, H J; van den Bosch, W J

    1998-01-01

    BACKGROUND: Practice management (PM) in general practice is as yet ill-defined; a systematic description of its domain, as well as a valid method to assess it, are necessary for research and assessment. AIM: To develop and validate a method to assess PM of general practitioners (GPs) and practices. METHOD: Relevant and potentially discriminating indicators were selected from a systematic framework of 2410 elements of PM to be used in an assessment method (VIP = visit instrument PM). The method was first tested in a pilot study and, after revision, was evaluated in order to select discriminating indicators and to determine validity of dimensions (factor and reliability analysis, linear regression). RESULTS: One hundred and ten GPs were assessed with the practice visit method using 249 indicators; 208 of these discriminated sufficiently at practice level or at GP level. Factor analysis resulted in 34 dimensions and in a taxonomy of PM. Dimensions and indicators showed marked variation between GPs and practices. Training practices scored higher on five dimensions; single-handed and dispensing practices scored lower on delegated tasks, but higher on accessibility and availability. CONCLUSION: A visit method to assess PM has been developed and its validity studied systematically. The taxonomy and dimensions of PM were in line with other classifications. Selection of a balanced number of useful and relevant indicators was nevertheless difficult. The dimensions could discriminate between groups of GPs and practices, establishing the value of the method for assessment. The VIP method could be an important contribution to the introduction of continuous quality improvement in the profession. PMID:10198481

  10. A Method for Optimizing Waste Management and Disposal Practices Using a Group-Based Uncertainty Model for the Analysis of Characterization Data - 13191

    SciTech Connect

    Simpson, A.; Clapham, M.; Lucero, R.; West, J. [Pajarito Scientific Corporation, 2976 Rodeo Park Drive East, Santa Fe, NM 87505 (United States)] [Pajarito Scientific Corporation, 2976 Rodeo Park Drive East, Santa Fe, NM 87505 (United States)

    2013-07-01

    It is a universal requirement for characterization of radioactive waste, that the consignor shall calculate and report a Total Measurement Uncertainty (TMU) value associated with each of the measured quantities such as nuclide activity. For Non-destructive Assay systems, the TMU analysis is typically performed on an individual container basis. However, in many cases, the waste consignor treats, transports, stores and disposes of containers in groups for example by over-packing smaller containers into a larger container or emplacing containers into groups for final disposal. The current standard practice for container-group data analysis is usually to treat each container as independent and uncorrelated and use a simple summation / averaging method (or in some cases summation of TMU in quadrature) to define the overall characteristics and associated uncertainty in the container group. In reality, many groups of containers are assayed on the same system, so there will be a large degree of co-dependence in the individual uncertainty elements. Many uncertainty terms may be significantly reduced when addressing issues such as the source position and variability in matrix contents over large populations. The systematic terms encompass both inherently 'two-directional' random effects (e.g. variation of source position) and other terms that are 'one-directional' i.e. designed to account for potential sources of bias. An analysis has been performed with population groups of a variety of non-destructive assay platforms in order to define a quantitative mechanism for waste consignors to determine overall TMU for batches of containers that have been assayed on the same system. (authors)

  11. Task Analysis Strategies and Practices. Practice Application Brief.

    ERIC Educational Resources Information Center

    Brown, Bettina Lankard

    Worker-oriented, job-oriented, and cognitive task analyses have all been used as tools for closing the gap between what curriculum teaches and what workers do. Although they share a commonality of purpose, the focus, cost, and practicality of task analysis techniques vary. Worker-oriented task analysis focuses on general human behaviors required…

  12. A practical application of wavelet moment method on the quantitative analysis of Shuanghuanglian oral liquid based on three-dimensional fingerprint spectra.

    PubMed

    Chen, Jing; Li, Bao Qiong; Zhai, Hong Lin; Lü, Wen Juan; Zhang, Xiao Yun

    2014-07-25

    The overlapping and shifts of peaks and noise signals appear mostly in high performance liquid chromatography (HPLC) experiments. A practical application of wavelet moment method on the quantitative analysis of the main active components in Shuanghuanglian oral liquid samples was presented based on the determination of HPLC coupled with photodiode array detector (PAD). The wavelet moments were calculated from the divided regions in the grayscale images of three-dimensional (3D) HPLC-PAD fingerprint spectra according to the target peak(s), and then used to establish linear models, respectively. The correlation coefficients (R) were more than 0.9980 within the test ranges. The intra- and inter-day variations were less than 1.13% and 1.10%, respectively. The recovery ranged from 96.2% to 102.7%. The overall LODs and LOQs were less than 0.2 ?g/mL and 0.7 ?g/mL, respectively. Our study indicated that wavelet moment approach could defuse the overlapping and shifts of peaks and noise signals in the chromatographic determination owing to its multi-resolution and inherently invariance properties. Thus the analytical time was shortened, and the obtained results were reliable and accurate. PMID:24913368

  13. Sensitivity analysis practices: Strategies for model-based inference

    Microsoft Academic Search

    Andrea Saltelli; Marco Ratto; Stefano Tarantola; Francesca Campolongo

    2006-01-01

    Fourteen years after Science's review of sensitivity analysis (SA) methods in 1989 (System analysis at molecular scale, by H. Rabitz) we search Science Online to identify and then review all recent articles having “sensitivity analysis” as a keyword. In spite of the considerable developments which have taken place in this discipline, of the good practices which have emerged, and of

  14. Practical Considerations for Using Exploratory Factor Analysis in Educational Research

    ERIC Educational Resources Information Center

    Beavers, Amy S.; Lounsbury, John W.; Richards, Jennifer K.; Huck, Schuyler W.; Skolits, Gary J.; Esquivel, Shelley L.

    2013-01-01

    The uses and methodology of factor analysis are widely debated and discussed, especially the issues of rotational use, methods of confirmatory factor analysis, and adequate sample size. The variety of perspectives and often conflicting opinions can lead to confusion among researchers about best practices for using factor analysis. The focus of the…

  15. Visionlearning: Research Methods: The Practice of Science

    NSDL National Science Digital Library

    2010-10-01

    This instructional module introduces four types of research methods: experimentation, description, comparison, and modeling. It was developed to help learners understand that the classic definition of the "scientific method" does not capture the dynamic nature of science investigation. As learners explore each methodology, they develop an understanding of why scientists use multiple methods to gather data and develop hypotheses. It is appropriate for introductory physics courses and for teachers seeking content support in research practices. Editor's Note: Secondary students often cling to the notion that scientific research follows a stock, standard "scientific method". They may be unaware of the differences between experimental research, correlative studies, observation, and computer-based modeling research. In this resource, they can glimpse each methodology in the context of a real study done by respected scientists. This resource is part of Visionlearning, an award-winning set of classroom-tested modules for science education.

  16. Good Research Practices for Comparative Effectiveness Research: Analytic Methods to Improve Causal Inference from Nonrandomized Studies of Treatment Effects Using Secondary Data Sources: The ISPOR Good Research Practices for Retrospective Database Analysis Task Force Report—Part III

    Microsoft Academic Search

    Michael L. Johnson; William Crown; Bradley C. Martin; Colin R. Dormuth; Uwe Siebert; Michael E. DeBakey

    Objectives: Most contemporary epidemiologic studies require complex analytical methods to adjust for bias and confounding. New methods are constantly being developed, and older more established methods are yet appropriate. Careful application of statistical analysis techniques can improve causal inference of comparative treatment effects from nonran- domized studies using secondary databases. A Task Force was formed to offer a review of

  17. Good Research Practices for Comparative Effectiveness Research: Analytic Methods to Improve Causal Inference from Nonrandomized Studies of Treatment Effects Using Secondary Data Sources: The ISPOR Good Research Practices for Retrospective Database Analysis Task Force Report—Part III

    Microsoft Academic Search

    Michael L. Johnson; William Crown; Bradley C. Martin; Colin R. Dormuth; Uwe Siebert

    2009-01-01

    ObjectivesMost contemporary epidemiologic studies require complex analytical methods to adjust for bias and confounding. New methods are constantly being developed, and older more established methods are yet appropriate. Careful application of statistical analysis techniques can improve causal inference of comparative treatment effects from nonrandomized studies using secondary databases. A Task Force was formed to offer a review of the more

  18. Practical method for balancing airplane moments

    NASA Technical Reports Server (NTRS)

    Hamburger, H

    1924-01-01

    The present contribution is the sequel to a paper written by Messrs. R. Fuchs, L. Hopf, and H. Hamburger, and proposes to show that the methods therein contained can be practically utilized in computations. Furthermore, the calculations leading up to the diagram of moments for three airplanes, whose performance in war service gave reason for complaint, are analyzed. Finally, it is shown what conclusions can be drawn from the diagram of moments with regard to the defects in these planes and what steps may be taken to remedy them.

  19. Practical applications of the method of generalized pole figures

    NASA Astrophysics Data System (ADS)

    Perlovich, Yu; Isaenkova, M.; Fesenko, V.; Krymskaya, O.; Dobrokhotov, P.

    2015-04-01

    Several actual practical applications of the new X-ray method of generalized pole figures are considered. Among them there are determination of c- and a-dislocation densities in shell tubes from Zr-1%Nb alloy, analysis of strain hardening at opposite sides of shell tubes from ferrritic-martensitic steel with oxide disperse strengthening particles for high-temperature nuclear reactor and revealing of substructure non-uniformity in rods of Cu, subjected to equi- channel angular pressing.

  20. Practical Application of Second Law Efficiency Analysis 

    E-print Network

    Gaggioli, R. A.; Wepfer, W. J.

    1983-01-01

    Slowly but surely the direct application of the Second Law of Thermodynamics is being recognized, for its practical usefulness in engineering. This paper will describe the methods, and show the results of applications to whole economic sectors...

  1. An Online Forum As a Qualitative Research Method: Practical Issues

    PubMed Central

    Im, Eun-Ok; Chee, Wonshik

    2008-01-01

    Background Despite positive aspects of online forums as a qualitative research method, very little is known about practical issues involved in using online forums for data collection, especially for a qualitative research project. Objectives The purpose of this paper is to describe the practical issues that the researchers encountered in implementing an online forum as a qualitative component of a larger study on cancer pain experience. Method Throughout the study process, the research staff recorded issues ranged from minor technical problems to serious ethical dilemmas as they arose and wrote memos about them. The memos and written records of discussions were reviewed and analyzed using the content analysis suggested by Weber. Results Two practical issues related to credibility were identified: a high response and retention rate and automatic transcripts. An issue related to dependability was the participants’ easy forgetfulness. The issues related to confirmability were difficulties in theoretical saturation and unstandardized computer and Internet jargon. A security issue related to hacking attempts was noted as well. Discussion The analysis of these issues suggests several implications for future researchers who want to use online forums as a qualitative data collection method. PMID:16849979

  2. Spiritual Assessment in Counseling: Methods and Practice

    ERIC Educational Resources Information Center

    Oakes, K. Elizabeth; Raphel, Mary M.

    2008-01-01

    Given the widely expanding professional and empirical support for integrating spirituality into counseling, the authors present a practical discussion for raising counselors' general awareness and skill in the critical area of spiritual assessment. A discussion of rationale, measurement, and clinical practice is provided along with case examples.…

  3. Method of analysis and quality-assurance practices by the U. S. Geological Survey Organic Geochemistry Research Group; determination of four selected mosquito insecticides and a synergist in water using liquid-liquid extraction and gas chrom

    USGS Publications Warehouse

    Zimmerman, L.R.; Strahan, A.P.; Thurman, E.M.

    2001-01-01

    A method of analysis and quality-assurance practices were developed for the determination of four mosquito insecticides (malathion, metho-prene, phenothrin, and resmethrin) and one synergist (piperonyl butoxide) in water. The analytical method uses liquid-liquid extraction (LLE) and gas chromatography/mass spectrometry (GC/MS). Good precision and accuracy were demonstrated in reagent water, urban surface water, and ground water. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 50 nanograms per liter ranged from 68 to 171 percent, with standard deviations in concentrations of 27 nanograms per liter or less. The method detection limit for all compounds was 5.9 nanograms per liter or less for 247-milliliter samples. This method is valuable for acquiring information about the fate and transport of these mosquito insecticides and one synergist in water.

  4. Council on Certification Professional Practice Analysis.

    PubMed

    Zaglaniczny, K L

    1993-06-01

    The CCNA has completed a PPA and will begin implementing its recommendations with the December 1993 certification examination. The results of the PPA provide content validation for the CCNA certification examination. The certification examination is reflective of the knowledge and skill required for entry-level practice. Assessment of this knowledge is accomplished through the use of questions that are based on the areas represented in the content outline. Analysis of the PPA has resulted in changes in the examination content outline and percentages of questions in each area to reflect current entry-level nurse anesthesia practice. The new outline is based on the major domains of knowledge required for nurse anesthesia practice. These changes are justified by the consistency in the responses of the practitioners surveyed. There was overall agreement as to the knowledge and skills related to patient conditions, procedures, agents, techniques, and equipment that an entry-level CRNA must have to practice. Members of the CCNA and Examination Committee will use the revised outline to develop questions for the certification examination. The questions will be focused on the areas identified as requiring high levels of expertise and those that appeared higher in frequency. The PPA survey will be used as a basis for subsequent content validation studies. It will be revised to reflect new knowledge, technology, and techniques related to nurse anesthesia practice. The CCNA has demonstrated its commitment to the certification process through completion of the PPA and implementation of changes in the structure of the examination. PMID:8291387

  5. Practical stochastic fatigue analysis of offshore platforms

    Microsoft Academic Search

    R. Skjong; H. O. Madsen

    1987-01-01

    A method for stochastic fatigue analysis of offshore platforms is presented. The method accounts for the nonlinearity in the drag loading term, and for the systematic deviation from a Gaussian process for a platform response. The method is based on an assumed form of the stress response process at a hot spot. A number of full scale measurements for elements

  6. Requirements Analysis for the Small Office Practice

    PubMed Central

    Giannakopoulos, Stephen M.E.; Hanmer, Jean C.

    1980-01-01

    Physicians are beginning to automate their office practices and many are not familiar with industry's techniques for evaluating such systems. A number of requirements analysis steps defined in business can be applied to the analysis of an office practice. The procedures suggested may be carried out by a physician and his auxiliary staff (with the aid of a consultant if desired). Our purpose is to help the practitioner define the problems impeding the flow of work in the office. The resolution of these problems will not necessarily justify the installation of a computerized system. For practioners who are acquiring a computer system, a series of cost and quality questions are provided for use in evaluating proposed systems.

  7. Method of analysis at the U.S. Geological Survey California Water Science Center, Sacramento Laboratory - determination of haloacetic acid formation potential, method validation, and quality-control practices

    USGS Publications Warehouse

    Zazzi, Barbara C.; Crepeau, Kathryn L.; Fram, Miranda S.; Bergamaschi, Brian A.

    2005-01-01

    An analytical method for the determination of haloacetic acid formation potential of water samples has been developed by the U.S. Geological Survey California Water Science Center Sacramento Laboratory. The haloacetic acid formation potential is measured by dosing water samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine. The haloacetic acids formed are bromochloroacetic acid, bromodichloroacetic acid, dibromochloroacetic acid, dibromoacetic acid, dichloroacetic acid, monobromoacetic acid, monochloroacetic acid, tribromoacetic acid, and trichloroacetic acid. They are extracted, methylated, and then analyzed using a gas chromatograph equipped with an electron capture detector. Method validation experiments were performed to determine the method accuracy, precision, and detection limit for each of the compounds. Method detection limits for these nine haloacetic acids ranged from 0.11 to 0.45 microgram per liter. Quality-control practices include the use of blanks, quality-control samples, calibration verification standards, surrogate recovery, internal standard, matrix spikes, and duplicates.

  8. Science Teaching Methods: A Rationale for Practices

    ERIC Educational Resources Information Center

    Osborne, Jonathan

    2011-01-01

    This article is a version of the talk given by Jonathan Osborne as the Association for Science Education (ASE) invited lecturer at the National Science Teachers' Association Annual Convention in San Francisco, USA, in April 2011. The article provides an explanatory justification for teaching about the practices of science in school science that…

  9. PRACTICAL STEREOLOGICAL METHODS FOR MORPHOMETRIC CYTOLOGY

    Microsoft Academic Search

    EWALD R. WEIBEL; S. KISTLER; WALTER F. SCHERLE

    2009-01-01

    Stereological principles provide efficient and reliable tools for the determination of quantita- tivc parameters of tissue structure on sections. Some principles which allow the estimation of volumetric ratios, surface areas, surface-to-volume ratios, thicknesses of tissue or cell shccts, and the number of structures are reviewed and presented in general form; means for their practical application in electron microscopy are outlined.

  10. Integrating analysis and design methods (abstract)

    Microsoft Academic Search

    Derek Coleman; Paul Jeremaes

    1993-01-01

    Integrating Analysis and Design MethodsDerek Coleman & Paul Jeremaes, HP Labs, EnglandMost of those who practice object-oriented analysis and designdo not follow any standard method exactly, but combine differenttechniques to suit their own unique requirements. Each methodemploys its own set of models, notations, and processes, so it canbe difficult to combine them. This tutorial shows how to design amethod by

  11. A practical approach for nonlinear analysis of tensegrity systems

    Microsoft Academic Search

    Ayhan NuhogluKasim; Kasim Armagan Korkmaz

    Tensegrity systems are lightweight structures composed of cables and struts. The nonlinear behavior of tensegrity systems\\u000a is critical; therefore, the design of these types of structures is relatively complex. In the present study, a practical and\\u000a efficient approach for geometrical nonlinear analysis of tensegrity systems is proposed. The approach is based on the point\\u000a iterative method. Static equilibrium equations are

  12. Statistical evaluation of the influence of several sample pretreatment methods on the mercury content detectable by chemical analysis of contaminated soil samples under practical conditions

    Microsoft Academic Search

    W. Rasemann; U. Seltmann; M. Hempel

    1995-01-01

    The estimation of the environmental risk of contaminated sites caused by hazardous components may be obtained, for instance, by means of a soil survey. There unavoidable errors by sampling, sample preparation and chemical analysis occur. Furthermore, in case of mercury contaminations, the mercury content detectable by chemical analysis can be falsified, if between sampling, on the one hand, and sample

  13. Standard practice for digital imaging and communication in nondestructive evaluation (DICONDE) for ultrasonic test methods

    E-print Network

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice facilitates the interoperability of ultrasonic imaging equipment by specifying image data transfer and archival storage methods in commonly accepted terms. This document is intended to be used in conjunction with Practice E 2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E 2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see http://medical.nema.org), an international standard for image data acquisition, review, transfer and archival storage. The goal of Practice E 2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE test results on any system conforming to the DICONDE standard. Toward that end, Practice E 2339 provides a data dictionary and set of information modules that are applicable to all NDE modalities. This practice supplements Practice E 2339 by providing information object definitions, information ...

  14. A Practical Guide to Wavelet Analysis

    NSDL National Science Digital Library

    Researchers dealing with time series data will find this powerful resource extremely helpful. Drs. Christopher Torrence (National Center for Atmospheric Research) and Gilbert Compo (NOAA/ CIRES Climate Diagnostics Center) have put together this Website for researchers interested in using wavelet analysis, a technique that decomposes a time series into time-frequency space. The site provides information on "both the amplitude of any periodic signals within the series, and how this amplitude varies with time." The nicely written introductory section (Wavelet Analysis & Monte Carlo) is complete with algorithms, graphically illustrated examples, and references (including some links). First time users may wish to consult the on-site article "A Practical Guide to Wavelet Analysis," originally published in 1998 (.pdf format), or browse the FAQ section. The heart of the site is the Interactive Wavelet Plots section; here, users may experiment with wavelet analysis using time series data provided at the site (i.e., Sea Surface Temperature, Sunspots) or provided by the user. As if that weren't enough, the site also offers free Wavelet software (Fortran, IDL, or Matlab; acknowledgment required) and several abbreviated data sets for experimentation.

  15. Practical Applications of Student Response Analysis.

    ERIC Educational Resources Information Center

    Switzer, Deborah M.; Connell, Michael L.

    This paper describes teacher usage of the microcomputer programs Test Analysis Package (TAP) and Student Problem Package (SPP) to analyze students' test item responses. These methods of organizing, analyzing, and reporting test results have proven useful to classroom teachers. The TAP consists of four integrated microcomputer programs to edit,…

  16. Optimizing Distributed Practice: Theoretical Analysis and Practical Implications

    ERIC Educational Resources Information Center

    Cepeda, Nicholas J.; Coburn, Noriko; Rohrer, Doug; Wixted, John T.; Mozer, Michael C,; Pashler, Harold

    2009-01-01

    More than a century of research shows that increasing the gap between study episodes using the same material can enhance retention, yet little is known about how this so-called distributed practice effect unfolds over nontrivial periods. In two three-session laboratory studies, we examined the effects of gap on retention of foreign vocabulary,…

  17. Practical Issues in Component Aging Analysis

    SciTech Connect

    Dana L. Kelly; Andrei Rodionov; Jens Uwe-Klugel

    2008-09-01

    This paper examines practical issues in the statistical analysis of component aging data. These issues center on the stochastic process chosen to model component failures. The two stochastic processes examined are repair same as new, leading to a renewal process, and repair same as old, leading to a nonhomogeneous Poisson process. Under the first assumption, times between failures can treated as statistically independent observations from a stationary process. The common distribution of the times between failures is called the renewal distribution. Under the second process, the times between failures will not be independently and identically distributed, and one cannot simply fit a renewal distribution to the cumulative failure times or the times between failures. The paper illustrates how the assumption made regarding the repair process is crucial to the analysis. Besides the choice of stochastic process, other issues that are discussed include qualitative graphical analysis and simple nonparametric hypothesis tests to help judge which process appears more appropriate. Numerical examples are presented to illustrate the issues discussed in the paper.

  18. Risk Analysis of Hematopoietic Stem Cell Transplant Process: Failure Mode, Effect, and Criticality Analysis and Hazard Analysis Critical Control Point Methods Integration Based on Guidelines to Good Manufacturing Practice for Medicinal Product ANNEX 20 (February 2008)

    Microsoft Academic Search

    S. Gianassi; S. Bisin; B. Bindi; I. Spitaleri; F. Bambi

    2010-01-01

    The collection and handling of hematopoietic stem cells (HSCs) must meet high quality requirements. An integrated Quality Risk Management can help to identify and contain potential risks related to HSC production. Risk analysis techniques allow one to “weigh” identified hazards, considering the seriousness of their effects, frequency, and detectability, seeking to prevent the most harmful hazards. The Hazard Analysis Critical

  19. Innovative Computational Methods For Transcriptomic Data Analysis: A Case Study in the Use Of FPT For Practical Algorithm Design and Implementation

    SciTech Connect

    Langston, Michael A [University of Tennessee, Knoxville (UTK); Perkins, Andy [University of Tennessee, Knoxville (UTK); Saxton, Arnold [University of Tennessee, Knoxville (UTK); Scharff, Jon [University of Tennessee, Knoxville (UTK); Voy, Brynn H [ORNL

    2007-01-01

    Tools of molecular biology and the evolving tools of genomics can now be exploited to study the genetic regulatory mechanisms that control cellular responses to a wide variety of stimuli. These responses are highly complex, and involve many genes and gene products. The main objectives of this paper are to describe a novel research program centered on understanding these responses by i.developing powerful graph algorithms that exploit the innovative principles of fixed parameter tractability in order to generate distilled gene sets; ii.producing scalable, high performance parallel and distributed implementations of these algorithms utilizing cutting-edge computing platforms and auxiliary resources; iii.employing these implementations to identify gene sets suggestive of co-regulation; and iv.performing sequence analysis and genomic data mining to examine, winnow and highlight the most promising gene sets for more detailed investigation. As a case study, we describe our work aimed at elucidating genetic regulatory mechanisms that control cellular responses to low-dose ionizing radiation (IR). A low-dose exposure, as defined here, is an exposure of at most 10 cGy (rads). While the consequences of high doses of radiation are well known, the net outcome of low-dose exposures continues to be debated, with support in the literature for both detrimental and beneficial effects. We use genome-scale gene expression data collected in response to low-dose IR exposure in vivo to identify the pathways that are activated or repressed as a tissue responds to the radiation insult. The driving motivation is that knowledge of these pathways will help clarify and interpret physiological responses to IR, which will advance our understanding of the health consequences of low-dose radiation exposures.

  20. Compassion fatigue within nursing practice: a concept analysis.

    PubMed

    Coetzee, Siedine Knobloch; Klopper, Hester C

    2010-06-01

    "Compassion fatigue" was first introduced in relation to the study of burnout among nurses, but it was never defined within this context; it has since been adopted as a synonym for secondary traumatic stress disorder, which is far removed from the original meaning of the term. The aim of the study was to define compassion fatigue within nursing practice. The method that was used in this article was concept analysis. The findings revealed several categories of compassion fatigue: risk factors, causes, process, and manifestations. The characteristics of each of these categories are specified and a connotative (theoretical) definition, model case, additional cases, empirical indicators, and a denotative (operational) definition are provided. Compassion fatigue progresses from a state of compassion discomfort to compassion stress and, finally, to compassion fatigue, which if not effaced in its early stages of compassion discomfort or compassion stress, can permanently alter the compassionate ability of the nurse. Recommendations for nursing practice, education, and research are discussed. PMID:20602697

  1. A practical, precise method for frequency tracking and phasor estimation

    Microsoft Academic Search

    Maohai Wang; Yuanzhang Sun

    2004-01-01

    Comprehensive analysis of discrete Fourier transform (DFT) error is given in this paper, including why it is accurate when used in the case of synchronous sampling and how error rises when sampling frequency does not synchronized to signal frequency. Simple but precise expressions of phase angle error and amplitude error are given. Practical formulas to calculate the true phase angle

  2. Signal Processing: Signals The characterization as well as analysis methods

    E-print Network

    Rimon, Elon

    Signal Processing: Signals SIGNALS The characterization as well as analysis methods depends on the signal structure. The following are some classification possibilities. Deterministic vs. random Transient vs. continuous Stationary vs. nonstationary In practice we often encounter combinations of signal

  3. Patients’ experiences of the choice of GP practice pilot, 2012/2013: a mixed methods evaluation

    PubMed Central

    Tan, Stefanie; Erens, Bob; Wright, Michael; Mays, Nicholas

    2015-01-01

    Objectives To investigate patients’ experiences of the choice of general practitioner (GP) practice pilot. Design Mixed-method, cross-sectional study. Setting Patients in the UK National Health Service (NHS) register with a general practice responsible for their primary medical care and practices set geographic boundaries. In 2012/2013, 43 volunteer general practices in four English NHS primary care trusts (PCTs) piloted a scheme allowing patients living outside practice boundaries to register as an out of area patient or be seen as a day patient. Participants Analysis of routine data for 1108 out of area registered patients and 250 day patients; postal survey of out of area registered (315/886, 36%) and day (64/188, 34%) patients over 18?years of age, with a UK mailing address; comparison with General Practice Patient Survey (GPPS); semistructured interviews with 24 pilot patients. Results Pilot patients were younger and more likely to be working than non-pilot patients at the same practices and reported generally more or at least as positive experiences than patients registered at the same practices, practices in the same PCT and nationally, despite belonging to subgroups of the population who typically report poorer than average experiences. Out of area patients who joined a pilot practice did so: after moving house and not wanting to change practice (26.2%); for convenience (32.6%); as newcomers to an area who selected a practice although they lived outside its boundary (23.6%); because of dissatisfaction with their previous practice (13.9%). Day patients attended primarily on grounds of convenience (68.8%); 51.6% of the day patient visits were for acute infections, most commonly upper respiratory infections (20.4%). Sixty-six per cent of day patients received a prescription during their visit. Conclusions Though the 12-month pilot was too brief to identify all costs and benefits, the scheme provided a positive experience for participating patients and practices. PMID:25667149

  4. Research in dental practice: a 'SWOT' analysis.

    PubMed

    Burke, F J T; Crisp, R J; McCord, J F

    2002-03-01

    Most dental treatment, in most countries, is carried out in general dental practice. There is therefore a potential wealth of research material, although clinical evaluations have generally been carried out on hospital-based patients. Many types of research, such as clinical evaluations and assessments of new materials, may be appropriate to dental practice. Principal problems are that dental practices are established to treat patients efficiently and to provide an income for the staff of the practice. Time spent on research therefore cannot be used for patient treatment, so there are cost implications. Critics of practice-based research have commented on the lack of calibration of operative diagnoses and other variables; however, this variability is the stuff of dental practice, the real-world situation. Many of the difficulties in carrying out research in dental practice may be overcome. For the enlightened, it may be possible to turn observations based on the volume of treatment carried out in practice into robust, clinically related and relevant research projects based in the real world of dental practice. PMID:11928346

  5. ACCA: An Architecture-Centric Concern Analysis Method

    Microsoft Academic Search

    Zhenyu Wang; Khalid Sherdil; Nazim H. Madhavji

    2005-01-01

    The architecture of a software system is a key asset for a software business. While there are several architecting and evaluation methods, literature and practice are devoid of architecture-centric concernanalysis (ACCA) methods analogous to causal analysis methods for software defects. A concern is any aspect of an architecture considered undesirable. This paper describes an ACCA method which uses at its

  6. Clinical interview methods in mathematics education research and practice

    Microsoft Academic Search

    Robert P. Hunting

    1997-01-01

    Use of clinical interview methods in mathematics education research and as an assessment strategy in the mathematics classroom are contrasted. Differences and similarities between roles of researcher and practitioner are outlined. Uses of clinical interviews in research and practice are discussed by focusing on issues of how one prepares to administer an interview, kinds of tasks found to be most

  7. METHODS OF PLANKTON INVESTIGATION IN THEIR RELATION TO PRACTICAL PROBLEMS.

    E-print Network

    169 METHODS OF PLANKTON INVESTIGATION IN THEIR RELATION TO PRACTICAL PROBLEMS. By JACOB REIGHARD the plankton. The total mass of plankton is, in most bodies of water, so great that, in comparison with it, it is customary to neglect the fixed plants along the shore and the animals that they harbor. That the plankton

  8. Methods to investigate coronary microvascular function in clinical practice.

    PubMed

    Lanza, Gaetano A; Camici, Paolo G; Galiuto, Leonarda; Niccoli, Giampaolo; Pizzi, Carmine; Di Monaco, Antonio; Sestito, Alfonso; Novo, Salvatore; Piscione, Federico; Tritto, Isabella; Ambrosio, Giuseppe; Bugiardini, Raffaele; Crea, Filippo; Marzilli, Mario

    2013-01-01

    A growing amount of data is increasingly showing the relevance of coronary microvascular dysfunction (CMVD) in several clinical contexts. This article reviews techniques and clinical investigations of the main noninvasive and invasive methods proposed to study coronary microcirculation and to identify CMVD in the presence of normal coronary arteries, also trying to provide indications for their application in clinical practice. PMID:23222188

  9. Time series analysis with the VSAA method

    NASA Astrophysics Data System (ADS)

    Tsantilas, S.; Kolenberg, K.; Rovithis-Livaniou, H.

    2009-03-01

    Time series analysis is a common task in many scientific fields, and so it is in astronomy, too. Fourier Transform and Wavelet Analysis are usually applied to handle the majority of the cases. Even so, problems arise when the time series signal presents modulation in the frequency under inspection. The Variable Sine Algorithmic Analysis (VSAA) is a new method focused exactly on this type of signals. It is based on a single sine function with variable coefficients and it is powered by the simplex algorithm. In cases of phenomena triggered by a single mechanism - that Fourier Transform and Wavelet Analysis fail to describe practically and efficiently - VSAA provides a straightforward solution. The method has already been applied to orbital period changes and magnetic field variations of binary stars, as well as to the Blazhko effect of the pulsating RR Lyrae stars and to sunspot activity.

  10. Computational analysis of protein tyrosine phosphatases: practical guide to bioinformatics and data resources

    Microsoft Academic Search

    Jannik N. Andersen; Robert L. Del Vecchio; Natarajan Kannan; James Gergel; Andrew F. Neuwald; Nicholas K. Tonks

    2005-01-01

    The exponential growth of sequence data has become a challenge to database curators and end-users alike and biologists seeking to utilize the data effectively are faced with numerous analysis methods. Here, with practical examples from our bioinformatics analysis of the protein tyrosine phosphatases (PTPs), we show how computational analysis can be exploited to fuel hypothesis-driven experimental research through the exploration

  11. Gait analysis methods in rehabilitation

    Microsoft Academic Search

    Richard Baker; Hugh Williamson; Gait CCRE

    2006-01-01

    INTRODUCTION: Brand's four reasons for clinical tests and his analysis of the characteristics of valid biomechanical tests for use in orthopaedics are taken as a basis for determining what methodologies are required for gait analysis in a clinical rehabilitation context. MEASUREMENT METHODS IN CLINICAL GAIT ANALYSIS: The state of the art of optical systems capable of measuring the positions of

  12. Supplementary Methods Sequence Analysis

    E-print Network

    Hahn, Matthew

    and Yang (1994) as implemented in the PAML software package (Yang 1997). Statistical Analysis All-Protein interaction networks for fly, yeast, and worm were obtained from the GRID database (Breitkreutz et al. 2003 that detect interactions among proteins including: affinity precipitation, affinity chromatography, yeast two

  13. Item-Analysis Methods and Their Implications for the ILTA Guidelines for Practice: A Comparison of the Effects of Classical Test Theory and Item Response Theory Models on the Outcome of a High-Stakes Entrance Exam

    ERIC Educational Resources Information Center

    Ellis, David P.

    2011-01-01

    The current version of the International Language Testing Association (ILTA) Guidelines for Practice requires language testers to pretest items before including them on an exam, or when pretesting is not possible, to conduct post-hoc item analysis to ensure any malfunctioning items are excluded from scoring. However, the guidelines are devoid of…

  14. Practical postcalibration uncertainty analysis: Yucca Mountain, Nevada.

    PubMed

    James, Scott C; Doherty, John E; Eddebbarh, Al-Aziz

    2009-01-01

    The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is "calibrated." Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the United States' proposed site for disposal of high-level radioactive waste. Linear and nonlinear uncertainty analyses are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction's uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This article applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. PMID:19744249

  15. Will genomic selection be a practical method for plant breeding?

    PubMed Central

    Nakaya, Akihiro; Isobe, Sachiko N.

    2012-01-01

    Background Genomic selection or genome-wide selection (GS) has been highlighted as a new approach for marker-assisted selection (MAS) in recent years. GS is a form of MAS that selects favourable individuals based on genomic estimated breeding values. Previous studies have suggested the utility of GS, especially for capturing small-effect quantitative trait loci, but GS has not become a popular methodology in the field of plant breeding, possibly because there is insufficient information available on GS for practical use. Scope In this review, GS is discussed from a practical breeding viewpoint. Statistical approaches employed in GS are briefly described, before the recent progress in GS studies is surveyed. GS practices in plant breeding are then reviewed before future prospects are discussed. Conclusions Statistical concepts used in GS are discussed with genetic models and variance decomposition, heritability, breeding value and linear model. Recent progress in GS studies is reviewed with a focus on empirical studies. For the practice of GS in plant breeding, several specific points are discussed including linkage disequilibrium, feature of populations and genotyped markers and breeding scheme. Currently, GS is not perfect, but it is a potent, attractive and valuable approach for plant breeding. This method will be integrated into many practical breeding programmes in the near future with further advances and the maturing of its theory. PMID:22645117

  16. Encouraging Gender Analysis in Research Practice

    ERIC Educational Resources Information Center

    Thien, Deborah

    2009-01-01

    Few resources for practical teaching or fieldwork exercises exist which address gender in geographical contexts. This paper adds to teaching and fieldwork resources by describing an experience with designing and implementing a "gender intervention" for a large-scale, multi-university, bilingual research project that brought together a group of…

  17. Practical Nursing. Ohio's Competency Analysis Profile.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Vocational Instructional Materials Lab.

    Developed through a modified DACUM (Developing a Curriculum) process involving business, industry, labor, and community agency representatives in Ohio, this document is a comprehensive and verified employer competency profile for practical nursing. The list contains units (with and without subunits), competencies, and competency builders that…

  18. Die attach using silver sintering practical implementation and analysis

    E-print Network

    Paris-Sud XI, Université de

    of the different implementations, and gives practical details about one of them, based on silver nano- particles are available, based on various silver particles sizes and sintering additives. This paper presents a reviewDie attach using silver sintering practical implementation and analysis Amandine Masson1 , Wissam

  19. Practical methods to improve the development of computational software

    SciTech Connect

    Osborne, A. G.; Harding, D. W.; Deinert, M. R. [Department of Mechanical Engineering, University of Texas, Austin (United States)

    2013-07-01

    The use of computation has become ubiquitous in science and engineering. As the complexity of computer codes has increased, so has the need for robust methods to minimize errors. Past work has show that the number of functional errors is related the number of commands that a code executes. Since the late 1960's, major participants in the field of computation have encouraged the development of best practices for programming to help reduce coder induced error, and this has lead to the emergence of 'software engineering' as a field of study. Best practices for coding and software production have now evolved and become common in the development of commercial software. These same techniques, however, are largely absent from the development of computational codes by research groups. Many of the best practice techniques from the professional software community would be easy for research groups in nuclear science and engineering to adopt. This paper outlines the history of software engineering, as well as issues in modern scientific computation, and recommends practices that should be adopted by individual scientific programmers and university research groups. (authors)

  20. Methods for genetic linkage analysis using trisomies

    SciTech Connect

    Feingold, E. [Emory Univ. School of Public Health, Atlanta, GA (United States); Lamb, N.E.; Sherman, S.L. [Emory Univ., Atlanta, GA (United States)

    1995-02-01

    Certain genetic disorders are rare in the general population, but more common in individuals with specific trisomies. Examples of this include leukemia and duodenal atresia in trisomy 21. This paper presents a linkage analysis method for using trisomic individuals to map genes for such traits. It is based on a very general gene-specific dosage model that posits that the trait is caused by specific effects of different alleles at one or a few loci and that duplicate copies of {open_quotes}susceptibility{close_quotes} alleles inherited from the nondisjoining parent give increased likelihood of having the trait. Our mapping method is similar to identity-by-descent-based mapping methods using affected relative pairs and also to methods for mapping recessive traits using inbred individuals by looking for markers with greater than expected homozygosity by descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected homozygosity in the chromosomes inherited from the nondisjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the trait gene, a confidence interval for that distance, and methods for computing power and sample sizes. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers and how to test candidate genes. 20 refs., 5 figs., 1 tab.

  1. A Practical Introduction to Analysis and Synthesis

    ERIC Educational Resources Information Center

    Williams, R. D.; Cosart, W. P.

    1976-01-01

    Discusses an introductory chemical engineering course in which mathematical models are used to analyze experimental data. Concepts illustrated include dimensional analysis, scaleup, heat transfer, and energy conservation. (MLH)

  2. A practical gait analysis system using gyroscopes

    Microsoft Academic Search

    Kaiyu Tong; Malcolm H Granat

    1999-01-01

    This study investigated the possibility of using uni-axial gyroscopes to develop a simple portable gait analysis system. Gyroscopes were attached on the skin surface of the shank and thigh segments and the angular velocity for each segment was recorded in each segment. Segment inclinations and knee angle were derived from segment angular velocities. The angular signals from a motion analysis

  3. Comparison of four teaching methods on Evidence-based Practice skills of postgraduate nursing students.

    PubMed

    Fernandez, Ritin S; Tran, Duong Thuy; Ramjan, Lucie; Ho, Carey; Gill, Betty

    2014-01-01

    The aim of this study was to compare four teaching methods on the evidence-based practice knowledge and skills of postgraduate nursing students. Students enrolled in the Evidence-based Nursing (EBN) unit in Australia and Hong Kong in 2010 and 2011 received education via either the standard distance teaching method, computer laboratory teaching method, Evidence-based Practice-Digital Video Disc (EBP-DVD) teaching method or the didactic classroom teaching method. Evidence-based Practice (EBP) knowledge and skills were evaluated using student assignments that comprised validated instruments. One-way analysis of covariance was implemented to assess group differences on outcomes after controlling for the effects of age and grade point average (GPA). Data were obtained from 187 students. The crude mean score among students receiving the standard+DVD method of instruction was higher for developing a precise clinical question (8.1±0.8) and identifying the level of evidence (4.6±0.7) compared to those receiving other teaching methods. These differences were statistically significant after controlling for age and grade point average. Significant improvement in cognitive and technical EBP skills can be achieved for postgraduate nursing students by integrating a DVD as part of the EBP teaching resources. The EBP-DVD is an easy teaching method to improve student learning outcomes and ensure that external students receive equivalent and quality learning experiences. PMID:23107585

  4. Practical aspects of genome-wide association interaction analysis.

    PubMed

    Gusareva, Elena S; Van Steen, Kristel

    2014-11-01

    Large-scale epistasis studies can give new clues to system-level genetic mechanisms and a better understanding of the underlying biology of human complex disease traits. Though many novel methods have been proposed to carry out such studies, so far only a few of them have demonstrated replicable results. Here, we propose a minimal protocol for genome-wide association interaction (GWAI) analysis to identify gene-gene interactions from large-scale genomic data. The different steps of the developed protocol are discussed and motivated, and encompass interaction screening in a hypothesis-free and hypothesis-driven manner. In particular, we examine a wide range of aspects related to epistasis discovery in the context of complex traits in humans, hereby giving practical recommendations for data quality control, variant selection or prioritization strategies and analytic tools, replication and meta-analysis, biological validation of statistical findings and other related aspects. The minimal protocol provides guidelines and attention points for anyone involved in GWAI analysis and aims to enhance the biological relevance of GWAI findings. At the same time, the protocol improves a better assessment of strengths and weaknesses of published GWAI methodologies. PMID:25164382

  5. Practical chaos time series analysis with financial applications

    Microsoft Academic Search

    Ikuo Matsuba; Hiroki Suyari; Sekjun Weon; D. Sato

    2000-01-01

    We describe the practical implementation of the nonlinear (chaos) time series analysis based on the paradigm of deterministic chaos. Some important techniques of statistical test for nonlinearity, phase space reconstruction, and nonlinear prediction are discussed with some applications to finance. The use of the nonlinear time series analysis is illustrated with particular emphasis on issues of choices of time delay

  6. MAD Skills: New Analysis Practices for Big Data

    Microsoft Academic Search

    Jeffrey Cohen; Brian Dolan; Mark Dunlap; Joseph M. Hellerstein; Caleb Welton

    2009-01-01

    As massive data acquisition and storage becomes increas- ingly aordable, a wide variety of enterprises are employing statisticians to engage in sophisticated data analysis. In this paper we highlight the emerging practice of Magnetic, Ag- ile, Deep (MAD) data analysis as a radical departure from traditional Enterprise Data Warehouses and Business Intel- ligence. We present our design philosophy, techniques and

  7. Methods for genetic linkage analysis using trisomies

    SciTech Connect

    Feingold, E.; Lamb, N.E.; Sherman, S.L. [Emory Univ., Atlanta, GA (United States)

    1994-09-01

    Certain genetic disorders (e.g. congenital cataracts, duodenal atresia) are rare in the general population, but more common in people with Down`s syndrome. We present a method for using individuals with trisomy 21 to map genes for such traits. Our methods are analogous to methods for mapping autosomal dominant traits using affected relative pairs by looking for markers with greater than expected identity-by-descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected reduction to homozygosity in the chromosomes inherited form the non-disjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the gene, a confidence interval for that distance, and methods for computing power and sample sizes. The methods are described in the context of gene-dosage model for the etiology of the disorder, but can be extended to other models. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers, how to test candidate genes, and how to handle the effect of reduced recombination associated with maternal meiosis I non-disjunction.

  8. Assessing methods for measurement of clinical outcomes and quality of care in primary care practices

    PubMed Central

    2012-01-01

    Purpose To evaluate the appropriateness of potential data sources for the population of performance indicators for primary care (PC) practices. Methods This project was a cross sectional study of 7 multidisciplinary primary care teams in Ontario, Canada. Practices were recruited and 5-7 physicians per practice agreed to participate in the study. Patients of participating physicians (20-30) were recruited sequentially as they presented to attend a visit. Data collection included patient, provider and practice surveys, chart abstraction and linkage to administrative data sets. Matched pairs analysis was used to examine the differences in the observed results for each indicator obtained using multiple data sources. Results Seven teams, 41 physicians, 94 associated staff and 998 patients were recruited. The survey response rate was 81% for patients, 93% for physicians and 83% for associated staff. Chart audits were successfully completed on all but 1 patient and linkage to administrative data was successful for all subjects. There were significant differences noted between the data collection methods for many measures. No single method of data collection was best for all outcomes. For most measures of technical quality of care chart audit was the most accurate method of data collection. Patient surveys were more accurate for immunizations, chronic disease advice/information dispensed, some general health promotion items and possibly for medication use. Administrative data appears useful for indicators including chronic disease diagnosis and osteoporosis/ breast screening. Conclusions Multiple data collection methods are required for a comprehensive assessment of performance in primary care practices. The choice of which methods are best for any one particular study or quality improvement initiative requires careful consideration of the biases that each method might introduce into the results. In this study, both patients and providers were willing to participate in and consent to, the collection and linkage of information from multiple sources that would be required for such assessments. PMID:22824551

  9. Standard practice for digital imaging and communication in nondestructive evaluation (DICONDE) for digital radiographic (DR) test methods

    E-print Network

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice facilitates the interoperability of digital X-ray imaging equipment by specifying image data transfer and archival methods in commonly accepted terms. This document is intended to be used in conjunction with Practice E2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see http://medical.nema.org), an international standard for image data acquisition, review, storage and archival storage. The goal of Practice E2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE results on any system conforming to the DICONDE standard. Toward that end, Practice E2339 provides a data dictionary and a set of information modules that are applicable to all NDE modalities. This practice supplements Practice E2339 by providing information object definitions, information modules and a ...

  10. Standard practice for digital imaging and communication in nondestructive evaluation (DICONDE) for X-ray computed tomography (CT) test methods

    E-print Network

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice facilitates the interoperability of X-ray computed tomography (CT) imaging equipment by specifying image data transfer and archival storage methods in commonly accepted terms. This document is intended to be used in conjunction with Practice E2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see http://medical.nema.org), an international standard for image data acquisition, review, storage and archival storage. The goal of Practice E2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE test results on any system conforming to the DICONDE standard. Toward that end, Practice E2339 provides a data dictionary and a set of information modules that are applicable to all NDE modalities. This practice supplements Practice E2339 by providing information object definitio...

  11. A practical interprocedural data flow analysis algorithm

    Microsoft Academic Search

    Jeffrey M. Barth

    1978-01-01

    A new interprocedural data flow analysis algorithm is presented and analyzed. The algorithm associates with each procedure in a program information about which variables may be modified, which may be used, and which are possibly preserved by a call on the procedure, and all of its subcalls. The algorithm is sufficiently powerful to be used on recursive programs and to

  12. Translational Behavior Analysis and Practical Benefits

    ERIC Educational Resources Information Center

    Pilgrim, Carol

    2011-01-01

    In his article, Critchfield ("Translational Contributions of the Experimental Analysis of Behavior," "The Behavior Analyst," v34, p3-17, 2011) summarizes a previous call (Mace & Critchfield, 2010) for basic scientists to reexamine the inspiration for their research and turn increasingly to translational approaches. Interestingly, rather than…

  13. Comparative Lifecycle Energy Analysis: Theory and Practice.

    ERIC Educational Resources Information Center

    Morris, Jeffrey; Canzoneri, Diana

    1992-01-01

    Explores the position that more energy is conserved through recycling secondary materials than is generated from municipal solid waste incineration. Discusses one component of a lifecycle analysis--a comparison of energy requirements for manufacturing competing products. Includes methodological issues, energy cost estimates, and difficulties…

  14. Method of multivariate spectral analysis

    DOEpatents

    Keenan, Michael R.; Kotula, Paul G.

    2004-01-06

    A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).

  15. Practical histological microdissection for PCR analysis.

    PubMed

    Going, J J; Lamb, R F

    1996-05-01

    Recovery of cells by histological microdissection is increasingly used for analysis by polymerase chain reaction (PCR) or microchemical techniques. This paper describes techniques of histological microdissection. Sections of archival formalin-fixed, paraffin-embedded tissue up to 15 years old were mounted on plain glass slides. Sections 6-7 microns in thickness stained with toluidine blue were dissected under proteinase K buffer solution, using an electrolytically sharpened tungsten needle in a bacteriological loop-holder and a Leitz mechanical micromanipulator (model M). Detached cell groups were recovered in a silicone-coated pipette tip for PCR analysis after digestion in 25-50 microliters of proteinase K (500/ml) in TRIS-HCl buffer (pH 8.3). Consistent amplification and analysis of microsatellite loci were obtained from 2 microliters of crude lysate using 28-30 cycles of PCR incorporating a 32P 5'-end-labelled primer, electrophoresis under denaturing conditions on 6 per cent polyacrylamide gels, and autoradiographic detection. PMID:8691336

  16. Method of photon spectral analysis

    DOEpatents

    Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.

    1993-04-27

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

  17. Method of photon spectral analysis

    DOEpatents

    Gehrke, Robert J. (Idaho Falls, ID); Putnam, Marie H. (Idaho Falls, ID); Killian, E. Wayne (Idaho Falls, ID); Helmer, Richard G. (Idaho Falls, ID); Kynaston, Ronnie L. (Blackfoot, ID); Goodwin, Scott G. (Idaho Falls, ID); Johnson, Larry O. (Pocatello, ID)

    1993-01-01

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

  18. Practical semen analysis: from A to Z.

    PubMed

    Brazil, Charlene

    2010-01-01

    Accurate semen analysis is critical for decisions about patient care, as well as for studies addressing overall changes in semen quality, contraceptive efficacy and effects of toxicant exposure. The standardization of semen analysis is very difficult for many reasons, including the use of subjective techniques with no standards for comparison, poor technician training, problems with proficiency testing and a reluctance to change techniques. The World Health Organization (WHO) Semen handbook (2010) offers a vastly improved set of standardized procedures, all at a level of detail that will preclude most misinterpretations. However, there is a limit to what can be learned from words and pictures alone. A WHO-produced DVD that offers complete demonstrations of each technique along with quality assurance standards for motility, morphology and concentration assessments would enhance the effectiveness of the manual. However, neither the manual nor a DVD will help unless there is general acknowledgement of the critical need to standardize techniques and rigorously pursue quality control to ensure that laboratories actually perform techniques 'according to WHO' instead of merely reporting that they have done so. Unless improvements are made, patient results will continue to be compromised and comparison between studies and laboratories will have limited merit. PMID:20111076

  19. Exhaled breath analysis: physical methods, instruments, and medical diagnostics

    NASA Astrophysics Data System (ADS)

    Vaks, V. L.; Domracheva, E. G.; Sobakinskaya, E. A.; Chernyaeva, M. B.

    2014-07-01

    This paper reviews the analysis of exhaled breath, a rapidly growing field in noninvasive medical diagnostics that lies at the intersection of physics, chemistry, and medicine. Current data are presented on gas markers in human breath and their relation to human diseases. Various physical methods for breath analysis are described. It is shown how measurement precision and data volume requirements have stimulated technological developments and identified the problems that have to be solved to put this method into clinical practice.

  20. Extended morphological processing: a practical method for automatic spot detection of biological markers from microscopic images

    PubMed Central

    2010-01-01

    Background A reliable extraction technique for resolving multiple spots in light or electron microscopic images is essential in investigations of the spatial distribution and dynamics of specific proteins inside cells and tissues. Currently, automatic spot extraction and characterization in complex microscopic images poses many challenges to conventional image processing methods. Results A new method to extract closely located, small target spots from biological images is proposed. This method starts with a simple but practical operation based on the extended morphological top-hat transformation to subtract an uneven background. The core of our novel approach is the following: first, the original image is rotated in an arbitrary direction and each rotated image is opened with a single straight line-segment structuring element. Second, the opened images are unified and then subtracted from the original image. To evaluate these procedures, model images of simulated spots with closely located targets were created and the efficacy of our method was compared to that of conventional morphological filtering methods. The results showed the better performance of our method. The spots of real microscope images can be quantified to confirm that the method is applicable in a given practice. Conclusions Our method achieved effective spot extraction under various image conditions, including aggregated target spots, poor signal-to-noise ratio, and large variations in the background intensity. Furthermore, it has no restrictions with respect to the shape of the extracted spots. The features of our method allow its broad application in biological and biomedical image information analysis. PMID:20615231

  1. Multivariate analysis methods for spectroscopic blood analysis

    NASA Astrophysics Data System (ADS)

    Wood, Michael F. G.; Rohani, Arash; Ghazalah, Rashid; Vitkin, I. Alex; Pawluczyk, Romuald

    2012-01-01

    Blood tests are an essential tool in clinical medicine with the ability diagnosis or monitor various diseases and conditions; however, the complexities of these measurements currently restrict them to a laboratory setting. P&P Optica has developed and currently produces patented high performance spectrometers and is developing a spectrometer-based system for rapid reagent-free blood analysis. An important aspect of this analysis is the need to extract the analyte specific information from the measured signal such that the analyte concentrations can be determined. To this end, advanced chemometric methods are currently being investigated and have been tested using simulated spectra. A blood plasma model was used to generate Raman, near infrared, and optical rotatory dispersion spectra with glucose as the target analyte. The potential of combined chemometric techniques, where multiple spectroscopy modalities are used in a single regression model to improve the prediction ability was investigated using unfold partial least squares and multiblock partial least squares. Results show improvement in the predictions of glucose levels using the combined methods and demonstrate potential for multiblock chemometrics in spectroscopic blood analysis.

  2. An analysis of remanufacturing practices in Japan

    Microsoft Academic Search

    Mitsutaka Matsumoto; Yasushi Umeda

    2011-01-01

    Purpose  This study presents case studies of selected remanufacturing operations in Japan. It investigates Japanese companies' motives\\u000a and incentives for remanufacturing, clarifies the requirements and obstacles facing remanufacturers, itemizes what measures\\u000a companies take to address them, and discusses the influence of Japanese laws related to remanufacturing.\\u000a \\u000a \\u000a \\u000a \\u000a Methods  This study involves case studies of four product areas: photocopiers, single-use cameras, auto parts, and

  3. Practical advanced analysis and design of three-dimensional truss bridges

    Microsoft Academic Search

    Seung-Eock Kim; Moon-Ho Park; Se-Hyu Choi

    2001-01-01

    A new design method of three-dimensional truss bridges using practical advanced analysis is presented. Separate member capacity checks encompassed by the code specifications are not required, because the stability of separate members and the structure as a whole can be rigorously treated in determing the maximum strength of the structures. The geometric nonlinearity is considered using the updated Lagrangian formulation.

  4. Theory and Practice of Logic Programming 1 State Space Computation and Analysis of Time

    E-print Network

    Paris-Sud XI, Université de

    Theory and Practice of Logic Programming 1 State Space Computation and Analysis of Time Petri Nets provides a general framework to specify the behaviors of real-time reactive systems and Time Petri Nets zone-based algorithm to compute the state space of a bounded Time Petri Net: the method is different

  5. SUSTAINABLE THEATRE: AN ANALYSIS OF THEORIES AND PRACTICES

    Microsoft Academic Search

    Calder Arthur Johnson

    2009-01-01

    ABSTRACT Sustainable Theatre: A Holistic Analysis of Theories and Practices By Calder Arthur Johnson This thesis is a qualitative analysis of techniques and theories for creating a more sustainable mode of production of live theater. In-depth, one-on-one interviewing is the primary research tool employed. Groups that interviewees have been drawn from include theatre practitioners, sustainable living advocates, academics in related

  6. Reconstructing northern Chinese Neolithic subsistence practices by isotopic analysis

    E-print Network

    Pechenkina, Ekaterina

    Reconstructing northern Chinese Neolithic subsistence practices by isotopic analysis Ekaterina A samples indicate that Neolithic farmers of the Yellow and Wei River basins in China potentially cultivated, specifically pigs, dogs, and perhaps chicken. Bone samples were analyzed from four Neolithic sites: Jiangzhai

  7. A Model of Practice in Special Education: Dynamic Ecological Analysis

    ERIC Educational Resources Information Center

    Hannant, Barbara; Lim, Eng Leong; McAllum, Ruth

    2010-01-01

    Dynamic Ecological Analysis (DEA) is a model of practice which increases a teams' efficacy by enabling the development of more effective interventions through collaboration and collective reflection. This process has proved to be useful in: a) clarifying thinking and problem-solving, b) transferring knowledge and thinking to significant parties,…

  8. MAD Skills: New Analysis Practices for Big Data Jeffrey Cohen

    E-print Network

    California at Irvine, University of

    MAD Skills: New Analysis Practices for Big Data Jeffrey Cohen Greenplum Brian Dolan Fox Audience and industrial development throughout the 1990's. Traditionally, a carefully designed EDW is considered to have as the rallying point for disciplined data integration within a large enterprise, rationalizing the outputs

  9. Practical evaluation of Mung bean seed pasteurization method in Japan.

    PubMed

    Bari, M L; Enomoto, K; Nei, D; Kawamoto, S

    2010-04-01

    The majority of the seed sprout-related outbreaks have been associated with Escherichia coli O157:H7 and Salmonella. Therefore, an effective method for inactivating these organisms on the seeds before sprouting is needed. The current pasteurization method for mung beans in Japan (hot water treatment at 85 degrees C for 10 s) was more effective for disinfecting inoculated E. coli O157:H7, Salmonella, and nonpathogenic E. coli on mung bean seeds than was the calcium hypochlorite treatment (20,000 ppm for 20 min) recommended by the U.S. Food and Drug Administration. Hot water treatment at 85 degrees C for 40 s followed by dipping in cold water for 30 s and soaking in chlorine water (2,000 ppm) for 2 h reduced the pathogens to undetectable levels, and no viable pathogens were found in a 25-g enrichment culture and during the sprouting process. Practical tests using a working pasteurization machine with nonpathogenic E. coli as a surrogate produced similar results. The harvest yield of the treated seed was within the acceptable range. These treatments could be a viable alternative to the presently recommended 20,000-ppm chlorine treatment for mung bean seeds. PMID:20377967

  10. Voltammetric analysis apparatus and method

    DOEpatents

    Almon, A.C.

    1993-06-08

    An apparatus and method is described for electrochemical analysis of elements in solution. An auxiliary electrode, a reference electrode, and five working electrodes are positioned in a container containing a sample solution. The working electrodes are spaced apart evenly from each other and the auxiliary electrode to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between the auxiliary electrode and each of the working electrodes. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in the sample solution and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  11. Voltametric analysis apparatus and method

    DOEpatents

    Almon, Amy C. (410 Waverly Dr., Augusta, GA 30909)

    1993-01-01

    An apparatus and method for electrochemical analysis of elements in solution. An auxiliary electrode 14, a reference electrode 18, and five working electrodes 20, 22, 26, 28, and 30 are positioned in a container 12 containing a sample solution 34. The working electrodes are spaced apart evenly from each other and auxiliary electrode 14 to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between auxiliary electrode 14 and each of the working electrodes 20, 22, 26, 28, and 30. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in sample solution 34 and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  12. ENGL/DTC 560: Critical Theories, Methods, and Practice in Digital Humanities

    E-print Network

    Collins, Gary S.

    #12;#12;1 ENGL/DTC 560: Critical Theories, Methods, and Practice in Digital Humanities 3 credits, and practice of digital humanities, while interrogating how digital humanities transforms knowledge across the humanities. COURSE DESCRIPTION Critical Theories, Methods, and Practice in Digital Humanities examines

  13. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  14. Stated preference analysis of travel choices: the state of practice

    Microsoft Academic Search

    David A. Hensher

    1994-01-01

    Stated preference (SP) methods are widely used in travel behaviour research and practice to identify behavioural responses to choice situations which are not revealed in the market, and where the attribute levels offered by existing choices are modified to such an extent that the reliability of revealed preference models as predictors of response is brought into question. This paper reviews

  15. Practice

    NSDL National Science Digital Library

    Paul Goldenberg

    2011-10-25

    This article focuses on the role and techniques of effective ("distributed") practice that leads to full and fluent mastery of mental mathematics as well as conceptual growth around properties of arithmetic. It lists the essential mental math skills needed for fluent computation at grades 1, 2, and 3. The article describes a number of strategies for developing mental skills and links to pages with more details on others (some not yet complete). While this article refers to the Think Math! curriculum published by EDC, the methods generalize to any program. The Fact of the Day technique and a related video are cataloged separately.

  16. Generalized finite element method for multiscale analysis 

    E-print Network

    Zhang, Lin

    2004-11-15

    This dissertation describes a new version of the Generalized Finite Element Method (GFEM), which is well suited for problems set in domains with a large number of internal features (e.g. voids, inclusions, etc.), which are practically impossible...

  17. 44 CFR 9.9 - Analysis and reevaluation of practicable alternatives.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...FLOODPLAIN MANAGEMENT AND PROTECTION OF WETLANDS § 9.9 Analysis and reevaluation...requirements to avoid floodplains and wetlands unless there is no practicable alternative...practicable alternative, or the floodplain or wetland is itself not a practicable...

  18. 44 CFR 9.9 - Analysis and reevaluation of practicable alternatives.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...FLOODPLAIN MANAGEMENT AND PROTECTION OF WETLANDS § 9.9 Analysis and reevaluation...requirements to avoid floodplains and wetlands unless there is no practicable alternative...practicable alternative, or the floodplain or wetland is itself not a practicable...

  19. Methods used by Dental Practice-Based Research Network (DPBRN) dentists to diagnose dental caries

    PubMed Central

    Gordan, Valeria V.; Riley, Joseph L; Carvalho, Ricardo M.; Snyder, John; Sanderson, James L; Anderson, Mary; Gilbert, Gregg H.

    2010-01-01

    Objectives To (1) identify the methods that dentists in The Dental Practice-Based Research Network (DPBRN) use to diagnose dental caries; (2) quantify their frequency of use; and (3) test the hypothesis that certain dentist and dental practice characteristics are significantly associated with their use. Methods A questionnaire about methods used for caries diagnosis was sent to DPBRN dentists who reported doing at least some restorative dentistry; 522 dentists participated. Questions included use of dental radiographs, dental explorer, laser fluorescence, air-drying, fiber optic devices, and magnification, as used when diagnosing primary, secondary/recurrent, or non-specific caries lesions. Variations on the frequency of their use were tested using multivariate analysis and Bonferroni tests. Results Overall, the dental explorer was the instrument most commonly used to detect primary occlusal caries as well as to detect caries at the margins of existing restorations. In contrast, laser fluorescence was rarely used to help diagnose occlusal primary caries. For proximal caries, radiographs were used to help diagnose 75-100% of lesions by 96% of the DPBRN dentists. Dentists who use radiographs most often to assess proximal surfaces of posterior teeth, were significantly more likely to also report providing a higher percentage of patients with individualized caries prevention (p = .040) and seeing a higher percentage of pediatric patients (p = .001). Conclusion Use of specific diagnostic methods varied substantially. The dental explorer and radiographs are still the most commonly used diagnostic methods. PMID:21488724

  20. Neuroscience and the Feldenkrais Method: evidence in research and clinical practice

    E-print Network

    Hickman, Mark

    @ canterbury.ac.nz Some say evidence-based practice stifles the creative therapies and learning modalitiesNeuroscience and the Feldenkrais Method: evidence in research and clinical practice Associate. It draws on principles of exploratory practice rather than prescribed exercises and can work at different

  1. Practical method to identify orbital anomaly as spacecraft breakup in the geostationary region

    NASA Astrophysics Data System (ADS)

    Hanada, Toshiya; Uetsuhara, Masahiko; Nakaniwa, Yoshitaka

    2012-07-01

    Identifying a spacecraft breakup is an essential issue to define the current orbital debris environment. This paper proposes a practical method to identify an orbital anomaly, which appears as a significant discontinuity in the observation data, as a spacecraft breakup. The proposed method is applicable to orbital anomalies in the geostationary region. Long-term orbital evolutions of breakup fragments may conclude that their orbital planes will converge into several corresponding regions in inertial space even if the breakup epoch is not specified. This empirical method combines the aforementioned conclusion with the search strategy developed at Kyushu University, which can identify origins of observed objects as fragments released from a specified spacecraft. This practical method starts with selecting a spacecraft that experienced an orbital anomaly, and formulates a hypothesis to generate fragments from the anomaly. Then, the search strategy is applied to predict the behavior of groups of fragments hypothetically generated. Outcome of this predictive analysis specifies effectively when, where and how we should conduct optical measurements using ground-based telescopes. Objects detected based on the outcome are supposed to be from the anomaly, so that we can confirm the anomaly as a spacecraft breakup to release the detected objects. This paper also demonstrates observation planning for a spacecraft anomaly in the geostationary region.

  2. Estimating free-living human energy expenditure: Practical aspects of the doubly labeled water method and its applications

    PubMed Central

    Kazuko, Ishikawa-Takata; Kim, Eunkyung; Kim, Jeonghyun; Yoon, Jinsook

    2014-01-01

    The accuracy and noninvasive nature of the doubly labeled water (DLW) method makes it ideal for the study of human energy metabolism in free-living conditions. However, the DLW method is not always practical in many developing and Asian countries because of the high costs of isotopes and equipment for isotope analysis as well as the expertise required for analysis. This review provides information about the theoretical background and practical aspects of the DLW method, including optimal dose, basic protocols of two- and multiple-point approaches, experimental procedures, and isotopic analysis. We also introduce applications of DLW data, such as determining the equations of estimated energy requirement and validation studies of energy intake. PMID:24944767

  3. Methods of stability analysis in nonlinear mechanics

    SciTech Connect

    Warnock, R.L.; Ruth, R.D.; Gabella, W.; Ecklund, K.

    1989-01-01

    We review our recent work on methods to study stability in nonlinear mechanics, especially for the problems of particle accelerators, and compare our ideals to those of other authors. We emphasize methods that (1) show promise as practical design tools, (2) are effective when the nonlinearity is large, and (3) have a strong theoretical basis. 24 refs., 2 figs., 2 tabs.

  4. Designing for scientific data analysis: From practice to prototype

    Microsoft Academic Search

    Springmeyer

    1992-01-01

    Designers charged with creating tools for processes foreign to their own experience need a reliable source of application knowledge. This dissertation presents an empirical study of the scientific data analysis process in order to inform the design of tools for this important aspect of scientific computing. Interaction analysis and contextual inquiry methods were adapted to observe scientists analyzing their own

  5. Cluster analysis in community research: Epistemology and practice

    Microsoft Academic Search

    Bruce D. Rapkin; Douglas A. Luke

    1993-01-01

    Cluster analysis refers to a family of methods for identifying cases with distinctive characteristics in heterogeneous samples and combining them into homogeneous groups. This approach provides a great deal of information about the types of cases and the distributions of variables in a sample. This paper considers cluster analysis as a quantitative complement to the traditional linear statistics that often

  6. Practice patterns in FNA technique: A survey analysis

    PubMed Central

    DiMaio, Christopher J; Buscaglia, Jonathan M; Gross, Seth A; Aslanian, Harry R; Goodman, Adam J; Ho, Sammy; Kim, Michelle K; Pais, Shireen; Schnoll-Sussman, Felice; Sethi, Amrita; Siddiqui, Uzma D; Robbins, David H; Adler, Douglas G; Nagula, Satish

    2014-01-01

    AIM: To ascertain fine needle aspiration (FNA) techniques by endosonographers with varying levels of experience and environments. METHODS: A survey study was performed on United States based endosonographers. The subjects completed an anonymous online electronic survey. The main outcome measurements were differences in needle choice, FNA technique, and clinical decision making among endosonographers and how this relates to years in practice, volume of EUS-FNA procedures, and practice environment. RESULTS: A total of 210 (30.8%) endosonographers completed the survey. Just over half (51.4%) identified themselves as academic/university-based practitioners. The vast majority of respondents (77.1%) identified themselves as high-volume endoscopic ultrasound (EUS) (> 150 EUS/year) and high-volume FNA (> 75 FNA/year) performers (73.3). If final cytology is non-diagnostic, high-volume EUS physicians were more likely than low volume physicians to repeat FNA with a core needle (60.5% vs 31.2%; P = 0.0004), and low volume physicians were more likely to refer patients for either surgical or percutaneous biopsy, (33.4% vs 4.9%, P < 0.0001). Academic physicians were more likely to repeat FNA with a core needle (66.7%) compared to community physicians (40.2%, P < 0.001). CONCLUSION: There is significant variation in EUS-FNA practices among United States endosonographers. Differences appear to be related to EUS volume and practice environment. PMID:25324922

  7. Coal Field Fire Fighting - Practiced methods, strategies and tactics

    NASA Astrophysics Data System (ADS)

    Wündrich, T.; Korten, A. A.; Barth, U. H.

    2009-04-01

    Subsurface coal fires destroy millions of tons of coal each year, have an immense impact to the ecological surrounding and threaten further coal reservoirs. Due to enormous dimensions a coal seam fire can develop, high operational expenses are needed. As part of the Sino-German coal fire research initiative "Innovative technologies for exploration, extinction and monitoring of coal fires in Northern China" the research team of University of Wuppertal (BUW) focuses on fire extinction strategies and tactics as well as aspects of environmental and health safety. Besides the choice and the correct application of different extinction techniques further factors are essential for the successful extinction. Appropriate tactics, well trained and protected personnel and the choice of the best fitting extinguishing agents are necessary for the successful extinction of a coal seam fire. The chosen strategy for an extinction campaign is generally determined by urgency and importance. It may depend on national objectives and concepts of coal conservation, on environmental protection (e.g. commitment to green house gases (GHG) reductions), national funding and resources for fire fighting (e.g. personnel, infrastructure, vehicles, water pipelines); and computer-aided models and simulations of coal fire development from self ignition to extinction. In order to devise an optimal fire fighting strategy, "aims of protection" have to be defined in a first step. These may be: - directly affected coal seams; - neighboring seams and coalfields; - GHG emissions into the atmosphere; - Returns on investments (costs of fire fighting compared to value of saved coal). In a further step, it is imperative to decide whether the budget shall define the results, or the results define the budget; i.e. whether there are fixed objectives for the mission that will dictate the overall budget, or whether the limited resources available shall set the scope within which the best possible results shall be achieved. For an effective and efficient fire fighting optimal tactics are requiered and can be divided into four fundamental tactics to control fire hazards: - Defense (digging away the coal, so that the coal can not begin to burn; or forming a barrier, so that the fire can not reach the not burning coal), - Rescue the coal (coal mining of a not burning seam), - Attack (active and direct cooling of burning seam), - Retreat (only monitoring till self-extinction of a burning seam). The last one is used when a fire exceeds the organizational and/or technical scope of a mission. In other words, "to control a coal fire" does not automatically and in all situations mean "to extinguish a coal fire". Best-practice tactics or a combination of them can be selected for control of a particular coal fire. For the extinguishing works different extinguishing agents are available. They can be applied by different application techniques and varying distinctive operating expenses. One application method may be the drilling of boreholes from the surface or covering the surface with low permeability soils. The mainly used extinction agents for coal field fire are as followed: Water (with or without additives), Slurry, Foaming mud/slurry, Inert gases, Dry chemicals and materials and Cryogenic agents. Because of its tremendous dimension and its complexity the worldwide challenge of coal fires is absolutely unique - it can only be solved with functional application methods, best fitting strategies and tactics, organisation and research as well as the dedication of the involved fire fighters, who work under extreme individual risks on the burning coal fields.

  8. Promoting nurses' knowledge in evidence-based practice: do educational methods matter?

    PubMed

    Toole, Belinda M; Stichler, Jaynelle F; Ecoff, Laurie; Kath, Lisa

    2013-01-01

    Evidence-based practice (EBP) is a mandate for nursing practice. Education on EBP has occurred in academic settings, but not all nurses have received this training. The authors describe a randomized controlled pretest/posttest design testing the differences in effectiveness of two educational methods to improve nurses' knowledge, attitudes, and practice of EBP. Results indicated both methods improved self-reported practice. On the basis of the study findings, staff development educators can select the teaching method that best complements their organizational environment. PMID:23877287

  9. Articulating current service development practices: a qualitative analysis of eleven mental health projects

    PubMed Central

    2014-01-01

    Background The utilisation of good design practices in the development of complex health services is essential to improving quality. Healthcare organisations, however, are often seriously out of step with modern design thinking and practice. As a starting point to encourage the uptake of good design practices, it is important to understand the context of their intended use. This study aims to do that by articulating current health service development practices. Methods Eleven service development projects carried out in a large mental health service were investigated through in-depth interviews with six operation managers. The critical decision method in conjunction with diagrammatic elicitation was used to capture descriptions of these projects. Stage-gate design models were then formed to visually articulate, classify and characterise different service development practices. Results Projects were grouped into three categories according to design process patterns: new service introduction and service integration; service improvement; service closure. Three common design stages: problem exploration, idea generation and solution evaluation - were then compared across the design process patterns. Consistent across projects were a top-down, policy-driven approach to exploration, underexploited idea generation and implementation-based evaluation. Conclusions This study provides insight into where and how good design practices can contribute to the improvement of current service development practices. Specifically, the following suggestions for future service development practices are made: genuine user needs analysis for exploration; divergent thinking and innovative culture for idea generation; and fail-safe evaluation prior to implementation. Better training for managers through partnership working with design experts and researchers could be beneficial. PMID:24438471

  10. Homotopy analysis method for heat radiation equations

    Microsoft Academic Search

    S. Abbasbandy

    2007-01-01

    Here, the homotopy analysis method (HAM), one of the newest analytical methods which is powerful and easy-to-use, is applied to solve heat transfer problems with high nonlinearity order. Also, the results are compared with the perturbation and numerical Runge–Kutta methods and homotopy perturbation method (HPM). Here, homotopy analysis method is used to solve an unsteady nonlinear convective–radiative equation containing two

  11. Strength-based Supervision: Frameworks, Current Practice, and Future Directions A Wu-wei Method.

    ERIC Educational Resources Information Center

    Edwards, Jeffrey K.; Chen, Mei-Whei

    1999-01-01

    Discusses a method of counseling supervision similar to the wu-wei practice in Zen and Taoism. Suggests that this strength-based method and an understanding of isomorphy in supervisory relationships are the preferred practice for the supervision of family counselors. States that this model of supervision potentiates the person-of-the-counselor.…

  12. Investigating the Efficacy of Practical Skill Teaching: A Pilot-Study Comparing Three Educational Methods

    ERIC Educational Resources Information Center

    Maloney, Stephen; Storr, Michael; Paynter, Sophie; Morgan, Prue; Ilic, Dragan

    2013-01-01

    Effective education of practical skills can alter clinician behaviour, positively influence patient outcomes, and reduce the risk of patient harm. This study compares the efficacy of two innovative practical skill teaching methods, against a traditional teaching method. Year three pre-clinical physiotherapy students consented to participate in a…

  13. Analysis of flight equipment purchasing practices of representative air carriers

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The process through which representative air carriers decide whether or not to purchase flight equipment was investigated as well as their practices and policies in retiring surplus aircraft. An analysis of the flight equipment investment decision process in ten airlines shows that for the airline industry as a whole, the flight equipment investment decision is in a state of transition from a wholly informal process in earliest years to a much more organized and structured process in the future. Individual air carriers are in different stages with respect to the formality and sophistication associated with the flight equipment investment decision.

  14. Root Cause Analysis: Methods and Mindsets.

    ERIC Educational Resources Information Center

    Kluch, Jacob H.

    This instructional unit is intended for use in training operations personnel and others involved in scram analysis at nuclear power plants in the techniques of root cause analysis. Four lessons are included. The first lesson provides an overview of the goals and benefits of the root cause analysis method. Root cause analysis techniques are covered…

  15. Generalized Method of Fault Analysis

    Microsoft Academic Search

    V. Brandwajn; W. F. Tinney

    1985-01-01

    A generalized method is given for solving shortcircuit faults of any conceivable complexity. The method efficiently combines the application of sparsity-oriented compensation techniques to sequence networks with the simulation of fault conditions in phase coordinates. All recent advances in features and modeling aspects of fault studies are incorporated in the method. Sparse vector techniques are extensively used to enhance speed

  16. Fourier Spectral Method for Shape Asymmetry Analysis

    E-print Network

    Wisconsin at Madison, University of

    Fourier Spectral Method for Shape Asymmetry Analysis Moo K. Chung Department of Biostatistics idea and Brechbuler's 3D Fourier descriptor. Surface shape registration,surface data smoothing population asymmetry analysis framework Clinical population Normal controls template image registration image

  17. A Practical Face Relighting Method for Directional Lighting Normalization

    Microsoft Academic Search

    Kuang-chih Lee; Baback Moghaddam

    2005-01-01

    We propose a simplified and practical computational technique for estimating directional lighting in uncalibrated images of faces in frontal pose. We show that this inverse problem can be solved using constrained least-squares and class-specific priors on shape and reflectance. For simplic- ity, the principal illuminant is modeled as a mixture of Lambertian and ambient components. By using a generic 3D

  18. Methods of Building Cost Analysis.

    ERIC Educational Resources Information Center

    Building Research Inst., Inc., Washington, DC.

    Presentation of symposium papers includes--(1) a study describing techniques for economic analysis of building designs, (2) three case studies of analysis techniques, (3) procedures for measuring the area and volume of buildings, and (4) an open forum discussion. Case studies evaluate--(1) the thermal economics of building enclosures, (2) an…

  19. Grounded Theory in Practice: Is It Inherently a Mixed Method?

    ERIC Educational Resources Information Center

    Johnson, R. B.; McGowan, M. W.; Turner, L. A.

    2010-01-01

    We address 2 key points of contention in this article. First, we engage the debate concerning whether particular methods are necessarily linked to particular research paradigms. Second, we briefly describe a mixed methods version of grounded theory (MM-GT). Grounded theory can be tailored to work well in any of the 3 major forms of mixed methods…

  20. Gas Hydrate Investigations Using Pressure Core Analysis: Current Practice

    NASA Astrophysics Data System (ADS)

    Schultheiss, P.; Holland, M.; Roberts, J.; Druce, M.

    2006-12-01

    Recently there have been a number of major gas hydrate expeditions, both academic and commercially oriented, that have benefited from advances in the practice of pressure coring and pressure core analysis, especially using the HYACINTH pressure coring systems. We report on the now mature process of pressure core acquisition, pressure core handling and pressure core analysis and the results from the analysis of pressure cores, which have revealed important in situ properties along with some remarkable views of gas hydrate morphologies. Pressure coring success rates have improved as the tools have been modified and adapted for use on different drilling platforms. To ensure that pressure cores remain within the hydrate stability zone, tool deployment, recovery and on-deck handling procedures now mitigate against unwanted temperature rises. Core analysis has been integrated into the core transfer protocol and automated nondestructive measurements, including P-wave velocity, gamma density, and X-ray imaging, are routinely made on cores. Pressure cores can be subjected to controlled depressurization experiments while nondestructive measurements are being made, or cores can be stored at in situ conditions for further analysis and subsampling.

  1. A report on the CCNA 2007 professional practice analysis.

    PubMed

    Muckle, Timothy J; Apatov, Nathaniel M; Plaus, Karen

    2009-06-01

    The purpose of this column is to present the results of the 2007 Professional Practice Analysis (PPA) of the field of nurse anesthesia, conducted by the Council on Certification of Nurse Anesthetists. The PPA used survey and rating scale methodologies to collect data regarding the relative emphasis of various aspects of the nurse anesthesia knowledge domain and competencies. A total of 3,805 survey responses were analyzed using the Rasch rating scale model, which aggregates and transforms ordinal (rating scale) responses into linear measures of relative importance and frequency. Summaries of respondent demographics and educational and professional background are provided, as well as descriptions of how the survey results are used to develop test specifications. The results of this analysis provide evidence for the content outline and test specifications (content percentages) and thus serve as a basis of content validation for the National Certification Examination. PMID:19645167

  2. Speech analysis by parametric methods: A survey

    NASA Astrophysics Data System (ADS)

    Gueguen, C.

    Speech analysis parametric modeling and parameter extraction methods are summarized. Linear prediction principles and associated fast algorithms are reviewed. Global analysis methods on short time windows, with variable frequency resolution, and additive noise; time evolving methods where a time varying parametric model is adjusted to model the transitions between quasistationary periods; and time adaptive sequential methods using fast algorithms along with a synchronous detection of temporal events are examined.

  3. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    PubMed Central

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or “one-shot,” in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294

  4. Rules and methods: a sociology of subcultural knowledge and practice

    E-print Network

    Staggs, Frank Maurice

    1978-01-01

    programs. ~Iu t' dDt t' f~std The implications of studying homosexuality as a group phenomenon are quite profound, especially if each study can take as a logical assumption the existence of a social reality giving substance to a gay way of life. To do... implicated in a recogniz- able continuing social system which serves to recruit, teach, and pro- vide support and opportunities for the deviant practices" (Akers, 1973; 142). Social learning theorists note that the physiological capacity for sexual...

  5. Adapting community based participatory research (CBPR) methods to the implementation of an asthma shared decision making intervention in ambulatory practices

    PubMed Central

    Kuhn, Lindsay; Alkhazraji, Thamara; Steuerwald, Mark; Ludden, Tom; Wilson, Sandra; Mowrer, Lauren; Mohanan, Sveta; Dulin, Michael F.

    2014-01-01

    Objective Translating research findings into clinical practice is a major challenge to improve the quality of healthcare delivery. Shared decision making (SDM) has been shown to be effective and has not yet been widely adopted by health providers. This paper describes the participatory approach used to adapt and implement an evidence-based asthma SDM intervention into primary care practices. Methods A participatory research approach was initiated through partnership development between practice staff and researchers. The collaborative team worked together to adapt and implement a SDM toolkit. Using the RE-AIM framework and qualitative analysis, we evaluated both the implementation of the intervention into clinical practice, and the level of partnership that was established. Analysis included the number of adopting clinics and providers, the patients’ perception of the SDM approach, and the number of clinics willing to sustain the intervention delivery after 1 year. Results All six clinics and physician champions implemented the intervention using half-day dedicated asthma clinics while 16% of all providers within the practices have participated in the intervention. Themes from the focus groups included the importance of being part the development process, belief that the intervention would benefit patients, and concerns around sustainability and productivity. One year after initiation, 100% of clinics have sustained the intervention, and 90% of participating patients reported a shared decision experience. Conclusions Use of a participatory research process was central to the successful implementation of a SDM intervention in multiple practices with diverse patient populations. PMID:24350877

  6. Scharz Preconditioners for Krylov Methods: Theory and Practice

    SciTech Connect

    Szyld, Daniel B.

    2013-05-10

    Several numerical methods were produced and analyzed. The main thrust of the work relates to inexact Krylov subspace methods for the solution of linear systems of equations arising from the discretization of partial di#11;erential equa- tions. These are iterative methods, i.e., where an approximation is obtained and at each step. Usually, a matrix-vector product is needed at each iteration. In the inexact methods, this product (or the application of a preconditioner) can be done inexactly. Schwarz methods, based on domain decompositions, are excellent preconditioners for thise systems. We contributed towards their under- standing from an algebraic point of view, developed new ones, and studied their performance in the inexact setting. We also worked on combinatorial problems to help de#12;ne the algebraic partition of the domains, with the needed overlap, as well as PDE-constraint optimization using the above-mentioned inexact Krylov subspace methods.

  7. Recruitment ad analysis offers new opportunities to attract GPs to short-staffed practices.

    PubMed

    Hemphill, Elizabeth; Kulik, Carol T

    2013-01-01

    As baby-boomer practitioners exit the workforce, physician shortages present new recruitment challenges for practices seeking GPs. This article reports findings from two studies examining GP recruitment practice. GP recruitment ad content analysis (Study 1) demonstrated that both Internet and print ads emphasize job attributes but rarely present family or practice attributes. Contacts at these medical practices reported that their practices offer distinctive family and practice attributes that could be exploited in recruitment advertising (Study 2). Understaffed medical practices seeking to attract GPs may differentiate their job offerings in a crowded market by incorporating family and/or practice attributes into their ads. PMID:23697854

  8. Practice Environments of Nurses in Ambulatory Oncology Settings: A Thematic Analysis

    PubMed Central

    Kamimura, Akiko; Schneider, Karin; Lee, Cheryl S.; Crawford, Scott D.; Friese, Christopher R.

    2010-01-01

    Background The practice environments of nurses have been studied extensively in inpatient settings, but rarely in the ambulatory context. As the majority of cancer care is delivered in ambulatory settings, a better understanding of the nursing practice environment may contribute to quality improvement efforts. Objective We sought to examine the features of nursing practice environments that contribute to quality patient care and nursing job satisfaction. Interventions/Methods In 2009-2010, we conducted focus groups with nurses who cared for adults with cancer outside of inpatient units. A semi-structured moderator guide explored practice environment features that promoted safe, high-quality care, and high job satisfaction. We also asked nurses to identify practice environment features that hindered quality care and reduced job satisfaction. We conducted thematic analysis to report themes, and to construct a conceptual framework. Results From two focus groups, comprised of 13 participants, nurses reported that variability in workloads, support from managers and medical assistants, and the practice's physical resources could facilitate or hinder high-quality care and job satisfaction. High-quality communication across team members improved patient safety and satisfaction. Conclusions Consistent with research findings from inpatient settings, nurses identified staffing and resource adequacy, management support, and collegiality as important inputs to high-quality care. Implications for Practice These findings can inform quality improvement initiatives in ambulatory oncology practices. Strengthening nurse-medical assistant relationships, smoothing patient workload variability, and implementing strategies to strengthen communication, may contribute to quality cancer care. Studies to test our proposed conceptual framework would bridge existing knowledge gaps in ambulatory settings. PMID:21372702

  9. Propel: Tools and Methods for Practical Source Code Model Checking

    NASA Technical Reports Server (NTRS)

    Mansouri-Samani, Massoud; Mehlitz, Peter; Markosian, Lawrence; OMalley, Owen; Martin, Dale; Moore, Lantz; Penix, John; Visser, Willem

    2003-01-01

    The work reported here is an overview and snapshot of a project to develop practical model checking tools for in-the-loop verification of NASA s mission-critical, multithreaded programs in Java and C++. Our strategy is to develop and evaluate both a design concept that enables the application of model checking technology to C++ and Java, and a model checking toolset for C++ and Java. The design concept and the associated model checking toolset is called Propel. It builds upon the Java PathFinder (JPF) tool, an explicit state model checker for Java applications developed by the Automated Software Engineering group at NASA Ames Research Center. The design concept that we are developing is Design for Verification (D4V). This is an adaption of existing best design practices that has the desired side-effect of enhancing verifiability by improving modularity and decreasing accidental complexity. D4V, we believe, enhances the applicability of a variety of V&V approaches; we are developing the concept in the context of model checking. The model checking toolset, Propel, is based on extending JPF to handle C++. Our principal tasks in developing the toolset are to build a translator from C++ to Java, productize JPF, and evaluate the toolset in the context of D4V. Through all these tasks we are testing Propel capabilities on customer applications.

  10. Imaging laser analysis of building materials - practical examples

    SciTech Connect

    Wilsch, G.; Schaurich, D.; Wiggenhauser, H. [BAM, Federal Institute for Materials Research and Testing, Berlin (Germany)

    2011-06-23

    The Laser induced Breakdown Spectroscopy (LIBS) is supplement and extension of standard chemical methods and SEM- or Micro-RFA-applications for the evaluation of building materials. As a laboratory method LIBS is used to gain color coded images representing composition, distribution of characteristic ions and/or ingress characteristic of damaging substances. To create a depth profile of element concentration a core has to be taken and split along the core axis. LIBS was proven to be able to detect all important elements in concrete, e. g. Chlorine, Sodium or Sulfur, which are responsible for certain degradation mechanisms and also light elements like lithium or hydrogen. Practical examples are given and a mobile system for on-site measurements is presented.

  11. [Evidence-based practices published in Brazil: identification and analysis of their types and methodological approches].

    PubMed

    Lacerda, Rúbia Aparecida; Nunes, Bruna Kosar; Batista, Arlete de Oliveira; Egry, Emiko Yoshikawa; Graziano, Kazuko Uchikawa; Angelo, Margareth; Merighi, Miriam Aparecida Barbosa; Lopes, Nadir Aparecida; Fonseca, Rosa Maria Godoy Serpa da; Castilho, Valéria

    2011-06-01

    This is an integrative review of Brazilian studies on evidence-based practices (EBP) in health, published in ISI/JCR journals in the last 10 years. The aim was to identify the specialty areas that most accomplished these studies, their foci and methodological approaches. Based on inclusion criteria, 144 studies were selected. The results indicate that most EBP studies addressed childhood and adolescence, infectious diseases, psychiatrics/mental health and surgery. The predominant foci were prevention, treatment/rehabilitation, diagnosis and assessment. The most used methods were systematic review with or without meta-analysis, protocol review or synthesis of available evidence studies, and integrative review. A strong multiprofessional expansion of EBP is found in Brazil, contributing to the search for more selective practices by collecting, recognizing and critically analyzing the produced knowledge. The study also contributes to the analysis itself of ways to do research and new research possibilities. PMID:21710089

  12. Short time-series microarray analysis: methods and challenges.

    PubMed

    Wang, Xuewei; Wu, Ming; Li, Zheng; Chan, Christina

    2008-01-01

    The detection and analysis of steady-state gene expression has become routine. Time-series microarrays are of growing interest to systems biologists for deciphering the dynamic nature and complex regulation of biosystems. Most temporal microarray data only contain a limited number of time points, giving rise to short-time-series data, which imposes challenges for traditional methods of extracting meaningful information. To obtain useful information from the wealth of short-time series data requires addressing the problems that arise due to limited sampling. Current efforts have shown promise in improving the analysis of short time-series microarray data, although challenges remain. This commentary addresses recent advances in methods for short-time series analysis including simplification-based approaches and the integration of multi-source information. Nevertheless, further studies and development of computational methods are needed to provide practical solutions to fully exploit the potential of this data. PMID:18605994

  13. General practice fundholding: observations on prescribing patterns and costs using the defined daily dose method

    Microsoft Academic Search

    M Maxwell; D Heaney; J G Howie; S Noble

    1993-01-01

    OBJECTIVE--To compare prescribing patterns between a group of fundholding practices and a group of non-fundholding practices in north east Scotland using a method which provides more accurate statements about volumes prescribed than standard NHS statistics. DESIGN--The pharmacy practice division of the National Health Service in Scotland provided data for selected British National Formulary sections over two years. Each prescription issued

  14. Dynamic analysis methods for nuclear facilities

    Microsoft Academic Search

    Horsager

    1979-01-01

    A comparison is made between three different dynamic analysis methods commonly used in the analysis of nuclear facilities. The methods are applied to a typical non-reactor type nuclear facility; namely, an early configuration of the High Performance Fuel Laboratory which was to have been designed and constructed to house an automated fuel process line on the Hanford Reservation near Richland,

  15. A signature analysis method for IC failure analysis

    SciTech Connect

    Henderson, C.L.; Soden, J.M.

    1996-10-01

    A new method of signature analysis is presented and explained. This method of signature analysis can be based on either experiential knowledge of failure analysis, observed data, or a combination of both. The method can also be used on low numbers of failures or even single failures. It uses the Dempster-Shafer theory to calculate failure mechanism confidence. The model is developed in the paper and an example is given for its use. 9 refs., 5 figs., 9 tabs.

  16. Dynamic mechanical analysis: A practical introduction to techniques and applications

    SciTech Connect

    Menard, K.

    1999-01-01

    This book introduces DMA, its history, and its current position as part of thermal analysis on polymers. It discusses major types of instrumentation, including oscillatory rotational, oscillatory axial, and torsional pendulum. It also describes analytical techniques in terms of utility, quality of data, methods of calibration, and suitability for different types of materials and assesses applications for thermoplastics, thermosetting systems, and thermosets.

  17. Professional Suitability for Social Work Practice: A Factor Analysis

    ERIC Educational Resources Information Center

    Tam, Dora M. Y.; Coleman, Heather; Boey, Kam-Wing

    2012-01-01

    Objective: The purpose of this study was to identify the underlying dimensions of professional suitability. Method: Data were collected from a province-wide mail-out questionnaire surveying 341 participants from a random sample of registered social workers. Results: The use of an exploratory factor analysis identified a 5-factor solution on…

  18. Documenting Elementary Teachers' Sustainability of Instructional Practices: A Mixed Method Case Study

    NASA Astrophysics Data System (ADS)

    Cotner, Bridget A.

    School reform programs focus on making educational changes; however, research on interventions past the funded implementation phase to determine what was sustained is rarely done (Beery, Senter, Cheadle, Greenwald, Pearson, et al., 2005). This study adds to the research on sustainability by determining what instructional practices, if any, of the Teaching SMARTRTM professional development program that was implemented from 2005--2008 in elementary schools with teachers in grades third through eighth were continued, discontinued, or adapted five years post-implementation (in 2013). Specifically, this study sought to answer the following questions: What do teachers who participated in Teaching SMARTRTM and district administrators share about the sustainability of Teaching SMARTRTM practices in 2013? What teaching strategies do teachers who participated in the program (2005--2008) use in their science classrooms five years postimplementation (2013)? What perceptions about the roles of females in science, technology, engineering, and mathematics (STEM) do teachers who participated in the program (2005--2008) have five years later (2013)? And, What classroom management techniques do the teachers who participated in the program (2005--2008) use five years post implementation (2013)? A mixed method approach was used to answer these questions. Quantitative teacher survey data from 23 teachers who participated in 2008 and 2013 were analyzed in SAS v. 9.3. Descriptive statistics were reported and paired t-tests were conducted to determine mean differences by survey factors identified from an exploratory factor analysis, principal axis factoring, and parallel analysis conducted with teacher survey baseline data (2005). Individual teacher change scores (2008 and 2013) for identified factors were computed using the Reliable Change Index statistic. Qualitative data consisted of interviews with two district administrators and three teachers who responded to the survey in both years (2008 and 2013). Additionally, a classroom observation was conducted with one of the interviewed teachers in 2013. Qualitative analyses were conducted following the constant comparative method and were facilitated by ATLAS.ti v. 6.2, a qualitative analysis software program. Qualitative findings identified themes at the district level that influenced teachers' use of Teaching SMARTRTM strategies. All the themes were classified as obstacles to sustainability: economic downturn, turnover of teachers and lack of hiring, new reform policies, such as Race to the Top, Student Success Act, Common Core State Standards, and mandated blocks of time for specific content. Results from the survey data showed no statistically significant difference through time in perceived instructional practices except for a perceived decrease in the use of hands-on instructional activities from 2008 to 2013. Analyses conducted at the individual teacher level found change scores were statistically significant for a few teachers, but overall, teachers reported similarly on the teacher survey at both time points. This sustainability study revealed the lack of facilitating factors to support the continuation of reform practices; however, teachers identified strategies to continue to implement some of the reform practices through time in spite of a number of system-wide obstacles. This sustainability study adds to the literature by documenting obstacles to sustainability in this specific context, which overlap with what is known in the literature. Additionally, the strategies teachers identified to overcome some of the obstacles to implement reform practices and the recommendations by district level administrators add to the literature on how stakeholders may support sustainability of reform through time.

  19. Analysis of Mixed Data: Methods & Applications

    E-print Network

    Carrière Chough, Keumhee

    Analysis of Mixed Data: Methods & Applications Edited by Alexander R. de Leon Keumhee Carri of mixed types occur frequently in many fields of science and social science. The analysis of such data has measures to mixed out- comes. Despite the attention researchers have given to mixed data analysis in recent

  20. Probabilistic analysis of sequencing methods

    E-print Network

    Sontag, Eduardo

    to the point where it is possi- ble to sequence entire genomes. Indeed, the International Human Genome of the entire human genome in the Spring of 2001. The human genome is approximately 3 × 109 bp long, so this was an ambitious project requiring massive amounts of lab work and data analysis. In this chapter we discuss some

  1. Probabilistic structural analysis by extremum methods

    NASA Technical Reports Server (NTRS)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  2. Moodtrack : practical methods for assembling emotion-driven music

    E-print Network

    Vercoe, G. Scott

    2006-01-01

    This thesis presents new methods designed for the deconstruction and reassembly of musical works based on a target emotional contour. Film soundtracks provide an ideal testing ground for organizing music around strict ...

  3. Development of Practical NDE Methods for Composite Aircraft Structures

    NASA Astrophysics Data System (ADS)

    Barnard, Daniel J.; Hsu, David K.

    2006-03-01

    With each new model, designers are specifying more composites for use in aircraft structures. Composites are alternatives to metallic alloys because of their higher specific stiffness or strength, which translates to lower weight and higher fuel efficiency. However, the increased attenuation and anisotropy of composite materials brings about an increased difficulty of in-service inspections, quantifying damage and evaluating repairs. Here, we will describe several methods developed by the Center for NDE and tested at various airline maintenance and repair facilities. The methods all make use of C-scan type outputs, making interpretation of the soundness of the part simpler, particularly with respect to differentiating defects from internal structure. We will demonstrate how lessons learned from applying these methods and by engaging inspectors in testing these systems can identify future directions for method development.

  4. Practical considerations for incomplete factorization methods in reservoir simulation

    SciTech Connect

    Behie, A.; Forsyth, P.A.

    1983-11-01

    Various incomplete factorization (ILU) methods coupled with ORTHOMIN acceleration are discussed. These include natural, D2 and D4 orderings with several degrees of factorization, the modified factorization (MILU) and the COMBINATIVE method. These techniques can also be used with the bordered systems resulting from fully-coupled, fully-implicit multi-block wells. Test results are reported for fully implicit black oil and fully implicit thermal simulations. Some results are also reported for vector and scalar modes on the CRAY.

  5. Autoethnography as a Method for Reflexive Research and Practice in Vocational Psychology

    ERIC Educational Resources Information Center

    McIlveen, Peter

    2008-01-01

    This paper overviews the qualitative research method of autoethnography and its relevance to research in vocational psychology and practice in career development. Autoethnography is a reflexive means by which the researcher-practitioner consciously embeds himself or herself in theory and practice, and by way of intimate autobiographic account,…

  6. Promoting evidence-based practice through anintegrated model ofcare: patient case studiesas a teaching method

    Microsoft Academic Search

    Robert McSherry; Tracey Proctor-Childs

    2001-01-01

    This paper intends to encourage debate on how evidence-based practice may be achieved through developing an integrated approach to teaching clinical practice skills by using case study as a teaching method. This is achieved by examining societal, social, and political changes as well as the changes that have occurred in nurse education over the past 15 years. What evidence-based nursing

  7. SWECS tower dynamics analysis methods and results

    NASA Technical Reports Server (NTRS)

    Wright, A. D.; Sexton, J. H.; Butterfield, C. P.; Thresher, R. M.

    1981-01-01

    Several different tower dynamics analysis methods and computer codes were used to determine the natural frequencies and mode shapes of both guyed and freestanding wind turbine towers. These analysis methods are described and the results for two types of towers, a guyed tower and a freestanding tower, are shown. The advantages and disadvantages in the use of and the accuracy of each method are also described.

  8. Nonlinear structural analysis using integrated force method

    Microsoft Academic Search

    N R B Krishnam Raju; J Nagabhushanam

    2000-01-01

    Though the use of the integrated force method for linear investigations is well-recognised, no efforts were made to extend\\u000a this method to nonlinear structural analysis. This paper presents the attempts to use this method for analysing nonlinear\\u000a structures. General formulation of nonlinear structural analysis is given. Typically highly nonlinear bench-mark problems\\u000a are considered. The characteristic matrices of the elements used

  9. Testing the quasi-absolute method in photon activation analysis

    SciTech Connect

    Sun, Z. J. [Chemical Sciences and Engineering Division, Argonne National Laboratory, 9700 S. Cass Ave., Argonne, IL 60439 (United States); Wells, D. [Physics Department, South Dakota School of Mines and Technology, 501 E. Saint Joseph St. Rapid City, SD 57701 (United States); Starovoitova, V.; Segebade, C. [Idaho Accelerator Center, Idaho State University, 921 S. 8th Ave. Pocatello, ID 83209 (United States)

    2013-04-19

    In photon activation analysis (PAA), relative methods are widely used because of their accuracy and precision. Absolute methods, which are conducted without any assistance from calibration materials, are seldom applied for the difficulty in obtaining photon flux in measurements. This research is an attempt to perform a new absolute approach in PAA - quasi-absolute method - by retrieving photon flux in the sample through Monte Carlo simulation. With simulated photon flux and database of experimental cross sections, it is possible to calculate the concentration of target elements in the sample directly. The QA/QC procedures to solidify the research are discussed in detail. Our results show that the accuracy of the method for certain elements is close to a useful level in practice. Furthermore, the future results from the quasi-absolute method can also serve as a validation technique for experimental data on cross sections. The quasi-absolute method looks promising.

  10. Theory, Method and Practice of Neuroscientific Findings in Science Education

    ERIC Educational Resources Information Center

    Liu, Chia-Ju; Chiang, Wen-Wei

    2014-01-01

    This report provides an overview of neuroscience research that is applicable for science educators. It first offers a brief analysis of empirical studies in educational neuroscience literature, followed by six science concept learning constructs based on the whole brain theory: gaining an understanding of brain function; pattern recognition and…

  11. Recommended Methods for Manure Analysis

    E-print Network

    Balser, Teri C.

    ://uwlab.soils.wisc.edu/pubs/A3769.pdf #12;Available Now http://uwlab.soils.wisc.edu/pubs/A3769.pdf #12;Introduction · Nutrient concentrations can be estimated using "book" values for available N, P2O5, and K2O · Manure testing takes - Concrete Pit First 6.9 45 50 15 Mid 8.5 46 60 16 Last 7.4 46 57 18 #12;Effect of Agitation on Analysis

  12. Identification of potential strategies, methods, and tools for improving cost estimating practices for highway projects 

    E-print Network

    Donnell, Kelly Elaine

    2005-08-29

    of strategies, methods, and tools for project cost estimation practices aimed at achieving greater consistency and accuracy between the project development phases. A literature review was conducted that assisted in identifying factors that lead to the cost...

  13. Improving educational environment in medical colleges through transactional analysis practice of teachers

    PubMed Central

    Rajan, Marina

    2012-01-01

    Context: A FAIMER (Foundation for Advancement in International Medical Education and Research) fellow organized a comprehensive faculty development program to improve faculty awareness resulting in changed teaching practices and better teacher student relationships using Transactional Analysis (TA). Practicing TA tools help development of ‘awareness’ about intrapersonal and interpersonal processes. Objectives: To improve self-awareness among medical educators.To bring about self-directed change in practices among medical educators.To assess usefulness of TA tools for the same. Methods: An experienced trainer conducted a basic course (12 hours) in TA for faculty members. The PAC model of personality structure, functional fluency model of personal functioning, stroke theory on motivation, passivity and script theories of adult functional styles were taught experientially with examples from the Medical Education Scenario. Self-reported improvement in awareness and changes in practices were assessed immediately after, at three months, and one year after training. Findings: The mean improvement in self-'awareness' is 13.3% (95% C.I 9.3-17.2) among nineteen participants. This persists one year after training. Changes in practices within a year include, collecting feedback, new teaching styles and better relationship with students. Discussion and Conclusions: These findings demonstrate sustainable and measurable improvement in self-awareness by practice of TA tools. Improvement in self-'awareness' of faculty resulted in self-directed changes in teaching practices. Medical faculty has judged the TA tools effective for improving self-awareness leading to self-directed changes. PMID:24358808

  14. Microparticle analysis system and method

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    2007-01-01

    A device for analyzing microparticles is provided which includes a chamber with an inlet and an outlet for respectively introducing and dispensing a flowing fluid comprising microparticles, a light source for providing light through the chamber and a photometer for measuring the intensity of light transmitted through individual microparticles. The device further includes an imaging system for acquiring images of the fluid. In some cases, the device may be configured to identify and determine a quantity of the microparticles within the fluid. Consequently, a method for identifying and tracking microparticles in motion is contemplated herein. The method involves flowing a fluid comprising microparticles in laminar motion through a chamber, transmitting light through the fluid, measuring the intensities of the light transmitted through the microparticles, imaging the fluid a plurality of times and comparing at least some of the intensities of light between different images of the fluid.

  15. Preparing Special Education Teacher Candidates: Extending Case Method to Practice

    ERIC Educational Resources Information Center

    Lengyel, Linda; Vernon-Dotson, Lisa

    2010-01-01

    Case methodology is receiving more recognition in the field of education as a viable pedagogy for use in the preparation of future educators. In this article, the coauthors explore two examples of case method instruction that extend beyond university classrooms to field sites: case report and case study. Both examples were used in special…

  16. Practical method of diffusion-welding steel plate in air

    NASA Technical Reports Server (NTRS)

    Holko, K. H.; Moore, T. J.

    1971-01-01

    Method is ideal for critical service requirements where parent metal properties are equaled in notch toughness, stress rupture and other characteristics. Welding technique variations may be used on a variety of materials, such as carbon steels, alloy steels, stainless steels, ceramics, and reactive and refractory materials.

  17. Practice and Progression in Second Language Research Methods

    ERIC Educational Resources Information Center

    Mackey, Alison

    2014-01-01

    Since its inception, the field of second language research has utilized methods from a number of areas, including general linguistics, psychology, education, sociology, anthropology and, recently, neuroscience and corpus linguistics. As the questions and objectives expand, researchers are increasingly pushing methodological boundaries to gain a…

  18. Practical approach for using Medicare data to estimate costs for cost-effectiveness analysis.

    PubMed

    Tumeh, John W; Moore, Susan G; Shapiro, Rachel; Flowers, Christopher R

    2005-04-01

    Many methods have been used to measure costs for cost-effectiveness analysis in healthcare. A central challenge in cost estimation is determining the direct cost of medical goods and services from a societal perspective. This review applies the methodology for calculating Medicare reimbursements for physician services, hospital services and medications as a means of estimating healthcare costs from a US societal perspective. This review provides the tools and information needed to calculate direct medical costs related to in- and outpatient services provided by physicians and hospitals, as well as drug costs using Medicare reimbursement data. The data used in calculating Medicare reimbursements was obtained from the Centers for Medicare and Medicaid Services website. Methods for estimating costs for a particular service in a specific location using Medicare and Medicaid Services are described and demonstrated. A method based on Medicare data that uses the unadjusted geographic practice cost index and standard hospital base rate to estimate healthcare costs that can be generalized to the US population is described and demonstrated. This review provides cost-effectiveness analysts with the tools needed to calculate healthcare service costs for economic research. It contains links to all websites needed for obtaining Medicare and Medicaid Services data and provides a step-by-step analysis of the methodology involved in calculating costs. A practical guide for applying the methodology used by Medicare and Medicaid Services to calculate direct medical costs in order to estimate US societal costs in cost-effectiveness analysis is provided. PMID:19807571

  19. Sloshing analysis method using existing FEM structural analysis code

    SciTech Connect

    Tokuda, N.; Sakurai, T.; Teraoku, T. [Ishikawajima-Harima Heavy Industries Co., Ltd., Yokohama (Japan)

    1995-08-01

    A fluid analysis method using an analogy relating the pressure wave equation of fluid to elasticity equations is applied to sloshing analysis, where existing FEM structural analysis codes are available. It is seen from theoretical consideration that the present method is equivalent to the classical FEM formulation of linear sloshing analysis. The numerical analyses of liquid sloshing in a rigid cubic tank and of vibration of tubulous fluid under gravitational force are performed by using the present method. The results are shown to be in excellent agreement with the theoretical values. This study is relevant for sloshing conditions in fast breeder reactors.

  20. Practical applications of activation analysis and other nuclear techniques

    SciTech Connect

    Lyon, W.S.

    1982-01-01

    Neeutron activation analysis (NAA) is a versatile, sensitive multielement, usually nondestructive analytical technique used to determine elemental concentrations in a variety of materials. Samples are irradiated with neutrons in a nuclear reactor, removed, and for the nondestructive technique, the induced radioactivity measured. This measurement of ..gamma.. rays emitted from specific radionuclides makes possible the quantitative determination of elements present. The method is described, advantages and disadvantages listed and a number of examples of its use given. Two other nuclear methods, particle induced x-ray emission and synchrotron produced x-ray fluorescence are also briefly discussed.

  1. [Methods for detection of biofilm formation in routine microbiological practice].

    PubMed

    R?zicka, F; Holá, V; Votava, M

    2006-02-01

    The increasing use of catheters, artificial implants and antimicrobials as well as high numbers of immunocompromised patients are major causes for concern over biofilm infections. These infections are characterized particularly by high resistance to antimicrobials and formation of persistent foci that may complicate therapy. Therefore, detection of biofilm formation is of high relevance to the clinician and his/her approach to the treatment. Reliable and sensitive methods for detection of this pathogenicity factor in clinically important organisms, suitable for use in routine microbiological laboratories, are needed for this purpose. Currently, a wide array of techniques are available for detection of this virulence factor, such as biofilm visualization by microscopy, culture detection, detection of particular components, detection of physical and chemical differences between biofilm-positive organisms and their planktonic forms and detection of genes responsible for biofilm formation. Since each of these methods has limitations, the best results can be achieved by combining different approaches. PMID:16528896

  2. Rules and methods: a sociology of subcultural knowledge and practice 

    E-print Network

    Staggs, Frank Maurice

    1978-01-01

    objective of this thesis was to discern and describe the tactics employed by members of the gay subculture in seeking out other homosexuals in problematic situations. By placing ethnomethod- ological and phenomenological programs within the frameworks... of the sociology of knowledge, these "methods of cruising" were used to study that system of knowledge which is incorporated within the gay subculture. Data were collected in the form of tape recorded interview situa- tions . Interviewees were selected from...

  3. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  4. A practical guide to value of information analysis.

    PubMed

    Wilson, Edward C F

    2015-02-01

    Value of information analysis is a quantitative method to estimate the return on investment in proposed research projects. It can be used in a number of ways. Funders of research may find it useful to rank projects in terms of the expected return on investment from a variety of competing projects. Alternatively, trialists can use the principles to identify the efficient sample size of a proposed study as an alternative to traditional power calculations, and finally, a value of information analysis can be conducted alongside an economic evaluation as a quantitative adjunct to the 'future research' or 'next steps' section of a study write up. The purpose of this paper is to present a brief introduction to the methods, a step-by-step guide to calculation and a discussion of issues that arise in their application to healthcare decision making. Worked examples are provided in the accompanying online appendices as Microsoft Excel spreadsheets. PMID:25336432

  5. [Analysis of heart rate variability. Mathematical description and practical application].

    PubMed

    Sammito, S; Böckelmann, I

    2015-03-01

    The analysis of heart rate variability (HRV) has recently become established as a non-invasive measurement for estimation of demands on the cardiovascular system. The HRV reflects the interaction of the sympathetic and parasympathetic nervous systems and allows the influence of the autonomic nervous system on the regulation of the cardiovascular system to be mathematically described. This review explicates the analysis method of HRV for time, frequency and non-linear methods as well as the range of parameters and the demand on acquisition time. The necessity and possibilities of artefact correction and advice for the selection of a reasonable acquisition period are discussed and standard values for selected HRV parameters are presented. PMID:25298003

  6. Triangle area water supply monitoring project, October 1988 through September 2001, North Carolina -- description of the water-quality network, sampling and analysis methods, and quality-assurance practices

    USGS Publications Warehouse

    Oblinger, Carolyn J.

    2004-01-01

    The Triangle Area Water Supply Monitoring Project was initiated in October 1988 to provide long-term water-quality data for six area water-supply reservoirs and their tributaries. In addition, the project provides data that can be used to determine the effectiveness of large-scale changes in water-resource management practices, document differences in water quality among water-supply types (large multiuse reservoir, small reservoir, run-of-river), and tributary-loading and in-lake data for water-quality modeling of Falls and Jordan Lakes. By September 2001, the project had progressed in four phases and included as many as 34 sites (in 1991). Most sites were sampled and analyzed by the U.S. Geological Survey. Some sites were already a part of the North Carolina Division of Water Quality statewide ambient water-quality monitoring network and were sampled by the Division of Water Quality. The network has provided data on streamflow, physical properties, and concentrations of nutrients, major ions, metals, trace elements, chlorophyll, total organic carbon, suspended sediment, and selected synthetic organic compounds. Project quality-assurance activities include written procedures for sample collection, record management and archive, collection of field quality-control samples (blank samples and replicate samples), and monitoring the quality of field supplies. In addition to project quality-assurance activities, the quality of laboratory analyses was assessed through laboratory quality-assurance practices and an independent laboratory quality-control assessment provided by the U.S. Geological Survey Branch of Quality Systems through the Blind Inorganic Sample Project and the Organic Blind Sample Project.

  7. A mixed methods study of food safety knowledge, practices and beliefs in Hispanic families with young children.

    PubMed

    Stenger, Kristen M; Ritter-Gooder, Paula K; Perry, Christina; Albrecht, Julie A

    2014-12-01

    Children are at a higher risk for foodborne illness. The objective of this study was to explore food safety knowledge, beliefs and practices among Hispanic families with young children (?10 years of age) living within a Midwestern state. A convergent mixed methods design collected qualitative and quantitative data in parallel. Food safety knowledge surveys were administered (n?=?90) prior to exploration of beliefs and practices among six focus groups (n?=?52) conducted by bilingual interpreters in community sites in five cities/towns. Descriptive statistics determined knowledge scores and thematic coding unveiled beliefs and practices. Data sets were merged to assess concordance. Participants were female (96%), 35.7 (±7.6) years of age, from Mexico (69%), with the majority having a low education level. Food safety knowledge was low (56%?±?11). Focus group themes were: Ethnic dishes popular, Relating food to illness, Fresh food in home country, Food safety practices, and Face to face learning. Mixed method analysis revealed high self confidence in preparing food safely with low safe food handling knowledge and the presence of some cultural beliefs. On-site Spanish classes and materials were preferred venues for food safety education. Bilingual food safety messaging targeting common ethnic foods and cultural beliefs and practices is indicated to lower the risk of foodborne illness in Hispanic families with young children. PMID:25178898

  8. Maximizing Return From Sound Analysis and Design Practices

    SciTech Connect

    Bramlette, Judith Lynn

    2002-06-01

    With today's tightening budgets computer applications must provide "true" long-term benefit to the company. Businesses are spending large portions of their budgets "Re- Engineering" old systems to take advantage of "new" technology. But what they are really getting is simply a new interface implementing the same incomplete or poor defined requirements as before. "True" benefit can only be gained if sound analysis and design practices are used. WHAT data and processes are required of a system is not the same as HOW the system will be implemented within a company. It is the System Analyst's responsibility to understand the difference between these two concepts. The paper discusses some simple techniques to be used during the Analysis and Design phases of projects, as well as the information gathered and recorded in each phase and how it is transformed between these phases. The paper also covers production application generated using Oracle Designer. Applying these techniques to "real world" problems, the applications will meet the needs for today's business and adapt easily to ever-changing business environments.

  9. Maximizing Return From Sound Analysis and Design Practices

    SciTech Connect

    Bramlette, J.D.

    2002-04-22

    With today's tightening budgets computer applications must provide ''true'' long-term benefit to the company. Businesses are spending large portions of their budgets ''Re-Engineering'' old systems to take advantage of ''new'' technology. But what they are really getting is simply a new interface implementing the same incomplete or poor defined requirements as before. ''True'' benefit can only be gained if sound analysis and design practices are used. WHAT data and processes are required of a system is not the same as HOW the system will be implemented within a company. It is the System Analyst's responsibility to understand the difference between these two concepts. The paper discusses some simple techniques to be used during the Analysis and Design phases of projects, as well as the information gathered and recorded in each phase and how it is transformed between these phases. The paper also covers production application generated using Oracle Designer. Applying these techniques to ''real world'' problems, the applications will meet the needs for today's business and adapt easily to ever-changing business environments.

  10. Computer-intensive methods in statistical analysis

    Microsoft Academic Search

    D. N. Politis

    1998-01-01

    As far back as the late 1970s, the impact of affordable, high-speed computers on the theory and practice of modern statistics was recognized by Efron (1979, 1982). As a result, the bootstrap and other computer-intensive statistical methods (such as subsampling and the jackknife) have been developed extensively since that time and now constitute very powerful (and intuitive) tools to do

  11. Validation of analytical methods in compliance with good manufacturing practice: a practical approach

    PubMed Central

    2013-01-01

    Background The quality and safety of cell therapy products must be maintained throughout their production and quality control cycle, ensuring their final use in the patient. We validated the Lymulus Amebocyte Lysate (LAL) test and immunophenotype according to International Conference on Harmonization Q2 Guidelines and the EU Pharmacopoeia, considering accuracy, precision, repeatability, linearity and range. Methods For the endotoxin test we used a kinetic chromogenic LAL test. As this is a limit test for the control of impurities, in compliance with International Conference on Harmonization Q2 Guidelines and the EU Pharmacopoeia, we evaluated the specificity and detection limit. For the immunophenotype test, an identity test, we evaluated specificity through the Fluorescence Minus One method and we repeated all experiments thrice to verify precision. The immunophenotype validation required a performance qualification of the flow cytometer using two types of standard beads which have to be used daily to check cytometer reproducibly set up. The results were compared together. Collected data were statistically analyzed calculating mean, standard deviation and coefficient of variation percentage (CV%). Results The LAL test is repeatable and specific. The spike recovery value of each sample was between 0.25 EU/ml and 1 EU/ml with a CV% < 10%. The correlation coefficient (? 0.980) and CV% (< 10%) of the standard curve tested in duplicate showed the test's linearity and a minimum detectable concentration value of 0.005 EU/ml. The immunophenotype method performed thrice on our cell therapy products is specific and repeatable as showed by CV% inter -experiment < 10%. Conclusions Our data demonstrated that validated analytical procedures are suitable as quality controls for the batch release of cell therapy products. Our paper could offer an important contribution for the scientific community in the field of CTPs, above all to small Cell Factories such as ours, where it is not always possible to have CFR21 compliant software. PMID:23981284

  12. Method of analysis and quality-assurance practices by the U.S. Geological Survey Organic Geochemistry Research Group; determination of geosmin and methylisoborneol in water using solid-phase microextraction and gas chromatography/mass spectrometry

    USGS Publications Warehouse

    Zimmerman, L.R.; Ziegler, A.C.; Thurman, E.M.

    2002-01-01

    A method for the determination of two common odor-causing compounds in water, geosmin and 2-methylisoborneol, was modified and verified by the U.S. Geological Survey's Organic Geochemistry Research Group in Lawrence, Kansas. The optimized method involves the extraction of odor-causing compounds from filtered water samples using a divinylbenzene-carboxen-polydimethylsiloxane cross-link coated solid-phase microextraction (SPME) fiber. Detection of the compounds is accomplished using capillary-column gas chromatography/mass spectrometry (GC/MS). Precision and accuracy were demonstrated using reagent-water, surface-water, and ground-water samples. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 35 nanograms per liter ranged from 60 to 123 percent for geosmin and from 90 to 96 percent for 2-methylisoborneol. Method detection limits were 1.9 nanograms per liter for geosmin and 2.0 nanograms per liter for 2-methylisoborneol in 45-milliliter samples. Typically, concentrations of 30 and 10 nanograms per liter of geosmin and 2-methylisoborneol, respectively, can be detected by the general public. The calibration range for the method is equivalent to concentrations from 5 to 100 nanograms per liter without dilution. The method is valuable for acquiring information about the production and fate of these odor-causing compounds in water.

  13. The moderating effect of supply chain role on the relationship between supply chain practices and performance : An empirical analysis

    Microsoft Academic Search

    Lori S. Cook; Daniel R. Heiser; Kaushik Sengupta

    2011-01-01

    Purpose – The purpose of this paper is to examine the relationships between specific supply chain practices and organizational performance and whether this relationship is moderated by the role that a company assumes in its respective supply chain. Design\\/methodology\\/approach – This paper uses regression analysis and the relative weights method to analyze a set of survey data from respondents within

  14. Queueing network analysis: concepts, terminology, and methods

    Microsoft Academic Search

    Rusty O. Baldwin; Nathaniel J. Davis Iv; Scott F. Midkiff; John E. Kobza

    2003-01-01

    Queueing network analysis can be a valuable tool to analyze network models. However, the vast number and diverse nature of the tools available to analyze a problem can often leave the uninitiated frustrated or bewildered--awash in concepts, terminology, and methods not encountered elsewhere. As a primer for queueing network analysis, this paper emphasizes essential concepts and terminology. Selection of analytical

  15. Trace Element Analysis Core Lab methods

    E-print Network

    Lotko, William

    (or trace metal grade for soils and sediments) HNO3 is added. Low biomass organic samples or toenail of high total dissolved salts; please inform us if your samples have high TDS. Trace Metals in solidsTrace Element Analysis Core Lab methods General All prices are based on analysis of up to 10

  16. Low hardness organisms: Culture methods, sensitivities, and practical applications

    SciTech Connect

    DaCruz, A.; DaCruz, N.; Bird, M.

    1995-12-31

    EPA Regulations require biomonitoring of permitted effluent and stormwater runoff. Several permit locations were studied, in Virginia, that have supply water and or stormwater runoff which ranges in hardness from 5--30 mg/L. Ceriodaphnia dubia (dubia) and Pimephales promelas (fathead minnow) were tested in reconstituted water with hardnesses from 5--30 mg/L. Results indicated osmotic stresses present in the acute tests with the fathead minnow as well as chronic tests for the dubia and the fathead minnow. Culture methods were developed for both organism types in soft (30 mg) reconstituted freshwater. Reproductivity and development for each organisms type meets or exceeds EPA testing requirements for moderately hard organisms. Sensitivities were measured over an 18 month interval using cadmium chloride as a reference toxicant. Additionally, sensitivities were charted in contrast with those of organisms cultured in moderately hard water. The comparison proved that the sensitivities of both the dubia and the fathead minnow cultured in 30 mg water increased, but were within two standard deviations of the organism sensitivities of those cultured in moderately hard water. Latitude for use of organisms cultured in 30 mg was documented for waters ranging in hardness from 10--100 mg/L with no acclimation period required. The stability of the organism sensitivity was also validated. The application was most helpful in stormwater runoff and in effluents where the hardness was 30 mg/L or less.

  17. This paper describes a graphical method of nonlinear circuit analysis. The method combines circuit analysis

    E-print Network

    Phang, Khoman

    is demonstrated in the distortion analysis of a common-emitter amplifier. Simulation results are provided ANALYSIS As an example, we will analyze the distortion in the common-emitter (CE) amplifier shown in Figure and flexible than traditional methods. The method is demonstrated in the analysis of a com- mon-emitter

  18. A practical method of estimating standard error of age in the fission track dating method

    USGS Publications Warehouse

    Johnson, N.M.; McGee, V.E.; Naeser, C.W.

    1979-01-01

    A first-order approximation formula for the propagation of error in the fission track age equation is given by PA = C[P2s+P2i+P2??-2rPsPi] 1 2, where PA, Ps, Pi and P?? are the percentage error of age, of spontaneous track density, of induced track density, and of neutron dose, respectively, and C is a constant. The correlation, r, between spontaneous are induced track densities is a crucial element in the error analysis, acting generally to improve the standard error of age. In addition, the correlation parameter r is instrumental is specifying the level of neutron dose, a controlled variable, which will minimize the standard error of age. The results from the approximation equation agree closely with the results from an independent statistical model for the propagation of errors in the fission-track dating method. ?? 1979.

  19. Measuring solar reflectance Part II: Review of practical methods

    SciTech Connect

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul

    2010-05-14

    A companion article explored how solar reflectance varies with surface orientation and solar position, and found that clear sky air mass 1 global horizontal (AM1GH) solar reflectance is a preferred quantity for estimating solar heat gain. In this study we show that AM1GH solar reflectance R{sub g,0} can be accurately measured with a pyranometer, a solar spectrophotometer, or an updated edition of the Solar Spectrum Reflectometer (version 6). Of primary concern are errors that result from variations in the spectral and angular distributions of incident sunlight. Neglecting shadow, background and instrument errors, the conventional pyranometer technique can measure R{sub g,0} to within 0.01 for surface slopes up to 5:12 [23{sup o}], and to within 0.02 for surface slopes up to 12:12 [45{sup o}]. An alternative pyranometer method minimizes shadow errors and can be used to measure R{sub g,0} of a surface as small as 1 m in diameter. The accuracy with which it can measure R{sub g,0} is otherwise comparable to that of the conventional pyranometer technique. A solar spectrophotometer can be used to determine R*{sub g,0}, a solar reflectance computed by averaging solar spectral reflectance weighted with AM1GH solar spectral irradiance. Neglecting instrument errors, R*{sub g,0} matches R{sub g,0} to within 0.006. The air mass 1.5 solar reflectance measured with version 5 of the Solar Spectrum Reflectometer can differ from R*{sub g,0} by as much as 0.08, but the AM1GH output of version 6 of this instrument matches R*{sub g,0} to within about 0.01.

  20. Measuring solar reflectance - Part II: Review of practical methods

    SciTech Connect

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul [Heat Island Group, Environmental Energy Technologies Division, Lawrence Berkeley National Laboratory, 1 Cyclotron Road, Berkeley, CA 94720 (United States)

    2010-09-15

    A companion article explored how solar reflectance varies with surface orientation and solar position, and found that clear sky air mass 1 global horizontal (AM1GH) solar reflectance is a preferred quantity for estimating solar heat gain. In this study we show that AM1GH solar reflectance R{sub g,0} can be accurately measured with a pyranometer, a solar spectrophotometer, or an updated edition of the Solar Spectrum Reflectometer (version 6). Of primary concern are errors that result from variations in the spectral and angular distributions of incident sunlight. Neglecting shadow, background and instrument errors, the conventional pyranometer technique can measure R{sub g,0} to within 0.01 for surface slopes up to 5:12 [23 ], and to within 0.02 for surface slopes up to 12:12 [45 ]. An alternative pyranometer method minimizes shadow errors and can be used to measure R{sub g,0} of a surface as small as 1 m in diameter. The accuracy with which it can measure R{sub g,0} is otherwise comparable to that of the conventional pyranometer technique. A solar spectrophotometer can be used to determine R{sub g,0}{sup *}, a solar reflectance computed by averaging solar spectral reflectance weighted with AM1GH solar spectral irradiance. Neglecting instrument errors, R{sub g,0}{sup *} matches R{sub g,0} to within 0.006. The air mass 1.5 solar reflectance measured with version 5 of the Solar Spectrum Reflectometer can differ from R{sub g,0}{sup *} by as much as 0.08, but the AM1GH output of version 6 of this instrument matches R{sub g,0}{sup *} to within about 0.01. (author)

  1. Qualitative Analysis of Common Definitions for Core Advanced Pharmacy Practice Experiences

    PubMed Central

    Danielson, Jennifer; Weber, Stanley S.

    2014-01-01

    Objective. To determine how colleges and schools of pharmacy interpreted the Accreditation Council for Pharmacy Education’s (ACPE’s) Standards 2007 definitions for core advanced pharmacy practice experiences (APPEs), and how they differentiated community and institutional practice activities for introductory pharmacy practice experiences (IPPEs) and APPEs. Methods. A cross-sectional, qualitative, thematic analysis was done of survey data obtained from experiential education directors in US colleges and schools of pharmacy. Open-ended responses to invited descriptions of the 4 core APPEs were analyzed using grounded theory to determine common themes. Type of college or school of pharmacy (private vs public) and size of program were compared. Results. Seventy-one schools (72%) with active APPE programs at the time of the survey responded. Lack of strong frequent themes describing specific activities for the acute care/general medicine core APPE indicated that most respondents agreed on the setting (hospital or inpatient) but the student experience remained highly variable. Themes were relatively consistent between public and private institutions, but there were differences across programs of varying size. Conclusion. Inconsistencies existed in how colleges and schools of pharmacy defined the core APPEs as required by ACPE. More specific descriptions of core APPEs would help to standardize the core practice experiences across institutions and provide an opportunity for quality benchmarking. PMID:24954931

  2. Coupled Flow Deformation Analysis Using Meshfree Method

    Microsoft Academic Search

    Arman Khoshghalb; Nasser Khalili; Somasundaram Valliappan

    2010-01-01

    A fully coupled meshfree algorithm is proposed for numerical analysis of Biot's formulation. Spatial discretisation of the governing equations is accomplished using radial point interpolation method. Temporal discretization is achieved based on a novel three-point approximation technique with variable time step, which has second order accuracy and avoids oscillatory response observed in the conventional methods of time discretization. Application of

  3. Causal Moderation Analysis Using Propensity Score Methods

    ERIC Educational Resources Information Center

    Dong, Nianbo

    2012-01-01

    This paper is based on previous studies in applying propensity score methods to study multiple treatment variables to examine the causal moderator effect. The propensity score methods will be demonstrated in a case study to examine the causal moderator effect, where the moderators are categorical and continuous variables. Moderation analysis is an…

  4. Cutting Plane Methods in Decision Analysis

    Microsoft Academic Search

    Xiaosong Ding; Faiz Al-Khayyal

    Several computational decision analysis approaches have been developed over a number of years for solving decision problems when vague and numerically imprecise information prevails. However, the evaluation phases in the DELTA method and similar methods often give rise to special bilinear programming problems, which are time-consuming to solve in an interactive environment with general nonlinear programming solvers. This paper proposes

  5. Optical methods for the analysis of dermatopharmacokinetics

    Microsoft Academic Search

    Juergen Lademann; Hans-Juergen Weigmann; R. von Pelchrzim; Wolfram Sterry

    2002-01-01

    The method of tape stripping in combination with spectroscopic measurements is a simple and noninvasive method for the analysis of dermatopharmacokinetics of cosmetic products and topically applied drugs. The absorbance at 430 nm was used for the characterization of the amount of corneocytes on the tape strips. It was compared to the increase of weight of the tapes after removing

  6. Methods to enhance compost practices as an alternative to waste disposal

    SciTech Connect

    Stuckey, H.T.; Hudak, P.F.

    1998-12-31

    Creating practices that are ecologically friendly, economically profitable, and ethically sound is a concept that is slowly beginning to unfold in modern society. In developing such practices, the authors challenge long-lived human behavior patterns and environmental management practices. In this paper, they trace the history of human waste production, describe problems associated with such waste, and explore regional coping mechanisms. Composting projects in north central Texas demonstrate new methods for waste disposal. The authors studied projects conducted by municipalities, schools, agricultural organizations, and individual households. These efforts were examined within the context of regional and statewide solid waste plans. They conclude that: (1) regional composting in north central Texas will substantially reduce the waste stream entering landfills; (2) public education is paramount to establishing alternative waste disposal practices; and (3) new practices for compost will catalyze widespread and efficient production.

  7. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M. (Albuquerque, NM)

    2002-01-01

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following estimation or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The "hybrid" method herein means a combination of an initial classical least squares analysis calibration step with subsequent analysis by an inverse multivariate analysis method. A "spectral shape" herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The "shape" can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  8. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    SciTech Connect

    Ronald Laurids Boring

    2010-11-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  9. Protein-protein interactions: methods for detection and analysis.

    PubMed Central

    Phizicky, E M; Fields, S

    1995-01-01

    The function and activity of a protein are often modulated by other proteins with which it interacts. This review is intended as a practical guide to the analysis of such protein-protein interactions. We discuss biochemical methods such as protein affinity chromatography, affinity blotting, coimmunoprecipitation, and cross-linking; molecular biological methods such as protein probing, the two-hybrid system, and phage display: and genetic methods such as the isolation of extragenic suppressors, synthetic mutants, and unlinked noncomplementing mutants. We next describe how binding affinities can be evaluated by techniques including protein affinity chromatography, sedimentation, gel filtration, fluorescence methods, solid-phase sampling of equilibrium solutions, and surface plasmon resonance. Finally, three examples of well-characterized domains involved in multiple protein-protein interactions are examined. The emphasis of the discussion is on variations in the approaches, concerns in evaluating the results, and advantages and disadvantages of the techniques. PMID:7708014

  10. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...2014-04-01 2014-04-01 false Imported articles involving unfair methods of competition...Unfair Competition § 12.39 Imported articles involving unfair methods of competition...practices in the importation or sale of articles, the effect or tendency of...

  11. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...2011-04-01 2011-04-01 false Imported articles involving unfair methods of competition...Unfair Competition § 12.39 Imported articles involving unfair methods of competition...practices in the importation or sale of articles, the effect or tendency of...

  12. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...2013-04-01 2013-04-01 false Imported articles involving unfair methods of competition...Unfair Competition § 12.39 Imported articles involving unfair methods of competition...practices in the importation or sale of articles, the effect or tendency of...

  13. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...2012-04-01 2012-04-01 false Imported articles involving unfair methods of competition...Unfair Competition § 12.39 Imported articles involving unfair methods of competition...practices in the importation or sale of articles, the effect or tendency of...

  14. Theoretical Analysis and Practical Insights on EAP Estimation via a Unified HARDI Framework

    E-print Network

    Boyer, Edmond

    of Automation, Chinese Academy of Sciences, China 2 Athena Project Team, INRIA Sophia Antipolis ­ M to analytically solve the Fourier transform. To our knowledge, there is no theoretical analysis and practicalTheoretical Analysis and Practical Insights on EAP Estimation via a Unified HARDI Framework Jian

  15. A New Method for Transient Stability Analysis

    NASA Astrophysics Data System (ADS)

    Yorino, Naoto; Saito, Takeshi; Kamei, Yoshifumi; Sasaki, Hiroshi

    This paper proposes a brand new method for transient stability analysis in power systems. The proposed method directly computes the critical trajectory for a given contingency to obtain the critical condition of the studied system. Although the method may be useful for general nonlinear dynamic systems, it is applied to the problem of obtaining a controlling UEP, unstable equilibrium point, which provides inevitable information for the energy function methods to assess transient stability. Namely, the proposed method effectively yields a critical trajectory on PEBS together with the controlling UEP, thus improving the conventional BCU method. The effectiveness of the proposed method is demonstrated in 3-machine 9-bus system and 6-machine 30-bus system.

  16. The Frankfurt Patient Safety Climate Questionnaire for General Practices (FraSiK): analysis of psychometric properties.

    PubMed

    Hoffmann, Barbara; Domanska, Olga Maria; Albay, Zeycan; Mueller, Vera; Guethlin, Corina; Thomas, Eric J; Gerlach, Ferdinand M

    2011-09-01

    BACKGROUND Safety culture has been identified as having a major impact on how safety is managed in healthcare. However, it has not received much attention in general practices. Hence, no instrument yet exists to assess safety climate-the measurable artefact of safety culture-in this setting. This study aims to evaluate psychometric properties of a newly developed safety climate questionnaire for use in German general practices. METHODS The existing Safety Attitudes Questionnaire, Ambulatory Version, was considerably modified and enhanced in order to be applicable in general practice. After pilot tests and its application in a random sample of 400 German practices, a first psychometric analysis led to modifications in several items. A further psychometric analysis was conducted with an additional sample of 60 practices and a response rate of 97.08%. Exploratory factor analysis with orthogonal varimax rotation was carried out and the internal consistency of the identified factors was calculated. RESULTS Nine factors emerged, representing a wide range of dimensions associated with safety culture: teamwork climate, error management, safety of clinical processes, perception of causes of errors, job satisfaction, safety of office structure, receptiveness to healthcare assistants and patients, staff perception of management, and quality and safety of medical care. Internal consistency of factors is moderate to good. CONCLUSIONS This study demonstrates the development of a patient safety climate instrument. The questionnaire displays established features of safety climate and additionally contains features that might be specific to small-scale general practices. PMID:21571753

  17. Drosophila hematopoiesis: markers and methods for molecular genetic analysis

    PubMed Central

    Evans, Cory J.; Liu, Ting; Banerjee, Utpal

    2014-01-01

    Analyses of the Drosophila hematopoietic system are becoming more and more prevalent as developmental and functional parallels with vertebrate blood cells become more evident. Investigative work on the fly blood system has, out of necessity, led to the identification of new molecular markers for blood cell types and lineages and to the refinement of useful molecular genetic tools and analytical methods. This review briefly describes the Drosophila hematopoietic system at different developmental stages, summarizes the major useful cell markers and tools for each stage, and provides basic protocols for practical analysis of circulating blood cells and of the lymph gland, the larval hematopoietic organ. PMID:24613936

  18. Multicultural Issues in School Psychology Practice: A Critical Analysis

    ERIC Educational Resources Information Center

    Ortiz, Samuel O.

    2006-01-01

    Once thought of largely as a sideline issue, multiculturalism is fast becoming a major topic on the central stage of psychology and practice. That cultural factors permeate the whole of psychological foundations and influence the manner in which the very scope of practice is shaped is undeniable. The rapidly changing face of the U.S. population…

  19. Researching "Practiced Language Policies": Insights from Conversation Analysis

    ERIC Educational Resources Information Center

    Bonacina-Pugh, Florence

    2012-01-01

    In language policy research, "policy" has traditionally been conceptualised as a notion separate from that of "practice". In fact, language practices were usually analysed with a view to evaluate whether a policy is being implemented or resisted to. Recently, however, Spolsky in ("Language policy". Cambridge University press, Cambridge, 2004;…

  20. Degradation of learned skills: Effectiveness of practice methods on visual approach and landing skill retention

    NASA Technical Reports Server (NTRS)

    Sitterley, T. E.; Zaitzeff, L. P.; Berge, W. A.

    1972-01-01

    Flight control and procedural task skill degradation, and the effectiveness of retraining methods were evaluated for a simulated space vehicle approach and landing under instrument and visual flight conditions. Fifteen experienced pilots were trained and then tested after 4 months either without the benefits of practice or with static rehearsal, dynamic rehearsal or with dynamic warmup practice. Performance on both the flight control and procedure tasks degraded significantly after 4 months. The rehearsal methods effectively countered procedure task skill degradation, while dynamic rehearsal or a combination of static rehearsal and dynamic warmup practice was required for the flight control tasks. The quality of the retraining methods appeared to be primarily dependent on the efficiency of visual cue reinforcement.

  1. A numerical method for the stress analysis of stiffened-shell structures under nonuniform temperature distributions

    NASA Technical Reports Server (NTRS)

    Heldenfels, Richard R

    1951-01-01

    A numerical method is presented for the stress analysis of stiffened-shell structures of arbitrary cross section under nonuniform temperature distributions. The method is based on a previously published procedure that is extended to include temperature effects and multicell construction. The application of the method to practical problems is discussed and an illustrative analysis is presented of a two-cell box beam under the combined action of vertical loads and a nonuniform temperature distribution.

  2. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2004-03-23

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  3. Advanced analysis methods in particle physics

    SciTech Connect

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  4. A Comparative Analysis of Ethnomedicinal Practices for Treating Gastrointestinal Disorders Used by Communities Living in Three National Parks (Korea)

    PubMed Central

    Kim, Hyun; Song, Mi-Jang; Brian, Heldenbrand; Choi, Kyoungho

    2014-01-01

    The purpose of this study is to comparatively analyze the ethnomedicinal practices on gastrointestinal disorders within communities in Jirisan National Park, Gayasan National Park, and Hallasan National Park of Korea. Data was collected through participant observations and indepth interviews with semistructured questionnaires. Methods for comparative analysis were accomplished using the informant consensus factor, fidelity level, and internetwork analysis. A total of 490 ethnomedicinal practices recorded from the communities were classified into 110 families, 176 genera, and 220 species that included plants, animals, fungi, and alga. The informant consensus factor values in the disorder categories were enteritis, and gastralgia (1.0), followed by indigestion (0.94), constipation (0.93), and abdominal pain and gastroenteric trouble (0.92). In terms of fidelity levels, 71 plant species showed fidelity levels of 100%. The internetwork analysis between disorders and all medicinal species are grouped in the center by the four categories of indigestion, diarrhea, abdominal pain, and gastroenteric trouble, respectively. Regarding the research method of this study, the comparative analysis methods will contribute to the availability of orally transmitted ethnomedicinal knowledge. Among the methods of analysis, the use of internetwork analysis as a tool for analysis in this study provides imperative internetwork maps between gastrointestinal disorders and medicinal species. PMID:25202330

  5. Groundwater Flooding: Practical Methods for the Estimation of Extreme Groundwater Levels

    NASA Astrophysics Data System (ADS)

    Bichler, A.; Fürst, J.

    2012-04-01

    Floods are in general recognized as a consequence of high flows in surface waters. Only recently awareness has been raised for potential flooding and flood risk from groundwater sources. In particular, information about high groundwater levels is relevant where basements of buildings or vulnerable installations might be affected. Also, the EU Floods Directive addresses the potential flood risk arising from groundwater sources. While the statistical analysis of extreme values is widely used in surface hydrology, there are currently only few studies that consider the specific properties of extreme groundwater levels. The main objective of this investigation is the application of at-site and regional frequency analysis in the field of hydrogeology. Extreme groundwater levels with a given return period (e.g. 100 years) are estimated with the method of L-moments and their uncertainty is quantified. Moreover, software tools are developed in order to make extreme value analysis a feasible technique for practical application by the Austrian Hydrological Service. These tools address demand for user-friendly handling as well as integration and an update of existing and readily derivable data. Lastly, the estimates are regionalized, thus information of extreme groundwater levels and accuracy of estimation can be retrieved at any point of the investigation area. The analysis is applied in four shallow, porous aquifers in Austria, with a total of more than 1000 time series records of groundwater levels, covering 10 - 50 years of observation. Firstly, local frequency analysis (LFA) is performed on a series of annual maximum peaks. The analysis of annual maxima allows for easy handling, but comes with the drawback of requiring 20-30 years of observation as minimum sample size. Due to anthropogenic impacts, natural changes of the hydrologic system, etc. this requirement cannot be met in numerous cases. Hence, the peaks over threshold (POT) approach and regional frequency analysis (RFA) is implemented. Thus, sufficiently large sample size can be derived from shorter time series either by selecting exceedances over a variable threshold (POT), or accounting for data from related observations (RFA, "trading space for time"). The results show, that at-site frequency analysis is applicable at 63% of the records, at which the peaks over threshold method yields more accurate estimates compared to the annual maxima. Regional frequency analysis can be applied at 51% of the samples and results in even further reduction of uncertainty. In the four case studies 12 - 45 % of the investigated area is susceptible to groundwater flood risk, i.e. an event with a return period of 100 years is likely to reach the terrain surface. As one of the outcomes, maps of depth to the groundwater table make it possible to identify areas prone to groundwater flooding or suitable for development at a glance.

  6. Probabilistic structural analysis methods development for SSME

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Hopkins, D. A.

    1988-01-01

    The development of probabilistic structural analysis methods is a major part of the SSME Structural Durability Program and consists of three program elements: composite load spectra, probabilistic finite element structural analysis, and probabilistic structural analysis applications. Recent progress includes: (1) the effects of the uncertainties of several factors on the HPFP blade temperature pressure and torque, (2) the evaluation of the cumulative distribution function of structural response variables based on assumed uncertainties on primitive structural variables, and (3) evaluation of the failure probability. Collectively, the results obtained demonstrate that the structural durability of critical SSME components can be probabilistically evaluated.

  7. Nutritional counselling in general practice: a cost effective analysis

    PubMed Central

    Pritchard, D. A.; Hyndman, J.; Taba, F.

    1999-01-01

    STUDY OBJECTIVE: To study the clinical and cost outcomes of providing nutritional counselling to patients with one or more of the following conditions: overweight, hypertension and type 2 diabetes. DESIGN: The study was designed as a random controlled trial. Consecutive patients were screened opportunistically for one or more of the above conditions and randomly allocated to one of two intervention groups (doctor/dietitian or dietitian) or a control group. Both intervention groups received six counselling sessions over 12 months from a dietitian. However, in the doctor/dietitian group it was the doctor and not the dietitian who invited the patient to join the study and the same doctor also reviewed progress at two of the six counselling sessions. SETTING: The study was conducted in a university group general practice set in a lower socioeconomic outer suburb of Perth, Western Australia. PATIENTS: Of the 273 patients randomly allocated to a study group, 198 were women. Age ranged from 25 to 65 years. Seventy eight per cent of patients resided in the lower two socioecnomic quartiles, 56 per cent described their occupation as home duties and 78 per cent were partnered. RESULTS: Both intervention groups reduced weight and blood pressure compared with the control group. Patients in the doctor/dietitian group were more likely to complete the 12 month programme than those in the dietitian group. Patients in the doctor/dietitian group lost an average of 6.7 kg at a cost of $A9.76 per kilogram, while the dietitian group lost 5.6 kg at a cost of $A7.30 per kilogram. CONCLUSION: General practitioners, in conjunction with a dietitian, can produce significant weight and blood pressure improvement by health promotion methods.   PMID:10396539

  8. Education Policy as a Practice of Power: Theoretical Tools, Ethnographic Methods, Democratic Options

    ERIC Educational Resources Information Center

    Levinson, Bradley A. U.; Sutton, Margaret; Winstead, Teresa

    2009-01-01

    This article outlines some theoretical and methodological parameters of a critical practice approach to policy. The article discusses the origins of this approach, how it can be uniquely adapted to educational analysis, and why it matters--not only for scholarly interpretation but also for the democratization of policy processes as well. Key to…

  9. A Renormalisation Group Method. IV. Stability Analysis

    NASA Astrophysics Data System (ADS)

    Brydges, David C.; Slade, Gordon

    2015-05-01

    This paper is the fourth in a series devoted to the development of a rigorous renormalisation group method for lattice field theories involving boson fields, fermion fields, or both. The third paper in the series presents a perturbative analysis of a supersymmetric field theory which represents the continuous-time weakly self-avoiding walk on . We now present an analysis of the relevant interaction functional of the supersymmetric field theory, which permits a nonperturbative analysis to be carried out in the critical dimension . The results in this paper include: proof of stability of the interaction, estimates which enable control of Gaussian expectations involving both boson and fermion fields, estimates which bound the errors in the perturbative analysis, and a crucial contraction estimate to handle irrelevant directions in the flow of the renormalisation group. These results are essential for the analysis of the general renormalisation group step in the fifth paper in the series.

  10. A renormalisation group method. IV. Stability analysis

    E-print Network

    David C. Brydges; Gordon Slade

    2014-11-25

    This paper is the fourth in a series devoted to the development of a rigorous renormalisation group method for lattice field theories involving boson fields, fermion fields, or both. The third paper in the series presents a perturbative analysis of a supersymmetric field theory which represents the continuous-time weakly self-avoiding walk on $\\mathbb{Z}^d$. We now present an analysis of the relevant interaction functional of the supersymmetric field theory, which permits a nonperturbative analysis to be carried out in the critical dimension $d = 4$. The results in this paper include: proof of stability of the interaction, estimates which enable control of Gaussian expectations involving both boson and fermion fields, estimates which bound the errors in the perturbative analysis, and a crucial contraction estimate to handle irrelevant directions in the flow of the renormalisation group. These results are essential for the analysis of the general renormalisation group step in the fifth paper in the series.

  11. PRACTICAL SENSITIVITY AND UNCERTAINTY ANALYSIS TECHNIQUES APPLIED TO AGRICULTURAL SYSTEMS MODELS

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We present a practical evaluation framework for analysis of two complex, process-based agricultural system models, WEPP and RZWQM. The evaluation framework combines sensitivity analysis and the uncertainty analysis techniques of first order error analysis (FOA) and Monte Carlo simulation with Latin ...

  12. Practical considerations for volumetric wear analysis of explanted hip arthroplasties

    PubMed Central

    Langton, D. J.; Sidaginamale, R. P.; Holland, J. P.; Deehan, D.; Joyce, T. J.; Nargol, A. V. F.; Meek, R. D.; Lord, J. K.

    2014-01-01

    Objectives Wear debris released from bearing surfaces has been shown to provoke negative immune responses in the recipient. Excessive wear has been linked to early failure of prostheses. Analysis using coordinate measuring machines (CMMs) can provide estimates of total volumetric material loss of explanted prostheses and can help to understand device failure. The accuracy of volumetric testing has been debated, with some investigators stating that only protocols involving hundreds of thousands of measurement points are sufficient. We looked to examine this assumption and to apply the findings to the clinical arena. Methods We examined the effects on the calculated material loss from a ceramic femoral head when different CMM scanning parameters were used. Calculated wear volumes were compared with gold standard gravimetric tests in a blinded study. Results Various scanning parameters including point pitch, maximum point to point distance, the number of scanning contours or the total number of points had no clinically relevant effect on volumetric wear calculations. Gravimetric testing showed that material loss can be calculated to provide clinically relevant degrees of accuracy. Conclusions Prosthetic surfaces can be analysed accurately and rapidly with currently available technologies. Given these results, we believe that routine analysis of explanted hip components would be a feasible and logical extension to National Joint Registries. Cite this article: Bone Joint Res 2014;3:60–8. PMID:24627327

  13. Author's personal copy Monte Carlo methods for design and analysis of radiation detectors

    E-print Network

    Shultis, J. Kenneth

    Author's personal copy Monte Carlo methods for design and analysis of radiation detectors William L Radiation detectors Inverse problems Detector design a b s t r a c t An overview of Monte Carlo as a practical method for designing and analyzing radiation detectors is provided. The emphasis is on detectors

  14. A Mixed Methods Content Analysis of the Research Literature in Science Education

    ERIC Educational Resources Information Center

    Schram, Asta B.

    2014-01-01

    In recent years, more and more researchers in science education have been turning to the practice of combining qualitative and quantitative methods in the same study. This approach of using mixed methods creates possibilities to study the various issues that science educators encounter in more depth. In this content analysis, I evaluated 18…

  15. Revenue and Expense Analysis: An Alternative Method for Analyzing University Operations.

    ERIC Educational Resources Information Center

    Jacquin, Jules C.

    1994-01-01

    Common university budgeting practices are summarized, and the efforts at Rensselaer Polytechnic Institute (New York) to develop an improved analytical method are described. The method, revenue and expense analysis, evaluates the university's financial operating performance by treating each school and program as autonomous financial organizations,…

  16. Practical guidelines to select and scale earthquake records for nonlinear response history analysis of structures

    USGS Publications Warehouse

    Kalkan, Erol; Chopra, Anil K.

    2010-01-01

    Earthquake engineering practice is increasingly using nonlinear response history analysis (RHA) to demonstrate performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. Presented herein is a modal-pushover-based scaling (MPS) method to scale ground motions for use in nonlinear RHA of buildings and bridges. In the MPS method, the ground motions are scaled to match (to a specified tolerance) a target value of the inelastic deformation of the first-'mode' inelastic single-degree-of-freedom (SDF) system whose properties are determined by first-'mode' pushover analysis. Appropriate for first-?mode? dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-'mode' SDF system in selecting a subset of the scaled ground motions. Based on results presented for two bridges, covering single- and multi-span 'ordinary standard' bridge types, and six buildings, covering low-, mid-, and tall building types in California, the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.

  17. Moving environmental DNA methods from concept to practice for monitoring aquatic macroorganisms

    USGS Publications Warehouse

    Goldberg, Caren S.; Strickler, Katherine M.; Pilliod, David S.

    2015-01-01

    The discovery that macroorganisms can be detected from their environmental DNA (eDNA) in aquatic systems has immense potential for the conservation of biological diversity. This special issue contains 11 papers that review and advance the field of eDNA detection of vertebrates and other macroorganisms, including studies of eDNA production, transport, and degradation; sample collection and processing to maximize detection rates; and applications of eDNA for conservation using citizen scientists. This body of work is an important contribution to the ongoing efforts to take eDNA detection of macroorganisms from technical breakthrough to established, reliable method that can be used in survey, monitoring, and research applications worldwide. While the rapid advances in this field are remarkable, important challenges remain, including consensus on best practices for collection and analysis, understanding of eDNA diffusion and transport, and avoidance of inhibition in sample collection and processing. Nonetheless, as demonstrated in this special issue, eDNA techniques for research and monitoring are beginning to realize their potential for contributing to the conservation of biodiversity globally.

  18. Practical method for evaluating the visibility of moire patterns for CRT design

    NASA Astrophysics Data System (ADS)

    Shiramatsu, Naoki; Tanigawa, Masashi; Iwata, Shuji

    1995-04-01

    The high resolution CRT displays used for computer monitor and high performance TV often produce a pattern of bright and dark stripes on the screen called a moire pattern. The elimination of the moire is an important consideration in the CRT design. The objective of this study is to provide a practical method for estimating and evaluating a moire pattern considering the visibility by the human vision. On the basis of the mathematical model of a moire generation, precise value of the period and the intensity of a moire are calculated from the actual data of the electron beam profile and the transmittance distribution of apertures of the shadow mask. The visibility of the moire is evaluated by plotting the calculation results on the contrast-period plane, which consists of visible and invisible moire pattern regions based on experimental results of the psychological tests. Not only fundamental design parameters such as a shadow mask pitch and a scanning line pitch but also details of an electron beam profile such as a distortion or an asymmetry can be examined. In addition to the analysis, the image simulation of a moire using the image memory is also available.

  19. Methodical Analysis of Adaptive Load Sharing Algorithms

    Microsoft Academic Search

    Orly Kremien; Jeff Kramer

    1992-01-01

    This paper presents a method for qualitative and quantitative analysis of load sharing algorithms, using a number of well known examples as illustration. Algorithm design choices are considered with respect to the main activities of information dissemination and allocation decision making. We argue that nodes must be capable of making local decisions, and for this efficient state dissemination techniques are

  20. Laboratory Policy Methods of Analysis & Quality Assurance

    E-print Network

    Maxwell, Bruce D.

    Laboratory Policy Methods of Analysis & Quality Assurance The Cereal Quality Lab (CQL) uses quality data. The laboratory participates in collaborative studies organized by AACC, Wheat Quality by the laboratory is treated confidentially and will not be revealed to third parties without prior consent

  1. A Mathematical Analysis of the PML Method

    Microsoft Academic Search

    Saul Abarbanel; David Gottlieb

    1997-01-01

    A detailed mathematical analysis of the Berenger PML method for the electromagnetic equations is carried out on the PDE level, as well as for the semidiscrete and fully discrete formulations. It is shown that the split set of equations is not strongly well-posed and that under certain conditions its solutions may be inappropriate.

  2. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, Lee M. (Philadelphia, TN); Ng, Esmond G. (Concord, TN)

    1998-01-01

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.

  3. Automated Analysis of Java Methods for Confidentiality

    E-print Network

    Cerny, Pavol

    Automated Analysis of Java Methods for Confidentiality Pavol Cern´y and Rajeev Alur University midlets for mobile devices, where a central correctness requirement con- cerns confidentiality of data, and are not applicable to checking confidentiality properties that re- quire reasoning about equivalence among executions

  4. Biomedical Signal Analysis - Contemporary Methods and Applications

    Microsoft Academic Search

    F.J. Theis; A. Meyer-Baese

    2010-01-01

    Biomedical signal analysis has become one of the most important visualization and interpretation methods in biology and medicine. Many new and powerful instruments for detecting, storing, transmitting, analyzing, and displaying images have been developed in recent years, allowing scientists and physicians to obtain quantitative measurements to support scientific hypotheses and medical diagnoses. This book offers an overview of a range

  5. ASAAM: Aspectual Software Architecture Analysis Method

    Microsoft Academic Search

    Bedir Tekinerdo

    Software architecture analysis methods aim to predict the quality of a system before it has been developed. In general, the quality of the architecture is validated by analyzing the impact of predefined scenarios on architectural components. Hereby, it is implicitly assumed that an appropriate refactoring of the architecture design can help in coping with critical scenarios and mending the architecture.

  6. Particle Filtering Methods for Subcellular Motion Analysis

    Microsoft Academic Search

    I. Smal

    2009-01-01

    Advances in fluorescent probing and microscopic imaging technology have revolutionized biology in the past decade and have opened the door for studying subcellular dynamical processes. However, accurate and reproducible methods for processing and analyzing the images acquired for such studies are still lacking. Since manual image analysis is time consuming, potentially inaccurate, and poorly reproducible, many biologically highly relevant questions

  7. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, L.M.; Ng, E.G.

    1998-09-29

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

  8. Least square methods in structural analysis

    Microsoft Academic Search

    L. G. Selna; M. A. Hakam

    2002-01-01

    A new approach to structural analysis is presented. The method uses equilibrium and deformation geometry field relations directly without resorting to an energy formulation. A matrix vector of errors in the field relations is minimized first with respect to the internal quantities, e.g., the moments. Subsequently the error is minimized with respect to the external forces and displacements. In the

  9. Component Analysis Methods for Computer Vision and

    E-print Network

    Botea, Adi

    1 Component Analysis Methods for Computer Vision and Pattern Recognition Fernando De la TorreFernando De la Torre Computer Vision and Pattern Recognition Easter SchoolComputer Vision and Pattern Vision and Pattern Recognition Easter SchoolComputer Vision and Pattern Recognition Easter School March

  10. Analysis methods for tocopherols and tocotrienols

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Tocopherols and tocotrienols, sometimes called tocochromanols or tocols, are also collectively termed Vitamin E. Vitamins A, D, E, and K, are referred to as fat soluble vitamins. Since the discovery of Vitamin E in 1922, many methods have been developed for the analysis of tocopherols and tocotrie...

  11. Practical Implementation of New Particle Tracking Method to the Real Field of Groundwater Flow and Transport.

    PubMed

    Suk, Heejun

    2012-01-01

    In articles published in 2009 and 2010, Suk and Yeh reported the development of an accurate and efficient particle tracking algorithm for simulating a path line under complicated unsteady flow conditions, using a range of elements within finite elements in multidimensions. Here two examples, an aquifer storage and recovery (ASR) example and a landfill leachate migration example, are examined to enhance the practical implementation of the proposed particle tracking method, known as Suk's method, to a real field of groundwater flow and transport. Results obtained by Suk's method are compared with those obtained by Pollock's method. Suk's method produces superior tracking accuracy, which suggests that Suk's method can describe more accurately various advection-dominated transport problems in a real field than existing popular particle tracking methods, such as Pollock's method. To illustrate the wide and practical applicability of Suk's method to random-walk particle tracking (RWPT), the original RWPT has been modified to incorporate Suk's method. Performance of the modified RWPT using Suk's method is compared with the original RWPT scheme by examining the concentration distributions obtained by the modified RWPT and the original RWPT under complicated transient flow systems. PMID:22476629

  12. Mixed-methods research in pharmacy practice: basics and beyond (part 1).

    PubMed

    Hadi, Muhammad Abdul; Alldred, David Phillip; Closs, S José; Briggs, Michelle

    2013-10-01

    This is the first of two papers which explore the use of mixed-methods research in pharmacy practice. In an era of evidence-based medicine and policy, high-quality research evidence is essential for the development of effective pharmacist-led services. Over the past decade, the use of mixed-methods research has become increasingly common in healthcare, although to date its use has been relatively limited in pharmacy practice research. In this article, the basic concepts of mixed-methods research including its definition, typologies and advantages in relation to pharmacy practice research are discussed. Mixed-methods research brings together qualitative and quantitative methodologies within a single study to answer or understand a research problem. There are a number of mixed-methods designs available, but the selection of an appropriate design must always be dictated by the research question. Importantly, mixed-methods research should not be seen as a 'tool' to collect qualitative and quantitative data, rather there should be some degree of 'integration' between the two data sets. If conducted appropriately, mixed-methods research has the potential to generate quality research evidence by combining strengths and overcoming the respective limitations of qualitative and quantitative methodologies. PMID:23418918

  13. Cask crush pad analysis using detailed and simplified analysis methods

    SciTech Connect

    Uldrich, E.D.; Hawkes, B.D.

    1997-12-31

    A crush pad has been designed and analyzed to absorb the kinetic energy of a hypothetically dropped spent nuclear fuel shipping cask into a 44-ft. deep cask unloading pool at the Fluorinel and Storage Facility (FAST). This facility, located at the Idaho Chemical Processing Plant (ICPP) at the Idaho national Engineering and Environmental Laboratory (INEEL), is a US Department of Energy site. The basis for this study is an analysis by Uldrich and Hawkes. The purpose of this analysis was to evaluate various hypothetical cask drop orientations to ensure that the crush pad design was adequate and the cask deceleration at impact was less than 100 g. It is demonstrated herein that a large spent fuel shipping cask, when dropped onto a foam crush pad, can be analyzed by either hand methods or by sophisticated dynamic finite element analysis using computer codes such as ABAQUS. Results from the two methods are compared to evaluate accuracy of the simplified hand analysis approach.

  14. Practical use of three-dimensional inverse method for compressor blade design

    Microsoft Academic Search

    S. Damle; T. Dang; J. Stringham; E. Razinsky

    1999-01-01

    The practical utility of a three-dimensional inverse viscous method is demonstrated by carrying out a design modification of a first-stage rotor in an industrial compressor. In this design modification study, the goal is to improve the efficiency of the original blade while retaining its overall aerodynamic, structural, and manufacturing characteristics. By employing a simple modification to the blade pressure loading

  15. METHODS FOR MONITORING LEACHATE LOSSES UNDER IRRIGATED CORN BEST MANAGEMENT PRACTICES

    Microsoft Academic Search

    Nathan E. Derby; Raymond E. Knighton; Dean D. Steele; Bruce R. Montgomery

    Leaching of applied agricultural chemicals is a process which must be fully understood if we are to reduce agriculture's impact on the environment. A Best Management Practices (BMP) project was initiated in 1989 near Oakes, ND. A primary objective of the study was to develop and evaluate methods for measuring leachate losses from a corn (Zea mays L.) root zone.

  16. Passive Sampling Methods for Contaminated Sediments: Practical Guidance for Selection, Calibration, and Implementation

    EPA Science Inventory

    This article provides practical guidance on the use of passive sampling methods(PSMs) that target the freely dissolved concentration (Cfree) for improved exposure assessment of hydrophobic organic chemicals in sediments. Primary considerations for selecting a PSM for a specific a...

  17. Econometric Methods for Causal Evaluation of Education Policies and Practices: A Non-Technical Guide

    ERIC Educational Resources Information Center

    Schlotter, Martin; Schwerdt, Guido; Woessmann, Ludger

    2011-01-01

    Education policy-makers and practitioners want to know which policies and practices can best achieve their goals. But research that can inform evidence-based policy often requires complex methods to distinguish causation from accidental association. Avoiding econometric jargon and technical detail, this paper explains the main idea and intuition…

  18. A comparison of South Australia's driver licensing methods: competency-based training vs. practical examination

    E-print Network

    A comparison of South Australia's driver licensing methods: competency-based training vs. practical Australia, 5th Floor CDRC Building, The Queen Elizabeth Hospital, Woodville Road, Woodville SA 5011, Australia b Transport Systems Centre, School of Geoinformatics, Planning and Building, University of South

  19. Self-Assessment Methods in Writing Instruction: A Conceptual Framework, Successful Practices and Essential Strategies

    ERIC Educational Resources Information Center

    Nielsen, Kristen

    2014-01-01

    Student writing achievement is essential to lifelong learner success, but supporting writing can be challenging for teachers. Several large-scale analyses of publications on writing have called for further study of instructional methods, as the current literature does not sufficiently address the need to support best teaching practices

  20. Job task analysis: Guide to good practice. [Contains glossary

    Microsoft Academic Search

    T. Mazour; J. Riplinger; M. Wetzel; P. Wolfe; G. Harris

    1989-01-01

    A JOB ANALYSIS is a process used to determine what a job includes. A TASK ANALYSIS is a process used to determine how to perform a job. Simply put, a job\\/task analysis (JTA) tells you exactly what is included in a particular job and exactly how it is supposed to be done. This manual was designed for the analysis team(s)

  1. Job task analysis: Guide to good practice. [Contains glossary

    Microsoft Academic Search

    T. Mazour; J. Riplinger; M. Wetzel; P. Wolfe; G. Harris

    1989-01-01

    A job analysis is a process used to determine what a job includes. A task analysis is a process used to determine how to perform a job. Simply put, a job\\/task analysis (JTA) tells you exactly what is included in a particular job and exactly how it is supposed to be done. This manual was designed for the analysis team(s)

  2. Comparison and Cost Analysis of Drinking Water Quality Monitoring Requirements versus Practice in Seven Developing Countries

    PubMed Central

    Crocker, Jonny; Bartram, Jamie

    2014-01-01

    Drinking water quality monitoring programs aim to support provision of safe drinking water by informing water quality management. Little evidence or guidance exists on best monitoring practices for low resource settings. Lack of financial, human, and technological resources reduce a country’s ability to monitor water supply. Monitoring activities were characterized in Cambodia, Colombia, India (three states), Jordan, Peru, South Africa, and Uganda according to water sector responsibilities, monitoring approaches, and marginal cost. The seven study countries were selected to represent a range of low resource settings. The focus was on monitoring of microbiological parameters, such as E. coli, coliforms, and H2S-producing microorganisms. Data collection involved qualitative and quantitative methods. Across seven study countries, few distinct approaches to monitoring were observed, and in all but one country all monitoring relied on fixed laboratories for sample analysis. Compliance with monitoring requirements was highest for operational monitoring of large water supplies in urban areas. Sample transport and labor for sample collection and analysis together constitute approximately 75% of marginal costs, which exclude capital costs. There is potential for substantive optimization of monitoring programs by considering field-based testing and by fundamentally reconsidering monitoring approaches for non-piped supplies. This is the first study to look quantitatively at water quality monitoring practices in multiple developing countries. PMID:25046632

  3. Practice characteristics and prior authorization costs: secondary analysis of data collected by SALT-Net in 9 central New York primary care practices

    PubMed Central

    2014-01-01

    Background An increase in prior authorization (PA) requirements from health insurance companies is placing administrative and financial burdens on primary care offices across the United States. As time allocation for these cases continues to grow, physicians are concerned with additional workload and inefficiency in the workplace. The objective is to estimate the effects of practice characteristics on time spent per prior authorization request in primary care practices. Methods Secondary analysis was performed using data on nine primary care practices in Central New York. Practice characteristics and demographics were collected at the onset of the study. In addition, participants were instructed to complete an "event form" (EF) to document each prior authorization event during a 4–6 week period; prior authorizations included requests for medication as well as other health care services. Stepwise Ordinary Least Squares (OLS) Regression was used to model Time in Minutes of each event as an outcome of various factors. Results Prior authorization events (N?=?435) took roughly 20 minutes to complete (beta?=?20.017, p?

  4. Job task analysis: Guide to good practice. [Contains glossary

    SciTech Connect

    Mazour, T.; Riplinger, J.; Wetzel, M.; Wolfe, P.; Harris, G.

    1989-08-01

    A job analysis is a process used to determine what a job includes. A task analysis is a process used to determine how to perform a job. Simply put, a job/task analysis (JTA) tells you exactly what is included in a particular job and exactly how it is supposed to be done. This manual was designed for the analysis team(s) and supervisors of the team(s) assigned the responsibility of conducting a JTA and/or needs analysis.

  5. Structural sensitivity analysis: Methods, applications and needs

    NASA Technical Reports Server (NTRS)

    Adelman, H. M.; Haftka, R. T.; Camarda, C. J.; Walsh, J. L.

    1984-01-01

    Innovative techniques applicable to sensitivity analysis of discretized structural systems are reviewed. The techniques include a finite difference step size selection algorithm, a method for derivatives of iterative solutions, a Green's function technique for derivatives of transient response, simultaneous calculation of temperatures and their derivatives, derivatives with respect to shape, and derivatives of optimum designs with respect to problem parameters. Computerized implementations of sensitivity analysis and applications of sensitivity derivatives are also discussed. Some of the critical needs in the structural sensitivity area are indicated along with plans for dealing with some of those needs.

  6. Structural sensitivity analysis: Methods, applications, and needs

    NASA Technical Reports Server (NTRS)

    Adelman, H. M.; Haftka, R. T.; Camarda, C. J.; Walsh, J. L.

    1984-01-01

    Some innovative techniques applicable to sensitivity analysis of discretized structural systems are reviewed. These techniques include a finite-difference step-size selection algorithm, a method for derivatives of iterative solutions, a Green's function technique for derivatives of transient response, a simultaneous calculation of temperatures and their derivatives, derivatives with respect to shape, and derivatives of optimum designs with respect to problem parameters. Computerized implementations of sensitivity analysis and applications of sensitivity derivatives are also discussed. Finally, some of the critical needs in the structural sensitivity area are indicated along with Langley plans for dealing with some of these needs.

  7. Finite element methods for integrated aerodynamic heating analysis

    NASA Technical Reports Server (NTRS)

    Peraire, J.

    1990-01-01

    Over the past few years finite element based procedures for the solution of high speed viscous compressible flows were developed. The objective of this research is to build upon the finite element concepts which have already been demonstrated and to develop these ideas to produce a method which is applicable to the solution of large scale practical problems. The problems of interest range from three dimensional full vehicle Euler simulations to local analysis of three-dimensional viscous laminar flow. Transient Euler flow simulations involving moving bodies are also to be included. An important feature of the research is to be the coupling of the flow solution methods with thermal/structural modeling techniques to provide an integrated fluid/thermal/structural modeling capability. The progress made towards achieving these goals during the first twelve month period of the research is presented.

  8. Eigenmode Analysis of Galaxy Redshift Surveys I. Theory and Methods

    E-print Network

    Michael S. Vogeley; Alexander S. Szalay

    1996-01-30

    We describe a method for estimating the power spectrum of density fluctuations from galaxy redshift surveys that yields improvement in both accuracy and resolution over direct Fourier analysis. The key feature of this analysis is expansion of the observed density field in the unique set of statistically orthogonal spatial functions which obtains for a given survey's geometry and selection function and the known properties of galaxy clustering (the Karhunen-Loeve transform). Each of these eigenmodes of the observed density field optimally weights the data to yield the cleanest (highest signal/noise) possible measure of clustering power as a function of wavelength scale for any survey. Using Bayesian methods, we simultaneously estimate the mean density, power spectrum of density fluctuations, and redshift distortion parameters that best fit the observed data. This method is particularly important for analysis of surveys with small sky coverage, that are comprised of disjoint regions (e.g., an ensemble of pencil beams or slices), or that have large fluctuations in sampling density. We present algorithms for practical application of this technique to galaxy survey data.

  9. Spectroscopic Chemical Analysis Methods and Apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F.; Reid, Ray D.

    2012-01-01

    This invention relates to non-contact spectroscopic methods and apparatus for performing chemical analysis and the ideal wavelengths and sources needed for this analysis. It employs deep ultraviolet (200- to 300-nm spectral range) electron-beam-pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor lightemitting devices, and hollow cathode metal ion lasers. Three achieved goals for this innovation are to reduce the size (under 20 L), reduce the weight [under 100 lb (.45 kg)], and reduce the power consumption (under 100 W). This method can be used in microscope or macroscope to provide measurement of Raman and/or native fluorescence emission spectra either by point-by-point measurement, or by global imaging of emissions within specific ultraviolet spectral bands. In other embodiments, the method can be used in analytical instruments such as capillary electrophoresis, capillary electro-chromatography, high-performance liquid chromatography, flow cytometry, and related instruments for detection and identification of unknown analytes using a combination of native fluorescence and/or Raman spectroscopic methods. This design provides an electron-beampumped semiconductor radiation-producing method, or source, that can emit at a wavelength (or wavelengths) below 300 nm, e.g. in the deep ultraviolet between about 200 and 300 nm, and more preferably less than 260 nm. In some variations, the method is to produce incoherent radiation, while in other implementations it produces laser radiation. In some variations, this object is achieved by using an AlGaN emission medium, while in other implementations a diamond emission medium may be used. This instrument irradiates a sample with deep UV radiation, and then uses an improved filter for separating wavelengths to be detected. This provides a multi-stage analysis of the sample. To avoid the difficulties related to producing deep UV semiconductor sources, a pumping approach has been developed that uses ballistic electron beam injection directly into the active region of a wide bandgap semiconductor material.

  10. Results from three years of the world's largest interlaboratory comparison for total mercury and methylmercury: Method performance and best practices

    NASA Astrophysics Data System (ADS)

    Creswell, J. E.; Engel, V.; Carter, A.; Davies, C.

    2013-12-01

    Brooks Rand Instruments has conducted the world's largest interlaboratory comparison study for total mercury and methylmercury in natural waters annually for three years. Each year, roughly 50 laboratories registered to participate and the majority of participants submitted results. Each laboratory was assigned a performance score based on the distance between its results and the consensus mean, as well as the precision of its replicate analyses. Participants were also asked to provide detailed data on their analytical methodology and equipment. We used the methodology data and performance scores to assess the performance of the various methods reported and equipment used. Although the majority of methods in use show no systematic trend toward poor analytical performance, there are noteworthy exceptions. We present results from each of the three years of the interlaboratory comparison exercise, as well as aggregated method performance data. We compare the methods used in this study to methods from other published interlaboratory comparison studies and present a list of recommended best practices. Our goals in creating a list of best practices are to maximize participation, ensure inclusiveness, minimize non-response bias, guarantee high data quality, and promote transparency of analysis. We seek to create a standardized methodology for interlaboratory comparison exercises for total mercury and methylmercury analysis in water, which will lead to more directly comparable results between studies. We show that in most cases, the coefficient of variation between labs measuring replicates of the same sample is greater than 20% after the removal of outlying data points (e.g. Figure 1). It is difficult to make comparisons between studies and ecosystems with such a high variability between labs. We highlight the need for regular participation in interlaboratory comparison studies and continuous analytical method improvement in order to ensure accurate data. Figure 1. Results from one sample analyzed in the 2013 Interlaboratory Comparison Study.

  11. Job task analysis: Guide to good practice. [Contains glossary

    SciTech Connect

    Mazour, T.; Riplinger, J.; Wetzel, M.; Wolfe, P.; Harris, G.

    1989-08-01

    A JOB ANALYSIS is a process used to determine what a job includes. A TASK ANALYSIS is a process used to determine how to perform a job. Simply put, a job/task analysis (JTA) tells you exactly what is included in a particular job and exactly how it is supposed to be done. This manual was designed for the analysis team(s) and supervisors of the team(s) assigned the responsibility of conducting a JTA and/or needs analysis. However, anyone involved with a JTA or needs analysis (such as subject-matter experts) may find this information helpful.

  12. Measurement methods for human exposure analysis.

    PubMed Central

    Lioy, P J

    1995-01-01

    The general methods used to complete measurements of human exposures are identified and illustrations are provided for the cases of indirect and direct methods used for exposure analysis. The application of the techniques for external measurements of exposure, microenvironmental and personal monitors, are placed in the context of the need to test hypotheses concerning the biological effects of concern. The linkage of external measurements to measurements made in biological fluids is explored for a suite of contaminants. This information is placed in the context of the scientific framework used to conduct exposure assessment. Examples are taken from research on volatile organics and for a large scale problem: hazardous waste sites. PMID:7635110

  13. Professional Learning in Rural Practice: A Sociomaterial Analysis

    ERIC Educational Resources Information Center

    Slade, Bonnie

    2013-01-01

    Purpose: This paper aims to examine the professional learning of rural police officers. Design/methodology/approach: This qualitative case study involved interviews and focus groups with 34 police officers in Northern Scotland. The interviews and focus groups were transcribed and analysed, drawing on practice-based and sociomaterial learning…

  14. Analysis of factors influencing project cost estimating practice

    Microsoft Academic Search

    Akintola Akintoye

    2000-01-01

    Although extensive research has been undertaken on factors influencing the decision to tender and mark-up and tender price determination for construction projects, very little of this research contains information appropriate to the factors involved in costing construction projects. The object of this study was to gain an understanding of the factors influencing contractors' cost estimating practice. This was achieved through

  15. Foreign Corrupt Practices Act: A legal and moral analysis

    Microsoft Academic Search

    Bill Shaw; Ashland Oil

    1988-01-01

    The author examines the categories of bribes that are prohibited under the Foreign Corrupt Practices Act from the perspective of three significant moral theories: utility, rights and justice. He concludes that the Act does not go too far in demanding ethical behaviors from U.S. business people doing business in foreign markets, therefore, it is not in need of a major

  16. Honesty in Critically Reflective Essays: An Analysis of Student Practice

    ERIC Educational Resources Information Center

    Maloney, Stephen; Tai, Joanna Hong-Meng; Lo, Kristin; Molloy, Elizabeth; Ilic, Dragan

    2013-01-01

    In health professional education, reflective practice is seen as a potential means for self-improvement from everyday clinical encounters. This study aims to examine the level of student honesty in critical reflection, and barriers and facilitators for students engaging in honest reflection. Third year physiotherapy students, completing summative…

  17. An Analysis of Teacher Practices with Toddlers during Social Conflicts

    ERIC Educational Resources Information Center

    Gloeckler, Lissy R.; Cassell, Jennifer M.; Malkus, Amy J.

    2014-01-01

    Employing a quasi-experimental design, this pilot study on teacher practices with toddlers during social conflicts was conducted in the southeastern USA. Four child-care classrooms, teachers (n?=?8) and children (n?=?51) were assessed with the Classroom Assessment Scoring System -- Toddler [CLASS-Toddler; La Paro, K., Hamre, B. K., & Pianta,…

  18. A Meta-Analysis of Intervention Research with Problem Behavior: Treatment Validity and Standards of Practice.

    ERIC Educational Resources Information Center

    Scotti, Joseph R.; And Others

    1991-01-01

    This meta-analysis of the developmental disabilities literature on remediation of problem behaviors evaluated relations between standards of practice, intervention and participant characteristics, and treatment validity. Results largely failed to support widespread assumptions of clinical practice such as the superiority of more intrusive…

  19. Nursing Faculty Decision Making about Best Practices in Test Construction, Item Analysis, and Revision

    ERIC Educational Resources Information Center

    Killingsworth, Erin Elizabeth

    2013-01-01

    With the widespread use of classroom exams in nursing education there is a great need for research on current practices in nursing education regarding this form of assessment. The purpose of this study was to explore how nursing faculty members make decisions about using best practices in classroom test construction, item analysis, and revision in…

  20. Reporting Practices in Confirmatory Factor Analysis: An Overview and Some Recommendations

    ERIC Educational Resources Information Center

    Jackson, Dennis L.; Gillaspy, J. Arthur, Jr.; Purc-Stephenson, Rebecca

    2009-01-01

    Reporting practices in 194 confirmatory factor analysis studies (1,409 factor models) published in American Psychological Association journals from 1998 to 2006 were reviewed and compared with established reporting guidelines. Three research questions were addressed: (a) how do actual reporting practices compare with published guidelines? (b) how…

  1. A Practical Ion Trap Mass Spectrometer for the Analysis of Peptides by Matrix-Assisted Laser

    E-print Network

    Chait, Brian T.

    A Practical Ion Trap Mass Spectrometer for the Analysis of Peptides by Matrix-Assisted Laser of peptide ions. The new instrument is demonstrated to be a highly practical tool for analyzing proteins. In particular, mixtures containing as many as 30 peptide components can be rapidly and sensitively analyzed

  2. Bridging Work Practice and System Design: Integrating Systemic Analysis, Appreciative Intervention and Practitioner Participation

    Microsoft Academic Search

    Helena Karasti

    2001-01-01

    This article discusses the integration of work practice and system design. By scrutinising the unfolding discourse of workshop participants the co-construction of work practice issues as relevant design considerations is described. Through a mutual exploration of ethnography and participatory design the contributing constituents to the co-construction process are identified and put forward as elements in the integration of `systemic analysis'

  3. Digital dream analysis: a revised method.

    PubMed

    Bulkeley, Kelly

    2014-10-01

    This article demonstrates the use of a digital word search method designed to provide greater accuracy, objectivity, and speed in the study of dreams. A revised template of 40 word search categories, built into the website of the Sleep and Dream Database (SDDb), is applied to four "classic" sets of dreams: The male and female "Norm" dreams of Hall and Van de Castle (1966), the "Engine Man" dreams discussed by Hobson (1988), and the "Barb Sanders Baseline 250" dreams examined by Domhoff (2003). A word search analysis of these original dream reports shows that a digital approach can accurately identify many of the same distinctive patterns of content found by previous investigators using much more laborious and time-consuming methods. The results of this study emphasize the compatibility of word search technologies with traditional approaches to dream content analysis. PMID:25286125

  4. Language Ideology or Language Practice? An Analysis of Language Policy Documents at Swedish Universities

    ERIC Educational Resources Information Center

    Björkman, Beyza

    2014-01-01

    This article presents an analysis and interpretation of language policy documents from eight Swedish universities with regard to intertextuality, authorship and content analysis of the notions of language practices and English as a lingua franca (ELF). The analysis is then linked to Spolsky's framework of language policy, namely language…

  5. Stirling Analysis Comparison of Commercial vs. High-Order Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako

    2007-01-01

    Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/ proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's Compact scheme and Dyson s Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model although sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.

  6. Stirling Analysis Comparison of Commercial Versus High-Order Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako

    2005-01-01

    Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's compact scheme and Dyson's Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model with sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.

  7. Meaning and challenges in the practice of multiple therapeutic massage modalities: a combined methods study

    PubMed Central

    2011-01-01

    Background Therapeutic massage and bodywork (TMB) practitioners are predominantly trained in programs that are not uniformly standardized, and in variable combinations of therapies. To date no studies have explored this variability in training and how this affects clinical practice. Methods Combined methods, consisting of a quantitative, population-based survey and qualitative interviews with practitioners trained in multiple therapies, were used to explore the training and practice of TMB practitioners in Alberta, Canada. Results Of the 5242 distributed surveys, 791 were returned (15.1%). Practitioners were predominantly female (91.7%), worked in a range of environments, primarily private (44.4%) and home clinics (35.4%), and were not significantly different from other surveyed massage therapist populations. Seventy-seven distinct TMB therapies were identified. Most practitioners were trained in two or more therapies (94.4%), with a median of 8 and range of 40 therapies. Training programs varied widely in number and type of TMB components, training length, or both. Nineteen interviews were conducted. Participants described highly variable training backgrounds, resulting in practitioners learning unique combinations of therapy techniques. All practitioners reported providing individualized patient treatment based on a responsive feedback process throughout practice that they described as being critical to appropriately address the needs of patients. They also felt that research treatment protocols were different from clinical practice because researchers do not usually sufficiently acknowledge the individualized nature of TMB care provision. Conclusions The training received, the number of therapies trained in, and the practice descriptors of TMB practitioners are all highly variable. In addition, clinical experience and continuing education may further alter or enhance treatment techniques. Practitioners individualize each patient's treatment through a highly adaptive process. Therefore, treatment provision is likely unique to each practitioner. These results may be of interest to researchers considering similar practice issues in other professions. The use of a combined-methods design effectively captured this complexity of TMB practice. TMB research needs to consider research approaches that can capture or adapt to the individualized nature of practice. PMID:21929823

  8. A PROBABILISTIC APPROACH FOR ANALYSIS OF UNCERTAINTY IN THE EVALUATION OF WATERSHED MANAGEMENT PRACTICES

    EPA Science Inventory

    A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...

  9. Contrasting Portraits of Email Practices: Visual approaches to reflection and analysis

    E-print Network

    Golbeck, Jennifer

    Contrasting Portraits of Email Practices: Visual approaches to reflection and analysis Adam Perer masmith@microsoft.com ABSTRACT Over time, many people accumulate extensive email repositories that contain visualizations that capture hierarchical, correlational, and temporal patterns present in user's email

  10. Probabilistic methods in fire-risk analysis

    SciTech Connect

    Brandyberry, M.D.

    1989-01-01

    The first part of this work outlines a method for assessing the frequency of ignition of a consumer product in a building and shows how the method would be used in an example scenario utilizing upholstered furniture as the product and radiant auxiliary heating devices (electric heaters, wood stoves) as the ignition source. Deterministic thermal models of the heat-transport processes are coupled with parameter uncertainty analysis of the models and with a probabilistic analysis of the events involved in a typical scenario. This leads to a distribution for the frequency of ignition for the product. In second part, fire-risk analysis as currently used in nuclear plants is outlines along with a discussion of the relevant uncertainties. The use of the computer code COMPBRN is discussed for use in the fire-growth analysis along with the use of response-surface methodology to quantify uncertainties in the code's use. Generalized response surfaces are developed for temperature versus time for a cable tray, as well as a surface for the hot gas layer temperature and depth for a room of arbitrary geometry within a typical nuclear power plant compartment. These surfaces are then used to simulate the cable tray damage time in a compartment fire experiment.

  11. Comment on Pearl: Practical implications of theoretical results for causal mediation analysis.

    PubMed

    Imai, Kosuke; Keele, Luke; Tingley, Dustin; Yamamoto, Teppei

    2014-12-01

    Mediation analysis has been extensively applied in psychological and other social science research. A number of methodologists have recently developed a formal theoretical framework for mediation analysis from a modern causal inference perspective. In Imai, Keele, and Tingley (2010), we have offered such an approach to causal mediation analysis that formalizes identification, estimation, and sensitivity analysis in a single framework. This approach has been used by a number of substantive researchers, and in subsequent work we have also further extended it to more complex settings and developed new research designs. In an insightful article, Pearl (2014) proposed an alternative approach that is based on a set of assumptions weaker than ours. In this comment, we demonstrate that the theoretical differences between our identification assumptions and his alternative conditions are likely to be of little practical relevance in the substantive research settings faced by most psychologists and other social scientists. We also show that our proposed estimation algorithms can be easily applied in the situations discussed in Pearl (2014). The methods discussed in this comment and many more are implemented via mediation, an open-source software (Tingley, Yamamoto, Hirose, Keele, & Imai, 2013). PMID:25486116

  12. Reform-based science teaching: A mixed-methods approach to explaining variation in secondary science teacher practice

    NASA Astrophysics Data System (ADS)

    Jetty, Lauren E.

    The purpose of this two-phase, sequential explanatory mixed-methods study was to understand and explain the variation seen in secondary science teachers' enactment of reform-based instructional practices. Utilizing teacher socialization theory, this mixed-methods analysis was conducted to determine the relative influence of secondary science teachers' characteristics, backgrounds and experiences across their teacher development to explain the range of teaching practices exhibited by graduates from three reform-oriented teacher preparation programs. Data for this study were obtained from the Investigating the Meaningfulness of Preservice Programs Across the Continuum of Teaching (IMPPACT) Project, a multi-university, longitudinal study funded by NSF. In the first quantitative phase of the study, data for the sample (N=120) were collected from three surveys from the IMPPACT Project database. Hierarchical multiple regression analysis was used to examine the separate as well as the combined influence of factors such as teachers' personal and professional background characteristics, beliefs about reform-based science teaching, feelings of preparedness to teach science, school context, school culture and climate of professional learning, and influences of the policy environment on the teachers' use of reform-based instructional practices. Findings indicate three blocks of variables, professional background, beliefs/efficacy, and local school context added significant contribution to explaining nearly 38% of the variation in secondary science teachers' use of reform-based instructional practices. The five variables that significantly contributed to explaining variation in teachers' use of reform-based instructional practices in the full model were, university of teacher preparation, sense of preparation for teaching science, the quality of professional development, science content focused professional, and the perceived level of professional autonomy. Using the results from phase one, the second qualitative phase selected six case study teachers based on their levels of reform-based teaching practices to highlight teachers across the range of practices from low, average, to high levels of implementation. Using multiple interview sources, phase two helped to further explain the variation in levels of reform-based practices. Themes related to teachers' backgrounds, local contexts, and state policy environments were developed as they related to teachers' socialization experiences across these contexts. The results of the qualitative analysis identified the following factors differentiating teachers who enacted reform-based instructional practices from those who did not: 1) extensive science research experiences prior to their preservice teacher preparation; 2) the structure and quality of their field placements; 3) developing and valuing a research-based understanding of teaching and learning as a result of their preservice teacher preparation experiences; 4) the professional culture of their school context where there was support for a high degree of professional autonomy and receiving support from "educational companions" with a specific focus on teacher pedagogy to support student learning; and 5) a greater sense of agency to navigate their districts' interpretation and implementation of state polices. Implications for key stakeholders as well as directions for future research are discussed.

  13. Methods for Model Selection & Checking Sensitivity Analysis for Bayes Factors

    E-print Network

    Masci, Frank

    Methods for Model Selection & Checking Sensitivity Analysis for Bayes Factors for Model Selection & Checking Sensitivity Analysis for Bayes Factors A Radical Suggestion Methods for Model Selection & Checking Sensitivity Analysis for Bayes Factors A Radical Suggestion

  14. Accommodations for patients with disabilities in primary care: a mixed methods study of practice administrators.

    PubMed

    Pharr, Jennifer R

    2014-01-01

    Structural barriers that limit access to health care services for people with disabilities have been identified through qualitative studies; however, little is known about how patients with disabilities are accommodated in the clinical setting when a structural barrier is encountered. The purpose of this study was to identify how primary care medical practices in the United States accommodated people with disabilities when a barrier to service is encountered. Primary care practice administrators from the medical management organization were identified through the organization's website. Sixty-three administrators from across the US participated in this study. Practice administrators reported that patients were examined in their wheelchairs (76%), that parts of the exam where skipped when a barrier was encountered (44%), that patients were asked to bring someone with them (52.4%) or that patients were refused treatment due to an inaccessible clinic (3.2%). These methods of accommodation would not be in compliance with requirements of the Americans with Disabilities Act. There was not a significant difference (p>0.05) in accommodations for patients with disabilities between administrators who could describe the application of the ADA to their clinic and those who could not. Practice administrators need a comprehensive understanding of the array of challenges encountered by patients with disabilities throughout the health care process and of how to best accommodate patients with disabilities in their practice. PMID:24373261

  15. Don Quixote in the digital age: an analysis of traditional editorial practices and current electronic editions 

    E-print Network

    Lugo Ibarra, Cruz Yolanda

    1999-01-01

    DON QUIXOTE IN THE DIGITAL AGE: AN ANALYSIS OF TRADITIONAL EDITORIAL PRACTICES AND CURRENT ELECTRONIC EDITIONS A Thesis by CRUZ YOLANDA LUGO IBARRA Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment... of the requirements for the degree of MASTER OF ARTS December 1999 Major Subject: Modern Languages DON QUIXOTE IN THE DIGITAL AGE: AN ANALYSIS OF TRADITIONAL EDITORIAL PRACTICES AND CURRENT ELECTRONIC EDITIONS A Thesis by CRUZ YOLANDA LUGO IBARRA Submitted...

  16. Pulsed plasma measurement method using harmonic analysis

    NASA Astrophysics Data System (ADS)

    Kim, Yu-Sin; Kim, Dong-Hwan; Lee, Hyo-Chang; Chung, Chin-Wook

    2015-06-01

    A phase delay harmonic analysis method (PDHAM) with high-time resolution is proposed to measure the plasma parameters of the pulsed plasmas. The PDHAM, which is based on the floating harmonic method, applies the phase delayed voltages to a probe tip, and obtains each of the currents in the phase-domain at a given time. The time resolution of this method is 0.8 ?s, and the total measurement is done within 2 s in the case of a pulsed plasma with a frequency of 1 kHz. The measurement result of the plasma parameters was compared with a conventional Langmuir probe using a boxcar mode, and shows good agreements. Because this PDHAM can measure the plasma parameters even in the processing discharges, it is expected to be usefully applied to plasma diagnostics for pulsed processing plasmas.

  17. Review of Computational Stirling Analysis Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  18. Optical methods for the analysis of dermatopharmacokinetics

    NASA Astrophysics Data System (ADS)

    Lademann, Juergen; Weigmann, Hans-Juergen; von Pelchrzim, R.; Sterry, Wolfram

    2002-07-01

    The method of tape stripping in combination with spectroscopic measurements is a simple and noninvasive method for the analysis of dermatopharmacokinetics of cosmetic products and topically applied drugs. The absorbance at 430 nm was used for the characterization of the amount of corneocytes on the tape strips. It was compared to the increase of weight of the tapes after removing them from the skin surface. The penetration profiles of two UV filter substances used in sunscreens were determined. The combined method of tape stripping and spectroscopic measurements can be also used for the investigation of the dermatopharmacokinetics of topically applied drugs passing through the skin. Differences in the penetration profiles of the steroid compound clobetasol, applied in the same concentration in different formulations on the skin are presented.

  19. Interesting Times: Practice, Science, and Professional Associations in Behavior Analysis

    PubMed Central

    Critchfield, Thomas S

    2011-01-01

    Neither practitioners nor scientists appear to be fully satisfied with the world's largest behavior-analytic membership organization. Each community appears to believe that initiatives that serve the other will undermine the association's capacity to serve their own needs. Historical examples suggest that such discord is predicted when practitioners and scientists cohabit the same association. This is true because all professional associations exist to address guild interests, and practice and science are different professions with different guild interests. No association, therefore, can succeed in being all things to all people. The solution is to assure that practice and science communities are well served by separate professional associations. I comment briefly on how this outcome might be promoted. PMID:22532750

  20. Developing a preliminary ‘never event’ list for general practice using consensus-building methods

    PubMed Central

    de Wet, Carl; O’Donnell, Catherine; Bowie, Paul

    2014-01-01

    Background The ‘never event’ concept has been implemented in many acute hospital settings to help prevent serious patient safety incidents. Benefits include increasing awareness of highly important patient safety risks among the healthcare workforce, promoting proactive implementation of preventive measures, and facilitating incident reporting. Aim To develop a preliminary list of never events for general practice. Design and setting Application of a range of consensus-building methods in Scottish and UK general practices. Method A total of 345 general practice team members suggested potential never events. Next, ‘informed’ staff (n =15) developed criteria for defining never events and applied the criteria to create a list of candidate never events. Finally, UK primary care patient safety ‘experts’ (n = 17) reviewed, refined, and validated a preliminary list via a modified Delphi group and by completing a content validity index exercise. Results There were 721 written suggestions received as potential never events. Thematic categorisation reduced this to 38. Five criteria specific to general practice were developed and applied to produce 11 candidate never events. The expert group endorsed a preliminary list of 10 items with a content validity index (CVI) score of >80%. Conclusion A preliminary list of never events was developed for general practice through practitioner experience and consensus-building methods. This is an important first step to determine the potential value of the never event concept in this setting. It is now intended to undertake further testing of this preliminary list to assess its acceptability, feasibility, and potential usefulness as a safety improvement intervention. PMID:24567655

  1. Regulating forest practices in Texas: a problem analysis

    E-print Network

    Dreesen, Alan D

    1977-01-01

    Stabilization and Conservation Service (ASCS) (Federal) U. S. Forest Service (USFS) (Federal) Environmental Protection Agency (EPA) (Federal) Soil Conservation Service (SCS) (Federal) Texas Forest Service (TFS) (State) Texas Agricultural Extension Service... application in future research efforts dealing with impacts of various forest practices. Dr. R. G. Merrifield, Head, Department of Forest Science served as a light house throughout the problem selection and research processes. His experience and judgement...

  2. Sensitivity Analysis of Parallel Manipulators using an Interval Linearization Method

    E-print Network

    Paris-Sud XI, Université de

    Sensitivity Analysis of Parallel Manipulators using an Interval Linearization Method Mikhael linearization method for the sensitivity analysis of manipulators to variations in their geometric param- eters linearization method automatically detects such situations. Keywords: parallel manipulators, sensitivity

  3. A Comparison of Low and High Structure Practice for Learning Interactional Analysis Skills

    ERIC Educational Resources Information Center

    Davis, Matthew James

    2011-01-01

    Innovative training approaches in work domains such as professional athletics, aviation, and the military have shown that specific types of practice can reliably lead to higher levels of performance for the average professional. This study describes the development of an initial effort toward creating a similar practice method for psychotherapy…

  4. Structural correlation method for model reduction and practical estimation of patient specific parameters illustrated on heart rate regulation.

    PubMed

    Ottesen, Johnny T; Mehlsen, Jesper; Olufsen, Mette S

    2014-11-01

    We consider the inverse and patient specific problem of short term (seconds to minutes) heart rate regulation specified by a system of nonlinear ODEs and corresponding data. We show how a recent method termed the structural correlation method (SCM) can be used for model reduction and for obtaining a set of practically identifiable parameters. The structural correlation method includes two steps: sensitivity and correlation analysis. When combined with an optimization step, it is possible to estimate model parameters, enabling the model to fit dynamics observed in data. This method is illustrated in detail on a model predicting baroreflex regulation of heart rate and applied to analysis of data from a rat and healthy humans. Numerous mathematical models have been proposed for prediction of baroreflex regulation of heart rate, yet most of these have been designed to provide qualitative predictions of the phenomena though some recent models have been developed to fit observed data. In this study we show that the model put forward by Bugenhagen et al. can be simplified without loss of its ability to predict measured data and to be interpreted physiologically. Moreover, we show that with minimal changes in nominal parameter values the simplified model can be adapted to predict observations from both rats and humans. The use of these methods make the model suitable for estimation of parameters from individuals, allowing it to be adopted for diagnostic procedures. PMID:25050793

  5. Problems with outlier test methods in flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Hu, Siyi

    1987-12-01

    In flood frequency analysis, the term "outlier" is commonly used to denote large floods in the systematic record or historical floods which lie far above the majority of the floods in the sample. The mere existence of these outliers complicates the frequency analysis procedure. In order to avoid any subjectivity in the detection and treatment of outliers, the U.S. Water Resources Council (WRC) recommended a method based on the principles of hypotheses testing. In spite of the fact that it has been extensively applied in the United States, there are some theoretical and practical aspects which require further consideration. A study of principles reveals that outlier tests in a statistical context postulate an assumption that outliers have a unique distribution which is different from that of the remaining sample observations. Thus, the theory underlying the outlier test is in conflict with the phenomenon of outliers in flood data because it is generally accepted that both historical floods and extraordinary floods in systematic records all come from a common unknown population including all floods. Consequently it would not be reasonable to introduce outlier tests into a flood frequency analysis. In addition, the so-called "masking effect" encountered in the practical use of the outlier test method in U.S. Water Resources Council (1981) is analytically discussed. Observed flood records at several stations are used to illustrate that the test in WRC Bulletin 17B does not guarantee the detection of outliers if more than one is present in the sample due to this "masking effect".

  6. Practical and Accurate Low-Level Pointer Analysis

    Microsoft Academic Search

    Bolei Guo; Matthew J. Bridges; Spyridon Triantafyllis; Guilherme Ottoni; Easwaran Raman; David I. August

    2005-01-01

    Pointer analysis is traditionally performed once, early in the compilation process, upon an intermediate representation (IR) with source-code semantics. However, performing pointer analysis only once at this level imposes a phase-ordering constraint, causing alias information to become stale after subsequent code transformations. Moreover, high-level pointer analysis cannot be used at link time or run time, where the source code is

  7. Entropy Methods For Univariate Distributions in Decision Analysis

    NASA Astrophysics Data System (ADS)

    Abbas, Ali E.

    2003-03-01

    One of the most important steps in decision analysis practice is the elicitation of the decision-maker's belief about an uncertainty of interest in the form of a representative probability distribution. However, the probability elicitation process is a task that involves many cognitive and motivational biases. Alternatively, the decision-maker may provide other information about the distribution of interest, such as its moments, and the maximum entropy method can be used to obtain a full distribution subject to the given moment constraints. In practice however, decision makers cannot readily provide moments for the distribution, and are much more comfortable providing information about the fractiles of the distribution of interest or bounds on its cumulative probabilities. In this paper we present a graphical method to determine the maximum entropy distribution between upper and lower probability bounds and provide an interpretation for the shape of the maximum entropy distribution subject to fractile constraints, (FMED). We also discuss the problems with the FMED in that it is discontinuous and flat over each fractile interval. We present a heuristic approximation to a distribution if in addition to its fractiles, we also know it is continuous and work through full examples to illustrate the approach.

  8. Influence of Analysis Methods on Interpretation of Hazard Maps

    PubMed Central

    Koehler, Kirsten A.

    2013-01-01

    Exposure or hazard mapping is becoming increasingly popular among industrial hygienists. Direct-reading instruments used for hazard mapping of data collection are steadily increasing in reliability and portability while decreasing in cost. Exposure measurements made with these instruments generally require no laboratory analysis although hazard mapping can be a time-consuming process. To inform decision making by industrial hygienists and management, it is crucial that the maps generated from mapping data are as accurate and representative as possible. Currently, it is unclear how many sampling locations are necessary to produce a representative hazard map. As such, researchers typically collect as many points as can be sampled in several hours and interpolation methods are used to produce higher resolution maps. We have reanalyzed hazard-mapping data sets from three industrial settings to determine which interpolation methods yield the most accurate results. The goal is to provide practicing industrial hygienists with some practical guidelines to generate accurate hazard maps with ‘off-the-shelf’ mapping software. Visually verifying the fit of the variogram model is crucial for accurate interpolation. Exponential and spherical variogram models performed better than Gaussian models. It was also necessary to diverge from some of the default interpolation parameters such as the number of bins used for the experimental variogram and whether or not to allow for a nugget effect to achieve reasonable accuracy of the interpolation for some data sets. PMID:23258453

  9. Practical decoy-state quantum key distribution method considering dark count rate fluctuation

    NASA Astrophysics Data System (ADS)

    Zhou, Yuan-yuan; Jiang, Hua; Wang, Ying-jian

    2012-09-01

    Considering fluctuant dark count rate in practical quantum key distribution (QKD) system, a new decoy-state method with one vacuum state and one weak decoy state is presented based on a heralded single photon source (HSPS). The method assumes that the dark count rate of each pulse is random and independent. The lower bound of the count rate and the upper bound of the error rate of a single photon state are estimated. The method is applied to the decoy-state QKD system with and without the fluctuation of dark count rate. Because the estimation of the upper bound of a single photon state's error rate is stricter, the method can obtain better performance than the existing methods under the same condition of implementation.

  10. A practical database method for predicting arrivals of “average” interplanetary shocks at Earth

    Microsoft Academic Search

    X. S. Feng; Y. Zhang; W. Sun; M. Dryer; C. D. Fry; C. S. Deehr

    2009-01-01

    A practical database method for predicting the interplanetary shock arrival time at L1 point is presented here. First, a shock transit time database (hereinafter called Database-I) based on HAFv.1 (version 1 of the Hakamada-Akasofu-Fry model) is preliminarily established with hypothetical solar events. Then, on the basis of the prediction test results of 130 observed solar events during the period from

  11. How nurse teachers keep up-to-date: their methods and practices

    Microsoft Academic Search

    Christine Love

    1996-01-01

    A total of 316 nurse teachers completed a self-administered questionnaire in an investigation of the methods and practices they use to keep up-to-date. The study was in two stages. In the first stage, 240 respondents were asked to rate 13 activities as indispensable, useful or of minimal value for updating. Responses were not found to depend significantly on qualifications or

  12. Development of a Practical Method for Using Ozone Gas as a Virus Decontaminating Agent

    Microsoft Academic Search

    James B. Hudson; Manju Sharma; Selvarani Vimalanathan

    2009-01-01

    Our objective was to develop a practical method of utilizing the known anti-viral properties of ozone in a mobile apparatus that could be used to decontaminate rooms in health care facilities, hotels and other buildings. Maximum anti-viral efficacy required a short period of high humidity (>90% relative humidity) after the attainment of peak ozone gas concentration (20–25 ppm). All 12

  13. Probabilistic Requirements (Partial) Verification Methods Best Practices Improvement. Variables Acceptance Sampling Calculators: Empirical Testing. Volume 2

    NASA Technical Reports Server (NTRS)

    Johnson, Kenneth L.; White, K. Preston, Jr.

    2012-01-01

    The NASA Engineering and Safety Center was requested to improve on the Best Practices document produced for the NESC assessment, Verification of Probabilistic Requirements for the Constellation Program, by giving a recommended procedure for using acceptance sampling by variables techniques as an alternative to the potentially resource-intensive acceptance sampling by attributes method given in the document. In this paper, the results of empirical tests intended to assess the accuracy of acceptance sampling plan calculators implemented for six variable distributions are presented.

  14. Detailed balance The key idea in most practical MCMC methods is reversibility or

    E-print Network

    Green, Peter

    Detailed balance The key idea in most practical MCMC methods is reversibility or , i.e. a balance is in detailed balance with respect to � if for all ¼ , ´ µ�´ ¼ µ 73 Why does detailed balance ensure the correct stationary distribution? Suppose ´ µ�´ ¼ µ ´ ¼ µ�´ ¼ µ Sum both sides over : we have "global balance": ´ µ

  15. Methods for Proteomic Analysis of Transcription Factors

    PubMed Central

    Jiang, Daifeng; Jarrett, Harry W.; Haskins, William E.

    2009-01-01

    Investigation of the transcription factor (TF) proteome presents challenges including the large number of low abundance and post-translationally modified proteins involved. Specialized purification and analysis methods have been developed over the last decades which facilitate the study of the TF proteome and these are reviewed here. Generally applicable proteomics methods that have been successfully applied are also discussed. TFs are selectively purified by affinity techniques using the DNA response element (RE) as the basis for highly specific binding, and several agents have been discovered that either enhance binding or diminish non-specific binding. One such affinity method called “trapping” enables purification of TFs bound to nM concentrations and recovery of TF complexes in a highly purified state. The electrophoretic mobility shift assay (EMSA) is the most important assay of TFs because it provides both measures of the affinity and amount of the TF present. Southwestern (SW) blotting and DNA-protein crosslinking (DPC) allow in vitro estimates of DNA-binding-protein mass, while chromatin immunoprecipitation (ChIP) allows confirmation of promoter binding in vivo. Two-dimensional gel electrophoresis methods (2-DE), and 3-DE methods which combines EMSA with 2-DE, allow further resolution of TFs. The synergy of highly selective purification and analytical strategies has led to an explosion of knowledge about the TF proteome and the proteomes of other DNA- and RNA-binding proteins. PMID:19726046

  16. Screening Workers: An Examination and Analysis of Practice and Public Policy.

    ERIC Educational Resources Information Center

    Greenfield, Patricia A.; And Others

    1989-01-01

    Discusses methods of screening job applicants and issues raised by screening procedures.. Includes legal ramifications, current practices in Britain and the United States, future directions, and the employment interview. (JOW)

  17. Program Analysis and Monitoring (PAM) in Writing: Practical Implementation.

    ERIC Educational Resources Information Center

    Hayford, Paul D.

    Program Analysis and Monitoring (PAM) in Writing is a decision-support tool which provides the principal with an informative report, the Program Analysis Report, that identifies needs and suggests solutions to problems in educational writing programs. The Report relates information on student achievement in writing to information on how resources…

  18. A practical method to avoid zero-point leak in molecular dynamics calculations: Application to the water dimer

    NASA Astrophysics Data System (ADS)

    Czakó, Gábor; Kaledin, Alexey L.; Bowman, Joel M.

    2010-04-01

    We report the implementation of a previously suggested method to constrain a molecular system to have mode-specific vibrational energy greater than or equal to the zero-point energy in quasiclassical trajectory calculations [J. M. Bowman et al., J. Chem. Phys. 91, 2859 (1989); W. H. Miller et al., J. Chem. Phys. 91, 2863 (1989)]. The implementation is made practical by using a technique described recently [G. Czakó and J. M. Bowman, J. Chem. Phys. 131, 244302 (2009)], where a normal-mode analysis is performed during the course of a trajectory and which gives only real-valued frequencies. The method is applied to the water dimer, where its effectiveness is shown by computing mode energies as a function of integration time. Radial distribution functions are also calculated using constrained quasiclassical and standard classical molecular dynamics at low temperature and at 300 K and compared to rigorous quantum path integral calculations.

  19. Accessing the ethics of complex health care practices: would a "domains of ethics analysis" approach help?

    PubMed

    Kirby, Jeffrey

    2010-06-01

    This paper explores how using a "domains of ethics analysis" approach might constructively contribute to an enhanced understanding (among those without specialized ethics training) of ethically-complex health care practices through the consideration of one such sample practice, i.e., deep and continuous palliative sedation (DCPS). For this purpose, I select four sample ethics domains (from a variety of possible relevant domains) for use in the consideration of this practice, i.e., autonomous choice, motives, actions and consequences. These particular domains were chosen because of their relevance to the analysis of DCPS and their relative ease of access to those without ethics training. The analysis demonstrates that such an approach could facilitate the emergence of accessible arguments and discussion points that could enhance the understanding and appreciation of this and other health care practices with strong ethics dimensions. PMID:20505981

  20. Analysis of Manual Lifting Tasks: A Qualitative Alternative to the NIOSH Work Practices Guide

    Microsoft Academic Search

    W. MONROE KEYSERLING

    1989-01-01

    A new method for evaluating ergonomic stresses on lifting tasks has been developed. This method utilizes the general procedures and hazard classification categories described in the National Institute for Occupational Safety and Health's (NIOSH) Work Practices Guide for Manual Lifting. The quantitative measurements of workplace dimensions and computations used by the NIOSH method to classify a job as “acceptable,” “administrative

  1. Practical use of three-dimensional inverse method for compressor blade design

    SciTech Connect

    Damle, S.; Dang, T. [Syracuse Univ., NY (United States). Dept. of Mechanical, Aerospace and Mfg. Engineering; Stringham, J.; Razinsky, E. [Solar Turbines, Inc., San Diego, CA (United States)

    1999-04-01

    The practical utility of a three-dimensional inverse viscous method is demonstrated by carrying out a design modification of a first-stage rotor in an industrial compressor. In this design modification study, the goal is to improve the efficiency of the original blade while retaining its overall aerodynamic, structural, and manufacturing characteristics. By employing a simple modification to the blade pressure loading distribution (which is the prescribed flow quantity in this inverse method), the modified blade geometry is predicted to perform better than the original design over a wide range of operating points, including an improvement in choke margin.

  2. Computational method for analysis of polyethylene biodegradation

    NASA Astrophysics Data System (ADS)

    Watanabe, Masaji; Kawai, Fusako; Shibata, Masaru; Yokoyama, Shigeo; Sudate, Yasuhiro

    2003-12-01

    In a previous study concerning the biodegradation of polyethylene, we proposed a mathematical model based on two primary factors: the direct consumption or absorption of small molecules and the successive weight loss of large molecules due to ?-oxidation. Our model is an initial value problem consisting of a differential equation whose independent variable is time. Its unknown variable represents the total weight of all the polyethylene molecules that belong to a molecular-weight class specified by a parameter. In this paper, we describe a numerical technique to introduce experimental results into analysis of our model. We first establish its mathematical foundation in order to guarantee its validity, by showing that the initial value problem associated with the differential equation has a unique solution. Our computational technique is based on a linear system of differential equations derived from the original problem. We introduce some numerical results to illustrate our technique as a practical application of the linear approximation. In particular, we show how to solve the inverse problem to determine the consumption rate and the ?-oxidation rate numerically, and illustrate our numerical technique by analyzing the GPC patterns of polyethylene wax obtained before and after 5 weeks cultivation of a fungus, Aspergillus sp. AK-3. A numerical simulation based on these degradation rates confirms that the primary factors of the polyethylene biodegradation posed in modeling are indeed appropriate.

  3. High-speed fringe analysis method using frequency demodulation technology

    Microsoft Academic Search

    Yasuhiko Arai; Shunsuke Yokozeki; Tomoharu Yamada

    1994-01-01

    Interferometer fringe pattern analysis is a very precise measuring method. However, this analysis method has not received significant attention in industry as a result of the long calculation time for an operation. The majority of conventional computer analysis methods process an image in the spatial domain using sophisticated algorithms. A new time domain fringe analysis technique is discussed in this

  4. A market analysis comparing the practices of psychiatrists and psychologists.

    PubMed

    Knesper, D J; Belcher, B E; Cross, J G

    1989-04-01

    Data are analyzed that describe the clinical work of representative samples of psychiatrists and clinical psychologists who practiced in one of 62 markets for mental health services in the United States during 1982 and 1983. For psychiatrists, intensity of treatment (ie, mean face-to-face treatment minutes per patient per month) varied from 107 to 368 minutes vs 124 to 283 minutes for psychologists. Multiple regression models explain these variations somewhat differently for each provider group. Whereas the patient's severity or stage of illness is explanatory for both psychiatrists and psychologists, only psychologists appear to alter intensity of treatment in response to local economic conditions. Psychiatrists have large diversified practices, whereas psychologists tend to treat fewer persons and the bulk of these have less severe mental and emotional conditions. Neither practitioner group appeared to provide services in excess of their perceptions of patient need. These and other important similarities and differences are explored, and the advantages of local market focus for examining relevant public policy issues are discussed. PMID:2649037

  5. Thermal Analysis Methods for Aerobraking Heating

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.; Gasbarre, Joseph F.; Dec, John A.

    2005-01-01

    As NASA begins exploration of other planets, a method of non-propulsively slowing vehicles at the planet, aerobraking, may become a valuable technique for managing vehicle design mass and propellant. An example of this is Mars Reconnaissance Orbiter (MRO), which will launch in late 2005 and reach Mars in March of 2006. In order to save propellant, MRO will use aerobraking to modify the initial orbit at Mars. The spacecraft will dip into the atmosphere briefly on each orbit, and during the drag pass, the atmospheric drag on the spacecraft will slow it, thus lowering the orbit apoapsis. The largest area on the spacecraft, and that most affected by the heat generated during the aerobraking process, is the solar arrays. A thermal analysis of the solar arrays was conducted at NASA Langley, to simulate their performance throughout the entire roughly 6-month period of aerobraking. Several interesting methods were used to make this analysis more rapid and robust. Two separate models were built for this analysis, one in Thermal Desktop for radiation and orbital heating analysis, and one in MSC.Patran for thermal analysis. The results from the radiation model were mapped in an automated fashion to the Patran thermal model that was used to analyze the thermal behavior during the drag pass. A high degree of automation in file manipulation as well as other methods for reducing run time were employed, since toward the end of the aerobraking period the orbit period is short, and in order to support flight operations the runs must be computed rapidly. All heating within the Patran Thermal model was combined in one section of logic, such that data mapped from the radiation model and aeroheating model, as well as skin temperature effects on the aeroheating and surface radiation, could be incorporated easily. This approach calculates the aeroheating at any given node, based on its position and temperature as well as the density and velocity at that trajectory point. Run times on several different processors, computer hard drives, and operating systems (Windows versus Linux) were evaluated.

  6. Analysis method for Fourier transform spectroscopy

    NASA Technical Reports Server (NTRS)

    Park, J. H.

    1983-01-01

    A fast Fourier transform technique is given for the simulation of those distortion effects in the instrument line shape of the interferometric spectrum that are due to errors in the measured interferogram. The technique is applied to analyses of atmospheric absorption spectra and laboratory spectra. It is shown that the nonlinear least squares method can retrieve the correct information from the distorted spectrum. Analyses of HF absorption spectra obtained in a laboratory and solar CO absorption spectra gathered by a balloon-borne interferometer indicate that the retrieved amount of absorbing gas is less than the correct value in most cases, if the interferogram distortion effects are not included in the analysis.

  7. Scanning methods applied to bitemark analysis

    NASA Astrophysics Data System (ADS)

    Bush, Peter J.; Bush, Mary A.

    2010-06-01

    The 2009 National Academy of Sciences report on forensics focused criticism on pattern evidence subdisciplines in which statements of unique identity are utilized. One principle of bitemark analysis is that the human dentition is unique to the extent that a perpetrator may be identified based on dental traits in a bitemark. Optical and electron scanning methods were used to measure dental minutia and to investigate replication of detail in human skin. Results indicated that being a visco-elastic substrate, skin effectively reduces the resolution of measurement of dental detail. Conclusions indicate caution in individualization statements.

  8. Numerical analysis method for linear induction machines.

    NASA Technical Reports Server (NTRS)

    Elliott, D. G.

    1972-01-01

    A numerical analysis method has been developed for linear induction machines such as liquid metal MHD pumps and generators and linear motors. Arbitrary phase currents or voltages can be specified and the moving conductor can have arbitrary velocity and conductivity variations from point to point. The moving conductor is divided into a mesh and coefficients are calculated for the voltage induced at each mesh point by unit current at every other mesh point. Combining the coefficients with the mesh resistances yields a set of simultaneous equations which are solved for the unknown currents.

  9. Which method of breastfeeding supplementation is best? The beliefs and practices of paediatricians and nurses

    PubMed Central

    Al-Sahab, Ban; Feldman, Mark; Macpherson, Alison; Ohlsson, Arne; Tamim, Hala

    2010-01-01

    The aims of the present study were to assess the practices of breastfeeding supplementation methods, and to explore the opinions and beliefs of health professionals with regard to breastfeeding supplementation methods and the nipple confusion phenomenon. The study was cross-sectional in design, and participants were recruited from five hospitals in Toronto, Ontario. All of the nurses and attending paediatricians in postpartum floors and level II nurseries were invited to participate in the study. A total of 87 nurses and 16 paediatricians completed the survey questionnaire. Bottle feeding was the most common breastfeeding supplementation method used in the nurseries followed by cup feeding. Only 15.0% of the level II nurses agreed that frequent bottle feeds lead to the nipple confusion phenomenon, compared with 44.4% of the postpartum nurses and 56.2% of the paediatricians. Findings demonstrated considerable variation in the practices and beliefs surrounding supplementation methods. A randomized controlled trial comparing the safety, efficiency and subsequent breastfeeding rates of different breastfeeding supplementation methods is warranted. PMID:21886446

  10. Are larger dental practices more efficient? An analysis of dental services production.

    PubMed Central

    Lipscomb, J; Douglass, C W

    1986-01-01

    Whether cost-efficiency in dental services production increases with firm size is investigated through application of an activity analysis production function methodology to data from a national survey of dental practices. Under this approach, service delivery in a dental practice is modeled as a linear programming problem that acknowledges distinct input-output relationships for each service. These service-specific relationships are then combined to yield projections of overall dental practice productivity, subject to technical and organizational constraints. The activity analysis reported here represents arguably the most detailed evaluation yet of the relationship between dental practice size and cost-efficiency, controlling for such confounding factors as fee and service-mix differences across firms. We conclude that cost-efficiency does increase with practice size, over the range from solo to four-dentist practices. Largely because of data limitations, we were unable to test satisfactorily for scale economies in practices with five or more dentists. Within their limits, our findings are generally consistent with results from the neoclassical production function literature. From the standpoint of consumer welfare, the critical question raised (but not resolved) here is whether these apparent production efficiencies of group practice are ultimately translated by the market into lower fees, shorter queues, or other nonprice benefits. PMID:3102404

  11. Principles of Good Practice for Budget Impact Analysis: Report of the ISPORTask Force on Good Research Practices— Budget Impact Analysis

    Microsoft Academic Search

    Josephine A. Mauskopf; Sean D. Sullivan; Lieven Annemans; Jaime Caro; C. Daniel Mullins; Mark Nuijten; Ewa Orlewska; John Watkins; Paul Trueman

    2007-01-01

    Objectives: There is growing recognition that a comprehen- sive economic assessment of a new health-care intervention at the time of launch requires both a cost-effectiveness analysis (CEA) and a budget impact analysis (BIA). National regula- tory agencies such as the National Institute for Health and Clinical Excellence in England and Wales and the Pharma- ceutical Benefits Advisory Committee in Australia,

  12. Best practice analysis of bank branches: An application of DEA in a large Canadian bank

    Microsoft Academic Search

    Claire Schaffnit; Dan Rosen; Joseph C. Paradi

    1997-01-01

    This paper presents a best practice analysis of the Ontario based branches of a large Canadian bank. Consistent with managerial goals, the analysis focuses on the performance of branch personnel; it considers as outputs both transactions and maintenance work. To sharpen our efficiency estimates, we use DEA AR models with output multiplier constraints based on standard transaction and maintenance times.

  13. Analysis of practical slab configurations using automated yield-line analysis and geometric optimization of fracture patterns

    Microsoft Academic Search

    A. C. A. Ramsay; D. Johnson

    1998-01-01

    Automated yield-line analysis and geometric optimization are used in the analysis and design of a number of practical reinforced concrete slab configurations. For two of the chosen configurations comparisons between experimental and theoretical results are possible. The final configuration is included to illustrate the significance and importance of carrying out some form of geometric optimization.

  14. Green analytical method development for statin analysis.

    PubMed

    Assassi, Amira Louiza; Roy, Claude-Eric; Perovitch, Philippe; Auzerie, Jack; Hamon, Tiphaine; Gaudin, Karen

    2015-02-01

    Green analytical chemistry method was developed for pravastatin, fluvastatin and atorvastatin analysis. HPLC/DAD method using ethanol-based mobile phase with octadecyl-grafted silica with various grafting and related-column parameters such as particle sizes, core-shell and monolith was studied. Retention, efficiency and detector linearity were optimized. Even for column with particle size under 2 ?m, the benefit of keeping efficiency within a large range of flow rate was not obtained with ethanol based mobile phase compared to acetonitrile one. Therefore the strategy to shorten analysis by increasing the flow rate induced decrease of efficiency with ethanol based mobile phase. An ODS-AQ YMC column, 50 mm × 4.6 mm, 3 ?m was selected which showed the best compromise between analysis time, statin separation, and efficiency. HPLC conditions were at 1 mL/min, ethanol/formic acid (pH 2.5, 25 mM) (50:50, v/v) and thermostated at 40°C. To reduce solvent consumption for sample preparation, 0.5mg/mL concentration of each statin was found the highest which respected detector linearity. These conditions were validated for each statin for content determination in high concentrated hydro-alcoholic solutions. Solubility higher than 100mg/mL was found for pravastatin and fluvastatin, whereas for atorvastatin calcium salt the maximum concentration was 2mg/mL for hydro-alcoholic binary mixtures between 35% and 55% of ethanol in water. Using atorvastatin instead of its calcium salt, solubility was improved. Highly concentrated solution of statins offered potential fluid for per Buccal Per-Mucous(®) administration with the advantages of rapid and easy passage of drugs. PMID:25582487

  15. A concise method for mine soils analysis

    SciTech Connect

    Winkler, S.; Wildeman, T.; Robinson, R.; Herron, J.

    1999-07-01

    A large number of abandoned hard rock mines exist in Colorado and other mountain west states, many on public property. Public pressure and resulting policy changes have become a driving force in the reclamation of these sites. Two of the key reclamation issues for these sites in the occurrence of acid forming materials (AFMs) in mine soils, and acid mine drainage (AMD) issuing from mine audits. An AMD treatment system design project for the Forest Queen mine in Colorado's San Juan mountains raised the need for a simple, useable method for analysis of mine land soils, both for suitability as a construction material, and to determine the AFM content and potential for acid release. The authors have developed a simple, stepwise, go - no go test for the analysis of mine soils. Samples were collected from a variety of sites in the Silverton, CO area, and subjected to three tiers of tests including: paste pH, Eh, and 10% HCl fizz test; then total digestion in HNO{sub 3}/HCl, neutralization potential, exposure to meteoric water, and toxicity content leaching procedure (TCLP). All elemental analyses were performed with an inductively-coupled plasma (ICP) spectrometer. Elimination of samples via the first two testing tiers left two remaining samples, which were subsequently subjected to column and sequential batch tests, with further elemental analysis by ICP. Based on these tests, one sample was chosen for suitability as a constructing material for the Forest Queen treatment system basins. Further simplification, and testing on two pairs of independent soil samples, has resulted in a final analytical method suitable for general use.

  16. Chapter 11. Community analysis-based methods

    SciTech Connect

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  17. Analysis of methods. [information systems evolution environment

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J. (editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity Relationship, Data Flow Diagrams, and Structure Charts, for inclusion in an integrated development support environment.

  18. Dynamic mechanical analysis: A practical introduction to techniques and applications

    SciTech Connect

    Menard, K.

    1999-03-01

    This introduction provides the chemist, chemical engineer, or materials scientists with a starting point to understand the applications of dynamic mechanical analysis, its workings, and its advantages and limitations. This book serves as a systematic study of manufacturing polymeric materials and components as well as for developing new materials. Contents include: introduction to dynamic mechanical analysis; basic rheological concepts: stress, strain, and flow; rheology basic: creep-recovery and stress relaxation; dynamic testing; time-temperature scans part 1: transitions in polymers; time and temperature studies part 2: thermosets; frequency scans; DMA applications to real problems: guidelines; and appendix: sample experiments for the DMA.

  19. Homotopy analysis method for quadratic Riccati differential equation

    Microsoft Academic Search

    Yue Tan; Saeid Abbasbandy

    2008-01-01

    In this paper, the quadratic Riccati differential equation is solved by means of an analytic technique, namely the homotopy analysis method (HAM). Comparisons are made between Adomian’s decomposition method (ADM), homotopy perturbation method (HPM) and the exact solution and the homotopy analysis method. The results reveal that the proposed method is very effective and simple.

  20. Suspension, Race, and Disability: Analysis of Statewide Practices and Reporting

    ERIC Educational Resources Information Center

    Krezmien, Michael P.; Leone, Peter E.; Achilles, Georgianna M.

    2006-01-01

    This analysis of statewide suspension data from 1995 to 2003 in Maryland investigated disproportionate suspensions of minority students and students with disabilities. We found substantial increases in over-all rates of suspensions from 1995 to 2003, as well as disproportionate rates of suspensions for African American students, American Indian…

  1. The Analysis of Athletic Performance: Some Practical and Philosophical Considerations

    ERIC Educational Resources Information Center

    Nelson, Lee J.; Groom, Ryan

    2012-01-01

    This article presents a hypothetical dialogue between a notational analyst (NA) recently schooled in the positivistic assessment of athletic performance, an "old-school" traditional coach (TC) who favours subjective analysis, and a pragmatic educator (PE). The conversation opens with NA and TC debating the respective value of quantitative and…

  2. The design, analysis and application of a practical vibrational platform

    NASA Technical Reports Server (NTRS)

    Tcheng, P.

    1977-01-01

    The design, analysis and application of a unique vibrational platform employing flexural pivots is described. A Lagrangian equation for the slightly-damped second-order system was derived and verified experimentally. Platform frequencies from 0.7 to 20 Hz with an extremely low damping coefficient were observed. Various educational and instrumentation applications of this simple mechanical system are included.

  3. An Analysis of Ethical Considerations in Programme Design Practice

    ERIC Educational Resources Information Center

    Govers, Elly

    2014-01-01

    Ethical considerations are inherent to programme design decision-making, but not normally explicit. Nonetheless, they influence whose interests are served in a programme and who benefits from it. This paper presents an analysis of ethical considerations made by programme design practitioners in the context of a polytechnic in Aotearoa/New Zealand.…

  4. System Testing Aided by Structured Analysis: A Practical Experience

    Microsoft Academic Search

    Thomas J. Mccabe; G. Gordon Schulmeyer

    1985-01-01

    This paper deals with the use of Structured Analysis just prior to system acceptance testing. Specifically, the drawing of data flow diagrams (DFD) was done after integration testing. The DFD's provided a picture of the logical flow through the integrated system for thorough system acceptance testing. System test sets, were derived from the flows in the DFD's. System test repeatability

  5. Analysis Method for Quantifying Vehicle Design Goals

    NASA Technical Reports Server (NTRS)

    Fimognari, Peter; Eskridge, Richard; Martin, Adam; Lee, Michael

    2007-01-01

    A document discusses a method for using Design Structure Matrices (DSM), coupled with high-level tools representing important life-cycle parameters, to comprehensively conceptualize a flight/ground space transportation system design by dealing with such variables as performance, up-front costs, downstream operations costs, and reliability. This approach also weighs operational approaches based on their effect on upstream design variables so that it is possible to readily, yet defensively, establish linkages between operations and these upstream variables. To avoid the large range of problems that have defeated previous methods of dealing with the complex problems of transportation design, and to cut down the inefficient use of resources, the method described in the document identifies those areas that are of sufficient promise and that provide a higher grade of analysis for those issues, as well as the linkages at issue between operations and other factors. Ultimately, the system is designed to save resources and time, and allows for the evolution of operable space transportation system technology, and design and conceptual system approach targets.

  6. Preventing childhood obesity during infancy in UK primary care: a mixed-methods study of HCPs' knowledge, beliefs and practice

    PubMed Central

    2011-01-01

    Background There is a strong rationale for intervening in early childhood to prevent obesity. Over a quarter of infants gain weight more rapidly than desirable during the first six months of life putting them at greater risk of obesity in childhood. However, little is known about UK healthcare professionals' (HCPs) approach to primary prevention. This study explored obesity-related knowledge of UK HCPs and the beliefs and current practice of general practitioners (GPs) and practice nurses in relation to identifying infants at risk of developing childhood obesity. Method Survey of UK HCPs (GPs, practice nurses, health visitors, nursery, community and children's nurses). HCPs (n = 116) rated their confidence in providing infant feeding advice and completed the Obesity Risk Knowledge Scale (ORK-10). Semi-structured interviews with a sub-set of 12 GPs and 6 practice nurses were audio recorded, taped and transcribed verbatim. Thematic analysis was applied using an interpretative, inductive approach. Results GPs were less confident about giving advice about infant feeding than health visitors (p = 0.001) and nursery nurses (p = 0.009) but more knowledgeable about the health risks of obesity (p < 0.001) than nurses (p = 0.009). HCPs who were consulted more often about feeding were less knowledgeable about the risks associated with obesity (r = -0.34, n = 114, p < 0.001). There was no relationship between HCPs' ratings of confidence in their advice and their knowledge of the obesity risk. Six main themes emerged from the interviews: 1) Attribution of childhood obesity to family environment, 2) Infant feeding advice as the health visitor's role, 3) Professional reliance on anecdotal or experiential knowledge about infant feeding, 4) Difficulties with recognition of, or lack of concern for, infants "at risk" of becoming obese, 5) Prioritising relationship with parent over best practice in infant feeding and 6) Lack of shared understanding for dealing with early years' obesity. Conclusions Intervention is needed to improve health visitors and nursery nurses' knowledge of obesity risk and GPs and practice nurses' capacity to identify and manage infants' at risk of developing childhood obesity. GPs value strategies that maintain relationships with vulnerable families and interventions to improve their advice-giving around infant feeding need to take account of this. Further research is needed to determine optimal ways of intervening with infants at risk of obesity in primary care. PMID:21699698

  7. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R.

    2010-05-25

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  8. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R. (Albuquerque, NM)

    2010-11-23

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  9. Integration of Formal Job Hazard Analysis & ALARA Work Practice

    SciTech Connect

    NELSEN, D.P.

    2002-09-01

    ALARA work practices have traditionally centered on reducing radiological exposure and controlling contamination. As such, ALARA policies and procedures are not well suited to a wide range of chemical and human health issues. Assessing relative risk, identifying appropriate engineering/administrative controls and selecting proper Personal Protective Equipment (PPE) for non nuclear work activities extends beyond the limitations of traditional ALARA programs. Forging a comprehensive safety management program in today's (2002) work environment requires a disciplined dialog between health and safety professionals (e.g. safety, engineering, environmental, quality assurance, industrial hygiene, ALARA, etc.) and personnel working in the field. Integrating organizational priorities, maintaining effective pre-planning of work and supporting a team-based approach to safety management represents today's hallmark of safety excellence. Relying on the mandates of any single safety program does not provide industrial hygiene with the tools necessary to implement an integrated safety program. The establishment of tools and processes capable of sustaining a comprehensive safety program represents a key responsibility of industrial hygiene. Fluor Hanford has built integrated safety management around three programmatic attributes: (1) Integration of radiological, chemical and ergonomic issues under a single program. (2) Continuous improvement in routine communications among work planning/scheduling, job execution and management. (3) Rapid response to changing work conditions, formalized work planning and integrated worker involvement.

  10. The influence of deliberate practice on musical achievement: a meta-analysis

    PubMed Central

    Platz, Friedrich; Kopiez, Reinhard; Lehmann, Andreas C.; Wolf, Anna

    2014-01-01

    Deliberate practice (DP) is a task-specific structured training activity that plays a key role in understanding skill acquisition and explaining individual differences in expert performance. Relevant activities that qualify as DP have to be identified in every domain. For example, for training in classical music, solitary practice is a typical training activity during skill acquisition. To date, no meta-analysis on the quantifiable effect size of deliberate practice on attained performance in music has been conducted. Yet the identification of a quantifiable effect size could be relevant for the current discussion on the role of various factors on individual difference in musical achievement. Furthermore, a research synthesis might enable new computational approaches to musical development. Here we present the first meta-analysis on the role of deliberate practice in the domain of musical performance. A final sample size of 13 studies (total N = 788) was carefully extracted to satisfy the following criteria: reported durations of task-specific accumulated practice as predictor variables and objectively assessed musical achievement as the target variable. We identified an aggregated effect size of rc = 0.61; 95% CI [0.54, 0.67] for the relationship between task-relevant practice (which by definition includes DP) and musical achievement. Our results corroborate the central role of long-term (deliberate) practice for explaining expert performance in music. PMID:25018742

  11. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...false Methods of analysis. 133.5 Section 133.5 Food and Drugs FOOD AND DRUG...PRODUCTS General Provisions § 133.5 Methods of analysis. ...federal_regulations/ibr_locations.html ): (a) Moisture content—section...

  12. 21 CFR 133.5 - Methods of analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...determined by the following methods of analysis from “Official Methods of Analysis of the Association of Official Analytical Chemists,” 13th ed., 1980, which is incorporated by reference (copies are available from the AOAC INTERNATIONAL, 481...

  13. A Method and Its Practice for Teaching the Fundamental Technology of Communication Protocols and Coding

    NASA Astrophysics Data System (ADS)

    Kobayashi, Tetsuji

    The education of information and communication technologies is important for engineering, and it includes terminals, communication media, transmission, switching, software, communication protocols, coding, etc. The proposed teaching method for protocols is based on the HDLC (High-level Data Link Control) procedures using our newly developed software “HDLC trainer” , and includes the extensions for understanding other protocols such as TCP/IP. As for teaching the coding theory that is applied for the error control in protocols, we use both of a mathematical programming language and a general-purpose programming language. We have practiced and evaluated the proposed teaching method in our college, and it is shown that the method has remarkable effects for understanding the fundamental technology of protocols and coding.

  14. The uniform asymptotic swallowtail approximation - Practical methods for oscillating integrals with four coalescing saddle points

    NASA Technical Reports Server (NTRS)

    Connor, J. N. L.; Curtis, P. R.; Farrelly, D.

    1984-01-01

    Methods that can be used in the numerical implementation of the uniform swallowtail approximation are described. An explicit expression for that approximation is presented to the lowest order, showing that there are three problems which must be overcome in practice before the approximation can be applied to any given problem. It is shown that a recently developed quadrature method can be used for the accurate numerical evaluation of the swallowtail canonical integral and its partial derivatives. Isometric plots of these are presented to illustrate some of their properties. The problem of obtaining the arguments of the swallowtail integral from an analytical function of its argument is considered, describing two methods of solving this problem. The asymptotic evaluation of the butterfly canonical integral is addressed.

  15. Practical guidance for statistical analysis of operational event data

    SciTech Connect

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  16. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton (Albuquerque, NM); Phillips, Cynthia A. (Albuquerque, NM)

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  17. Radioisotope method of compound flow analysis

    NASA Astrophysics Data System (ADS)

    Petryka, Leszek; Zych, Marcin; Hanus, Robert; Sobota, Jerzy; Vlasak, Pavel; Malczewska, Beata

    2015-05-01

    The paper presents gamma radiation application to analysis of a multicomponent or multiphase flow. Such information as a selected component content in the mixture transported through pipe is crucial in many industrial or laboratory installations. Properly selected sealed radioactive source and collimators, deliver the photon beam, penetrating cross section of the flow. Detectors mounted at opposite to the source side of the pipe, allow recording of digital signals representing composition of the stream. In the present development of electronics, detectors and computer software, a significant progress in know-how of this field may be observed. The paper describes application of this method to optimization and control of hydrotransport of solid particles and propose monitoring facilitating prevent of a pipe clogging or dangerous oscillations.

  18. Initial analysis of space target's stealth methods at laser wavelengths

    NASA Astrophysics Data System (ADS)

    Du, Haitao; Han, Yi; Sun, Huayan; Zhang, Tinghua

    2014-12-01

    The laser stealth of space target is useful, important and urgent in practice. This paper introduces the definition expression of laser radar cross section (LRCS) and the general laws of the influencing factors of space target's LRCS, including surface materials types, target's shape and size. Then this paper discusses the possible laser stealth methods of space target in practical applications from the two view points of material stealth methods and shape stealth methods. These conclusions and suggestions can provide references for the next research thinking and methods of the target's laser stealth.

  19. International methods validation for soil analysis in the third millennium

    Microsoft Academic Search

    KALRA Yash

    The AOAC INTERNATIONAL is dedicated to methods validation and quality measurements in the analytical sciences. It collaborates with the Soil Science Society of America Committee S889 (Coordination of Official Methods of Soil Analysis Committee) for the interlaboratory studies on soil analysis. Although AOAC has been conducting the official methods program for several years, the program for soil analysis is less

  20. Face Recognition Technique Using Symbolic Linear Discriminant Analysis Method

    Microsoft Academic Search

    P. S. Hiremath; C. J. Prabhakar

    2006-01-01

    Techniques that can introduce low dimensional feature representation with enhanced discriminatory power are important in face recognition systems. This paper presents one of the symbolic factor analysis method i.e., symbolic Linear Discriminant Analysis (symbolic LDA) method for face representation and recognition. Classical factor analysis methods extract features, which are single valued in nature to represent face images. These single valued

  1. Periodic Component Analysis: An Eigenvalue Method for Representing

    E-print Network

    Saul, Lawrence K.

    Periodic Component Analysis: An Eigenvalue Method for Representing Periodic Structure in Speech 07932 Abstract An eigenvalue method is developed for analyzing periodic structure in speech. Signals) and independent com- ponent analysis (ICA). Our method--called periodic component analysis (CA)--uses constructive

  2. Practical estimates of field-saturated hydraulic conductivity of bedrock outcrops using a modified bottomless bucket method

    USGS Publications Warehouse

    Mirus, Benjamin B.; Perkins, Kim S.

    2012-01-01

    The bottomless bucket (BB) approach (Nimmo et al., 2009a) is a cost-effective method for rapidly characterizing field-saturated hydraulic conductivity Kfs of soils and alluvial deposits. This practical approach is of particular value for quantifying infiltration rates in remote areas with limited accessibility. A similar approach for bedrock outcrops is also of great value for improving quantitative understanding of infiltration and recharge in rugged terrain. We develop a simple modification to the BB method for application to bedrock outcrops, which uses a non-toxic, quick-drying silicone gel to seal the BB to the bedrock. These modifications to the field method require only minor changes to the analytical solution for calculating Kfs on soils. We investigate the reproducibility of the method with laboratory experiments on a previously studied calcarenite rock and conduct a sensitivity analysis to quantify uncertainty in our predictions. We apply the BB method on both bedrock and soil for sites on Pahute Mesa, which is located in a remote area of the Nevada National Security Site. The bedrock BB tests may require monitoring over several hours to days, depending on infiltration rates, which necessitates a cover to prevent evaporative losses. Our field and laboratory results compare well to Kfs values inferred from independent reports, which suggests the modified BB method can provide useful estimates and facilitate simple hypothesis testing. The ease with which the bedrock BB method can be deployed should facilitate more rapid in-situ data collection than is possible with alternative methods for quantitative characterization of infiltration into bedrock.

  3. Visceral fat estimation method by bioelectrical impedance analysis and causal analysis

    NASA Astrophysics Data System (ADS)

    Nakajima, Hiroshi; Tasaki, Hiroshi; Tsuchiya, Naoki; Hamaguchi, Takehiro; Shiga, Toshikazu

    2011-06-01

    It has been clarified that abdominal visceral fat accumulation is closely associated to the lifestyle disease and metabolic syndrome. The gold standard in medical fields is visceral fat area measured by an X-ray computer tomography (CT) scan or magnetic resonance imaging. However, their measurements are high invasive and high cost; especially a CT scan causes X-ray exposure. They are the reasons why medical fields need an instrument for viscera fat measurement with low invasive, ease of use, and low cost. The article proposes a simple and practical method of visceral fat estimation by employing bioelectrical impedance analysis and causal analysis. In the method, abdominal shape and dual impedances of abdominal surface and body total are measured to estimate a visceral fat area based on the cause-effect structure. The structure is designed according to the nature of abdominal body composition to be fine-tuned by statistical analysis. The experiments were conducted to investigate the proposed model. 180 subjects were hired to be measured by both a CT scan and the proposed method. The acquired model explained the measurement principle well and the correlation coefficient is 0.88 with the CT scan measurements.

  4. Interpolation methods for shaped reflector analysis

    NASA Technical Reports Server (NTRS)

    Galindo-Israel, Victor; Imbriale, William A.; Rahmat-Samii, Yahya; Veruttipong, Thavath

    1988-01-01

    The diffraction analysis of reflector surfaces which are described only at a discrete set of locations usually leads to the requirement of an interpolation to determine the surface characteristics over a continuum of locations. Two methods of interpolation, the global and the local methods, are presented. The global interpolation representation is a closed-form or series expression valid over the entire surface. The coefficients of a series expression are found by an integration of all of the raw data. Since the number of coefficients used to describe the surface is much smaller than the number of raw data points, the integration effectively provides a smoothing of the raw data. The local interpolation provides a closed-form expression for only a small area of the reflector surface. The subreflector is divided into sectors each of which has constant discretized data. Each area segment is then locally described by a two-dimensional quadratic surface. The second derivative data give the desired smoothed values.

  5. Spelling Practice Intervention: A Comparison of Tablet PC and Picture Cards as Spelling Practice Methods for Students with Developmental Disabilities

    ERIC Educational Resources Information Center

    Seok, Soonhwa; DaCosta, Boaventura; Yu, Byeong Min

    2015-01-01

    The present study compared a spelling practice intervention using a tablet personal computer (PC) and picture cards with three students diagnosed with developmental disabilities. An alternating-treatments design with a non-concurrent multiple-baseline across participants was used. The aims of the present study were: (a) to determine if…

  6. Flutter and Divergence Analysis using the Generalized Aeroelastic Analysis Method

    NASA Technical Reports Server (NTRS)

    Edwards, John W.; Wieseman, Carol D.

    2003-01-01

    The Generalized Aeroelastic Analysis Method (GAAM) is applied to the analysis of three well-studied checkcases: restrained and unrestrained airfoil models, and a wing model. An eigenvalue iteration procedure is used for converging upon roots of the complex stability matrix. For the airfoil models, exact root loci are given which clearly illustrate the nature of the flutter and divergence instabilities. The singularities involved are enumerated, including an additional pole at the origin for the unrestrained airfoil case and the emergence of an additional pole on the positive real axis at the divergence speed for the restrained airfoil case. Inconsistencies and differences among published aeroelastic root loci and the new, exact results are discussed and resolved. The generalization of a Doublet Lattice Method computer code is described and the code is applied to the calculation of root loci for the wing model for incompressible and for subsonic flow conditions. The error introduced in the reduction of the singular integral equation underlying the unsteady lifting surface theory to a linear algebraic equation is discussed. Acknowledging this inherent error, the solutions of the algebraic equation by GAAM are termed 'exact.' The singularities of the problem are discussed and exponential series approximations used in the evaluation of the kernel function shown to introduce a dense collection of poles and zeroes on the negative real axis. Again, inconsistencies and differences among published aeroelastic root loci and the new 'exact' results are discussed and resolved. In all cases, aeroelastic flutter and divergence speeds and frequencies are in good agreement with published results. The GAAM solution procedure allows complete control over Mach number, velocity, density, and complex frequency. Thus all points on the computed root loci can be matched-point, consistent solutions without recourse to complex mode tracking logic or dataset interpolation, as in the k and p-k solution methods.

  7. Practical Security Analysis of Dirty Paper Trellis Watermarking

    NASA Astrophysics Data System (ADS)

    Bas, Patrick; Doërr, Gwenaël

    This paper analyses the security of dirty paper trellis (DPT) watermarking schemes which use both informed coding and informed embedding. After recalling the principles of message embedding with DPT watermarking, the secret parameters of the scheme are highlighted. The security weaknesses of DPT watermarking are then presented: in the watermarked contents only attack (WOA) setup, the watermarked data-set exhibits clusters corresponding to the different patterns attached to the arcs of the trellis. The K-means clustering algorithm is used to estimate these patterns and a co-occurrence analysis is performed to retrieve the connectivity of the trellis. Experimental results demonstrate that it is possible to accurately estimate the trellis configuration, which enables to perform attacks much more efficient than simple additive white Gaussian noise (AWGN).

  8. Thyrotropin Isoforms: Implications for Thyrotropin Analysis and Clinical Practice

    PubMed Central

    Estrada, Joshua M.; Soldin, Danielle; Buckey, Timothy M.; Soldin, Offie P.

    2014-01-01

    Serum thyrotropin (TSH) is considered the single most sensitive and specific measure of thyroid function in the general population owing to its negative logarithmic association with free triiodothyronine and free thyroxine concentrations. It is therefore often the test of choice for screening, diagnosis, and monitoring of primary hypothyroidism. Serum TSH concentrations can be analyzed quantitatively using third-generation immunoassays, whereas its bioactivity can be measured by TSH activity assays in cell culture. Theoretically, if serum TSH concentrations are directly related to TSH activity, the two tests should yield comparable results. However, on occasion, the results are discordant, with serum concentrations being higher than TSH biological activity. This review focuses on the dissociation between the clinical state and serum TSH concentrations and addresses clinically important aspects of TSH analysis. PMID:24073798

  9. EIA practice in India and its evaluation using SWOT analysis

    SciTech Connect

    Paliwal, Ritu [Centre for Regulatory and Policy Research, TERI School of Advanced Studies, Habitat Centre, Lodhi Road, Delhi-110003 (India)]. E-mail: ritup@terischool.ac.in

    2006-07-15

    In India Environmental Impact Assessment (EIA) has been formally introduced in 1994. It relied on the institutional framework that has a strong supporting legislative, administrative and procedural set-up. Both central and state authorities together are sharing the responsibility of its development and management. A Strength, Weakness, Opportunity and Threat (SWOT) analysis taken up in this article has suggested that there are several issues that need to be readdressed. It highlights several constraints, ranging from improper screening and scoping guidelines to ineffective monitoring and post project evaluation. The opportunities are realised as increasing public awareness, initiatives of environmental groups and business community and forward thinking to integrate environmental consideration into plans and policies. Poor governance, rapid economic reforms, and favours to small-scale units are some of the foreseen threats to the system. This article concludes with some suggestions to improve EIA process in India.

  10. Computer-aided analysis of conditions for optimizing practical electrorotation.

    PubMed

    Hughes, M P

    1998-12-01

    Previous studies have indicated that the variations in torque induced in particles in electrorotation electrode arrays are sufficiently large to cause errors in electrorotation measurements. In order to avoid this, experimenters usually study particles bounded by an arbitrary region near the centre of the electrodes. By simulating the time-dependent electric field for polynomial electrodes, we have assessed the variation in torque across the centre of the array. By considering both the variation in applied torque and the dielectrophoretic force in the electrode chamber, the optimal conditions for electrorotation experiments have been determined. Further to this, by comparing the torque variation across the electrode chamber for a number of common electrode designs, a comparison of the suitability of each electrode design for multiparticle electrorotation analysis has been made. PMID:9869038

  11. Computer Game Criticism: A Method for Computer Game Analysis

    Microsoft Academic Search

    Lars Konzack

    2002-01-01

    In this paper, we describe a method to analyse computer games. The analysis method is based on computer games in particular and not some kind of transfer from other fi eld or studies - even though of course it is inspired from other kinds of analysis methods from varying fi elds of studies. The method is based on seven different

  12. Methods for analysis of fluoroquinolones in biological fluids

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Methods for analysis of 10 selected fluoroquinolone antibiotics in biological fluids are reviewed. Approaches for sample preparation, detection methods, limits of detection and quantitation and recovery information are provided for both single analyte and multi-analyte fluoroquinolone methods....

  13. A Practical Method for Multi-Objective Scheduling through Soft Computing Approach

    NASA Astrophysics Data System (ADS)

    Shimizu, Yoshiaki; Tanaka, Yasutsugu

    Due to diversified customer demands and global competition, scheduling has been increasingly notified as an important problem-solving in manufacturing. Since the scheduling is considered at stage close to the practical operation in production planning, flexibility and agility in decision making should be most important in real world applications. In addition, since the final goal of such scheduling has many attributes, and their relative importance is likely changed depending on the decision environment, it is of great significance to derive a flexible scheduling through plain multi-objective optimization method. To derive such a rational scheduling, in this paper, we have applied a novel multi-objective optimization named MOON2R (MOON2 of radial basis function) by incorporating with simulated annealing as a solution algorithm. Finally, illustrative examples are provided to outline and verify the effectiveness of the proposed method.

  14. Assessing Scientific Practices Using Machine-Learning Methods: How Closely Do They Match Clinical Interview Performance?

    NASA Astrophysics Data System (ADS)

    Beggrow, Elizabeth P.; Ha, Minsu; Nehm, Ross H.; Pearl, Dennis; Boone, William J.

    2013-07-01

    The landscape of science education is being transformed by the new Framework for Science Education (National Research Council, A framework for K-12 science education: practices, crosscutting concepts, and core ideas. The National Academies Press, Washington, DC, 2012), which emphasizes the centrality of scientific practices—such as explanation, argumentation, and communication—in science teaching, learning, and assessment. A major challenge facing the field of science education is developing assessment tools that are capable of validly and efficiently evaluating these practices. Our study examined the efficacy of a free, open-source machine-learning tool for evaluating the quality of students' written explanations of the causes of evolutionary change relative to three other approaches: (1) human-scored written explanations, (2) a multiple-choice test, and (3) clinical oral interviews. A large sample of undergraduates (n = 104) exposed to varying amounts of evolution content completed all three assessments: a clinical oral interview, a written open-response assessment, and a multiple-choice test. Rasch analysis was used to compute linear person measures and linear item measures on a single logit scale. We found that the multiple-choice test displayed poor person and item fit (mean square outfit >1.3), while both oral interview measures and computer-generated written response measures exhibited acceptable fit (average mean square outfit for interview: person 0.97, item 0.97; computer: person 1.03, item 1.06). Multiple-choice test measures were more weakly associated with interview measures (r = 0.35) than the computer-scored explanation measures (r = 0.63). Overall, Rasch analysis indicated that computer-scored written explanation measures (1) have the strongest correspondence to oral interview measures; (2) are capable of capturing students' normative scientific and naive ideas as accurately as human-scored explanations, and (3) more validly detect understanding than the multiple-choice assessment. These findings demonstrate the great potential of machine-learning tools for assessing key scientific practices highlighted in the new Framework for Science Education.

  15. A practical approach to object based requirements analysis

    NASA Technical Reports Server (NTRS)

    Drew, Daniel W.; Bishop, Michael

    1988-01-01

    Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.

  16. A practical method for depth of interaction determination in monolithic scintillator PET detectors.

    PubMed

    van Dam, Herman T; Seifert, Stefan; Vinke, Ruud; Dendooven, Peter; Löhner, Herbert; Beekman, Freek J; Schaart, Dennis R

    2011-07-01

    Several new methods for determining the depth of interaction (DOI) of annihilation photons in monolithic scintillator detectors with single-sided, multi-pixel readout are investigated. The aim is to develop a DOI decoding method that allows for practical implementation in a positron emission tomography system. Specifically, calibration data, obtained with perpendicularly incident gamma photons only, are being used. Furthermore, neither detector modifications nor a priori knowledge of the light transport and/or signal variances is required. For this purpose, a clustering approach is utilized in combination with different parameters correlated with the DOI, such as the degree of similarity to a set of reference light distributions, the measured intensity on the sensor pixel(s) closest to the interaction position and the peak intensity of the measured light distribution. The proposed methods were tested experimentally on a detector comprised of a 20 mm × 20 mm × 12 mm polished LYSO:Ce crystal coupled to a 4 × 4 multi-anode photomultiplier. The method based on the linearly interpolated measured intensities on the sensor pixels closest to the estimated (x, y)-coordinate outperformed the other methods, yielding DOI resolutions between ?1 and ?4.5 mm FWHM depending on the DOI, the (x, y) resolution and the amount of reference data used. PMID:21693789

  17. A meta-analysis of intervention research with problem behavior: treatment validity and standards of practice.

    PubMed

    Scotti, J R; Evans, I M; Meyer, L H; Walker, P

    1991-11-01

    Published intervention research to remediate problem behavior provides a major source of empirical evidence regarding standards of practice and the relative effectiveness of intervention strategies. A meta-analysis of the developmental disabilities literature for the years 1976 through 1987 was performed. Two measures of intervention effectiveness were employed to evaluate the relations between standards of practice, intervention and participant characteristics, and the treatment validity of different levels of intervention for a range of excess behaviors. The results largely failed to support several widespread assumptions regarding precepts of clinical practice. Suggestions were made concerning clinical-experimental research and publication practices to ensure that future work will provide a more conclusive base. PMID:1836733

  18. A practical method for estimating non-isothermal and formation damage skin factors for cold water injection wells

    E-print Network

    Warland, Arild

    1986-01-01

    A PRACTICAL METHOD FOR ESTIMATING NON ISOTHERMAL AND FORMATION DAMAGE SKIN FACTORS FOR COLD WATER INJECTION WELLS A Thesis by ARILD WARLAND Submitted to the Graduate College of Texas AAM University in partial fulfillment of the requirements... for the degree of MASTER OF SCIENCE December 1986 Major Subject: Petroleum Bhgineering A PRACTICAL METHOD FOR ESTIMATING NON-ISOTHERMAL AND FORMATION DAMAGE SKIN FACTORS FOR COLD WATER INJECTION WELLS A Thesis by ARILD WARLAND Approved as to style...

  19. Peering inside the Clock: Using Success Case Method to Determine How and Why Practice-Based Educational Interventions Succeed

    ERIC Educational Resources Information Center

    Olson, Curtis A.; Shershneva, Marianna B.; Brownstein, Michelle Horowitz

    2011-01-01

    Introduction: No educational method or combination of methods will facilitate implementation of clinical practice guidelines in all clinical contexts. To develop an empirical basis for aligning methods to contexts, we need to move beyond "Does it work?" to also ask "What works for whom and under what conditions?" This study employed Success Case…

  20. The solution of fractional wave equation by using modified trial equation method and homotopy analysis method

    NASA Astrophysics Data System (ADS)

    Kocak, Zeynep Fidan; Bulut, Hasan; Yel, Gulnur

    2014-12-01

    In this study, we applied the Homotopy Analysis method to the nonlinear fractional wave equation. Then, we executed a comparison between analytical solution obtained by using Modified Trial Equation method and approximate solution obtained via Homotopy Analysis method. Finally, we investigated errors analysis by drawing 3D and 2D graphics of the solution of fractional wave equation for different value of alpha.

  1. Practice Makes Perfect: Improving Students' Skills in Understanding and Avoiding Plagiarism with a Themed Methods Course

    ERIC Educational Resources Information Center

    Estow, Sarah; Lawrence, Eva K.; Adams, Kathrynn A.

    2011-01-01

    To address the issue of plagiarism, students in two undergraduate Research Methods and Analysis courses conducted, analyzed, and wrote up original research on the topic of plagiarism. Students in an otherwise identical course completed the same assignments but examined a different research topic. At the start and end of the semester, all students…

  2. Critical analysis of biomarkers in the current periodontal practice

    PubMed Central

    Khiste, Sujeet V.; Ranganath, V.; Nichani, Ashish S.; Rajani, V.

    2011-01-01

    Periodontal disease is a chronic microbial infection that triggers inflammation-mediated loss of the periodontal ligament and alveolar bone that supports the teeth. Because of the increasing prevalence and associated comorbidities, there is a need for the development of new diagnostic tests that can detect the presence of active disease, predict future disease progression, and evaluate the response to periodontal therapy, thereby improving the clinical management of periodontal patients. The diagnosis of active phases of periodontal disease and the identification of patients at risk for active disease represent challenges for clinical investigators and practitioners. Advances in diagnostic research are moving toward methods whereby the periodontal risk can be identified and quantified by objective measures using biomarkers. Patients with periodontitis may have elevated circulating levels of specific inflammatory markers that can be correlated to the severity of the disease. Advances in the use of oral fluids as possible biological samples for objective measures of the current disease state, treatment monitoring, and prognostic indicators have boosted saliva- and other oral-based fluids to the forefront of technology. Gingival crevicular fluid (GCF) is an inflammatory exudate that can be collected at the gingival margin or within the gingival crevice. This article highlights recent advances in the use of biomarker-based disease diagnostics that focus on the identification of active periodontal disease from plaque biofilms, GCF, and saliva. PMID:21976831

  3. Development of model-based remote maintenance robot system. IV. A practical stiffness control method for redundant robot arm

    Microsoft Academic Search

    Yukio ASARI; Hirokazu SATO; Takashi YOSHIMI; Kyorich TATSUNO; K. Asano

    1993-01-01

    For part III, see ibid., pp. 1237-1244. A simple stiffness control method for a seven-degree-of-freedom redundant robot arm is presented. The method is very practical method for application to a real robot system. The robot arm is able to move compliantly by this method without heavy computation. The method uses the transpose of the Jacobian matrix to transform the end-point

  4. A 5-Year Analysis of Peer-Reviewed Journal Article Publications of Pharmacy Practice Faculty Members

    PubMed Central

    Spivey, Christina; Martin, Jennifer R.; Wyles, Christina; Ehrman, Clara; Schlesselman, Lauren S.

    2012-01-01

    Objectives. To evaluate scholarship, as represented by peer-reviewed journal articles, among US pharmacy practice faculty members; contribute evidence that may better inform benchmarking by academic pharmacy practice departments; and examine factors that may be related to publication rates. Methods. Journal articles published by all pharmacy practice faculty members between January 1, 2006, and December 31, 2010, were identified. College and school publication rates were compared based on public vs. private status, being part of a health science campus, having a graduate program, and having doctor of pharmacy (PharmD) faculty members funded by the National Institutes of Health (NIH). Results. Pharmacy practice faculty members published 6,101 articles during the 5-year study period, and a pharmacy practice faculty member was the primary author on 2,698 of the articles. Pharmacy practice faculty members published an average of 0.51 articles per year. Pharmacy colleges and schools affiliated with health science campuses, at public institutions, with NIH-funded PharmD faculty members, and with graduate programs had significantly higher total publication rates compared with those that did not have these characteristics (p<0.006). Conclusion. Pharmacy practice faculty members contributed nearly 6,000 unique publications over the 5-year period studied. However, this reflects a rate of less than 1 publication per faculty member per year, suggesting that a limited number of faculty members produced the majority of publications. PMID:23049099

  5. Echinacea purpurea: Pharmacology, phytochemistry and analysis methods.

    PubMed

    Manayi, Azadeh; Vazirian, Mahdi; Saeidnia, Soodabeh

    2015-01-01

    Echinacea purpurea (Asteraceae) is a perennial medicinal herb with important immunostimulatory and anti-inflammatory properties, especially the alleviation of cold symptoms. The plant also attracted scientists' attention to assess other aspects of its beneficial effects. For instance, antianxiety, antidepression, cytotoxicity, and antimutagenicity as induced by the plant have been revealed in various studies. The findings of the clinical trials are controversial in terms of side effects. While some studies revealed the beneficial effects of the plant on the patients and no severe adverse effects, some others have reported serious side effects including abdominal pain, angioedema, dyspnea, nausea, pruritus, rash, erythema, and urticaria. Other biological activities of the plant such as antioxidant, antibacterial, antiviral, and larvicidal activities have been reported in previous experimental studies. Different classes of secondary metabolites of the plant such as alkamides, caffeic acid derivatives, polysaccharides, and glycoproteins are believed to be biologically and pharmacologically active. Actually, concurrent determination and single analysis of cichoric acid and alkamides have been successfully developed mainly by using high-performance liquid chromatography (HPLC) coupled with different detectors including UV spectrophotometric, coulometric electrochemical, and electrospray ionization mass spectrometric detectors. The results of the studies which were controversial revealed that in spite of major experiments successfully accomplished using E. purpurea, many questions remain unanswered and future investigations may aim for complete recognition of the plant's mechanism of action using new, complementary methods. PMID:26009695

  6. Echinacea purpurea: Pharmacology, phytochemistry and analysis methods

    PubMed Central

    Manayi, Azadeh; Vazirian, Mahdi; Saeidnia, Soodabeh

    2015-01-01

    Echinacea purpurea (Asteraceae) is a perennial medicinal herb with important immunostimulatory and anti-inflammatory properties, especially the alleviation of cold symptoms. The plant also attracted scientists’ attention to assess other aspects of its beneficial effects. For instance, antianxiety, antidepression, cytotoxicity, and antimutagenicity as induced by the plant have been revealed in various studies. The findings of the clinical trials are controversial in terms of side effects. While some studies revealed the beneficial effects of the plant on the patients and no severe adverse effects, some others have reported serious side effects including abdominal pain, angioedema, dyspnea, nausea, pruritus, rash, erythema, and urticaria. Other biological activities of the plant such as antioxidant, antibacterial, antiviral, and larvicidal activities have been reported in previous experimental studies. Different classes of secondary metabolites of the plant such as alkamides, caffeic acid derivatives, polysaccharides, and glycoproteins are believed to be biologically and pharmacologically active. Actually, concurrent determination and single analysis of cichoric acid and alkamides have been successfully developed mainly by using high-performance liquid chromatography (HPLC) coupled with different detectors including UV spectrophotometric, coulometric electrochemical, and electrospray ionization mass spectrometric detectors. The results of the studies which were controversial revealed that in spite of major experiments successfully accomplished using E. purpurea, many questions remain unanswered and future investigations may aim for complete recognition of the plant's mechanism of action using new, complementary methods. PMID:26009695

  7. Practical analysis of tide gauges records from Antarctica

    NASA Astrophysics Data System (ADS)

    Galassi, Gaia; Spada, Giorgio

    2015-04-01

    We have collected and analyzed in a basic way the currently available time series from tide gauges deployed along the coasts of Antarctica. The database of the Permanent Service for Mean Sea Level (PSMSL) holds relative sea level information for 17 stations, which are mostly concentrated in the Antarctic Peninsula (8 out of 17). For 7 of the PSMSL stations, Revised Local Reference (RLR) monthly and yearly observations are available, spanning from year 1957.79 (Almirante Brown) to 2013.95 (Argentine Islands). For the remaining 11 stations, only metric monthly data can be obtained during the time window 1957-2013. The record length of the available time series is not generally exceeding 20 years. Remarkable exceptions are the RLR station of Argentine Island, located in the Antarctic Peninsula (AP) (time span: 1958-2013, record length: 54 years, completeness=98%), and the metric station of Syowa in East Antarctica (1975-2012, 37 years, 92%). The general quality (geographical coverage and length of record) of the time series hinders a coherent geophysical interpretation of the relative sea-level data along the coasts of Antarctica. However, in an attempt to characterize the relative sea level signals available, we have stacked (i.e., averaged) the RLR time series for the AP and for the whole Antarctica. The so obtained time series have been analyzed using simple regression in order to estimate a trend and a possible sea-level acceleration. For the AP, the the trend is 1.8 ± 0.2 mm/yr and for the whole Antarctica it is 2.1 ± 0.1 mm/yr (both during 1957-2013). The modeled values of Glacial Isostatic Adjustment (GIA) obtained with ICE-5G(VM2) using program SELEN, range between -0.7 and -1.6 mm/yr, showing that the sea-level trend recorded by tide gauges is strongly influenced by GIA. Subtracting the average GIA contribution (-1.1 mm/yr) to observed sea-level trend from the two stacks, we obtain 3.2 and 2.9 mm/yr for Antarctica and AP respectively, which are interpreted as the effect of current ice melting and steric ocean contributions. By the Ensemble Empirical Mode Decomposition method, we have detected different oscillations embedded in the sea-level signals for Antarctica and AP. This confirms previously recognized connections between the sea-level variations in Antarctica and ocean modes like the ENSO.

  8. Practical Analysis of Wind-Induced Human Comfort Condition of Tall Buildings

    Microsoft Academic Search

    Shuifu Chen; Chao Li

    2010-01-01

    A practical analysis procedure for estimation of the wind-induced root mean square acceleration response and the human comfort condition of tall buildings is presented in this paper. The modal wind-force time history curves and the corresponding power spectrum curves are first determined through analysis of the wind-pressure coefficient data obtained from the aerodynamic wind-tunnel model test and the dynamic characteristics

  9. A Quantitative Analysis and Natural History of B. F. Skinner's Coauthoring Practices

    PubMed Central

    McKerchar, Todd L; Morris, Edward K; Smith, Nathaniel G

    2011-01-01

    This paper describes and analyzes B. F. Skinner's coauthoring practices. After identifying his 35 coauthored publications and 27 coauthors, we analyze his coauthored works by their form (e.g., journal articles) and kind (e.g., empirical); identify the journals in which he published and their type (e.g., data-type); describe his overall and local rates of publishing with his coauthors (e.g., noting breaks in the latter); and compare his coauthoring practices with his single-authoring practices (e.g., form, kind, journal type) and with those in the scientometric literature (e.g., majority of coauthored publications are empirical). We address these findings in the context of describing the natural history of Skinner's coauthoring practices. Finally, we describe some limitations in our methods and offer suggestions for future research. PMID:22532732

  10. Identifying Evidence-Based Practices in Special Education through High Quality Meta-Analysis

    ERIC Educational Resources Information Center

    Friedt, Brian

    2012-01-01

    The purpose of this study was to determine if meta-analysis can be used to enhance efforts to identify evidence-based practices (EBPs). In this study, the quality of included studies acted as the moderating variable. I used the quality indicators for experimental and quasi-experimental research developed by Gersten, Fuchs, Coyne, Greenwood, and…

  11. A Longitudinal Analysis of Parenting Practices, Couple Satisfaction, and Child Behavior Problems

    Microsoft Academic Search

    Deanna Linville; Krista Chronister; Tom Dishion; Jeff Todahl; John Miller; Daniel Shaw; Francis Gardner; Melvin Wilson

    2010-01-01

    This longitudinal study examined the relationship between couple relationship satisfaction, parenting practices, parent depression, and child problem behaviors. The study partici- pants (n = 148) were part of a larger experimental study that examined the effectiveness of a brief family-centered intervention, the Family Check-Up model. Regression analysis results indicated that our proposed model accounted for 38% of the variance in

  12. inria-00000421,version1-11Oct2005 Practical Semantic Analysis of Web Sites and Documents

    E-print Network

    Paris-Sud XI, Université de

    inria-00000421,version1-11Oct2005 Practical Semantic Analysis of Web Sites and Documents Thierry make a parallel between programs and Web sites. We present some examples of semantic constraints semantics, logic programming, web sites, in- formation system, knowledge management, content management

  13. A probabilistic approach for analysis of uncertainty in the evaluation of watershed management practices

    Microsoft Academic Search

    Mazdak Arabi; Rao S. Govindaraju; Mohamed M. Hantush

    2007-01-01

    A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (<10 km2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference between the magnitude of uncertainty associated with absolute hydrologic and water quality predictions, and uncertainty in estimated benefits of BMPs. The Soil

  14. Discounting in Cost-Utility Analysis of Healthcare Interventions: Reassessing Current Practice

    Microsoft Academic Search

    Brian J. Cohen

    2003-01-01

    Cost-utility analysis (CUA) is a technique that can potentially be used as a guide to allocating healthcare resources so as to obtain the maximum health benefits possible under a given budget constraint. However, it is not clear that current practice captures societal preferences regarding health benefits. In analyses of healthcare interventions providing survival benefits, the market rate of interest is

  15. Interventions for Adolescent Struggling Readers: A Meta-Analysis with Implications for Practice

    ERIC Educational Resources Information Center

    Scammacca, Nancy; Roberts, Greg; Vaughn, Sharon; Edmonds, Meaghan; Wexler, Jade; Reutebuch, Colleen Klein; Torgesen, Joseph K.

    2007-01-01

    This meta-analysis offers decision-makers research-based guidance for intervening with adolescent struggling readers. The authors outline major implications for practice: (1) Adolescence is not too late to intervene. Interventions do benefit older students; (2) Older students with reading difficulties benefit from interventions focused at both the…

  16. A Secondary Analysis of the Impact of School Management Practices on School Performance

    ERIC Educational Resources Information Center

    Talbert, Dale A.

    2009-01-01

    The purpose of this study was to conduct a secondary analysis of the impact of school management practices on school performance utilizing a survey design of School and Staffing (SASS) data collected by the National Center for Education Statistics (NCES) of the U.S. Department of Education, 1999-2000. The study identifies those school management…

  17. Ensuring Quality and Productivity in Higher Education: An Analysis of Assessment Practices. ERIC Digest.

    ERIC Educational Resources Information Center

    Gates, Susan M.; Augustine, Catherine H.; Benjamin, Roger; Bikson, Tora K.; Kaganoff, Tessa; Levy, Dina G.; Moini, Joy S.; Zimmer, Ron W.

    This digest summarizes the highlights of an analysis of assessment practices used in education and professional development for educators. A research team from RAND conducted a broad review of the literature and reviewed the documents of organizations engaged in such assessments. Researchers interviewed experts, attended conferences, and conducted…

  18. AN EMPIRICAL ANALYSIS OF QUALITY MANAGEMENT PRACTICES IN JAPANESE MANUFACTURING COMPANIES

    Microsoft Academic Search

    Phan Chi Anh; Yoshiki Matsui

    Quality management represents company-wide activities to improve the quality level of products and works through customer orientation, continuous quality improvement, employees' involvement, etc. to establish and sustain a competitive advantage. This paper presents result of an empirical analysis on quality management practices and its impact on competitive performance in Japanese manufacturing companies. This result has been derived from the third

  19. Abstract: Using System Dynamics Analysis for Evaluating the Sustainability of ?Complete Streets? Practices

    EPA Science Inventory

    Abstract: Using System Dynamics Analysis for Evaluating the Sustainability of ?Complete Streets? Practices Primary Author: Nicholas R. Flanders 109 T.W. Alexander Drive Mail Code: E343-02 Research Triangle Park, NC 27709 919-541-3660 Flanders.nick@Epa.gov Topic categ...

  20. Using Performance Analysis for Training in an Organization Implementing ISO-9000 Manufacturing Practices: A Case Study.

    ERIC Educational Resources Information Center

    Kunneman, Dale E.; Sleezer, Catherine M.

    2000-01-01

    This case study examines the application of the Performance Analysis for Training (PAT) Model in an organization that was implementing ISO-9000 (International Standards Organization) processes for manufacturing practices. Discusses the interaction of organization characteristics, decision maker characteristics, and analyst characteristics to…

  1. Ranking corporations based on sustainable and socially responsible practices. A data envelopment analysis (DEA) approach

    Microsoft Academic Search

    Constantin Belu

    2009-01-01

    This study ranks publicly listed corporations based on social and environmental (i.e. sustainable) achievements in relation to financial results, by using a data envelopment analysis (DEA) approach with financial performance indicators (return on assets, return on equity and yearly stock return) as inputs and sustainability scores as outputs. The sustainability scores cover a wide range of sustainable practices and were

  2. Design of a practical model-observer-based image quality assessment method for CT imaging systems

    NASA Astrophysics Data System (ADS)

    Tseng, Hsin-Wu; Fan, Jiahua; Cao, Guangzhi; Kupinski, Matthew A.; Sainath, Paavana

    2014-03-01

    The channelized Hotelling observer (CHO) is a powerful method for quantitative image quality evaluations of CT systems and their image reconstruction algorithms. It has recently been used to validate the dose reduction capability of iterative image-reconstruction algorithms implemented on CT imaging systems. The use of the CHO for routine and frequent system evaluations is desirable both for quality assurance evaluations as well as further system optimizations. The use of channels substantially reduces the amount of data required to achieve accurate estimates of observer performance. However, the number of scans required is still large even with the use of channels. This work explores different data reduction schemes and designs a new approach that requires only a few CT scans of a phantom. For this work, the leave-one-out likelihood (LOOL) method developed by Hoffbeck and Landgrebe is studied as an efficient method of estimating the covariance matrices needed to compute CHO performance. Three different kinds of approaches are included in the study: a conventional CHO estimation technique with a large sample size, a conventional technique with fewer samples, and the new LOOL-based approach with fewer samples. The mean value and standard deviation of area under ROC curve (AUC) is estimated by shuffle method. Both simulation and real data results indicate that an 80% data reduction can be achieved without loss of accuracy. This data reduction makes the proposed approach a practical tool for routine CT system assessment.

  3. Concepts, tools/methods, and practices of water-energy-food NEXUS

    NASA Astrophysics Data System (ADS)

    Endo, A.; Tsurita, I.; Orencio, P. M.; Taniguchi, M.

    2014-12-01

    The needs to consider the NEXUS on food and water were emphasized in international dialogues and publications around the end of the 20th century. In fact, in 1983, the United Nations University already launched a Food-Energy Nexus Programme to fill the gaps between the issues of food and energy. The term "NEXUS" to link water, food, and trade was also used in the World Bank during 1990s. The idea of NEXUS is likely to have further developed under the discussion of "virtual water" and "water footprints". With experiencing several international discussions such as Kyoto World Water Forum 2003, scholars and practitioners around the globe acknowledged the need to include energy for the pillars of NEXUS. Finally, the importance of three NEXUS pillars, "water, energy, and food" was officially announced in the BONN 2011 NEXUS Conference, which is a turning point of NEXUS idea in the international community , in order to contribute to the United Nations Conference on Sustainable Development (Rio+20) in 2012 that highlighted the concept of "green economy". The concept of NEXUS is becoming a requisite to achieve sustainable development due to the global concerns embedded in society, economy, and environment. The concept stresses to promote the cooperation with the sectors such as water, energy, food, and climate change since these complex global issues are dependent and inter-connected, which can no longer be solved by the sectorial approaches. The NEXUS practices are currently shared among different stakeholders through various modes including literatures, conferences, workshops, and research projects. However, since the NEXUS practices are not led by a particular organization, its concept, theory, policy, tools, methods, and applications are diverse and incoherent. In terms of tools/methods, the potential of integrated modeling approach is introduced to avoid pressures and to promote interactions among water, energy and food. This paper explores the concepts, tools/methods, and practices of water-energy-food NEXUS to evaluate human environmental security under the RIHN project on "Human-Environmental Security in the Asia-Pacific Ring of Fire: Water-Energy-Food Nexus".

  4. Quantitative Data Analysis Methods for 3D Microstructure

    E-print Network

    Quantitative Data Analysis Methods for 3D Microstructure Characterization of Solid Oxide Cells by the analysis of Ni-YSZ and LSC-CGO electrode samples. Automatic methods for preprocessing the raw 3D image data variation in the 3D image data. Routine use of quantitative three dimensional analysis of microstructure

  5. COMPARISON OF METHODS FOR THE ANALYSIS OF PANEL STUDIES

    EPA Science Inventory

    Three different methods of analysis of panels were compared using asthma panel data from a 1970-1971 study done by EPA in Riverhead, New York. The methods were (1) regression analysis using raw attack rates; (2) regression analysis using the ratio of observed attacks to expected ...

  6. MONTE CARLO ANALYSIS: ESTIMATING GPP WITH THE CANOPY CONDUCTANCE METHOD

    E-print Network

    DeLucia, Evan H.

    MONTE CARLO ANALYSIS: ESTIMATING GPP WITH THE CANOPY CONDUCTANCE METHOD 1. Overview A novel method performed a Monte Carlo Analysis to investigate the power of our statistical approach: i.e. what and Assumptions The Monte Carlo Analysis was performed as follows: · Natural variation. The only study to date

  7. A Qualitative Analysis of an Advanced Practice Nurse–Directed Transitional Care Model Intervention

    PubMed Central

    Bradway, Christine; Trotta, Rebecca; Bixby, M.Brian; McPartland, Ellen; Wollman, M. Catherine; Kapustka, Heidi; McCauley, Kathleen; Naylor, Mary D.

    2012-01-01

    Purpose: The purpose of this study was to describe barriers and facilitators to implementing a transitional care intervention for cognitively impaired older adults and their caregivers lead by advanced practice nurses (APNs). Design and Methods: APNs implemented an evidence-based protocol to optimize transitions from hospital to home. An exploratory, qualitative directed content analysis examined 15 narrative case summaries written by APNs and fieldnotes from biweekly case conferences. Results: Three central themes emerged: patients and caregivers having the necessary information and knowledge, care coordination, and the caregiver experience. An additional category was also identified, APNs going above and beyond. Implications: APNs implemented individualized approaches and provided care that exceeds the type of care typically staffed and reimbursed in the American health care system by applying a Transitional Care Model, advanced clinical judgment, and doing whatever was necessary to prevent negative outcomes. Reimbursement reform as well as more formalized support systems and resources are necessary for APNs to consistently provide such care to patients and their caregivers during this vulnerable time of transition. PMID:21908805

  8. A Practical Method for Transforming Free-Text Eligibility Criteria into Computable Criteria

    PubMed Central

    Tu, Samson W.; Peleg, Mor; Carini, Simona; Bobak, Michael; Ross, Jessica; Rubin, Daniel; Sim, Ida

    2011-01-01

    Formalizing eligibility criteria in a computer-interpretable language would facilitate eligibility determination for study subjects and the identification of studies on similar patient populations. Because such formalization is extremely labor intensive, we transform the problem from one of fully capturing the semantics of criteria directly in a formal expression language to one of annotating free-text criteria in a format called ERGO Annotation. The annotation can be done manually, or it can be partially automated using natural-language processing techniques. We evaluated our approach in three ways. First, we assessed the extent to which ERGO Annotations capture the semantics of 1000 eligibility criteria randomly drawn from ClinicalTrials.gov. Second, we demonstrated the practicality of the annotation process in a feasibility study. Finally, we demonstrate the computability of ERGO Annotation by using it to (1) structure a library of eligibility criteria, (2) search for studies enrolling specified study populations, and (3) screen patients for potential eligibility for a study. We therefore demonstrate a new and practical method for incrementally capturing the semantics of free-text eligibility criteria into computable form. PMID:20851207

  9. Practical methods of tracking of nonstationary time series applied to real-world data

    NASA Astrophysics Data System (ADS)

    Nabney, Ian T.; McLachlan, Alan; Lowe, David

    1996-03-01

    In this paper, we discuss some practical implications for implementing adaptable network algorithms applied to non-stationary time series problems. Two real world data sets, containing electricity load demands and foreign exchange market prices, are used to test several different methods, ranging from linear models with fixed parameters, to non-linear models which adapt both parameters and model order on-line. Training with the extended Kalman filter, we demonstrate that the dynamic model-order increment procedure of the resource allocating RBF network (RAN) is highly sensitive to the parameters of the novelty criterion. We investigate the use of system noise for increasing the plasticity of the Kalman filter training algorithm, and discuss the consequences for on-line model order selection. The results of our experiments show that there are advantages to be gained in tracking real world non-stationary data through the use of more complex adaptive models.

  10. STUDY ON PRACTICAL USE OF DETECTION METHOD OF IMPACT FORCE USING BRIDGE GIRDER STRENGTH

    NASA Astrophysics Data System (ADS)

    Misaki, Norikazu; Sakamoto, Yasuhiko; Ikoma, Noboru

    When information of motor vehicle collision to a steel bridge girder of railway overbridge was received, the accident situation and degree of damage of the girder should be confirmed. However, there is a problem that a train service might be delayed due to time-consuming safety confirmation. Therefore, only to detect impact force inducing deformation, damage and movement of bridge girder, resulting into delay of train service, the strength of bridge girder and harmful impact force were calculated. Simultaneously, a system to detect the harmful impact force was developed and a tolerance for the detection was determined. From the above results, the method to detect the harmful impact force was developed and put into practical use.

  11. A practical method to detect and correct for lens distortion in the TEM.

    PubMed

    Capitani, Gian Carlo; Oleynikov, Peter; Hovmöller, Sven; Mellini, Marcello

    2006-01-01

    A practical, offline method for experimental detection and correction for projector lens distortion in the transmission electron microscope (TEM) operating in high-resolution (HR) and selected area electron diffraction (SAED) modes is described. Typical TEM works show that, in the simplest case, the distortion transforms on the recording device, which would be a circle into an ellipse. The first goal of the procedure described here is to determine the elongation and orientation of the ellipse. The second goal is to correct for the distortion using an ordinary graphic program. The same experimental data set may also be used to determine the actual microscope magnification and the rotation between SAED patterns and HR images. The procedure may be helpful in several quantitative applications of electron diffraction and HR imaging, for instance while performing accurate lattice parameter determination, or while determining possible metrical deviations (cell edges and angles) from a given symmetry. PMID:16046067

  12. Confirmatory Factor Analysis on the Professional Suitability Scale for Social Work Practice

    ERIC Educational Resources Information Center

    Tam, Dora M. Y.; Twigg, Robert C.; Boey, Kam-Wing; Kwok, Siu-Ming

    2013-01-01

    Objective: This article presents a validation study to examine the factor structure of an instrument designed to measure professional suitability for social work practice. Method: Data were collected from registered social workers in a provincial mailed survey. The response rate was 23.2%. After eliminating five cases with multivariate outliers,…

  13. Current Philosophy & Practice in ESL/EFL Reading Education: An Analysis.

    ERIC Educational Resources Information Center

    Meredith, David

    This paper reviews a small sampling of recent research-supported method studies that were consistent with the prominent current philosophical practice in English-as-a-Second-Language/English-as-a-Foreign-Language (ESL/EFL) reading education. It focuses on the following: The Advent of Whole Language in ESL/EFL Study; Teaching from a Constructivist…

  14. Optimum compression to ventilation ratios in CPR under realistic, practical conditions: a physiological and mathematical analysis

    Microsoft Academic Search

    Charles F. Babbs; Karl B. Kern

    2002-01-01

    Objective: To develop and evaluate a practical formula for the optimum ratio of compressions to ventilations in cardiopulmonary resuscitation (CPR). The optimum value of a variable is that for which a desired result is maximized. Here the desired result is assumed to be either oxygen delivery to peripheral tissues or a combination of oxygen delivery and waste product removal. Method:

  15. An Analysis of Agricultural Mechanics Safety Practices in Agricultural Science Laboratories.

    ERIC Educational Resources Information Center

    Swan, Michael K.

    North Dakota secondary agricultural mechanics instructors were surveyed regarding instructional methods and materials, safety practices, and equipment used in the agricultural mechanics laboratory. Usable responses were received from 69 of 89 instructors via self-administered mailed questionnaires. Findings were consistent with results of similar…

  16. The practice of participatory research and gender analysis in natural resource management

    Microsoft Academic Search

    Nancy Johnson; Nina Lilja; Jacqueline A. Ashby; James A. Garcia

    2004-01-01

    Stakeholder participation is expected to improve the efficiency, equity, and sustainability of natural resource management research and development (R&D) projects by ensuring that research reflects users' priorities, needs, capabilities, and constraints. Use of participatory methods and tools is growing rapidly; however, there is little systematic evidence about what participation actually means in practice, or about what difference it makes. Based

  17. Physical Methods for Intracellular Delivery: Practical Aspects from Laboratory Use to Industrial-Scale Processing

    PubMed Central

    Meacham, J. Mark; Durvasula, Kiranmai; Degertekin, F. Levent; Fedorov, Andrei G.

    2015-01-01

    Effective intracellular delivery is a significant impediment to research and therapeutic applications at all processing scales. Physical delivery methods have long demonstrated the ability to deliver cargo molecules directly to the cytoplasm or nucleus, and the mechanisms underlying the most common approaches (microinjection, electroporation, and sonoporation) have been extensively investigated. In this review, we discuss established approaches, as well as emerging techniques (magnetofection, optoinjection, and combined modalities). In addition to operating principles and implementation strategies, we address applicability and limitations of various in vitro, ex vivo, and in vivo platforms. Importantly, we perform critical assessments regarding (1) treatment efficacy with diverse cell types and delivered cargo molecules, (2) suitability to different processing scales (from single cell to large populations), (3) suitability for automation/integration with existing workflows, and (4) multiplexing potential and flexibility/adaptability to enable rapid changeover between treatments of varied cell types. Existing techniques typically fall short in one or more of these criteria; however, introduction of micro-/nanotechnology concepts, as well as synergistic coupling of complementary method(s), can improve performance and applicability of a particular approach, overcoming barriers to practical implementation. For this reason, we emphasize these strategies in examining recent advances in development of delivery systems. PMID:23813915

  18. Intergenerational analysis of dietary practices and health perceptions of Hispanic women and their adult daughters.

    PubMed

    Garcia-Maas, L D

    1999-07-01

    This descriptive, correlational, two-group study investigated differences between dietary practices, acculturation, and health perceptions in a convenience sample of Hispanic mothers and their adult daughters (N = 76, 47 mother-daughter dyads). Analysis (paired t tests) of the Block Screening Questionnaire, General Acculturation Index, and Self-Rated Health Subindex of the Multilevel Assessment Instrument showed significant differences: Daughters ate more fat (p = .04) and were more acculturated than their mothers (p = .0001). The Pearson correlation yielded a significant relationship for the 76 subjects between fat intake (dietary practice) and health perception: The more fat (meat/snacks) intake, the more negatively women perceived their health status (p = .0001). PMID:10693408

  19. Statistical methods for dealing with publication bias in meta-analysis.

    PubMed

    Jin, Zhi-Chao; Zhou, Xiao-Hua; He, Jia

    2015-01-30

    Publication bias is an inevitable problem in the systematic review and meta-analysis. It is also one of the main threats to the validity of meta-analysis. Although several statistical methods have been developed to detect and adjust for the publication bias since the beginning of 1980s, some of them are not well known and are not being used properly in both the statistical and clinical literature. In this paper, we provided a critical and extensive discussion on the methods for dealing with publication bias, including statistical principles, implementation, and software, as well as the advantages and limitations of these methods. We illustrated a practical application of these methods in a meta-analysis of continuous support for women during childbirth. PMID:25363575

  20. METHODS OF FREQUENCY ANALYSIS OF A COMPLEX MAMMALIAN VOCALISATION

    Microsoft Academic Search

    SAFI K. DARDEN; SIMON B. PEDERSEN; TORBEN DABELSTEEN

    2003-01-01

    The prevalence of complex acoustic structures in mammalian vocalisations can make it difficult to quantify frequency characteristics. We describe two methods developed for the frequency analysis of a complex swift fox Vulpes velox vocalisation, the barking sequence: (1) autocorrelation function analysis and (2) instantaneous frequency analysis. The autocorrelation function analysis results in an energy density spectrum of the signal's averaged

  1. Causal Network Methods for Integrated Project Portfolio Risk Analysis 

    E-print Network

    Govan, Paul

    2014-08-06

    Corporate portfolio risk analysis is of primary concern for many organizations, as the success of strategic objectives greatly depends on an accurate risk assessment. Current risk analysis methods typically involve statistical models of risk...

  2. A practical discrete-adjoint method for high-fidelity compressible turbulence simulations

    NASA Astrophysics Data System (ADS)

    Vishnampet, Ramanathan; Bodony, Daniel J.; Freund, Jonathan B.

    2015-03-01

    Methods and computing hardware advances have enabled accurate predictions of complex compressible turbulence phenomena, such as the generation of jet noise that motivates the present effort. However, limited understanding of underlying physical mechanisms restricts the utility of such predictions since they do not, by themselves, indicate a route to design improvements. Gradient-based optimization using adjoints can circumvent the flow complexity to guide designs, though this is predicated on the availability of a sufficiently accurate solution of the forward and adjoint systems. These are challenging to obtain, since both the chaotic character of the turbulence and the typical use of discretizations near their resolution limits in order to efficiently represent its smaller scales will amplify any approximation errors made in the adjoint formulation. Formulating a practical exact adjoint that avoids such errors is especially challenging if it is to be compatible with state-of-the-art simulation methods used for the turbulent flow itself. Automatic differentiation (AD) can provide code to calculate a nominally exact adjoint, but existing general-purpose AD codes are inefficient to the point of being prohibitive for large-scale turbulence simulations. Here, we analyze the compressible flow equations as discretized using the same high-order workhorse methods used for many high-fidelity compressible turbulence simulations, and formulate a practical space-time discrete-adjoint method without changing the basic discretization. A key step is the definition of a particular discrete analog of the continuous norm that defines our cost functional; our selection leads directly to an efficient Runge-Kutta-like scheme, though it would be just first-order accurate if used outside the adjoint formulation for time integration, with finite-difference spatial operators for the adjoint system. Its computational cost only modestly exceeds that of the flow equations. We confirm that its accuracy is limited by computing precision, and we demonstrate it on the aeroacoustic control of a mixing layer with a challengingly broad range of turbulence scales. For comparison, the error from a corresponding discretization of the continuous-adjoint equations is quantified to potentially explain its limited success in past efforts to control jet noise. The differences are illuminating: the continuous-adjoint is shown to suffer from exponential error growth in (reverse) time even for the best-resolved largest turbulence scales. Implications for jet noise reduction and turbulence control in general are discussed.

  3. Maximum likelihood principal component analysis with correlated measurement errors: theoretical and practical considerations

    Microsoft Academic Search

    Peter D. Wentzell; Mitchell T. Lohnes

    1999-01-01

    Procedures to compensate for correlated measurement errors in multivariate data analysis are described. These procedures are based on the method of maximum likelihood principal component analysis (MLPCA), previously described in the literature. MLPCA is a decomposition method similar to conventional PCA, but it takes into account measurement uncertainty in the decomposition process, placing less emphasis on measurements with large variance.

  4. Advanced Computer Methods for Grounding Analysis Ignasi Colominas1

    E-print Network

    Colominas, Ignasi

    system are to safeguard that persons working or walking in the surroundings of the grounded installation of grounding grids of large electrical substations in practical cases present some difficulties mainly due been implemented in a CAD tool for grounding systems comprising all stages of the analysis

  5. RAPID ON-SITE METHODS OF CHEMICAL ANALYSIS

    EPA Science Inventory

    The analysis of potentially hazardous air, water and soil samples collected and shipped to service laboratories off-site is time consuming and expensive. This Chapter addresses the practical alternative of performing the requisite analytical services on-site. The most significant...

  6. Promoting recovery-oriented practice in mental health services: a quasi-experimental mixed-methods study

    PubMed Central

    2013-01-01

    Background Recovery has become an increasingly prominent concept in mental health policy internationally. However, there is a lack of guidance regarding organisational transformation towards a recovery orientation. This study evaluated the implementation of recovery-orientated practice through training across a system of mental health services. Methods The intervention comprised four full-day workshops and an in-team half-day session on supporting recovery. It was offered to 383 staff in 22 multidisciplinary community and rehabilitation teams providing mental health services across two contiguous regions. A quasi-experimental design was used for evaluation, comparing behavioural intent with staff from a third contiguous region. Behavioural intent was rated by coding points of action on the care plans of a random sample of 700 patients (400 intervention, 300 control), before and three months after the intervention. Action points were coded for (a) focus of action, using predetermined categories of care; and (b) responsibility for action. Qualitative inquiry was used to explore staff understanding of recovery, implementation in services and the wider system, and the perceived impact of the intervention. Semi-structured interviews were conducted with 16 intervention group team leaders post-training and an inductive thematic analysis undertaken. Results A total of 342 (89%) staff received the intervention. Care plans of patients in the intervention group had significantly more changes with evidence of change in the content of patient’s care plans (OR 10.94. 95% CI 7.01-17.07) and the attributed responsibility for the actions detailed (OR 2.95, 95% CI 1.68-5.18). Nine themes emerged from the qualitative analysis split into two superordinate categories. ‘Recovery, individual and practice’, describes the perception and provision of recovery orientated care by individuals and at a team level. It includes themes on care provision, the role of hope, language of recovery, ownership and multidisciplinarity. ‘Systemic implementation’, describes organizational implementation and includes themes on hierarchy and role definition, training approaches, measures of recovery and resources. Conclusions Training can provide an important mechanism for instigating change in promoting recovery-orientated practice. However, the challenge of systemically implementing recovery approaches requires further consideration of the conceptual elements of recovery, its measurement, and maximising and demonstrating organizational commitment. PMID:23764121

  7. Applying the 5-Step Method to Children and Affected Family Members: Opportunities and Challenges within Policy and Practice

    ERIC Educational Resources Information Center

    Harwin, Judith

    2010-01-01

    The main aim of this article is to consider how the 5-Step Method could be developed to meet the needs of affected family members (AFMs) with children under the age of 18. This would be an entirely new development. This article examines opportunities and challenges within practice and policy and makes suggestions on how the Method could be taken…

  8. "Methods Practiced in Social Studies Instruction: A Review of Public School Teachers' Strategies"

    ERIC Educational Resources Information Center

    Bolinger, Kevin; Warren, Wilson J.

    2007-01-01

    In this study, the authors explore the "apparent gulf between professionals' advice and actual teachers' practices." Their study recognizes the imperviousness of teacher practice to the efforts of researchers and curricular reformers. For over a century, the authors argue, "best practices" in social studies have always included instruction that…

  9. Combining the soilwater balance and water-level fluctuation methods to estimate natural groundwater recharge: Practical aspects

    USGS Publications Warehouse

    Sophocleous, M.A.

    1991-01-01

    A relatively simple and practical approach for calculating groundwater recharge in semiarid plain environments with a relatively shallow water table, such as the Kansas Prairies, is outlined. Major uncertainties in the Darcian, water balance, and groundwater fluctuation analysis approaches are outlined, and a combination methodology for reducing some of the uncertainties is proposed. By combining a storm-based soilwater balance (lasting several days) with the resulting water table rise, effective storativity values of the region near the water table are obtained. This combination method is termed the 'hybrid water-fluctuation method'. Using a simple average of several such estimates results in a site-calibrated effective storativity value that can be used to translate each major water-table rise tied to a specific storm period into a corresponding amount of groundwater recharge. Examples of soilwater balance and water-level fluctuation analyses based on field-measured data from Kansas show that the proposed methodology gives better and more reliable results than either of the two well-established approaches used singly. ?? 1991.

  10. Multiscale Methods for Nuclear Reactor Analysis

    NASA Astrophysics Data System (ADS)

    Collins, Benjamin S.

    The ability to accurately predict local pin powers in nuclear reactors is necessary to understand the mechanisms that cause fuel pin failure during steady state and transient operation. In the research presented here, methods are developed to improve the local solution using high order methods with boundary conditions from a low order global solution. Several different core configurations were tested to determine the improvement in the local pin powers compared to the standard techniques, that use diffusion theory and pin power reconstruction (PPR). Two different multiscale methods were developed and analyzed; the post-refinement multiscale method and the embedded multiscale method. The post-refinement multiscale methods use the global solution to determine boundary conditions for the local solution. The local solution is solved using either a fixed boundary source or an albedo boundary condition; this solution is "post-refinement" and thus has no impact on the global solution. The embedded multiscale method allows the local solver to change the global solution to provide an improved global and local solution. The post-refinement multiscale method is assessed using three core designs. When the local solution has more energy groups, the fixed source method has some difficulties near the interface: however the albedo method works well for all cases. In order to remedy the issue with boundary condition errors for the fixed source method, a buffer region is used to act as a filter, which decreases the sensitivity of the solution to the boundary condition. Both the albedo and fixed source methods benefit from the use of a buffer region. Unlike the post-refinement method, the embedded multiscale method alters the global solution. The ability to change the global solution allows for refinement in areas where the errors in the few group nodal diffusion are typically large. The embedded method is shown to improve the global solution when it is applied to a MOX/LEU assembly interface, the fuel/reflector interface, and assemblies where control rods are inserted. The embedded method also allows for multiple solution levels to be applied in a single calculation. The addition of intermediate levels to the solution improves the accuracy of the method. Both multiscale methods considered here have benefits and drawbacks, but both can provide improvements over the current PPR methodology.

  11. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...the USDA, Agricultural Marketing Service, Dairy Programs, or Official Methods of Analysis of the Association of Analytical Chemists or Standard Methods for the Examination of Dairy Products. [67 FR 48976, July 29,...

  12. 7 CFR 58.812 - Methods of sample analysis.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ...Agricultural Marketing Service, Dairy Programs, or the Official Methods of Analysis of the Association of Official Analytical Chemists, or Standard Methods for the Examination of Dairy Products. [67 FR 48976, July 29, 2002] Requirements for...

  13. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...Methods of Analysis of the Association of Official Analytical Chemists,” and the supplements thereto (“Changes in Methods” as...of the “Journal of the Association of Official Analytical Chemists”), which are incorporated by reference, when...

  14. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...Methods of Analysis of the Association of Official Analytical Chemists,” and the supplements thereto (“Changes in Methods” as...of the “Journal of the Association of Official Analytical Chemists”), which are incorporated by reference, when...

  15. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...Methods of Analysis of the Association of Official Analytical Chemists,” and the supplements thereto (“Changes in Methods” as...of the “Journal of the Association of Official Analytical Chemists”), which are incorporated by reference, when...

  16. 21 CFR 2.19 - Methods of analysis.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...Methods of Analysis of the Association of Official Analytical Chemists,” and the supplements thereto (“Changes in Methods” as...of the “Journal of the Association of Official Analytical Chemists”), which are incorporated by reference, when...

  17. Adaptive Nodal Transport Methods for Reactor Transient Analysis

    SciTech Connect

    Thomas Downar; E. Lewis

    2005-08-31

    Develop methods for adaptively treating the angular, spatial, and time dependence of the neutron flux in reactor transient analysis. These methods were demonstrated in the DOE transport nodal code VARIANT and the US NRC spatial kinetics code, PARCS.

  18. Common cause analysis : a review and extension of existing methods

    E-print Network

    Heising, Carolyn D.

    1982-01-01

    The quantitative common cause analysis code, MOBB, is extended to include uncertainties arising from modelling uncertainties and data uncertainties. Two methods, Monte Carlo simulation and the Method-of-Moments are used ...

  19. Numerical analysis and the scientific method

    SciTech Connect

    Glimm, J.; Sharp, D.H.

    1986-01-01

    The computer has given rise to a new mode of scientific practice, and today computational science stands beside theory and experiment as a fundamental methodology. The impact of the computer revolution on science can be projected from current trends. The demands to be made on computing methodologies will be reviewed. One of the demands is an ongoing need for excellence in computational methodologies. Generic difficulties encountered in meeting these challenges will be discussed. Recent work of the authors and others will be reviewed in this context.

  20. Web-Based Systems Development: Analysis and Comparison of Practices in Croatia and Ireland

    NASA Astrophysics Data System (ADS)

    Lang, Michael; Vukovac, Dijana Plantak

    The “dot.com” hysteria which sparked fears of a “Web crisis” a decade ago has long subsided and firms established in the 1990 s now have mature development processes in place. This chapter presents a timely re-assessment of the state of Web development practices, comparing data gathered in Croatia and Ireland. Given the growth in popularity of “agile” methods in the past few years, a secondary objective of this research was to analyse the extent to which Web development practices are guided by or otherwise consistent with the underlying principles of agile development.

  1. Shear Lag in Box Beams Methods of Analysis and Experimental Investigations

    NASA Technical Reports Server (NTRS)

    Kuhn, Paul; Chiarito, Patrick T

    1942-01-01

    The bending stresses in the covers of box beams or wide-flange beams differ appreciably from the stresses predicted by the ordinary bending theory on account of shear deformation of the flanges. The problem of predicting these differences has become known as the shear-lag problem. The first part of this paper deals with methods of shear-lag analysis suitable for practical use. The second part of the paper describes strain-gage tests made by the NACA to verify the theory. Three tests published by other investigators are also analyzed by the proposed method. The third part of the paper gives numerical examples illustrating the methods of analysis. An appendix gives comparisons with other methods, particularly with the method of Ebner and Koller.

  2. Methods of Plasma Theory 3.Singular Point Analysis

    NASA Astrophysics Data System (ADS)

    Tokuda, Shinji

    An introductory review is given on recent developments in the methods for stability analysis of a toroidally confined plasma. Emphasis is put on the perturbation analysis of a magnetohydrodynamic system that has the marginally stable state as a terminal point of continuous spectra. We address ourselves to the asymptotic matching method pertinent to such a problem. The Newcomb equation and inner layer equations are essential ingredients in the methods and the numerical methods for solving them are discussed.

  3. Technology transfer through a network of standard methods and recommended practices - The case of petrochemicals

    NASA Astrophysics Data System (ADS)

    Batzias, Dimitris F.; Karvounis, Sotirios

    2012-12-01

    Technology transfer may take place in parallel with cooperative action between companies participating in the same organizational scheme or using one another as subcontractor (outsourcing). In this case, cooperation should be realized by means of Standard Methods and Recommended Practices (SRPs) to achieve (i) quality of intermediate/final products according to specifications and (ii) industrial process control as required to guarantee such quality with minimum deviation (corresponding to maximum reliability) from preset mean values of representative quality parameters. This work deals with the design of the network of SRPs needed in each case for successful cooperation, implying also the corresponding technology transfer, effectuated through a methodological framework developed in the form of an algorithmic procedure with 20 activity stages and 8 decision nodes. The functionality of this methodology is proved by presenting the path leading from (and relating) a standard test method for toluene, as petrochemical feedstock in the toluene diisocyanate production, to the (6 generations distance upstream) performance evaluation of industrial process control systems (ie., from ASTM D5606 to BS EN 61003-1:2004 in the SRPs network).

  4. Randomized Comparison of 3 Methods to Screen for Domestic Violence in Family Practice

    PubMed Central

    Chen, Ping-Hsin; Rovi, Sue; Washington, Judy; Jacobs, Abbie; Vega, Marielos; Pan, Ko-Yu; Johnson, Mark S.

    2007-01-01

    PURPOSE We undertook a study to compare 3 ways of administering brief domestic violence screening questionnaires: self-administered questionnaire, medical staff interview, and physician interview. METHODS We conducted a randomized trial of 3 screening protocols for domestic violence in 4 urban family medicine practices with mostly minority patients. We randomly assigned 523 female patients, aged 18 years or older and currently involved with a partner, to 1 of 3 screening protocols. Each included 2 brief screening tools: HITS and WAST-Short. Outcome measures were domestic violence disclosure, patient and clinician comfort with the screening, and time spent screening. RESULTS Overall prevalence of domestic violence was 14%. Most patients (93.4%) and clinicians (84.5%) were comfortable with the screening questions and method of administering them. Average time spent screening was 4.4 minutes. Disclosure rates, patient and clinician comfort with screening, and time spent screening were similar among the 3 protocols. In addition, WAST-Short was validated in this sample of minority women by comparison with HITS and with the 8-item WAST. CONCLUSIONS Domestic violence is common, and we found that most patients and clinicians are comfortable with domestic violence screening in urban family medicine settings. Patient self-administered domestic violence screening is as effective as clinician interview in terms of disclosure, comfort, and time spent screening. PMID:17893385

  5. Ethical practice in sex offender assessment: consideration of actuarial and polygraph methods.

    PubMed

    Vess, James

    2011-09-01

    The current generation of community protection laws represents a shift in priorities that may see the individual rights of sex offenders compromised for the goal of public safety. At the center of many judicial decisions under these laws are the risk assessment reports provided by mental health practitioners. The widespread enactment of laws allowing for additional sanctions for sex offenders, and a burgeoning research literature regarding the methods used to assess risk have served to heighten rather than resolve the ethical concerns associated with professional practice in this area. This article examines ethical issues inherent in the use of two assessment methods commonly used with sex offenders in the correctional context, focusing on actuarial measures and polygraph tests. Properly conducted and adequately reported actuarial findings are considered to provide useful information of sufficient accuracy to inform rather than mislead judicial decision makers, although careful consideration must be given to the limitations of current measures in each individual case. Despite its increasing use, polygraph testing is considered controversial, with little consensus regarding its accuracy or appropriate applications. On the basis of the current state of the professional literature regarding the polygraph, its use with sex offenders raises unresolved ethical concerns. PMID:20944058

  6. Application of Discrete Choice Methods in Consumer Preference Analysis

    Microsoft Academic Search

    Andrzej B?k; Aneta Rybicka

    Stated consumer preferences refer to hypothetical market behaviour of consumers. In this case the analytical methods are based on data collected a priori by means of surveys to register intentions stated by consumers at the moment of survey taking. The methods used for stated preference analysis include, for instance, discrete choice methods. The general concept of discrete choice methods results

  7. Milk Sharing in Practice: A Descriptive Analysis of Peer Breastmilk Sharing.

    PubMed

    Reyes-Foster, Beatriz M; Carter, Shannon K; Hinojosa, Melanie Sberna

    2015-06-01

    Peer breastmilk sharing has emerged in recent years as a subject of investigation and occasional controversy. Although researchers know that thousands of milk exchanges are facilitated through milk sharing Web sites every week, there is only limited research into milk sharing practices on the ground. This study examines these practices through a 102-item online survey that asked questions about milk sharing practices, perceptions of milk sharing, and demographic characteristics. Participants were recruited through social media sites specific to breastfeeding and parenting events in Central Florida. The sample consisted of 392 respondents. Data were analyzed using univariate analysis. We found that breastmilk sharing is a complex practice, showing high levels of overlap in which some donors are also recipients, and that cross-nursing sometimes occurs simultaneously with the exchange of expressed milk. Respondents often donated and received milk from people they knew; however, exchanging milk with strangers was also common. Many but not all used the Internet to facilitate milk exchange; participants used well-known milk sharing Web sites as well as their private virtual networks. The study found that most milk exchanges happen in-person as gifts and that selling and shipping breastmilk were rare. We suggest that further research is needed on breastmilk sharing practices to inform breastmilk safety research and policy recommendations. PMID:25973632

  8. A dendrite method for cluster analysis

    Microsoft Academic Search

    T. Cali?ski; J. Harabasz

    1974-01-01

    A method for identifying clusters of points in a multidimensional Euclidean space is described and its application to taxonomy considered. It reconciles, in a sense, two different approaches to the investigation of the spatial relationships between the points, viz., the agglomerative and the divisive methods. A graph, the shortest dendrite of Florek etal. (1951a), is constructed on a nearest neighbour

  9. Basic Methods for Sensitivity Analysis of Biases

    Microsoft Academic Search

    SANDER GREENLAND

    1996-01-01

    Despite the importance of systematic errors and biases, quantitative methods that take account of these biases have seen much less development than methods for addressing random error. There are at least two reasons for this. First, until recently, randomized experiments supplied most of the impetus for statistical develop- ments. These experiments were concentrated in agri- culture, manufacturing, and clinical medicine,

  10. Generic residue analysis and BV method comparison

    NASA Astrophysics Data System (ADS)

    Dorville, Nicolas; Anekallu, Chandra; Haaland, Stein; Belmont, Gerard

    2015-04-01

    Determining the orientation of the normal direction to the magnetopause layer is a key issue for studying in detail the structure of this boundary. Both conservation laws methods and the new iterative BV method, that performs a fit of the magnetic field and ion normal flow velocity with an elliptic model, have been developed for this purpose. These methods have different model assumptions and validity ranges. Unlike the conservation laws methods, the BV method also provides spatial profiles inside the layer. However, it is compatible only with a subset of magnetopause crossings with a single layer current sheet. We compare here their results on artificial magnetopause data with noise, to understand their sensibility to small departures from their physical hypothesis. Then we present a statistical study on their comparison on a list of 149 flank and dayside magnetopause crossings.

  11. Performance Factors Analysis of a Wavelet-based Watermarking Method

    Microsoft Academic Search

    Chaw-seng Woo; Jiang Du; Binh Pham

    2005-01-01

    The essential performance metrics of a robust watermark include robustness, imperceptibility, watermark capacity and security. In addition, computational cost is important for practicality. Wavelet-based image watermarking methods exploit the frequency information and spatial information of the transformed data in multiple resolutions to gain robustness. Although the Human Visual System (HVS) model offers imperceptibility in wavelet-based watermarking, it suffers high computational

  12. Nurses’ self-efficacy and practices relating to weight management of adult patients: a path analysis

    PubMed Central

    2013-01-01

    Background Health professionals play a key role in the prevention and treatment of excess weight and obesity, but many have expressed a lack of confidence in their ability to manage obese patients with their delivery of weight-management care remaining limited. The specific mechanism underlying inadequate practices in professional weight management remains unclear. The primary purpose of this study was to examine a self-efficacy theory-based model in understanding Registered Nurses’ (RNs) professional performance relating to weight management. Methods A self-report questionnaire was developed based upon the hypothesized model and administered to a convenience sample of 588 RNs. Data were collected regarding socio-demographic variables, psychosocial variables (attitudes towards obese people, professional role identity, teamwork beliefs, perceived skills, perceived barriers and self-efficacy) and professional weight management practices. Structural equation modeling was conducted to identify correlations between the above variables and to test the goodness of fit of the proposed model. Results The survey response rate was 71.4% (n?=?420). The respondents reported a moderate level of weight management practices. Self-efficacy directly and positively predicted the weight management practices of the RNs (??=?0.36, p?practices. The final model constructed in this study demonstrated a good fit to the data [?2 (14) =13.90, p?=?0.46; GFI?=?0.99; AGFI?=?0.98; NNFI?=?1.00; CFI?=?1.00; RMSEA?=?0.00; AIC?=?57.90], accounting for 38.4% and 43.2% of the variance in weight management practices and self-efficacy, respectively. Conclusions Self-efficacy theory appears to be useful in understanding the weight management practices of RNs. Interventions targeting the enhancement of self-efficacy may be effective in promoting RNs’ professional performance in managing overweight and obese patients. PMID:24304903

  13. Project Uncertainty, Management Practice and Project Performance: An Empirical Analysis on Customized Information Systems Development Projects

    Microsoft Academic Search

    Q. Z. Wang; J. Liu

    2006-01-01

    This paper develops a risk-based integrated model of information systems development(ISD) project performance to explain the effect of project inherent uncertainty and management practices on project performance. Based on the dataset collected from customized information systems development projects of software houses in Hangzhou City of China,this paper carries out an empirical analysis on the research model. The results reveal that

  14. Analysis of a Multigrid Method for a Transport Equation by Numerical Fourier Analysis 1

    E-print Network

    Oliveira, Suely

    1 Analysis of a Multigrid Method for a Transport Equation by Numerical Fourier Analysis 1 S@cs.tamu.edu Abstract. In this paper, we perform Fourier analysis for a multigrid method with two­cell ¯­line relaxation for solving isotropic transport equa­ tions. Our numerical results show that the Fourier analysis prediction

  15. Singular value decomposition methods for wave propagation analysis

    E-print Network

    Santolik, Ondrej

    Singular value decomposition methods for wave propagation analysis O. Santoli´k,1 M. Parrot, and F February 2003. [1] We describe several newly developed methods for propagation analysis of electromagnetic and instabilities; 2794 Magnetospheric Physics: Instruments and techniques; 0689 Electromagnetics: Wave propagation

  16. Method for chromium analysis and speciation

    DOEpatents

    Aiken, Abigail M.; Peyton, Brent M.; Apel, William A.; Petersen, James N.

    2004-11-02

    A method of detecting a metal in a sample comprising a plurality of metal is disclosed. The method comprises providing the sample comprising a metal to be detected. The sample is added to a reagent solution comprising an enzyme and a substrate, where the enzyme is inhibited by the metal to be detected. An array of chelating agents is used to eliminate the inhibitory effects of additional metals in the sample. An enzymatic activity in the sample is determined and compared to an enzymatic activity in a control solution to detect the metal to be detected. A method of determining a concentration of the metal in the sample is also disclosed. A method of detecting a valence state of a metal is also disclosed.

  17. Diversity and evolution of methods and practices for the molecular diagnosis of congenital toxoplasmosis in France: a 4-year survey.

    PubMed

    Sterkers, Y; Varlet-Marie, E; Marty, P; Bastien, P

    2010-10-01

    The prenatal diagnosis of congenital toxoplasmosis is currently based upon molecular biology using a sample of amniotic fluid. The vast majority of centres globally (and all centres in France) performing this diagnosis use 'in house' or laboratory-developed PCR assays. This may be the source of considerable inter-laboratory variation in the performances of the assays, hampering any valuable comparison of data among different centres. The present study was based upon questionnaires that were sent to 21-25 centres between 2002 and 2005 enquiring about methods and practices of the PCR-based prenatal diagnosis of congenital toxoplasmosis. An extreme diversity of PCR methods and practices was observed. Thus, in 2005, 35 PCR methods, differing in one of the main steps of the whole process, were reported as being in use for routine diagnosis, with nine centres using two or three methods. We provide comprehensive information on the extraction methods, DNA targets, primer pairs and detection methods used for this diagnosis, as well as their evolution, during the period of study. Interestingly, in this period (2002-2005), a rapid progression of the number of laboratories using real-time PCR technology, which increased from four to 19, was observed. We also studied general PCR practices concerning, for example, the number of reaction tubes used for each biological sample and the inclusion of controls. The return of information in a yearly report provided the opportunity for writing proposals aiming to improve laboratory practices for this diagnosis at the national level. The high diversity of methods and practices currently used emphasizes the need for external quality assessment of the performances of the molecular diagnostic methods. PMID:19886905

  18. Implementation of infection control best practice in intensive care units throughout Europe: a mixed-method evaluation study

    PubMed Central

    2013-01-01

    Background The implementation of evidence-based infection control practices is essential, yet challenging for healthcare institutions worldwide. Although acknowledged that implementation success varies with contextual factors, little is known regarding the most critical specific conditions within the complex cultural milieu of varying economic, political, and healthcare systems. Given the increasing reliance on unified global schemes to improve patient safety and healthcare effectiveness, research on this topic is needed and timely. The ‘InDepth’ work package of the European FP7 Prevention of Hospital Infections by Intervention and Training (PROHIBIT) consortium aims to assess barriers and facilitators to the successful implementation of catheter-related bloodstream infection (CRBSI) prevention in intensive care units (ICU) across several European countries. Methods We use a qualitative case study approach in the ICUs of six purposefully selected acute care hospitals among the 15 participants in the PROHIBIT CRBSI intervention study. For sensitizing schemes we apply the theory of diffusion of innovation, published implementation frameworks, sensemaking, and new institutionalism. We conduct interviews with hospital health providers/agents at different organizational levels and ethnographic observations, and conduct rich artifact collection, and photography during two rounds of on-site visits, once before and once one year into the intervention. Data analysis is based on grounded theory. Given the challenge of different languages and cultures, we enlist the help of local interpreters, allot two days for site visits, and perform triangulation across multiple data sources. Qualitative measures of implementation success will consider the longitudinal interaction between the initiative and the institutional context. Quantitative outcomes on catheter-related bloodstream infections and performance indicators from another work package of the consortium will produce a final mixed-methods report. Conclusion A mixed-methods study of this scale with longitudinal follow-up is unique in the field of infection control. It highlights the ‘Why’ and ‘How’ of best practice implementation, revealing key factors that determine success of a uniform intervention in the context of several varying cultural, economic, political, and medical systems across Europe. These new insights will guide future implementation of more tailored and hence more successful infection control programs. Trial registration Trial number: PROHIBIT-241928 (FP7 reference number) PMID:23421909

  19. Analysis of hemoglobin electrophoresis results and physicians investigative practices in Saudi Arabia

    PubMed Central

    Mehdi, Syed Riaz; Al Dahmash, Badr Abdullah

    2013-01-01

    BACKGROUND AND OBJECTIVES: Riyadh and central province falls in a moderate prevalent zone of hemoglobinopathies in Saudi Arabia. However, it has been observed that the physicians working in Saudi Arabia invariably advise all cases of anemia for hemoglobin electrophoresis (HE). The present work was carried out to study the yield of the HE in Riyadh and the investigative practices of the physicians advising HE. SETTINGS AND DESIGN: The study was carried out in the hospitals of King Saud University from 2009 to 2011 in order to assess the yield of HE in referred cases of clinical anemia. MATERIALS AND METHODS: A total of 1073 cases divided in two groups of males and females had undergone complete blood count and red blood cell morphology. Cellulose acetate HE was performed and all the positive results were reconfirmed on the high performance liquid chromatography (HPLC). The results were analyzed for the type of hemoglobinopathies. For statistical analysis Statistical Package for Social Sciences 15 version (SPSS Inc., Chicago, IL, USA) was used. RESULTS: A total of 405 males and 668 females blood samples were included in the present study. 116 (28.5%) males and 167 (25%) females showed an abnormal pattern on HE. The incidence of beta thalassemia trait was higher in females while sickle cell trait was predominantly seen in males. Red cell indices were reduced considerably in thalassemias, but were unaffected in sickle cell disorders, except those which had concurrent alpha trait. The total yield of HE was 26.6% which was much less than expected. CONCLUSION: The physicians are advised to rule out iron deficiency and other common causes of anemia before investigating the cases for hemoglobinopathies, which employs time consuming and expensive tests of HE and HPLC. PMID:24339548

  20. Analysis Method for Giant Magnetostrictive Material Based Actuator Using FEM

    NASA Astrophysics Data System (ADS)

    Yoo, Byungjin; Hirata, Katsuhiro

    Giant Magnetostrictive Materials (GMM) came to attract attention because of great magnetostriction under moderate prestress condition, strong force, high response and so on. Recently, the design process of devices using this kind of smart material relies on computational tools increasingly. High accuracy analysis aids us in optimizing the performance of devices in the design stage. Then various kinds of analysis methods have been studied, however, it seems that any effective analysis method has not developed yet. One of the crucial reasons is that the nonlinearity of GMM is not considered in conventional method. In this paper, the magneto-mechanical coupled analysis method for GMM actuator is proposed using 3D-FEM under various prestress conditions, in which the nonlinearity of GMM characteristics are considered. Moreover, the validity of the proposed analysis method is verified through the comparison with the experiment of a prototype.

  1. Extraction of brewer's yeasts using different methods of cell disruption for practical biodiesel production.

    PubMed

    ?ezanka, Tomáš; Matoulková, Dagmar; Kolouchová, Irena; Masák, Jan; Viden, Ivan; Sigler, Karel

    2015-05-01

    The methods of preparation of fatty acids from brewer's yeast and its use in production of biofuels and in different branches of industry are described. Isolation of fatty acids from cell lipids includes cell disintegration (e.g., with liquid nitrogen, KOH, NaOH, petroleum ether, nitrogenous basic compounds, etc.) and subsequent processing of extracted lipids, including analysis of fatty acid and computing of biodiesel properties such as viscosity, density, cloud point, and cetane number. Methyl esters obtained from brewer's waste yeast are well suited for the production of biodiesel. All 49 samples (7 breweries and 7 methods) meet the requirements for biodiesel quality in both the composition of fatty acids and the properties of the biofuel required by the US and EU standards. PMID:25394535

  2. Computational Methods for the Fourier Analysis of Sparse

    E-print Network

    Potts, Daniel

    Computational Methods for the Fourier Analysis of Sparse High-Dimensional Functions Lutz K to these thinner discretisations and we focus on two ma- jor topics regarding the Fourier analysis of high Fourier analysis is the fast computation of certain trigonometric sums. A straightforward evaluation

  3. A Content Analysis Method for Developing User-Based Objectives.

    ERIC Educational Resources Information Center

    Nahl-Jakobovits, Diane; Jakobovits, Leon A.

    1992-01-01

    Describes a method of developing user-based objectives for bibliographic instruction that utilizes content analysis to code self-reports by library users according to a taxonomy of library speech acts in the affective, cognitive, and sensorimotor domains. Analysis of one student's library reports and translation of the analysis into enabling…

  4. Probabilistic structural analysis methods of hot engine structures

    Microsoft Academic Search

    C. C. Chamis; D. A. Hopkins

    1989-01-01

    Development of probabilistic structural analysis methods for hot engine structures at Lewis Research Center is presented. Three elements of the research program are: (1) composite load spectra methodology; (2) probabilistic structural analysis methodology; and (3) probabilistic structural analysis application. Recent progress includes: (1) quantification of the effects of uncertainties for several variables on high pressure fuel turbopump (HPFT) turbine blade

  5. Nonparametric Bayesian Methods for Manifold Analysis

    E-print Network

    Maggioni, Mauro

    Analysis #12;Motivation: Low-Rank Gaussian Mixture Model Data on a manifold may be approximated statistically as being drawn from a mixture of low-rank Gaussians How many Gaussians? What is the rank? How may needed to represent the data {xi}i=1,n? What is the rank J of the Gaussians (in general different

  6. Environmental Impact Analysis: Philosophy and Methods.

    ERIC Educational Resources Information Center

    Ditton, Robert B.; Goodale, Thomas L.

    Proceedings of the Conference on Environmental Impact Analysis held in Green Bay, Wisconsin, January 4-5, 1972, are compiled in this report. The conference served as a forum for exchange of information among State and Federal agencies and educators on experiences with the National Environmental Policy Act of 1970. Hopefully, results of the…

  7. Decomposition Methods for Fault Tree Analysis

    Microsoft Academic Search

    Arnon Rosenthal

    1980-01-01

    Some kinds of fault tree analysis are described for which cut set enumeration is inadequate. Modularization leads to more efficient computer programs, and also identifies subsystems which are intuitively meaningful. The problem of finding all modules of a fault tree is formulated as as extension of the problem of finding all ``cut-points'' of an undirected graph. The major result is

  8. Analysis of Two Methods to Evaluate Antioxidants

    ERIC Educational Resources Information Center

    Tomasina, Florencia; Carabio, Claudio; Celano, Laura; Thomson, Leonor

    2012-01-01

    This exercise is intended to introduce undergraduate biochemistry students to the analysis of antioxidants as a biotechnological tool. In addition, some statistical resources will also be used and discussed. Antioxidants play an important metabolic role, preventing oxidative stress-mediated cell and tissue injury. Knowing the antioxidant content…

  9. [Experiences with instrumental methods for urinary calculi analysis].

    PubMed

    Asper, R; Schmucki, O

    1979-08-01

    To reduce the urinary calculi incidence by calculi formers, it is important to know the composition of these stones. Unfortunately the chemical analysis does not give very reliable results. Looking for a better method to analyse urinary calculi, three instrumental methods were tested: infared spectroscopy, thermal analysis and X-ray diffraction. The experimental results and economical considerations show that the X-ray diffraction analysis of urinary calculi would meet the goal of improved care of patients with stones. PMID:489410

  10. Comparative analysis of selected hydromorphological assessment methods.

    PubMed

    Sípek, Václav; Matousková, Milada; Dvorák, Martin

    2010-10-01

    The European Water Framework Directive 2000/60/EC aims to achieve a good ecological status of all surface water bodies in Europe. The definition of the ecological status is based on the hydromorphological, hydrochemical, and hydrobiological features of water bodies. Numerous methods are applied for the purpose of hydromorphological status assessment. This study attempts to compare four different methods (EcoRivHab, LAWA Field and Overview Survey, and Rapid Bioassessment Protocol) that were applied at two study areas in the Czech part of the Elbe River Basin. The selected catchments represent areas with different sizes and physical geographic as well as socioeconomic characteristics. All the methods applied showed the capacity to identify the natural and even the completely changed reaches and provided good information on the river physical habitat state. However, they are varied from the viewpoint of the number of parameters, number of monitored zones, time and knowledge demands of the performed assessment. PMID:19760083

  11. Passive sampling methods for contaminated sediments: Practical guidance for selection, calibration, and implementation

    PubMed Central

    Ghosh, Upal; Driscoll, Susan Kane; Burgess, Robert M; Jonker, Michiel To; Reible, Danny; Gobas, Frank; Choi, Yongju; Apitz, Sabine E; Maruya, Keith A; Gala, William R; Mortimer, Munro; Beegan, Chris

    2014-01-01

    This article provides practical guidance on the use of passive sampling methods (PSMs) that target the freely dissolved concentration (Cfree) for improved exposure assessment of hydrophobic organic chemicals in sediments. Primary considerations for selecting a PSM for a specific application include clear delineation of measurement goals for Cfree, whether laboratory-based “ex situ” and/or field-based “in situ” application is desired, and ultimately which PSM is best-suited to fulfill the measurement objectives. Guidelines for proper calibration and validation of PSMs, including use of provisional values for polymer–water partition coefficients, determination of equilibrium status, and confirmation of nondepletive measurement conditions are defined. A hypothetical example is described to illustrate how the measurement of Cfree afforded by PSMs reduces uncertainty in assessing narcotic toxicity for sediments contaminated with polycyclic aromatic hydrocarbons. The article concludes with a discussion of future research that will improve the quality and robustness of Cfree measurements using PSMs, providing a sound scientific basis to support risk assessment and contaminated sediment management decisions. Integr Environ Assess Manag 2014;10:210–223. © 2014 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of SETAC. PMID:24288273

  12. Advanced stress analysis methods applicable to turbine engine structures

    NASA Technical Reports Server (NTRS)

    Pian, T. H. H.

    1985-01-01

    Advanced stress analysis methods applicable to turbine engine structures are investigated. Constructions of special elements which containing traction-free circular boundaries are investigated. New versions of mixed variational principle and version of hybrid stress elements are formulated. A method is established for suppression of kinematic deformation modes. semiLoof plate and shell elements are constructed by assumed stress hybrid method. An elastic-plastic analysis is conducted by viscoplasticity theory using the mechanical subelement model.

  13. Practical considerations for conducting ecotoxicity test methods with manufactured nanomaterials: what have we learnt so far?

    PubMed

    Handy, Richard D; van den Brink, Nico; Chappell, Mark; Mühling, Martin; Behra, Renata; Dušinská, Maria; Simpson, Peter; Ahtiainen, Jukka; Jha, Awadhesh N; Seiter, Jennifer; Bednar, Anthony; Kennedy, Alan; Fernandes, Teresa F; Riediker, Michael

    2012-05-01

    This review paper reports the consensus of a technical workshop hosted by the European network, NanoImpactNet (NIN). The workshop aimed to review the collective experience of working at the bench with manufactured nanomaterials (MNMs), and to recommend modifications to existing experimental methods and OECD protocols. Current procedures for cleaning glassware are appropriate for most MNMs, although interference with electrodes may occur. Maintaining exposure is more difficult with MNMs compared to conventional chemicals. A metal salt control is recommended for experiments with metallic MNMs that may release free metal ions. Dispersing agents should be avoided, but if they must be used, then natural or synthetic dispersing agents are possible, and dispersion controls essential. Time constraints and technology gaps indicate that full characterisation of test media during ecotoxicity tests is currently not practical. Details of electron microscopy, dark-field microscopy, a range of spectroscopic methods (EDX, XRD, XANES, EXAFS), light scattering techniques (DLS, SLS) and chromatography are discussed. The development of user-friendly software to predict particle behaviour in test media according to DLVO theory is in progress, and simple optical methods are available to estimate the settling behaviour of suspensions during experiments. However, for soil matrices such simple approaches may not be applicable. Alternatively, a Critical Body Residue approach may be taken in which body concentrations in organisms are related to effects, and toxicity thresholds derived. For microbial assays, the cell wall is a formidable barrier to MNMs and end points that rely on the test substance penetrating the cell may be insensitive. Instead assays based on the cell envelope should be developed for MNMs. In algal growth tests, the abiotic factors that promote particle aggregation in the media (e.g. ionic strength) are also important in providing nutrients, and manipulation of the media to control the dispersion may also inhibit growth. Controls to quantify shading effects, and precise details of lighting regimes, shaking or mixing should be reported in algal tests. Photosynthesis may be more sensitive than traditional growth end points for algae and plants. Tests with invertebrates should consider non-chemical toxicity from particle adherence to the organisms. The use of semi-static exposure methods with fish can reduce the logistical issues of waste water disposal and facilitate aspects of animal husbandry relevant to MMNs. There are concerns that the existing bioaccumulation tests are conceptually flawed for MNMs and that new test(s) are required. In vitro testing strategies, as exemplified by genotoxicity assays, can be modified for MNMs, but the risk of false negatives in some assays is highlighted. In conclusion, most protocols will require some modifications and recommendations are made to aid the researcher at the bench. PMID:22422174

  14. Good Manufacturing Practices production and analysis of a DNA vaccine against dental caries

    Microsoft Academic Search

    Ya-ping Yang; Yu-hong Li; Ai-hua Zhang; Lan Bi; Ming-wen Fan

    2009-01-01

    Aim:To prepare a clinical-grade anti-caries DNA vaccine pGJA-P\\/VAX and explore its immune effect and protective efficacy against a cariogenic bacterial challenge.Methods:A large-scale industrial production process was developed under Good Manufacturing Practices (GMP) by combining and optimizing common unit operations such as alkaline lysis, precipitation, endotoxin removal and column chromatography. Quality controls of the purified bulk and final lyophilized vaccine were

  15. Adaptive computational methods for aerothermal heating analysis

    NASA Technical Reports Server (NTRS)

    Price, John M.; Oden, J. Tinsley

    1988-01-01

    The development of adaptive gridding techniques for finite-element analysis of fluid dynamics equations is described. The developmental work was done with the Euler equations with concentration on shock and inviscid flow field capturing. Ultimately this methodology is to be applied to a viscous analysis for the purpose of predicting accurate aerothermal loads on complex shapes subjected to high speed flow environments. The development of local error estimate strategies as a basis for refinement strategies is discussed, as well as the refinement strategies themselves. The application of the strategies to triangular elements and a finite-element flux-corrected-transport numerical scheme are presented. The implementation of these strategies in the GIM/PAGE code for 2-D and 3-D applications is documented and demonstrated.

  16. Analysis Resistant Cipher Method and Apparatus

    NASA Technical Reports Server (NTRS)

    Oakley, Ernest C. (Inventor)

    2009-01-01

    A system for encoding and decoding data words including an anti-analysis encoder unit for receiving an original plaintext and producing a recoded data, a data compression unit for receiving the recoded data and producing a compressed recoded data, and an encryption unit for receiving the compressed recoded data and producing an encrypted data. The recoded data has an increased non-correlatable data redundancy compared with the original plaintext in order to mask the statistical distribution of characters in the plaintext data. The system of the present invention further includes a decryption unit for receiving the encrypted data and producing a decrypted data, a data decompression unit for receiving the decrypted data and producing an uncompressed recoded data, and an anti-analysis decoder unit for receiving the uncompressed recoded data and producing a recovered plaintext that corresponds with the original plaintext.

  17. THE EVAPORATION METHOD IN SPECTRAL ANALYSIS

    Microsoft Academic Search

    Shteinberg

    1962-01-01

    The rapid development of science and the use of new elements, especialiy ; in the atomic industry and the semiconductor technology, places a great emphasis ; on the purity of the materials, requiring analytical methods which are able to ; detect impurities in the 10⁻⁴ to 10⁻⁶% range; the specifications are ; expected to become even more rigorous in the

  18. NUMERICAL ANALYSIS: MULTISCALE METHODS, ADAPTIVITY AND COMPLEXITY

    E-print Network

    Jensen, Max

    for systems with multiple timescales " 16.40 - 17.10 G. Baxter (Reading), "Towards multiscale data scattering" 12.20 - 14.00 Lunch in the Claverton Rooms 14.00 - 14.30 Z. Stoyanov (Bath), "The sensitivity Veldhuizen (Delft), "Solving implicit relations by inexact Newton methods: Application to chemical vapour

  19. A method of streamflow drought analysis

    Microsoft Academic Search

    Emir Zelenhasi?; Atila Salvai

    1987-01-01

    A method of completely describing and analyzing the stochastic process of streamflow droughts has been recommended. All important components of streamflow droughts such as deficit, duration, time of occurrence, number of streamflow droughts in a given time interval [0, t], the largest streamflow drought deficit, and the largest streamflow drought duration in a given time interval [0, t] are taken

  20. Ecoefficiency analysis by basf: the method

    Microsoft Academic Search

    Peter Saling I; Andreas Kicherer; Brigitte Dittrich-Kriimer; Rolf Wittlinger; Winfried Zombik; Isabell Schmidt; Wolfgang Schrott; Silke Schmidt

    2002-01-01

    Intention, Goal, Scope, Background  BASF has developed the tool of eco-efficiency analysis to address not only strategic issues, but also issues posed by the\\u000a marketplace, politics and research. It was a goal to develop a tool for decision-making processes which is useful for a lot\\u000a of applications in chemistry and other industries. Objectives. The objectives were the development of a common

  1. Chemical Methods of Analysis of Glycoproteins

    Microsoft Academic Search

    Elizabeth F. Hounsell; Michael J. Davies; Kevin D. Smith

    \\u000a The first analysis of glycoconjugates that often needs to be carried out is to see if they indeed contain sugar. For glycoproteins\\u000a in gels or oligosaccharides in solution, this can be readily achieved by periodate oxidation at two concentrations, the first\\u000a to detect sialic acids, and the second, any monosaccharide that has two free vicinal hydroxyl groups (1). Periodate cleaves

  2. Managing visitor impacts in parks: A multi-method study of the effectiveness of alternative management practices

    USGS Publications Warehouse

    Park, L.O.; Marion, J.L.; Manning, R.E.; Lawson, S.R.; Jacobi, C.

    2008-01-01

    How can recreation use be managed to control associated environmental impacts? What management practices are most effective and why? This study explored these and related questions through a series of experimental ?treatments? and associated ?controls? at the summit of Cadillac Mountain in Acadia National Park, a heavily used and environmentally fragile area. The treatments included five management practices designed to keep visitors on maintained trails, and these practices ranged from ?indirect? (information/education) to ?direct? (a fence bordering the trail). Research methods included unobtrusive observation of visitors to determine the percentage of visitors who walked off-trail and a follow-up visitor survey to explore why management practices did or didn?t work. All of the management practices reduced the percentage of visitors who walked off-trail. More aggressive applications of indirect practices were more effective than less aggressive applications, and the direct management practice of fencing was the most effective of all. None of the indirect management practices reduced walking off-trail to a degree that is likely to control damage to soil and vegetation at the study site. Study findings suggest that an integrated suite of direct and indirect management practices be implemented on Cadillac Mountain (and other, similar sites) that includes a) a regulation requiring visitors to stay on the maintained trail, b) enforcement of this regulation as needed, c) unobtrusive fencing along the margins of the trail, d) redesign of the trail to extend it, widen it in key places, and provide short spur trails to key ?photo points?, and e) an aggressive information/education program to inform visitors of the regulation to stay on the trail and the reasons for it. These recommendations are a manifestation of what may be an emerging principle of park and outdoor recreation management: intensive use requires intensive management.

  3. PRACTICAL EXPERIENCE IN ANALYSIS OF ORGANIC COMPOUNDS IN AMBIENTAIR USING CANISTERS AND SORBENTS

    EPA Science Inventory

    Generation of accurate ambient air VOC pollutant mcasurement dataas a base for regulatory decisions is critical. umerous methodsand procedures for sampling and analysis are available from avariety of sources. ir methods available through theEnvironmental Protection Agency are con...

  4. Stochastic Plane Stress Analysis with Elementary Stiffness Matrix Decomposition Method

    NASA Astrophysics Data System (ADS)

    Er, G. K.; Wang, M. C.; Iu, V. P.; Kou, K. P.

    2010-05-01

    In this study, the efficient analytical method named elementary stiffness matrix decomposition (ESMD) method is further investigated and utilized for the moment evaluation of stochastic plane stress problems in comparison with the conventional perturbation method in stochastic finite element analysis. In order to evaluate the performance of this method, computer programs are written and some numerical results about stochastic plane stress problems are obtained. The numerical analysis shows that the computational efficiency is much increased and the computer EMS memory requirement can be much reduced by using ESMD method.

  5. Combined Finite Element --Finite Volume Method ( Convergence Analysis )

    E-print Network

    Magdeburg, Universität

    Combined Finite Element -- Finite Volume Method ( Convergence Analysis ) M'aria Luk idea is to combine finite volume and finite element methods in an appropriate way. Thus nonlinear grid. Diffusion terms are discretized by the conforming piecewise linear finite element method

  6. High-speed fringe analysis method using frequency demodulation technology

    Microsoft Academic Search

    Yasuhiko Arai; Shunsuke Yokozeki; Kazuhiro Shiraki; Tomoharu Yamada

    1996-01-01

    Interferometer fringe-pattern analysis using the fast Fourier transform (FFT) technology is a very precise measuring method. However, this method has not received significant attention in industry, as a result of the long calculation time for an operation, because the method processes an image using sophisticated algorithms. In this work, in order to reduce the calculation time, the fringes imaged on

  7. Optoelectronic method for analysis of biomolecular interaction dynamics

    NASA Astrophysics Data System (ADS)

    Nepomnyashchaya, E.; Velichko, E.; Aksenov, E.; Bogomaz, T.

    2014-10-01

    Optoelectronic method of laser correlation spectroscopy for study of intermolecular interaction in biomolecular suspension is presented. The method of laser correlation spectroscopy is integrated with orthogonal laser light scattering and ultramicroscopy technique for visual control of biomolecular interactions. The capabilities of the method for analysis of biomolecular conglomerates dynamics are considered.

  8. Synthesis and Analysis of Automatic Assessment Methods Generating intelligent MCQs

    E-print Network

    Gibson, J. Paul

    Synthesis and Analysis of Automatic Assessment Methods in CS1 Generating intelligent MCQs Des choice questions has proved to be a feasible method of testing if students have suitable knowledge pgibson@cs.may.ie ABSTRACT This paper describes the use of random code generation and mutation as a method

  9. New methods for strategic analysis: Automating the wargame

    Microsoft Academic Search

    Morlie H. Graubard; Carl H. Builder

    1982-01-01

    A new method for automating political-military games as a means of analyzing strategic forces was recently developed and demonstrated by The Rand Corporation. Interest in this technique sprang from dissatisfaction with the dominant methods of analyzing strategic forces: manual political-military games and force exchange models. While each brings important capabilities to the analysis of strategic forces, neither method can independently

  10. AEROACOUSTIC ANALYSIS USING A HYBRID FINITE ELEMENT METHOD

    E-print Network

    Giles, Mike

    ., 2004] are much faster than time-domain methods and treat acoustic liners in a natural way either the time-domain or the frequency-domain approach. Time-domain methods [Özyörük and Long, 1996AEROACOUSTIC ANALYSIS USING A HYBRID FINITE ELEMENT METHOD M. C. Duta, A. Laird, M. B. Giles

  11. Sequential Monte Carlo Methods for Statistical Analysis of Tables

    E-print Network

    Liu, Jun

    Sequential Monte Carlo Methods for Statistical Analysis of Tables Yuguo CHEN, Persi DIACONIS, Susan- butions. Our method produces Monte Carlo samples that are remarkably close to the uniform distribution. Our method compares favorably with other existing Monte Carlo- based algorithms, and sometimes

  12. The Semi-Regenerative Method of Simulation Output Analysis

    E-print Network

    Nakayama, Marvin K.

    The Semi-Regenerative Method of Simulation Output Analysis JAMES M. CALVIN New Jersey Institute develop a class of techniques for analyzing the output of simulations of a semi-regenerative process. Called the semi-regenerative method, the approach is a generalization of the regenera- tive method

  13. Recruitment methods in Alzheimer's disease research: general practice versus population based screening by mail

    Microsoft Academic Search

    Fred Andersen; Torgeir A Engstad; Bjørn Straume; Matti Viitanen; Dag S. Halvorsen; Samuel Hykkerud; Kjell Sjøbrend

    2010-01-01

    BACKGROUND: In Alzheimer's disease (AD) research patients are usually recruited from clinical practice, memory clinics or nursing homes. Lack of standardised inclusion and diagnostic criteria is a major concern in current AD studies. The aim of the study was to explore whether patient characteristics differ between study samples recruited from general practice and from a population based screening by mail

  14. Predatory vs. Dialogic Ethics: Constructing an Illusion or Ethical Practice as the Core of Research Methods

    ERIC Educational Resources Information Center

    Cannella, Gaile S.; Lincoln, Yvonna S.

    2007-01-01

    The ethical conduct of research is addressed from two perspectives, as a regulatory enterprise that creates an illusion of ethical practice and as a philosophical concern for equity and the imposition of power within the conceptualization and practice of research itself. The authors discuss various contemporary positions that influence…

  15. The Arrangement of Students' Extracurricular Piano Practice Process with the Asynchronous Distance Piano Teaching Method

    ERIC Educational Resources Information Center

    Karahan, Ahmet Suat

    2015-01-01

    That the students do their extracurricular piano practices in the direction of the teacher's warnings is a key factor in achieving success in the teaching-learning process. However, the teachers cannot adequately control the students' extracurricular practices in the process of traditional piano education. Under the influence of this lack of…

  16. Overview of methods for uncertainty analysis and sensitivity analysis in probabilistic risk assessment

    SciTech Connect

    Iman, R.L.; Helton, J.C.

    1984-01-01

    In this paper, available methods for uncertainty analysis and sensitivity analysis in a PRA are reviewed. This review first treats methods for use with individual components of a PRA and then considers how these methods could be combined in the performance of a complete PRA. The goal of uncertainty analysis is to measure the imprecision in PRA outcomes of interest, and the goal of sensitivity analysis is to identify the major contributors to this imprecision. 56 references.

  17. Femtosecond protein nanocrystallography—data analysis methods

    PubMed Central

    Kirian, Richard A.; Wang, Xiaoyu; Weierstall, Uwe; Schmidt, Kevin E.; Spence, John C. H.; Hunter, Mark; Fromme, Petra; White, Thomas; Chapman, Henry N.; Holton, James

    2014-01-01

    X-ray diffraction patterns may be obtained from individual submicron protein nanocrystals using a femtosecond pulse from a free-electron X-ray laser. Many “single-shot” patterns are read out every second from a stream of nanocrystals lying in random orientations. The short pulse terminates before significant atomic (or electronic) motion commences, minimizing radiation damage. Simulated patterns for Photosystem I nanocrystals are used to develop a method for recovering structure factors from tens of thousands of snapshot patterns from nanocrystals varying in size, shape and orientation. We determine the number of shots needed for a required accuracy in structure factor measurement and resolution, and investigate the convergence of our Monte-Carlo integration method. PMID:20389587

  18. Analysis and design of preconditioning methods for the Euler equations

    NASA Astrophysics Data System (ADS)

    Zaccanti, Marco Rodolfo

    Preconditioning of a system of equations at the differential level represents a relatively new area of research in convergence acceleration of discrete schemes for the fluid dynamics equations. This technique attempts to remove the intrinsic stiffness of the equations caused by the different time scales of the dynamic problem. Specifically, for the Euler equations, preconditioning aims at equalizing the speed of propagation of the different waves of the hyperbolic problem. This 'fix' becomes particularly useful in the incompressible limit and in the neighborhood of the sonic point. Practical examples of application of preconditioning include modern transonic supercritical airfoils, and nearly incompressible flows with embedded regions where compressibility is important (e.g., low speed combustion). This study attempts the ambitious project of reviewing most of the work done in Euler preconditioning, while at the same time extending some of the existing methods and correcting their robustness problems. Several original contributions to the theory of Euler preconditioning are given, including a thorough exposition of the symmetrizability property of the preconditioned equations, a discussion of the positive definiteness property of the preconditioning matrix, and the study of the eigenvalues for the full form of the preconditioner. Considering the numerical implementation of the preconditioned methods using the Roe flux function, a scheme based on the classical one-Riemann problem normal to the cell interface is proposed, and its advantages over other formulations found in the literature, as well as its drawbacks, are discussed. Then, the analysis of several existing preconditioning methods is conducted, and the complete eigenvector structure of the equations preconditioned with the Van Leer- Lee-Roe matrix, the Turkel matrix, the Choi-Merkle matrix, and a few others, is obtained and analyzed. A comprehensive exploration of preconditioning in one and two spatial dimensions is attempted, which allows to better understand the properties of existing preconditioners. While this investigation suggests new interesting families of one-dimensional preconditioners, for the two-dimensional case the analysis is complete only for the sparse form of the preconditioner, and shows that in this instance it is not possible to remove all of the robustness problems usually found in Euler preconditioning. When considering the full form of the preconditioning matrix the analysis is not complete, because of the formidable algebraic problem involved. Nonetheless, some specific solutions are considered, and a few general conclusions are also drawn in this case. Finally, it is shown that there exist at least two sparse preconditioners that are sufficiently robust in computing stagnation point flow, while preserving the overall effectiveness of preconditioning for low speed flow. One of these matrices is a modification of the popular Turkel method. Using this matrix in regions of low Mach number, in conjunction with the Van Leer-Lee-Roe preconditioner in the transonic and supersonic parts of the flow field, allows to achieve a very robust and efficient preconditioning procedure for the entire Mach regime.

  19. The Som Palaeobarometry Method: A Critical Analysis

    NASA Astrophysics Data System (ADS)

    Kavanagh, L.; Goldblatt, C.

    2013-12-01

    Som et al. (2012) created the first method to quantify ancient atmospheric density using lithified raindrop imprints. This method is based on a series of assumptions, all of which must hold for the calculation of air density to be valid. Crater area is assumed to act as a proxy for drop momentum, which is used to calculate terminal velocity by estimating a drop size. The drop size estimation assumes that the largest stable ancient drops are similar to those today and this limit is independent of air density. By breaking down the overall method into individual postulates, each was examined separately before assessing the process as a whole. These were evaluated using a combination of physical theory, a large dataset of modern rainfall characteristics, and an application of the Som et al. (2012) method to recent raindrop imprints. The relationship between imprint area and drop momentum was confirmed. However, maximum drop size appears to depend strongly on rainfall rate and weakly on atmospheric density. Thus the central assumption that the largest imprint area can be mapped directly to drop velocity is not supported. Our provisional conclusion is that the distribution of fossil raindrop imprints can likely lead to an estimate of past rainfall rate, but is unlikely to result in a valid calculation of ancient atmospheric pressure. Som, S. M., D. C. Catling, J. P. Harnmeijer, P. M. Polivka, and R. Buick (2012), Air Density 2.7 Billion Years Ago Limited to Less Than Twice Modern Levels by Fossil Raindrop Imprints, Nature, 484 (7394), 359-362, doi:10.1038/nature10890.

  20. Continuous geodetic time-transfer analysis methods.

    PubMed

    Dach, Rolf; Schildknecht, Thomas; Hugentobler, Urs; Bernier, Laurent-Guy; Dudle, Gregor

    2006-07-01

    We address two issues that limit the quality of time and frequency transfer by carrier phase measurements from the Global Positioning System (GPS). The first issue is related to inconsistencies between code and phase observations. We describe and classify several types of events that can cause inconsistencies and observe that some of them are related to the internal clock of the GPS receiver. Strategies to detect and overcome time-code inconsistencies have been developed and implemented into the Bernese GPS Software package. For the moment, only inconsistencies larger than the 20 ns code measurement noise level can be detected automatically. The second issue is related to discontinuities at the day boundaries that stem from the processing of the data in daily batches. Two new methods are discussed: clock handover and ambiguity stacking. The two approaches are tested on data obtained from a network of stations, and the results are compared with an independent time-transfer method. Both methods improve the stability of the transfer for short averaging times, but there is no benefit for averaging times longer than 8 days. We show that continuous solutions are sufficiently robust against modeling and preprocessing errors to prevent the solution from accumulating a permanent bias. PMID:16889332