Sample records for practical analysis method

  1. A Practical Guide to the Why-Because Analysis Method Performing a Why-Because Analysis

    E-print Network

    Moeller, Ralf

    by Hierarchical Task Analysis [Paul-Stüve 05]. The flow charts follow the IBM Flowcharting Tech- niques guide necessary to perform a WBA. The data to create this guide and the flow charts were determinedA Practical Guide to the Why-Because Analysis Method Performing a Why-Because Analysis Thilo Paul

  2. A Practical Method of Policy Analysis by Simulating Policy Options

    ERIC Educational Resources Information Center

    Phelps, James L.

    2011-01-01

    This article focuses on a method of policy analysis that has evolved from the previous articles in this issue. The first section, "Toward a Theory of Educational Production," identifies concepts from science and achievement production to be incorporated into this policy analysis method. Building on Kuhn's (1970) discussion regarding paradigms, the…

  3. A Practical Method of Policy Analysis by Estimating Effect Size

    ERIC Educational Resources Information Center

    Phelps, James L.

    2011-01-01

    The previous articles on class size and other productivity research paint a complex and confusing picture of the relationship between policy variables and student achievement. Missing is a conceptual scheme capable of combining the seemingly unrelated research and dissimilar estimates of effect size into a unified structure for policy analysis and…

  4. A Topography Analysis Incorporated Optimization Method for the Selection and Placement of Best Management Practices

    PubMed Central

    Shen, Zhenyao; Chen, Lei; Xu, Liang

    2013-01-01

    Best Management Practices (BMPs) are one of the most effective methods to control nonpoint source (NPS) pollution at a watershed scale. In this paper, the use of a topography analysis incorporated optimization method (TAIOM) was proposed, which integrates topography analysis with cost-effective optimization. The surface status, slope and the type of land use were evaluated as inputs for the optimization engine. A genetic algorithm program was coded to obtain the final optimization. The TAIOM was validated in conjunction with the Soil and Water Assessment Tool (SWAT) in the Yulin watershed in Southwestern China. The results showed that the TAIOM was more cost-effective than traditional optimization methods. The distribution of selected BMPs throughout landscapes comprising relatively flat plains and gentle slopes, suggests the need for a more operationally effective scheme, such as the TAIOM, to determine the practicability of BMPs before widespread adoption. The TAIOM developed in this study can easily be extended to other watersheds to help decision makers control NPS pollution. PMID:23349917

  5. A topography analysis incorporated optimization method for the selection and placement of best management practices.

    PubMed

    Shen, Zhenyao; Chen, Lei; Xu, Liang

    2013-01-01

    Best Management Practices (BMPs) are one of the most effective methods to control nonpoint source (NPS) pollution at a watershed scale. In this paper, the use of a topography analysis incorporated optimization method (TAIOM) was proposed, which integrates topography analysis with cost-effective optimization. The surface status, slope and the type of land use were evaluated as inputs for the optimization engine. A genetic algorithm program was coded to obtain the final optimization. The TAIOM was validated in conjunction with the Soil and Water Assessment Tool (SWAT) in the Yulin watershed in Southwestern China. The results showed that the TAIOM was more cost-effective than traditional optimization methods. The distribution of selected BMPs throughout landscapes comprising relatively flat plains and gentle slopes, suggests the need for a more operationally effective scheme, such as the TAIOM, to determine the practicability of BMPs before widespread adoption. The TAIOM developed in this study can easily be extended to other watersheds to help decision makers control NPS pollution. PMID:23349917

  6. A Practical Slack-time Analysis Method for DVS Real-time Scheduling Da-Ren Chen

    E-print Network

    Paris-Sud XI, Université de

    , dynamic voltage scaling can utilize slack times when adjusting voltage levels. Consequently, the energyA Practical Slack-time Analysis Method for DVS Real-time Scheduling Da-Ren Chen Department) are considered as well. The proposed online approach can cooperate with many slack-time analysis methods based

  7. A practical method for incorporating Real Options analysis into US federal benefit-cost analysis procedures

    E-print Network

    Rivey, Darren

    2007-01-01

    This research identifies how Real Options (RO) thinking might acceptably and effectively complement the current mandates for Benefit-Cost Analysis (BCA) defined by the Office of Management and Budget (OMB) in Circular A-94. ...

  8. A Practical Guide to Practice Analysis for Credentialing Examinations.

    ERIC Educational Resources Information Center

    Raymond, Mark R.

    2002-01-01

    Offers recommendations for the conduct of practice analysis (i.e., job analysis) concerning these issues: (1) selecting a method of practice analysis; (2) developing rating scales; (3) determining the content of test plans; (4) using multi-variate procedures for structuring test plans; and (5) determining topic weights for test plans. (SLD)

  9. Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory?Determination of Trihalomethane Formation Potential, Method Validation, and Quality-Control Practices

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Fram, Miranda S.; Bush, Noel

    2004-01-01

    An analytical method for the determination of the trihalomethane formation potential of water samples has been developed. The trihalomethane formation potential is measured by dosing samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine, and then analyzing the resulting trihalomethanes by purge and trap/gas chromatography equipped with an electron capture detector. Detailed explanations of the method and quality-control practices are provided. Method validation experiments showed that the trihalomethane formation potential varies as a function of time between sample collection and analysis, residual-free chlorine concentration, method of sample dilution, and the concentration of bromide in the sample.

  10. Analysis of the upper massif of the craniofacial with the radial methodpractical use

    PubMed Central

    Lepich, Tomasz; D?bek, Józefa; Stompel, Daniel; Gielecki, Jerzy S.

    2011-01-01

    Introduction The analysis of the upper massif of the craniofacial (UMC) is widely used in many fields of science. The aim of the study was to create a high resolution computer system based on a digital information record and on vector graphics, that could enable dimension measuring and evaluation of craniofacial shape using the radial method. Material and methods The study was carried out on 184 skulls, in a good state of preservation, from the early middle ages. The examined skulls were fixed into Molisson's craniostat in the author's own modification. They were directed in space towards the Frankfurt plane and photographed in frontal norm with a digital camera. The parameters describing the plane and dimensional structure of the UMC and orbits were obtained thanks to the computer analysis of the function recordings picturing the craniofacial structures and using software combining raster graphics with vector graphics. Results It was compared mean values of both orbits separately for male and female groups. In female skulls the comparison of the left and right side did not show statistically significant differences. In male group, higher values were observed for the right side. Only the circularity index presented higher values for the left side. Conclusions Computer graphics with the software used for analysing digital pictures of UMC and orbits increase the precision of measurements as well as the calculation possibilities. Recognition of the face in the post mortem examination is crucial for those working on identification in anthropology and criminology laboratories. PMID:22291834

  11. Animal Disease Import Risk Analysis - a Review of Current Methods and Practice.

    PubMed

    Peeler, E J; Reese, R A; Thrush, M A

    2013-10-30

    The application of risk analysis to the spread of disease with international trade in animals and their products, that is, import risk analysis (IRA), has been largely driven by the Sanitary and Phytosanitary (SPS) agreement of the World Trade Organization (WTO). The degree to which the IRA standard established by the World Organization for Animal Health (OIE), and associated guidance, meets the needs of the SPS agreement is discussed. The use of scenario trees is the core modelling approach used to represent the steps necessary for the hazard to occur. There is scope to elaborate scenario trees for commodity IRA so that the quantity of hazard at each step is assessed, which is crucial to the likelihood of establishment. The dependence between exposure and establishment suggests that they should fall within the same subcomponent. IRA undertaken for trade reasons must include an assessment of consequences to meet SPS criteria, but guidance is sparse. The integration of epidemiological and economic modelling may open a path for better methods. Matrices have been used in qualitative IRA to combine estimates of entry and exposure, and consequences with likelihood, but this approach has flaws and better methods are needed. OIE IRA standards and guidance indicate that the volume of trade should be taken into account, but offer no detail. Some published qualitative IRAs have assumed current levels and patterns of trade without specifying the volume of trade, which constrains the use of IRA to determine mitigation measures (to reduce risk to an acceptable level) and whether the principle of equivalence, fundamental to the SPS agreement, has been observed. It is questionable whether qualitative IRA can meet all the criteria set out in the SPS agreement. Nevertheless, scope exists to elaborate the current standards and guidance, so they better serve the principle of science-based decision-making. PMID:24237667

  12. Empowering Discourse: Discourse Analysis as Method and Practice in the Sociology Classroom

    ERIC Educational Resources Information Center

    Hjelm, Titus

    2013-01-01

    Collaborative learning and critical pedagogy are widely recognized as "empowering" pedagogies for higher education. Yet, the practical implementation of both has a mixed record. The question, then, is: How could collaborative and critical pedagogies be empowered themselves? This paper makes a primarily theoretical case for discourse…

  13. APA's Learning Objectives for Research Methods and Statistics in Practice: A Multimethod Analysis

    ERIC Educational Resources Information Center

    Tomcho, Thomas J.; Rice, Diana; Foels, Rob; Folmsbee, Leah; Vladescu, Jason; Lissman, Rachel; Matulewicz, Ryan; Bopp, Kara

    2009-01-01

    Research methods and statistics courses constitute a core undergraduate psychology requirement. We analyzed course syllabi and faculty self-reported coverage of both research methods and statistics course learning objectives to assess the concordance with APA's learning objectives (American Psychological Association, 2007). We obtained a sample of…

  14. Alternative method for wave propagation analysis within bounded linear media: conceptual and practical implications

    E-print Network

    Alberto Lencina; Beatriz Ruiz; Pablo Vaveliuk

    2006-07-02

    This paper uses an alternative approach to study the monochromatic plane wave propagation within dielectric and conductor linear media of plane-parallel-faces. This approach introduces the time-averaged Poynting vector modulus as field variable. The conceptual implications of this formalism are that the nonequivalence between the time-averaged Poynting vector and the squared-field amplitude modulus is naturally manifested as a consequence of interface effects. Also, two practical implications are considered: first, the exact transmittance is compared with that given by the Beer's Law, employed commonly in experiments. The departure among them can be significative for certain material parameter values. Second, when the exact reflectance is studied for negative permittivity slabs, it is show that the high reflectance can be diminished if a small amount of absorption is present.

  15. Practical methods for incorporating summary time-to-event data into meta-analysis

    Microsoft Academic Search

    Jayne F Tierney; Lesley A Stewart; Davina Ghersi; Sarah Burdett; Matthew R Sydes

    2007-01-01

    BACKGROUND: In systematic reviews and meta-analyses, time-to-event outcomes are most appropriately analysed using hazard ratios (HRs). In the absence of individual patient data (IPD), methods are available to obtain HRs and\\/or associated statistics by carefully manipulating published or other summary data. Awareness and adoption of these methods is somewhat limited, perhaps because they are published in the statistical literature using

  16. A practical method of analysis of the current-voltage characteristics of solar cells

    Microsoft Academic Search

    J. P. Charles; M. Abdelkrim; Y. H. Muoy; P. Mialhe

    1981-01-01

    A numerical method suitable for use with a programmable calculator is developed for determining the current-voltage (I-V) equation parameters of a solar cell driven as a generator only. The exact magnitude of the photocurrent is found and the fill factor is deduced. High and low quality Si solar cells were studied under illumination from a 12 V lamp in AM1

  17. Thermographical analysis of the warmth of the hands during the practice of self-regulation method.

    PubMed

    Ikemi, A; Tomita, S; Hayashida, Y

    1988-01-01

    The changes in surface temperatures of the hands, and the permeation of warmth in the hands during self-regulation method (SRM), a new method of self-control, were studied in 5 trained subjects and in 5 beginners. Temperature changes were measured by thermography. Results indicated that both beginners and trained subjects showed similar significant increases in maximum hand temperatures during SRM, however, trained subjects showed significant increases during the control period as well. Further, both groups showed significant increases in the area of warmth permeation during SRM, however, trained subjects also showed a significant increase during the control period. There was a statistical difference between the two groups for the area of warmth permeation at both the end of the control period and the end of SRM. Thus, in SRM, although beginners seem to be able to raise their hand temperatures in a similar degree to trained subjects, the area of warmth permeation is significantly greater in trained subjects than in beginners. These results indicate that SRM is relatively easy to master for beginners. From these results, the technique of SRM, psychophysiological aspects of SRM and methods of self-control in general, and some issues regarding their clinical applications are discussed. PMID:3072586

  18. Practical Thermal Evaluation Methods For HAC Fire Analysis In Type B Radiaoactive Material (RAM) Packages

    SciTech Connect

    Abramczyk, Glenn; Hensel, Stephen J; Gupta, Narendra K.

    2013-03-28

    Title 10 of the United States Code of Federal Regulations Part 71 for the Nuclear Regulatory Commission (10 CFR Part 71.73) requires that Type B radioactive material (RAM) packages satisfy certain Hypothetical Accident Conditions (HAC) thermal design requirements to ensure package safety during accidental fire conditions. Compliance with thermal design requirements can be met by prototype tests, analyses only or a combination of tests and analyses. Normally, it is impractical to meet all the HAC using tests only and the analytical methods are too complex due to the multi-physics non-linear nature of the fire event. Therefore, a combination of tests and thermal analyses methods using commercial heat transfer software are used to meet the necessary design requirements. The authors, along with his other colleagues at Savannah River National Laboratory in Aiken, SC, USA, have successfully used this 'tests and analyses' approach in the design and certification of several United States' DOE/NNSA certified packages, e.g. 9975, 9977, 9978, 9979, H1700, and Bulk Tritium Shipping Package (BTSP). This paper will describe these methods and it is hoped that the RAM Type B package designers and analysts can use them for their applications.

  19. A Critical Analysis of SocINDEX and Sociological Abstracts Using an Evaluation Method for the Practicing Bibliographer

    ERIC Educational Resources Information Center

    Mellone, James T.

    2010-01-01

    This study provides a database evaluation method for the practicing bibliographer that is more than a brief review yet less than a controlled experiment. The author establishes evaluation criteria in the context of the bibliographic instruction provided to meet the research requirements of undergraduate sociology majors at Queens College, City…

  20. Insight into Evaluation Practice: A Content Analysis of Designs and Methods Used in Evaluation Studies Published in North American Evaluation-Focused Journals

    ERIC Educational Resources Information Center

    Christie, Christina A.; Fleischer, Dreolin Nesbitt

    2010-01-01

    To describe the recent practice of evaluation, specifically method and design choices, the authors performed a content analysis on 117 evaluation studies published in eight North American evaluation-focused journals for a 3-year period (2004-2006). The authors chose this time span because it follows the scientifically based research (SBR)…

  1. Qualitative Approaches to Mixed Methods Practice

    ERIC Educational Resources Information Center

    Hesse-Biber, Sharlene

    2010-01-01

    This article discusses how methodological practices can shape and limit how mixed methods is practiced and makes visible the current methodological assumptions embedded in mixed methods practice that can shut down a range of social inquiry. The article argues that there is a "methodological orthodoxy" in how mixed methods is practiced that…

  2. Insight Into Evaluation Practice: A Content Analysis of Designs and Methods Used in Evaluation Studies Published in North American Evaluation-Focused Journals

    Microsoft Academic Search

    Christina A. Christie; Dreolin Nesbitt Fleischer

    2010-01-01

    To describe the recent practice of evaluation, specifically method and design choices, the authors performed a content analysis on 117 evaluation studies published in eight North American evaluation-focused journals for a 3-year period (2004—2006). The authors chose this time span because it follows the scientifically based research (SBR) movement, which prioritizes the use of randomized controlled trials (RCTs) to study

  3. Method of Analysis by the U.S. Geological Survey California District Sacramento Laboratory-- Determination of Dissolved Organic Carbon in Water by High Temperature Catalytic Oxidation, Method Validation, and Quality-Control Practices

    USGS Publications Warehouse

    Bird, Susan M.; Fram, Miranda S.; Crepeau, Kathryn L.

    2003-01-01

    An analytical method has been developed for the determination of dissolved organic carbon concentration in water samples. This method includes the results of the tests used to validate the method and the quality-control practices used for dissolved organic carbon analysis. Prior to analysis, water samples are filtered to remove suspended particulate matter. A Shimadzu TOC-5000A Total Organic Carbon Analyzer in the nonpurgeable organic carbon mode is used to analyze the samples by high temperature catalytic oxidation. The analysis usually is completed within 48 hours of sample collection. The laboratory reporting level is 0.22 milligrams per liter.

  4. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: practical guidelines for the crystalline sponge method.

    PubMed

    Ramadhar, Timothy R; Zheng, Shao Liang; Chen, Yu Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal-organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high-quality data, and will allow construction of chemically and physically sensible models for guest structural determination. PMID:25537388

  5. Analysis of rapidly synthesized guest-filled porous complexes with synchrotron radiation: practical guidelines for the crystalline sponge method

    PubMed Central

    Ramadhar, Timothy R.; Zheng, Shao-Liang; Chen, Yu-Sheng; Clardy, Jon

    2015-01-01

    A detailed set of synthetic and crystallographic guidelines for the crystalline sponge method based upon the analysis of expediently synthesized crystal sponges using third-generation synchrotron radiation are reported. The procedure for the synthesis of the zinc-based metal–organic framework used in initial crystal sponge reports has been modified to yield competent crystals in 3 days instead of 2 weeks. These crystal sponges were tested on some small molecules, with two being unexpectedly difficult cases for analysis with in-house diffractometers in regard to data quality and proper space-group determination. These issues were easily resolved by the use of synchrotron radiation using data-collection times of less than an hour. One of these guests induced a single-crystal-to-single-crystal transformation to create a larger unit cell with over 500 non-H atoms in the asymmetric unit. This led to a non-trivial refinement scenario that afforded the best Flack x absolute stereochemical determination parameter to date for these systems. The structures did not require the use of PLATON/SQUEEZE or other solvent-masking programs, and are the highest-quality crystalline sponge systems reported to date where the results are strongly supported by the data. A set of guidelines for the entire crystallographic process were developed through these studies. In particular, the refinement guidelines include strategies to refine the host framework, locate guests and determine occupancies, discussion of the proper use of geometric and anisotropic displacement parameter restraints and constraints, and whether to perform solvent squeezing/masking. The single-crystal-to-single-crystal transformation process for the crystal sponges is also discussed. The presented general guidelines will be invaluable for researchers interested in using the crystalline sponge method at in-house diffraction or synchrotron facilities, will facilitate the collection and analysis of reliable high-quality data, and will allow construction of chemically and physically sensible models for guest structural determination. PMID:25537388

  6. MAKING FORMAL METHODS PRACTICAL Marc Zimmerman, Mario Rodriguez, Benjamin Ingram,

    E-print Network

    Leveson, Nancy

    1 MAKING FORMAL METHODS PRACTICAL Marc Zimmerman, Mario Rodriguez, Benjamin Ingram, Masafumi and analysis that requires a large learning time. Contributing to this skepticism is the fact that some types. used in industrial practice for day-to-day software development. Several reasons may be hypothesized

  7. The piezoelectric sorption technique: a practical method

    E-print Network

    Flipse, Eugene Charles

    1983-01-01

    THE PIEZOELECTRIC SORPTION TECHNIQUE, A PRACTICAL METHOD A Thesis by EUGENE CHARLES FLIPSE Submitted to the Graduate College of Texas A&M University in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE August 1983...) D. Holland (Head of Department) August 1983 ABSTRACT The Piezoelectric Sorption Technique, A Practical Method. (August 1983) Eugene Charles Flipse, B. A. , Virginia Polytechnic Institute and State University Chairman of Advisory Committee: Dr...

  8. Assessment of management in general practice: validation of a practice visit method.

    PubMed Central

    van den Hombergh, P; Grol, R; van den Hoogen, H J; van den Bosch, W J

    1998-01-01

    BACKGROUND: Practice management (PM) in general practice is as yet ill-defined; a systematic description of its domain, as well as a valid method to assess it, are necessary for research and assessment. AIM: To develop and validate a method to assess PM of general practitioners (GPs) and practices. METHOD: Relevant and potentially discriminating indicators were selected from a systematic framework of 2410 elements of PM to be used in an assessment method (VIP = visit instrument PM). The method was first tested in a pilot study and, after revision, was evaluated in order to select discriminating indicators and to determine validity of dimensions (factor and reliability analysis, linear regression). RESULTS: One hundred and ten GPs were assessed with the practice visit method using 249 indicators; 208 of these discriminated sufficiently at practice level or at GP level. Factor analysis resulted in 34 dimensions and in a taxonomy of PM. Dimensions and indicators showed marked variation between GPs and practices. Training practices scored higher on five dimensions; single-handed and dispensing practices scored lower on delegated tasks, but higher on accessibility and availability. CONCLUSION: A visit method to assess PM has been developed and its validity studied systematically. The taxonomy and dimensions of PM were in line with other classifications. Selection of a balanced number of useful and relevant indicators was nevertheless difficult. The dimensions could discriminate between groups of GPs and practices, establishing the value of the method for assessment. The VIP method could be an important contribution to the introduction of continuous quality improvement in the profession. PMID:10198481

  9. Thermographical Analysis of the Warmth of the Hands during the Practice of Self-Regulation Method (With 1 color plate)

    Microsoft Academic Search

    Akira Ikemi; Sayuri Tomita; Yoshiaki Hayashida

    1988-01-01

    The changes in surface temperatures of the hands, and the permeation of warmth in the hands during self-regulation method (SRM), a new method of self-control, were studied in 5 trained subjects and in 5 beginners. Temperature changes were measured by thermography. Results indicated that both beginners and trained subjects showed similar significant increases in maximum hand temperatures during SRM, however,

  10. Practical limitations of epidemiologic methods.

    PubMed Central

    Lilienfeld, A M

    1983-01-01

    Epidemiologic methods can be categorized into demographic studies of mortality and morbidity and observational studies that are either retrospective or prospective. Some of the limitations of demographic studies are illustrated by a review of one specific mortality study showing possible relationship of nuclear fallout to leukemia. Problems of accuracy of diagnosis or causes of death on death certificates, estimates of population, migration from areas of study, and the issue of "ecological fallacy" are discussed. Retrospective studies have such problems as recall of previous environmental exposure, selection bias and survivor bias. In environmental epidemiology, prospective studies have been used. The problems associated with these studies are illustrated by reviewing some of the details of the study of effects of microwave radiation on embassy employees in Moscow. The study population had to be reconstructed, individuals had to be located and information on exposure status had to be obtained by questionnaire. The relatively small size of the exposed group permitted the detection of only fairly large relative risks. Despite these limitations, epidemiologic studies have been remarkably productive in elucidating etiological factors. They are necessary since "the proper study of man is man." PMID:6653534

  11. A Method for Optimizing Waste Management and Disposal Practices Using a Group-Based Uncertainty Model for the Analysis of Characterization Data - 13191

    SciTech Connect

    Simpson, A.; Clapham, M.; Lucero, R.; West, J. [Pajarito Scientific Corporation, 2976 Rodeo Park Drive East, Santa Fe, NM 87505 (United States)] [Pajarito Scientific Corporation, 2976 Rodeo Park Drive East, Santa Fe, NM 87505 (United States)

    2013-07-01

    It is a universal requirement for characterization of radioactive waste, that the consignor shall calculate and report a Total Measurement Uncertainty (TMU) value associated with each of the measured quantities such as nuclide activity. For Non-destructive Assay systems, the TMU analysis is typically performed on an individual container basis. However, in many cases, the waste consignor treats, transports, stores and disposes of containers in groups for example by over-packing smaller containers into a larger container or emplacing containers into groups for final disposal. The current standard practice for container-group data analysis is usually to treat each container as independent and uncorrelated and use a simple summation / averaging method (or in some cases summation of TMU in quadrature) to define the overall characteristics and associated uncertainty in the container group. In reality, many groups of containers are assayed on the same system, so there will be a large degree of co-dependence in the individual uncertainty elements. Many uncertainty terms may be significantly reduced when addressing issues such as the source position and variability in matrix contents over large populations. The systematic terms encompass both inherently 'two-directional' random effects (e.g. variation of source position) and other terms that are 'one-directional' i.e. designed to account for potential sources of bias. An analysis has been performed with population groups of a variety of non-destructive assay platforms in order to define a quantitative mechanism for waste consignors to determine overall TMU for batches of containers that have been assayed on the same system. (authors)

  12. The LMDI approach to decomposition analysis: a practical guide

    Microsoft Academic Search

    B. W. Ang

    2005-01-01

    In a recent study, Ang (Energy Policy 32 (2004)) compared various index decomposition analysis methods and concluded that the logarithmic mean Divisia index method is the preferred method. Since the literature on the method tends to be either too technical or specific for most potential users, this paper provides a practical guide that includes the general formulation process, summary tables

  13. METHODS OF PLANKTON INVESTIGATION IN THEIR RELATION TO PRACTICAL PROBLEMS.

    E-print Network

    , it is customary to neglect the fixed plants along the shore and the animals that they harbor. That the plankton169 METHODS OF PLANKTON INVESTIGATION IN THEIR RELATION TO PRACTICAL PROBLEMS. By JACOB REIGHARD and animals which inhabit a body of water are interdependent. In the final analysis all the fishes

  14. Practical Considerations for Using Exploratory Factor Analysis in Educational Research

    ERIC Educational Resources Information Center

    Beavers, Amy S.; Lounsbury, John W.; Richards, Jennifer K.; Huck, Schuyler W.; Skolits, Gary J.; Esquivel, Shelley L.

    2013-01-01

    The uses and methodology of factor analysis are widely debated and discussed, especially the issues of rotational use, methods of confirmatory factor analysis, and adequate sample size. The variety of perspectives and often conflicting opinions can lead to confusion among researchers about best practices for using factor analysis. The focus of the…

  15. Visionlearning: Research Methods: The Practice of Science

    NSDL National Science Digital Library

    Anthony Carpi

    This instructional module introduces four types of research methods: experimentation, description, comparison, and modeling. It was developed to help learners understand that the classic definition of the "scientific method" does not capture the dynamic nature of science investigation. As learners explore each methodology, they develop an understanding of why scientists use multiple methods to gather data and develop hypotheses. It is appropriate for introductory physics courses and for teachers seeking content support in research practices. Editor's Note: Secondary students often cling to the notion that scientific research follows a stock, standard "scientific method". They may be unaware of the differences between experimental research, correlative studies, observation, and computer-based modeling research. In this resource, they can glimpse each methodology in the context of a real study done by respected scientists. This resource is part of Visionlearning, an award-winning set of classroom-tested modules for science education.

  16. Practical Analysis of Nutritional Data

    NSDL National Science Digital Library

    This online textbook, created by faculty members at Tulane University, provides information on the statistical analysis of nutritional data. Techniques covered include data cleaning, descriptive statistics, histograms, graphics, scatterplots, outlier identification, regression and correlation, confounding, and interactions. Each chapter includes exercises with real data and self-tests to be used with SPSS. Additionally, the site contains information on using SPSS for statistical testing, the basics of Epi-info, and the basics of Stat Analysis. The programs requires a operating system of Windows 95 or later.

  17. Achieving integration in mixed methods designs-principles and practices.

    PubMed

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-12-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs-exploratory sequential, explanatory sequential, and convergent-and through four advanced frameworks-multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835

  18. Achieving Integration in Mixed Methods Designs—Principles and Practices

    PubMed Central

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-01-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs—exploratory sequential, explanatory sequential, and convergent—and through four advanced frameworks—multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835

  19. A Practical Analysis of Clustering Strategies for Hierarchical Radiosity

    Microsoft Academic Search

    Jean-marc Hasenfratz; Cyrille Damez; François X. Sillion; George Drettakis

    1999-01-01

    The calculation of radiant energy balance in complex scenes has been made possible by hierarchical radiosity methods based on clustering mechanisms. Although clustering offers an elegant theoretical solution by reducing the asymptotic complexity of the algorithm, its practical use raises many difficulties, and may result in image artifacts or unexpected behavior. This paper proposes a detailed analysis of the expectations

  20. Exploratory and Confirmatory Analysis of the Trauma Practices Questionnaire

    ERIC Educational Resources Information Center

    Craig, Carlton D.; Sprang, Ginny

    2009-01-01

    Objective: The present study provides psychometric data for the Trauma Practices Questionnaire (TPQ). Method: A nationally randomized sample of 2,400 surveys was sent to self-identified trauma treatment specialists, and 711 (29.6%) were returned. Results: An exploratory factor analysis (N = 319) conducted on a randomly split sample (RSS) revealed…

  1. Method of analysis and quality-assurance practices by the U. S. Geological Survey Organic Geochemistry Research Group; determination of four selected mosquito insecticides and a synergist in water using liquid-liquid extraction and gas chrom

    USGS Publications Warehouse

    Zimmerman, L.R.; Strahan, A.P.; Thurman, E.M.

    2001-01-01

    A method of analysis and quality-assurance practices were developed for the determination of four mosquito insecticides (malathion, metho-prene, phenothrin, and resmethrin) and one synergist (piperonyl butoxide) in water. The analytical method uses liquid-liquid extraction (LLE) and gas chromatography/mass spectrometry (GC/MS). Good precision and accuracy were demonstrated in reagent water, urban surface water, and ground water. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 50 nanograms per liter ranged from 68 to 171 percent, with standard deviations in concentrations of 27 nanograms per liter or less. The method detection limit for all compounds was 5.9 nanograms per liter or less for 247-milliliter samples. This method is valuable for acquiring information about the fate and transport of these mosquito insecticides and one synergist in water.

  2. Comprehensive rotorcraft analysis methods

    NASA Technical Reports Server (NTRS)

    Stephens, Wendell B.; Austin, Edward E.

    1988-01-01

    The development and application of comprehensive rotorcraft analysis methods in the field of rotorcraft technology are described. These large scale analyses and the resulting computer programs are intended to treat the complex aeromechanical phenomena that describe the behavior of rotorcraft. They may be used to predict rotor aerodynamics, acoustic, performance, stability and control, handling qualities, loads and vibrations, structures, dynamics, and aeroelastic stability characteristics for a variety of applications including research, preliminary and detail design, and evaluation and treatment of field problems. The principal comprehensive methods developed or under development in recent years and generally available to the rotorcraft community because of US Army Aviation Research and Technology Activity (ARTA) sponsorship of all or part of the software systems are the Rotorcraft Flight Simulation (C81), Dynamic System Coupler (DYSCO), Coupled Rotor/Airframe Vibration Analysis Program (SIMVIB), Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics (CAMRAD), General Rotorcraft Aeromechanical Stability Program (GRASP), and Second Generation Comprehensive Helicopter Analysis System (2GCHAS).

  3. Science Teaching Methods: A Rationale for Practices

    ERIC Educational Resources Information Center

    Osborne, Jonathan

    2011-01-01

    This article is a version of the talk given by Jonathan Osborne as the Association for Science Education (ASE) invited lecturer at the National Science Teachers' Association Annual Convention in San Francisco, USA, in April 2011. The article provides an explanatory justification for teaching about the practices of science in school science that…

  4. Towards Practical User Experience Evaluation Methods

    Microsoft Academic Search

    Kaisa Väänänen-Vainio-Mattila; Virpi Roto; Marc Hassenzahl

    2008-01-01

    In the last decade, User eXperience (UX) research in the academic community has produced a multitude of UX models and frameworks. These models address the key issues of UX: its subjective, highly situated and dynamic nature, as well as the pragmatic and hedonic factors leading to UX. At the same time, industry is adopting the UX term but the practices

  5. Method of analysis at the U.S. Geological Survey California Water Science Center, Sacramento Laboratory - determination of haloacetic acid formation potential, method validation, and quality-control practices

    USGS Publications Warehouse

    Zazzi, Barbara C.; Crepeau, Kathryn L.; Fram, Miranda S.; Bergamaschi, Brian A.

    2005-01-01

    An analytical method for the determination of haloacetic acid formation potential of water samples has been developed by the U.S. Geological Survey California Water Science Center Sacramento Laboratory. The haloacetic acid formation potential is measured by dosing water samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine. The haloacetic acids formed are bromochloroacetic acid, bromodichloroacetic acid, dibromochloroacetic acid, dibromoacetic acid, dichloroacetic acid, monobromoacetic acid, monochloroacetic acid, tribromoacetic acid, and trichloroacetic acid. They are extracted, methylated, and then analyzed using a gas chromatograph equipped with an electron capture detector. Method validation experiments were performed to determine the method accuracy, precision, and detection limit for each of the compounds. Method detection limits for these nine haloacetic acids ranged from 0.11 to 0.45 microgram per liter. Quality-control practices include the use of blanks, quality-control samples, calibration verification standards, surrogate recovery, internal standard, matrix spikes, and duplicates.

  6. Practical Applications of Simple Cosserat Methods

    Microsoft Academic Search

    David A. Burton; Robin W. Tucker

    \\u000a Motivated by the need to construct models of slender elastic media that are versatile enough to accommodated non-linear phenomena\\u000a under dynamical evolution, an overview is presented of recent practical applications of simple Cosserat theory. This theory\\u000a offers a methodology for modeling non-linear continua that is physically accurate and amenable to controlled numerical approximation.\\u000a By contrast to linear models, where non-linearities

  7. Standard practice for digital imaging and communication nondestructive evaluation (DICONDE) for computed radiography (CR) test methods

    E-print Network

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice facilitates the interoperability of computed radiography (CR) imaging and data acquisition equipment by specifying image data transfer and archival storage methods in commonly accepted terms. This practice is intended to be used in conjunction with Practice E2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see http://medical.nema.org), an international standard for image data acquisition, review, storage and archival storage. The goal of Practice E2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE results on any system conforming to the DICONDE standard. Toward that end, Practice E2339 provides a data dictionary and a set of information modules that are applicable to all NDE modalities. This practice supplements Practice E2339 by providing information objec...

  8. APPLICATIONS OF RESAMPLING METHODS IN ACTUARIAL PRACTICE

    E-print Network

    Ostaszewski, Krzysztof M.

    . OSTASZEWSKI, AND GRZEGORZ A. REMPALA Abstract Actuarial analysis can be viewed as the process of studying profitability and solvency of an insurance firm under a realistic and integrated model of key input ran- dom In modern analysis of the financial models of property- casualty companies the input variables can

  9. Practical Experience Applying Formal Methods to Air Traffic Management Software

    E-print Network

    Andrews, Jamie

    Practical Experience Applying Formal Methods to Air Traffic Management Software Richard Yates, Jamie Andrews, Phil Gray Abstract. This paper relates experiences with formal methods that are relevant from formal methods. There is a need to experiment with these new methods, and to communicate

  10. [Kaplan-Meier analysis in urological practice].

    PubMed

    Rink, M; Kluth, L A; Shariat, S F; Fisch, M; Dahlem, R; Dahm, P

    2013-06-01

    Kaplan-Meier curves and estimates of survival are the most common way to present survival data in medicine, especially for cohorts with different lengths of follow-up. Moreover, Kaplan-Meier curves present a frequently used general graphic approach to display time-to-event outcomes. A solid understanding of how Kaplan-Meier curves are generated and how they should be analyzed and interpreted is of great importance to appraise urological literature in daily clinical practice. This article describes the basic principles of Kaplan-Meier analysis, possible variants and pitfalls. Improved knowledge of Kaplan-Meier analysis can help to improve Evidence-based urology and its application in patient care. PMID:23703691

  11. Critical practice in nursing care: analysis, action and reflexivity.

    PubMed

    Timmins, F

    This article examines critical practice and its underlying principles: analysis, action and reflexivity. Critical analysis involves the examination of knowledge that underpins practice. Critical action requires nurses to assess their skills and identify potential gaps in need of professional development. Critical reflexivity is personal analysis that involves challenging personal beliefs and assumptions to improve professional and personal practice. Incorporating these aspects into nursing can benefit nursing practice. PMID:16786927

  12. Dream Interpretation: Practical Methods for Pastoral Care and Counseling

    Microsoft Academic Search

    Kelly Bulkeley

    2000-01-01

    This article outlines a simple, straightforward set of practical dream interpretation methods which pastoral caregivers can use any time a client's dream becomes relevant to the counseling situation. The article describes seven elements to the successful interpretation of a dream, offers a brief illustration of how dream interpretation works in practice, and explains how to test the validity of an

  13. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1929-01-01

    This report on a method of analysis of aircraft accidents has been prepared by a special committee on the nomenclature, subdivision, and classification of aircraft accidents organized by the National Advisory Committee for Aeronautics in response to a request dated February 18, 1928, from the Air Coordination Committee consisting of the Assistant Secretaries for Aeronautics in the Departments of War, Navy, and Commerce. The work was undertaken in recognition of the difficulty of drawing correct conclusions from efforts to analyze and compare reports of aircraft accidents prepared by different organizations using different classifications and definitions. The air coordination committee's request was made "in order that practices used may henceforth conform to a standard and be universally comparable." the purpose of the special committee therefore was to prepare a basis for the classification and comparison of aircraft accidents, both civil and military. (author)

  14. Design Methods for Young Sustainable Architecture Practice

    Microsoft Academic Search

    D. Jauslin; H. Drexler; F. Curiel

    2012-01-01

    This paper introduces landscape aesthetics as an innovative design method for sustainable architecture. It is based on the framework of a recent paper where the young and unfamous authors criticized three of the most prominent\\u2028 architects today in regard to sustainable architecture and its aesthetics. Leading architects expressed their skepticism as to whether there is such a thing as aesthetics

  15. Deepen the GIS spatial analysis theory studying through the gradual process of practice

    NASA Astrophysics Data System (ADS)

    Yi, Y. G.; Liu, H. P.; Liu, X. P.

    2014-04-01

    Spatial analysis is the key content of GIS basic theory course. In this paper, the importance of practice teaching for GIS spatial analysis theory studying and its implementation method are discussed combined with practice teaching arrangement of spatial analysis in the course "GIS theory and practice" based on the basic principle of procedural teaching theory and its teaching model. In addition, the concrete gradual practice process is mentioned in four aspects. By this way, the GIS spatial analysis theory studying can be deepened and the cultivation of students' comprehensive ability of Geography Science can be strengthened.

  16. A Practical Guide to Wavelet Analysis

    NSDL National Science Digital Library

    Researchers dealing with time series data will find this powerful resource extremely helpful. Drs. Christopher Torrence (National Center for Atmospheric Research) and Gilbert Compo (NOAA/ CIRES Climate Diagnostics Center) have put together this Website for researchers interested in using wavelet analysis, a technique that decomposes a time series into time-frequency space. The site provides information on "both the amplitude of any periodic signals within the series, and how this amplitude varies with time." The nicely written introductory section (Wavelet Analysis & Monte Carlo) is complete with algorithms, graphically illustrated examples, and references (including some links). First time users may wish to consult the on-site article "A Practical Guide to Wavelet Analysis," originally published in 1998 (.pdf format), or browse the FAQ section. The heart of the site is the Interactive Wavelet Plots section; here, users may experiment with wavelet analysis using time series data provided at the site (i.e., Sea Surface Temperature, Sunspots) or provided by the user. As if that weren't enough, the site also offers free Wavelet software (Fortran, IDL, or Matlab; acknowledgment required) and several abbreviated data sets for experimentation.

  17. Standard practice for digital imaging and communication in nondestructive evaluation (DICONDE) for ultrasonic test methods

    E-print Network

    American Society for Testing and Materials. Philadelphia

    2008-01-01

    1.1 This practice facilitates the interoperability of ultrasonic imaging equipment by specifying image data transfer and archival storage methods in commonly accepted terms. This document is intended to be used in conjunction with Practice E 2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E 2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see http://medical.nema.org), an international standard for image data acquisition, review, transfer and archival storage. The goal of Practice E 2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE test results on any system conforming to the DICONDE standard. Toward that end, Practice E 2339 provides a data dictionary and set of information modules that are applicable to all NDE modalities. This practice supplements Practice E 2339 by providing information object definitions, information ...

  18. Reflections on Experiential Teaching Methods: Linking the Classroom to Practice

    ERIC Educational Resources Information Center

    Wehbi, Samantha

    2011-01-01

    This article explores the use of experiential teaching methods in social work education. The literature demonstrates that relying on experiential teaching methods in the classroom can have overwhelmingly positive learning outcomes; however, not much is known about the possible effect of these classroom methods on practice. On the basis of…

  19. Multi-criteria decision analysis: Limitations, pitfalls, and practical difficulties

    SciTech Connect

    Kujawski, Edouard

    2003-02-01

    The 2002 Winter Olympics women's figure skating competition is used as a case study to illustrate some of the limitations, pitfalls, and practical difficulties of Multi-Criteria Decision Analysis (MCDA). The paper compares several widely used models for synthesizing the multiple attributes into a single aggregate value. The various MCDA models can provide conflicting rankings of the alternatives for a common set of information even under states of certainty. Analysts involved in MCDA need to deal with the following challenging tasks: (1) selecting an appropriate analysis method, and (2) properly interpreting the results. An additional trap is the availability of software tools that implement specific MCDA models that can beguile the user with quantitative scores. These conclusions are independent of the decision domain and they should help foster better MCDA practices in many fields including systems engineering trade studies.

  20. Practical Issues in Component Aging Analysis

    SciTech Connect

    Dana L. Kelly; Andrei Rodionov; Jens Uwe-Klugel

    2008-09-01

    This paper examines practical issues in the statistical analysis of component aging data. These issues center on the stochastic process chosen to model component failures. The two stochastic processes examined are repair same as new, leading to a renewal process, and repair same as old, leading to a nonhomogeneous Poisson process. Under the first assumption, times between failures can treated as statistically independent observations from a stationary process. The common distribution of the times between failures is called the renewal distribution. Under the second process, the times between failures will not be independently and identically distributed, and one cannot simply fit a renewal distribution to the cumulative failure times or the times between failures. The paper illustrates how the assumption made regarding the repair process is crucial to the analysis. Besides the choice of stochastic process, other issues that are discussed include qualitative graphical analysis and simple nonparametric hypothesis tests to help judge which process appears more appropriate. Numerical examples are presented to illustrate the issues discussed in the paper.

  1. Methods of analysis and quality-assurance practices of the U.S. Geological Survey organic laboratory, Sacramento, California; determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Domagalski, Joseph L.; Kuivila, Kathryn M.

    1994-01-01

    Analytical method and quality-assurance practices were developed for a study of the fate and transport of pesticides in the Sacramento-San Joaquin Delta and the Sacramento and San Joaquin River. Water samples were filtered to remove suspended parti- culate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide, and the pesticides were eluted with three 2-milliliter aliquots of hexane:diethyl ether (1:1). The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for analytes determined per 1,500-milliliter samples ranged from 0.006 to 0.047 microgram per liter. Recoveries ranged from 47 to 89 percent for 12 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.05 and 0.26 microgram per liter. The method was modified to improve the pesticide recovery by reducing the sample volume to 1,000 milliliters. Internal standards were added to improve quantitative precision and accuracy. The analysis also was expanded to include a total of 21 pesticides. The method detection limits for 1,000-milliliter samples ranged from 0.022 to 0.129 microgram per liter. Recoveries ranged from 38 to 128 percent for 21 pesticides in organic-free, Sacramento River and San Joaquin River water samples fortified at 0.10 and 0.75 microgram per liter.

  2. Practice and Evaluation of Blended Learning with Cross-Cultural Distance Learning in a Foreign Language Class: Using Mix Methods Data Analysis

    ERIC Educational Resources Information Center

    Sugie, Satoko; Mitsugi, Makoto

    2014-01-01

    The Information and Communication Technology (ICT) utilization in Chinese as a "second" foreign language has mainly been focused on Learning Management System (LMS), digital material development, and quantitative analysis of learners' grammatical knowledge. There has been little research that has analyzed the effectiveness of…

  3. Towards a practical parallelisation of the simplex method

    Microsoft Academic Search

    J. A. J. Hall

    2010-01-01

    The simplex method is frequently the most efficient method of solving linear programming (LP) problems. This paper reviews\\u000a previous attempts to parallelise the simplex method in relation to efficient serial simplex techniques and the nature of practical\\u000a LP problems. For the major challenge of solving general large sparse LP problems, there has been no parallelisation of the\\u000a simplex method that

  4. Social Network Analysis as an Analytic Tool for Interaction Patterns in Primary Care Practices

    PubMed Central

    Scott, John; Tallia, Alfred; Crosson, Jesse C.; Orzano, A. John; Stroebel, Christine; DiCicco-Bloom, Barbara; O’Malley, Dena; Shaw, Eric; Crabtree, Benjamin

    2005-01-01

    PURPOSE Social network analysis (SNA) provides a way of quantitatively analyzing relationships among people or other information-processing agents. Using 2 practices as illustrations, we describe how SNA can be used to characterize and compare communication patterns in primary care practices. METHODS Based on data from ethnographic field notes, we constructed matrices identifying how practice members interact when practice-level decisions are made. SNA software (UCINet and KrackPlot) calculates quantitative measures of network structure including density, centralization, hierarchy and clustering coefficient. The software also generates a visual representation of networks through network diagrams. RESULTS The 2 examples show clear distinctions between practices for all the SNA measures. Potential uses of these measures for analysis of primary care practices are described. CONCLUSIONS SNA can be useful for quantitative analysis of interaction patterns that can distinguish differences among primary care practices. PMID:16189061

  5. Between practice and theory: Melanie Klein, Anna Freud and the development of child analysis.

    PubMed

    Donaldson, G

    1996-04-01

    An examination of the early history of child analysis in the writings of Melanie Klein and Anna Freud reveals how two different and opposing approaches to child analysis arose at the same time. The two methods of child analysis are rooted in a differential emphasis on psychoanalytic theory and practice. The Kleinian method derives from the application of technique while the Anna Freudian method is driven by theory. Furthermore, by holding to the Freudian theory of child development Anna Freud was forced to limit the scope of child analysis, while Klein's application of Freudian practice has led to new discoveries about the development of the infant psyche. PMID:8642183

  6. Practicing the practice: Learning to guide elementary science discussions in a practice-oriented science methods course

    NASA Astrophysics Data System (ADS)

    Shah, Ashima Mathur

    University methods courses are often criticized for telling pre-service teachers, or interns, about the theories behind teaching instead of preparing them to actually enact teaching. Shifting teacher education to be more "practice-oriented," or to focus more explicitly on the work of teaching, is a current trend for re-designing the way we prepare teachers. This dissertation addresses the current need for research that unpacks the shift to more practice-oriented approaches by studying the content and pedagogical approaches in a practice-oriented, masters-level elementary science methods course (n=42 interns). The course focused on preparing interns to guide science classroom discussions. Qualitative data, such as video records of course activities and interns' written reflections, were collected across eight course sessions. Codes were applied at the sentence and paragraph level and then grouped into themes. Five content themes were identified: foregrounding student ideas and questions, steering discussion toward intended learning goals, supporting students to do the cognitive work, enacting teacher role of facilitator, and creating a classroom culture for science discussions. Three pedagogical approach themes were identified. First, the teacher educators created images of science discussions by modeling and showing videos of this practice. They also provided focused teaching experiences by helping interns practice the interactive aspects of teaching both in the methods classroom and with smaller groups of elementary students in schools. Finally, they structured the planning and debriefing phases of teaching so interns could learn from their teaching experiences and prepare well for future experiences. The findings were analyzed through the lens of Grossman and colleagues' framework for teaching practice (2009) to reveal how the pedagogical approaches decomposed, represented, and approximated practice throughout course activities. Also, the teacher educators' purposeful use of both pedagogies of investigation (to study teaching) and pedagogies of enactment (to practice enacting teaching) was uncovered. This work provides insights for the design of courses that prepare interns to translate theories about teaching into the interactive work teachers actually do. Also, it contributes to building a common language for talking about the content of practice-oriented courses and for comparing the affordances and limitations of pedagogical approaches across teacher education settings.

  7. [A method for the implementation and promotion of access to comprehensive and complementary primary healthcare practices].

    PubMed

    Santos, Melissa Costa; Tesser, Charles Dalcanale

    2012-11-01

    The rendering of integrated and complementary practices in the Brazilian Unified Health System is fostered to increase the comprehensiveness of care and access to same, though it is a challenge to incorporate them into the services. Our objective is to provide a simple method of implementation of such practices in Primary Healthcare, derived from analysis of experiences in municipalities, using partial results of a master's thesis that employed research-action methodology. The method involves four stages: 1 - defininition of a nucleus responsible for implementation and consolidation thereof; 2 - situational analysis, with definition of the existing competent professionals; 3 - regulation, organization of access and legitimation; and 4 - implementation cycle: local plans, mentoring and ongoing education in health. The phases are described, justified and briefly discussed. The method encourages the development of rational and sustainable actions, sponsors participatory management, the creation of comprehensivenessand the broadening of care provided in Primary Healthcare by offering progressive and sustainable comprehensive and complementary practices. PMID:23175308

  8. Visualization Methods in Analysis of Detailed Chemical Kinetics Modelling

    Microsoft Academic Search

    Anders Broe Bendtsen; Peter Glarborg; Kim Dam-johansen

    2001-01-01

    Sensitivity analysis, principal component analysis of the sensitivity matrix, and rate-of-production analysis are useful tools in interpreting detailed chemical kinetics calculations. This paper deals with the practical use and communication of the sensitivity analysis and the related methods are discussed. Some limitations of sensitivity analysis, originating from the mathematical concept (e.g. first-order or brute force methods) or from the software-specific

  9. Practical postcalibration uncertainty analysis: Yucca Mountain, Nevada.

    PubMed

    James, Scott C; Doherty, John E; Eddebbarh, Al-Aziz

    2009-01-01

    The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is "calibrated." Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the United States' proposed site for disposal of high-level radioactive waste. Linear and nonlinear uncertainty analyses are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction's uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This article applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. PMID:19744249

  10. [Good Practice of Secondary Data Analysis (GPS): Guidelines and Recommendations].

    PubMed

    Swart, E; Gothe, H; Geyer, S; Jaunzeme, J; Maier, B; Grobe, T G; Ihle, P

    2015-02-01

    In 2005, the Working Group for the Survey and Utilisation of Secondary Data (AGENS) of the German Society for Social Medicine and Prevention (DGSMP) and the German Society for Epidemiology (DGEpi) first published "Good Practice in Secondary Data Analysis (GPS)" formulating a standard for conducting secondary data analyses. GPS is intended as a guide for planning and conducting analyses and can provide a basis for contracts between data owners. The domain of these guidelines does not only include data routinely gathered by statutory health insurance funds and further statutory social insurance funds, but all forms of secondary data. The 11 guidelines range from ethical principles and study planning through quality assurance measures and data preparation to data privacy, contractual conditions and responsible communication of analytical results. They are complemented by explanations and practical assistance in the form of recommendations. GPS targets all persons directing their attention to secondary data, their analysis and interpretation from a scientific point of view and by employing scientific methods. This includes data owners. Furthermore, GPS is suitable to assess scientific publications regarding their quality by authors, referees and readers. In 2008, the first version of GPS was evaluated and revised by members of AGENS and the Epidemiological Methods Working Group of DGEpi, DGSMP and GMDS including other epidemiological experts and had then been accredited as implementation regulations of Good Epidemiological Practice (GEP). Since 2012, this third version of GPS is on hand and available for downloading from the DGEpi website at no charge. Especially linguistic specifications have been integrated into the current revision; its internal consistency was increased. With regards to contents, further recommendations concerning the guideline on data privacy have been added. On the basis of future developments in science and data privacy, further revisions will follow. PMID:25622207

  11. Generalized multicoincidence analysis methods

    Microsoft Academic Search

    Glen A. Warren; Leon E. Smith; Craig E. Aalseth; J. Edward Ellis; Andrei B. Valsan; Wondwosen Mengesha

    2006-01-01

    The ability to conduct automated trace radionuclide analysis at or near the sample collection point would provide a valuable tool for emergency response, environmental monitoring, and verification of treaties and agreements. Pacific Northwest National Laboratory is developing systems for this purpose based on dual gamma-ray spectrometers, e.g., NaI(TI) or HPGe, combined with thin organic scintillator sensors to detect light charged

  12. Study of fuzzy controller based on analysis method with PLC

    Microsoft Academic Search

    Jingzhao Li; Chongwei Zhang

    2002-01-01

    Fuzzy controller based on analysis method is achieved in this paper. Fuzzification of input signal and fuzzy control rule based on analysis method and defuzzification of output signal are researched, and PLC's ladder is given. The fact shows that this fuzzy controller is inexpensive and practical. It provides wider application for fuzzy control technology.

  13. A practical method of estimating energy expenditure during tennis play.

    PubMed

    Novas, A M P; Rowbottom, D G; Jenkins, D G

    2003-03-01

    This study aimed to develop a practical method of estimating energy expenditure (EE) during tennis. Twenty-four elite female tennis players first completed a tennis-specific graded test in which five different Intensity levels were applied randomly. Each intensity level was intended to simulate a "game" of singles tennis and comprised six 14 s periods of activity alternated with 20 s of active rest. Oxygen consumption (VO2) and heart rate (HR) were measured continuously and each player's rate of perceived exertion (RPE) was recorded at the end of each intensity level. Rate of energy expenditure (EE(VO2)) during the test was calculated using the sum of VO2 during play and the 'O2 debt' during recovery, divided by the duration of the activity. There were significant individual linear relationships between EE(VO2) and RPE, EE(VO2) and HR (r > or = 0.89 & r > or = 0.93; p < 0.05). On a second occasion, six players completed a 60-min singles tennis match during which VO2, HR and RPE were recorded; EE(VO2) was compared with EE predicted from the previously derived RPE and HR regression equations. Analysis found that EE(VO2) was overestimated by EE(RPE) (92 +/- 76 kJ x h(-1)) and EE(HR) (435 +/- 678 kJ x h(-1)), but the error of estimation for EE(RPE) (t = -3.01; p = 0.03) was less than 5% whereas for EE(HR) such error was 20.7%. The results of the study show that RPE can be used to estimate the energetic cost of playing tennis. PMID:12801209

  14. Traditional Methods for Mineral Analysis

    NASA Astrophysics Data System (ADS)

    Ward, Robert E.; Carpenter, Charles E.

    This chapter describes traditional methods for analysis of minerals involving titrimetric and colorimetric procedures, and the use of ion selective electrodes. Other traditional methods of mineral analysis include gravimetric titration (i.e., insoluble forms of minerals are precipitated, rinse, dried, and weighed) and redox reactions (i.e., mineral is part of an oxidation-reduction reaction, and product is quantitated). However, these latter two methods will not be covered because they currently are used little in the food industry. The traditional methods that will be described have maintained widespread usage in the food industry despite the development of more modern instrumentation such as atomic absorption spectroscopy and inductively coupled plasma-atomic emission spectroscopy (Chap. 24). Traditional methods generally require chemicals and equipment that are routinely available in an analytical laboratory and are within the experience of most laboratory technicians. Additionally, traditional methods often form the basis for rapid analysis kits (e.g., Quantab®; for salt determination) that are increasingly in demand. Procedures for analysis of minerals of major nutritional or food processing concern are used for illustrative purposes. For additional examples of traditional methods refer to references (1-6). Slight modifications of these traditional methods are often needed for specific foodstuffs to minimize interferences or to be in the range of analytical performance. For analytical requirements for specific foods see the Official Methods of Analysis of AOAC International (5) and related official methods (6).

  15. Practical Nursing. Ohio's Competency Analysis Profile.

    ERIC Educational Resources Information Center

    Ohio State Univ., Columbus. Vocational Instructional Materials Lab.

    Developed through a modified DACUM (Developing a Curriculum) process involving business, industry, labor, and community agency representatives in Ohio, this document is a comprehensive and verified employer competency profile for practical nursing. The list contains units (with and without subunits), competencies, and competency builders that…

  16. Practical analysis of welding processes using finite element analysis.

    SciTech Connect

    Cowles, J. H. (John H.); Dave, V. R. (Vivek R.); Hartman, D. A. (Daniel A.)

    2001-01-01

    With advances in commercially available finite element software and computational capability, engineers can now model large-scale problems in mechanics, heat transfer, fluid flow, and electromagnetics as never before. With these enhancements in capability, it is increasingly tempting to include the fundamental process physics to help achieve greater accuracy (Refs. 1-7). While this goal is laudable, it adds complication and drives up cost and computational requirements. Practical analysis of welding relies on simplified user inputs to derive important relativistic trends in desired outputs such as residual stress or distortion due to changes in inputs like voltage, current, and travel speed. Welding is a complex three-dimensional phenomenon. The question becomes how much modeling detail is needed to accurately predict relative trends in distortion, residual stress, or weld cracking? In this work, a HAZ (Heat Affected Zone) weld-cracking problem was analyzed to rank two different welding cycles (weld speed varied) in terms of crack susceptibility. Figure 1 shows an aerospace casting GTA welded to a wrought skirt. The essentials of part geometry, welding process, and tooling were suitably captured lo model the strain excursion in the HAZ over a crack-susceptible temperature range, and the weld cycles were suitably ranked. The main contribution of this work is the demonstration of a practical methodology by which engineering solutions to engineering problems may be obtained through weld modeling when time and resources are extremely limited. Typically, welding analysis suffers with the following unknowns: material properties over entire temperature range, the heat-input source term, and environmental effects. Material properties of interest are conductivity, specific heat, latent heat, modulus, Poisson's ratio, yield strength, ultimate strength, and possible rate dependencies. Boundary conditions are conduction into fixturing, radiation and convection to the environment, and any mechanical constraint. If conductivity, for example, is only known at a few temperatures it can be linearly extrapolated from the highest known temperature to the liquidus temperature. Over the liquidus to solidus temperature the conductivity is linearly increased by a factor of three to account for the enhanced heat transfer due to convection in the weld pool. Above the liquidus it is kept constant. Figure 2 shows an example of this type of approximation. Other thermal and mechanical properties and boundary conditions can be similarly approximated, using known physical material characteristics when possible. Sensitivity analysis can show that many assumptions have a small effect on the final outcome of the analysis. In the example presented in this work, simplified analysis procedures were used to model this process to understand why one set of parameters is superior to the other. From Lin (Ref. 8), mechanical strain is expected to drive HAZ cracking. Figure 3 shows a plot of principal tensile mechanical strain versus temperature during the welding process. By looking at the magnitudes of the tensile mechanical strain in the material's Brittle Temperature Region (BTR), it can be seen that on a relative basis the faster travel speed process that causes cracking results in about three times the strain in the temperature range of the BTR. In this work, a series of simplifying assumptions were used in order to quickly and accurately model a real welding process to respond to an immediate manufacturing need. The analysis showed that the driver for HAZ cracking, the mechanical strain in the BTR, was significantly higher in the process that caused cracking versus the process that did not. The main emphasis of the analysis was to determine whether there was a mechanical reason whether the improved weld parameters would consistently produce an acceptable weld, The prediction of the mechanical strain magnitudes confirms the better process.

  17. Practical Analysis of Gadget Framework on Android OS

    E-print Network

    Practical Analysis of Gadget Framework on Android OS Studienarbeit im Rahmen des Diplomstudiengangs on Android 4 2.1 USB Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2.1.1 Topology . . . . . . . . . . . . . . . . . . . . . 12 2.2.5 Gadgetfs . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.3 Embedded Android

  18. Clinical simulation using deliberate practice in nursing education: a Wilsonian concept analysis.

    PubMed

    Chee, Jennifer

    2014-05-01

    Effective use of simulation is dependent on a complete understanding of simulation's central conceptual elements. Deliberate practice, a constituent of Ericsson's theory of expertise, has been identified as a central concept in effective simulation learning. Deliberate practice is compatible with simulation frameworks already being suggested for use in nursing education. This paper uses Wilson's Method of concept analysis for the purpose of exploring the concept of deliberate practice in the context of clinical simulation in nursing education. Nursing education should move forward in a manner that reflects best practice in nursing education. PMID:24120521

  19. Landscape analysis: Theoretical considerations and practical needs

    USGS Publications Warehouse

    Godfrey, A.E.; Cleaves, E.T.

    1991-01-01

    Numerous systems of land classification have been proposed. Most have led directly to or have been driven by an author's philosophy of earth-forming processes. However, the practical need of classifying land for planning and management purposes requires that a system lead to predictions of the results of management activities. We propose a landscape classification system composed of 11 units, from realm (a continental mass) to feature (a splash impression). The classification concerns physical aspects rather than economic or social factors; and aims to merge land inventory with dynamic processes. Landscape units are organized using a hierarchical system so that information may be assembled and communicated at different levels of scale and abstraction. Our classification uses a geomorphic systems approach that emphasizes the geologic-geomorphic attributes of the units. Realm, major division, province, and section are formulated by subdividing large units into smaller ones. For the larger units we have followed Fenneman's delineations, which are well established in the North American literature. Areas and districts are aggregated into regions and regions into sections. Units smaller than areas have, in practice, been subdivided into zones and smaller units if required. We developed the theoretical framework embodied in this classification from practical applications aimed at land use planning and land management in Maryland (eastern Piedmont Province near Baltimore) and Utah (eastern Uinta Mountains). ?? 1991 Springer-Verlag New York Inc.

  20. New directions on agile methods: a comparative analysis

    Microsoft Academic Search

    Pekka Abrahamsson; Juhani Warsta; Mikko T. Siponen; Jussi Ronkainen

    2003-01-01

    Agile software development methods have caught the attention of software engineers and researchers worldwide. Scientific research is yet scarce. This paper reports results from a study, which aims to organize, analyze and make sense out of the dispersed field of agile software development methods. The comparative analysis is performed using the method's life-cycle coverage, project management support, type of practical

  1. A Practical Introduction to Analysis and Synthesis

    ERIC Educational Resources Information Center

    Williams, R. D.; Cosart, W. P.

    1976-01-01

    Discusses an introductory chemical engineering course in which mathematical models are used to analyze experimental data. Concepts illustrated include dimensional analysis, scaleup, heat transfer, and energy conservation. (MLH)

  2. A practical gait analysis system using gyroscopes

    Microsoft Academic Search

    Kaiyu Tong; Malcolm H Granat

    1999-01-01

    This study investigated the possibility of using uni-axial gyroscopes to develop a simple portable gait analysis system. Gyroscopes were attached on the skin surface of the shank and thigh segments and the angular velocity for each segment was recorded in each segment. Segment inclinations and knee angle were derived from segment angular velocities. The angular signals from a motion analysis

  3. Practical Application of Second Law Efficiency Analysis

    E-print Network

    Gaggioli, R. A.; Wepfer, W. J.

    1983-01-01

    . efforts in this century to popularize its practical use (see, for examples. the classical engineering thermodynamics texts of Goodenough (3). Keenan (4). and Dodge (5? have met with limited acceptance. Available energy. henceforth called exergy. is a... of Thermo dynamics," 3rd ed., Henry Holt & Co., 1920. 4. Keenan, J.H., The~modynamics, 1st MIT Press Ed: Cambridge, MA, 1970. Originally published by Wiley, New York, 1941. 5. Dodge, B. F., "Chemical Engineering Thermo dynamics," McGraw-Hi],I, 1944...

  4. A practical method to evaluate radiofrequency exposure of mast workers.

    PubMed

    Alanko, Tommi; Hietanen, Maila

    2008-01-01

    Assessment of occupational exposure to radiofrequency (RF) fields in telecommunication transmitter masts is a challenging task. For conventional field strength measurements using manually operated instruments, it is difficult to document the locations of measurements while climbing up a mast. Logging RF dosemeters worn by the workers, on the other hand, do not give any information about the location of the exposure. In this study, a practical method was developed and applied to assess mast workers' exposure to RF fields and the corresponding location. This method uses a logging dosemeter for personal RF exposure evaluation and two logging barometers to determine the corresponding height of the worker's position on the mast. The procedure is not intended to be used for compliance assessments, but to indicate locations where stricter assessments are needed. The applicability of the method is demonstrated by making measurements in a TV and radio transmitting mast. PMID:19054796

  5. Development of a practical costing method for hospitals.

    PubMed

    Cao, Pengyu; Toyabe, Shin-Ichi; Akazawa, Kouhei

    2006-03-01

    To realize an effective cost control, a practical and accurate cost accounting system is indispensable in hospitals. In traditional cost accounting systems, the volume-based costing (VBC) is the most popular cost accounting method. In this method, the indirect costs are allocated to each cost object (services or units of a hospital) using a single indicator named a cost driver (e.g., Labor hours, revenues or the number of patients). However, this method often results in rough and inaccurate results. The activity based costing (ABC) method introduced in the mid 1990s can prove more accurate results. With the ABC method, all events or transactions that cause costs are recognized as "activities", and a specific cost driver is prepared for each activity. Finally, the costs of activities are allocated to cost objects by the corresponding cost driver. However, it is much more complex and costly than other traditional cost accounting methods because the data collection for cost drivers is not always easy. In this study, we developed a simplified ABC (S-ABC) costing method to reduce the workload of ABC costing by reducing the number of cost drivers used in the ABC method. Using the S-ABC method, we estimated the cost of the laboratory tests, and as a result, similarly accurate results were obtained with the ABC method (largest difference was 2.64%). Simultaneously, this new method reduces the seven cost drivers used in the ABC method to four. Moreover, we performed an evaluation using other sample data from physiological laboratory department to certify the effectiveness of this new method. In conclusion, the S-ABC method provides two advantages in comparison to the VBC and ABC methods: (1) it can obtain accurate results, and (2) it is simpler to perform. Once we reduce the number of cost drivers by applying the proposed S-ABC method to the data for the ABC method, we can easily perform the cost accounting using few cost drivers after the second round of costing. PMID:16498229

  6. Fourier methods for biosequence analysis.

    PubMed Central

    Benson, D C

    1990-01-01

    Novel methods are discussed for using fast Fourier transforms for DNA or protein sequence comparison. These methods are also intended as a contribution to the more general computer science problem of text search. These methods extend the capabilities of previous FFT methods and show that these methods are capable of considerable refinement. In particular, novel methods are given which (1) enable the detection of clusters of matching letters, (2) facilitate the insertion of gaps to enhance sequence similarity, and (3) accommodate to varying densities of letters in the input sequences. These methods use Fourier analysis in two distinct ways. (1) Fast Fourier transforms are used to facilitate rapid computation. (2) Fourier expansions are used to form an 'image' of the sequence comparison. PMID:2243777

  7. Meta-analysis: In Practice An article for the

    E-print Network

    Meta-analysis: In Practice An article for the International Encyclopedia of the Social-analysis is concerned with ensuring the validity and robustness of the entire enterprise of research synthesis. Often there is little control over the choice of the summary measure and di#11;erent summary measures

  8. Geographic data misalignment: Practical solutions for integrated data analysis

    E-print Network

    Hargrove, William W.

    water supplies to pesticide residues in raw water depends in part on pesticide usage in a watershed1 Geographic data misalignment: Practical solutions for integrated data analysis B. Bhaduri1 , E. A for integrated analysis using multiple spatial data. In a GIS, overlay and visual exploration of relationships

  9. MAD Skills: New Analysis Practices for Big Data

    Microsoft Academic Search

    Jeffrey Cohen; Brian Dolan; Mark Dunlap; Joseph M. Hellerstein; Caleb Welton

    2009-01-01

    As massive data acquisition and storage becomes increas- ingly aordable, a wide variety of enterprises are employing statisticians to engage in sophisticated data analysis. In this paper we highlight the emerging practice of Magnetic, Ag- ile, Deep (MAD) data analysis as a radical departure from traditional Enterprise Data Warehouses and Business Intel- ligence. We present our design philosophy, techniques and

  10. Comparative Lifecycle Energy Analysis: Theory and Practice.

    ERIC Educational Resources Information Center

    Morris, Jeffrey; Canzoneri, Diana

    1992-01-01

    Explores the position that more energy is conserved through recycling secondary materials than is generated from municipal solid waste incineration. Discusses one component of a lifecycle analysis--a comparison of energy requirements for manufacturing competing products. Includes methodological issues, energy cost estimates, and difficulties…

  11. Practical DNA Microarray Analysis: An Introduction

    E-print Network

    Spang, Rainer

    DNA versus oligonucleotide microarrays, spotted vs. printed vs. in-situ synthesized chips, one- channel vs statistical tests (t-test, Wilco- xon test) P values from these tests have to be corrected for mul- tiple are nested, the appropriate statistical method is ANO- VA The problem of multiple testing persists #12

  12. Systematic Review and Meta-Analysis of Practice Facilitation Within Primary Care Settings

    PubMed Central

    Baskerville, N. Bruce; Liddy, Clare; Hogg, William

    2012-01-01

    PURPOSE This study was a systematic review with a quantitative synthesis of the literature examining the overall effect size of practice facilitation and possible moderating factors. The primary outcome was the change in evidence-based practice behavior calculated as a standardized mean difference. METHODS In this systematic review, we searched 4 electronic databases and the reference lists of published literature reviews to find practice facilitation studies that identified evidence-based guideline implementation within primary care practices as the outcome. We included randomized and nonrandomized controlled trials and prospective cohort studies published from 1966 to December 2010 in English language only peer-reviewed journals. Reviews of each study were conducted and assessed for quality; data were abstracted, and standardized mean difference estimates and 95% confidence intervals (CIs) were calculated using a random-effects model. Publication bias, influence, subgroup, and meta-regression analyses were also conducted. RESULTS Twenty-three studies contributed to the analysis for a total of 1,398 participating practices: 697 practice facilitation intervention and 701 control group practices. The degree of variability between studies was consistent with what would be expected to occur by chance alone (I2 = 20%). An overall effect size of 0.56 (95% CI, 0.43–0.68) favored practice facilitation (z = 8.76; P <.001), and publication bias was evident. Primary care practices are 2.76 (95% CI, 2.18–3.43) times more likely to adopt evidence-based guidelines through practice facilitation. Meta-regression analysis indicated that tailoring (P = .05), the intensity of the intervention (P = .03), and the number of intervention practices per facilitator (P = .004) modified evidence-based guideline adoption. CONCLUSION Practice facilitation has a moderately robust effect on evidence-based guideline adoption within primary care. Implementation fidelity factors, such as tailoring, the number of practices per facilitator, and the intensity of the intervention, have important resource implications. PMID:22230833

  13. Standard practice for digital imaging and communication in nondestructive evaluation (DICONDE) for X-ray computed tomography (CT) test methods

    E-print Network

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice facilitates the interoperability of X-ray computed tomography (CT) imaging equipment by specifying image data transfer and archival storage methods in commonly accepted terms. This document is intended to be used in conjunction with Practice E2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see http://medical.nema.org), an international standard for image data acquisition, review, storage and archival storage. The goal of Practice E2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE test results on any system conforming to the DICONDE standard. Toward that end, Practice E2339 provides a data dictionary and a set of information modules that are applicable to all NDE modalities. This practice supplements Practice E2339 by providing information object definitio...

  14. Standard practice for digital imaging and communication in nondestructive evaluation (DICONDE) for digital radiographic (DR) test methods

    E-print Network

    American Society for Testing and Materials. Philadelphia

    2010-01-01

    1.1 This practice facilitates the interoperability of digital X-ray imaging equipment by specifying image data transfer and archival methods in commonly accepted terms. This document is intended to be used in conjunction with Practice E2339 on Digital Imaging and Communication in Nondestructive Evaluation (DICONDE). Practice E2339 defines an industrial adaptation of the NEMA Standards Publication titled Digital Imaging and Communications in Medicine (DICOM, see http://medical.nema.org), an international standard for image data acquisition, review, storage and archival storage. The goal of Practice E2339, commonly referred to as DICONDE, is to provide a standard that facilitates the display and analysis of NDE results on any system conforming to the DICONDE standard. Toward that end, Practice E2339 provides a data dictionary and a set of information modules that are applicable to all NDE modalities. This practice supplements Practice E2339 by providing information object definitions, information modules and a ...

  15. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ...articles involving unfair methods of competition or practices. 12.39 Section...SPECIAL CLASSES OF MERCHANDISE Unfair Competition § 12.39 Imported articles involving unfair methods of competition or practices. (a)...

  16. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ...articles involving unfair methods of competition or practices. 12.39 Section...SPECIAL CLASSES OF MERCHANDISE Unfair Competition § 12.39 Imported articles involving unfair methods of competition or practices. (a)...

  17. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ...articles involving unfair methods of competition or practices. 12.39 Section...SPECIAL CLASSES OF MERCHANDISE Unfair Competition § 12.39 Imported articles involving unfair methods of competition or practices. (a)...

  18. SNM measurement methods: the state of the practice

    SciTech Connect

    Rogers, D.R.

    1982-01-01

    In order to determine the specific applications and performance of special nuclear materials (SNM) measurement methods being used routinely in nuclear fuel production facilities, a survey of users was conducted at 22 commercial and DOE plants. The methods used and their performance when applied to the wide range of uranium and plutonium materials encountered during processing were evaluated. The most frequently used methods for safeguards material control and accounting involve a combination of bulk measurement (weight or volume), sampling, and chemical assay/isotopic analysis, particularly for feed, intermediate products, and final products. Non-destructive assay methods are used on materials that are difficult to sample or dissolve (when chemical assay methods are ineffective), wuch as inhomogeneous, impure scrap and waste.

  19. Methods for genetic linkage analysis using trisomies

    SciTech Connect

    Feingold, E.; Lamb, N.E.; Sherman, S.L. [Emory Univ., Atlanta, GA (United States)

    1994-09-01

    Certain genetic disorders (e.g. congenital cataracts, duodenal atresia) are rare in the general population, but more common in people with Down`s syndrome. We present a method for using individuals with trisomy 21 to map genes for such traits. Our methods are analogous to methods for mapping autosomal dominant traits using affected relative pairs by looking for markers with greater than expected identity-by-descent. In the trisomy case, one would take trisomic individuals and look for markers with greater than expected reduction to homozygosity in the chromosomes inherited form the non-disjoining parent. We present statistical methods for performing such a linkage analysis, including a test for linkage to a marker, a method for estimating the distance from the marker to the gene, a confidence interval for that distance, and methods for computing power and sample sizes. The methods are described in the context of gene-dosage model for the etiology of the disorder, but can be extended to other models. We also resolve some practical issues involved in implementing the methods, including how to use partially informative markers, how to test candidate genes, and how to handle the effect of reduced recombination associated with maternal meiosis I non-disjunction.

  20. Method of multivariate spectral analysis

    DOEpatents

    Keenan, Michael R.; Kotula, Paul G.

    2004-01-06

    A method of determining the properties of a sample from measured spectral data collected from the sample by performing a multivariate spectral analysis. The method can include: generating a two-dimensional matrix A containing measured spectral data; providing a weighted spectral data matrix D by performing a weighting operation on matrix A; factoring D into the product of two matrices, C and S.sup.T, by performing a constrained alternating least-squares analysis of D=CS.sup.T, where C is a concentration intensity matrix and S is a spectral shapes matrix; unweighting C and S by applying the inverse of the weighting used previously; and determining the properties of the sample by inspecting C and S. This method can be used to analyze X-ray spectral data generated by operating a Scanning Electron Microscope (SEM) with an attached Energy Dispersive Spectrometer (EDS).

  1. Method of photon spectral analysis

    DOEpatents

    Gehrke, R.J.; Putnam, M.H.; Killian, E.W.; Helmer, R.G.; Kynaston, R.L.; Goodwin, S.G.; Johnson, L.O.

    1993-04-27

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and [gamma]-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2,000 keV), as well as high-energy [gamma] rays (>1 MeV). A 8,192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The [gamma]-ray portion of each spectrum is analyzed by a standard Ge [gamma]-ray analysis program. This method can be applied to any analysis involving x- and [gamma]-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the [gamma]-ray analysis and accommodated during the x-ray analysis.

  2. Method of photon spectral analysis

    DOEpatents

    Gehrke, Robert J. (Idaho Falls, ID); Putnam, Marie H. (Idaho Falls, ID); Killian, E. Wayne (Idaho Falls, ID); Helmer, Richard G. (Idaho Falls, ID); Kynaston, Ronnie L. (Blackfoot, ID); Goodwin, Scott G. (Idaho Falls, ID); Johnson, Larry O. (Pocatello, ID)

    1993-01-01

    A spectroscopic method to rapidly measure the presence of plutonium in soils, filters, smears, and glass waste forms by measuring the uranium L-shell x-ray emissions associated with the decay of plutonium. In addition, the technique can simultaneously acquire spectra of samples and automatically analyze them for the amount of americium and .gamma.-ray emitting activation and fission products present. The samples are counted with a large area, thin-window, n-type germanium spectrometer which is equally efficient for the detection of low-energy x-rays (10-2000 keV), as well as high-energy .gamma. rays (>1 MeV). A 8192- or 16,384 channel analyzer is used to acquire the entire photon spectrum at one time. A dual-energy, time-tagged pulser, that is injected into the test input of the preamplifier to monitor the energy scale, and detector resolution. The L x-ray portion of each spectrum is analyzed by a linear-least-squares spectral fitting technique. The .gamma.-ray portion of each spectrum is analyzed by a standard Ge .gamma.-ray analysis program. This method can be applied to any analysis involving x- and .gamma.-ray analysis in one spectrum and is especially useful when interferences in the x-ray region can be identified from the .gamma.-ray analysis and accommodated during the x-ray analysis.

  3. Multivariate analysis methods for spectroscopic blood analysis

    NASA Astrophysics Data System (ADS)

    Wood, Michael F. G.; Rohani, Arash; Ghazalah, Rashid; Vitkin, I. Alex; Pawluczyk, Romuald

    2012-01-01

    Blood tests are an essential tool in clinical medicine with the ability diagnosis or monitor various diseases and conditions; however, the complexities of these measurements currently restrict them to a laboratory setting. P&P Optica has developed and currently produces patented high performance spectrometers and is developing a spectrometer-based system for rapid reagent-free blood analysis. An important aspect of this analysis is the need to extract the analyte specific information from the measured signal such that the analyte concentrations can be determined. To this end, advanced chemometric methods are currently being investigated and have been tested using simulated spectra. A blood plasma model was used to generate Raman, near infrared, and optical rotatory dispersion spectra with glucose as the target analyte. The potential of combined chemometric techniques, where multiple spectroscopy modalities are used in a single regression model to improve the prediction ability was investigated using unfold partial least squares and multiblock partial least squares. Results show improvement in the predictions of glucose levels using the combined methods and demonstrate potential for multiblock chemometrics in spectroscopic blood analysis.

  4. Using multi?layered supervision methods to develop creative practice

    Microsoft Academic Search

    Dan Richmond

    2009-01-01

    This article sets out to examine the mechanisms required for developing effective supervision processes which nurture creative practice within teams. In order to explore the subject effectively it will be necessary to consider aspects of theory and practice, highlighting some of the key issues and themes within the topic, such as what is meant by creative practice and how it

  5. An epidemiological method applied to practices to measure the representativeness of their prescribing characteristics

    Microsoft Academic Search

    D M Fleming

    1984-01-01

    The standardised report of the Prescription Pricing Authority, which is concerned with the prescribing characteristics of practices, was used as an epidemiological tool to evaluate the prescribing representativeness of practices. Study practices were compared with average prescribing results from family practitioner committees, which are specific for the geographical district and month sampled. The method was applied in 40 practices, representing

  6. Exhaled breath analysis: physical methods, instruments, and medical diagnostics

    NASA Astrophysics Data System (ADS)

    Vaks, V. L.; Domracheva, E. G.; Sobakinskaya, E. A.; Chernyaeva, M. B.

    2014-07-01

    This paper reviews the analysis of exhaled breath, a rapidly growing field in noninvasive medical diagnostics that lies at the intersection of physics, chemistry, and medicine. Current data are presented on gas markers in human breath and their relation to human diseases. Various physical methods for breath analysis are described. It is shown how measurement precision and data volume requirements have stimulated technological developments and identified the problems that have to be solved to put this method into clinical practice.

  7. A Deliberate Practice Approach to Teaching Phylogenetic Analysis

    ERIC Educational Resources Information Center

    Hobbs, F. Collin; Johnson, Daniel J.; Kearns, Katherine D.

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we…

  8. Mobile IPv6 Deployments: Graphbased Analysis and practical Guidelines

    E-print Network

    Magnien, Clémence

    Mobile IPv6 Deployments: Graph­based Analysis and practical Guidelines Guillaume Valadon a,b,# Cl LIP6, F­75005, Paris, France c TOYOTA InfoTechnology Center, U.S.A., Inc. Abstract The Mobile IPv6 to provide permanent IP addresses to end­users of WiMAX and 3GPP2. Mobile IPv6 relies on a specific router

  9. A Model of Practice in Special Education: Dynamic Ecological Analysis

    ERIC Educational Resources Information Center

    Hannant, Barbara; Lim, Eng Leong; McAllum, Ruth

    2010-01-01

    Dynamic Ecological Analysis (DEA) is a model of practice which increases a teams' efficacy by enabling the development of more effective interventions through collaboration and collective reflection. This process has proved to be useful in: a) clarifying thinking and problem-solving, b) transferring knowledge and thinking to significant parties,…

  10. A Meta-Analysis of Published School Social Work Practice Studies: 1980-2007

    ERIC Educational Resources Information Center

    Franklin, Cynthia; Kim, Johnny S.; Tripodi, Stephen J.

    2009-01-01

    Objective: This systematic review examined the effectiveness of school social work practices using meta-analytic techniques. Method: Hierarchical linear modeling software was used to calculate overall effect size estimates as well as test for between-study variability. Results: A total of 21 studies were included in the final analysis.…

  11. Practical evaluation of Mung bean seed pasteurization method in Japan.

    PubMed

    Bari, M L; Enomoto, K; Nei, D; Kawamoto, S

    2010-04-01

    The majority of the seed sprout-related outbreaks have been associated with Escherichia coli O157:H7 and Salmonella. Therefore, an effective method for inactivating these organisms on the seeds before sprouting is needed. The current pasteurization method for mung beans in Japan (hot water treatment at 85 degrees C for 10 s) was more effective for disinfecting inoculated E. coli O157:H7, Salmonella, and nonpathogenic E. coli on mung bean seeds than was the calcium hypochlorite treatment (20,000 ppm for 20 min) recommended by the U.S. Food and Drug Administration. Hot water treatment at 85 degrees C for 40 s followed by dipping in cold water for 30 s and soaking in chlorine water (2,000 ppm) for 2 h reduced the pathogens to undetectable levels, and no viable pathogens were found in a 25-g enrichment culture and during the sprouting process. Practical tests using a working pasteurization machine with nonpathogenic E. coli as a surrogate produced similar results. The harvest yield of the treated seed was within the acceptable range. These treatments could be a viable alternative to the presently recommended 20,000-ppm chlorine treatment for mung bean seeds. PMID:20377967

  12. Practice

    NSDL National Science Digital Library

    Paul Goldenberg

    2011-10-25

    This article focuses on the role and techniques of effective ("distributed") practice that leads to full and fluent mastery of mental mathematics as well as conceptual growth around properties of arithmetic. It lists the essential mental math skills needed for fluent computation at grades 1, 2, and 3. The article describes a number of strategies for developing mental skills and links to pages with more details on others (some not yet complete). While this article refers to the Think Math! curriculum published by EDC, the methods generalize to any program. The Fact of the Day technique and a related video are cataloged separately.

  13. Diagnostic Methods for Bile Acid Malabsorption in Clinical Practice

    PubMed Central

    Vijayvargiya, Priya; Camilleri, Michael; Shin, Andrea; Saenger, Amy

    2013-01-01

    Altered bile acid (BA) concentrations in the colon may cause diarrhea or constipation. BA malabsorption (BAM) accounts for >25% of patients with irritable bowel syndrome (IBS) with diarrhea and chronic diarrhea in Western countries. As BAM is increasingly recognized, proper diagnostic methods are desired in clinical practice to help direct the most effective treatment course for the chronic bowel dysfunction. This review appraises the methodology, advantages and disadvantages of 4 tools that directly measure BAM: 14C-glycocholate breath and stool test, 75Selenium HomotauroCholic Acid Test (SeHCAT), 7 ?-hydroxy-4-cholesten-3-one (C4) and fecal BAs. 14C-glycocholate is a laborious test no longer widely utilized. 75SeHCAT is validated, but not available in the United States. Serum C4 is a simple, accurate method that is applicable to a majority of patients, but requires further clinical validation. Fecal measurements to quantify total and individual fecal BAs are technically cumbersome and not widely available. Regrettably, none of these tests are routinely available in the U.S., and a therapeutic trial with a BA binder is used as a surrogate for diagnosis of BAM. Recent data suggest there is an advantage to studying fecal excretion of the individual BAs and their role in BAM; this may constitute a significant advantage of the fecal BA method over the other tests. Fecal BA test could become a routine addition to fecal fat measurement in patients with unexplained diarrhea. In summary, availability determines the choice of test among C4, SeHCAT and fecal BA; more widespread availability of such tests would enhance clinical management of these patients. PMID:23644387

  14. Combining the soilwater balance and water-level fluctuation methods to estimate natural groundwater recharge: Practical aspects

    Microsoft Academic Search

    MARIOS A. SOPHOCLEOUS

    1991-01-01

    Sophocleous, M.A., 1991. Combining the soilwater balance and water-level fluctuation methods to estimate natural groundwater recharge: practical aspects. J. Hydrol., 124: 229-241. A relatively simple and practical approach for calculating groundwater recharge in semiarid plain environments with a relatively shallow water table, such as the Kansas Prairies, is outlined. Major uncertainties in the Darcian, water balance, and groundwater fluctuation analysis

  15. Voltametric analysis apparatus and method

    DOEpatents

    Almon, Amy C. (410 Waverly Dr., Augusta, GA 30909)

    1993-01-01

    An apparatus and method for electrochemical analysis of elements in solution. An auxiliary electrode 14, a reference electrode 18, and five working electrodes 20, 22, 26, 28, and 30 are positioned in a container 12 containing a sample solution 34. The working electrodes are spaced apart evenly from each other and auxiliary electrode 14 to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between auxiliary electrode 14 and each of the working electrodes 20, 22, 26, 28, and 30. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in sample solution 34 and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  16. Voltammetric analysis apparatus and method

    DOEpatents

    Almon, A.C.

    1993-06-08

    An apparatus and method is described for electrochemical analysis of elements in solution. An auxiliary electrode, a reference electrode, and five working electrodes are positioned in a container containing a sample solution. The working electrodes are spaced apart evenly from each other and the auxiliary electrode to minimize any inter-electrode interference that may occur during analysis. An electric potential is applied between the auxiliary electrode and each of the working electrodes. Simultaneous measurements taken of the current flow through each of the working electrodes for each given potential in a potential range are used for identifying chemical elements present in the sample solution and their respective concentrations. Multiple working electrodes enable a more positive identification to be made by providing unique data characteristic of chemical elements present in the sample solution.

  17. Adapting Job Analysis Methodology to Improve Evaluation Practice

    ERIC Educational Resources Information Center

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  18. Adapting Job Analysis Methodology to Improve Evaluation Practice

    Microsoft Academic Search

    Susan M. Jenkins; Patrick Curtin

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs administer programs and provide program services, interpretation of outcome data, make comparisons

  19. Content Analysis as a Best Practice in Technical Communication Research

    ERIC Educational Resources Information Center

    Thayer, Alexander; Evans, Mary; McBride, Alicia; Queen, Matt; Spyridakis, Jan

    2007-01-01

    Content analysis is a powerful empirical method for analyzing text, a method that technical communicators can use on the job and in their research. Content analysis can expose hidden connections among concepts, reveal relationships among ideas that initially seem unconnected, and inform the decision-making processes associated with many technical…

  20. Practical implementation of nonlinear time series methods: The TISEAN package

    E-print Network

    Rainer Hegger; Holger Kantz; Thomas Schreiber

    1998-09-30

    Nonlinear time series analysis is becoming a more and more reliable tool for the study of complicated dynamics from measurements. The concept of low-dimensional chaos has proven to be fruitful in the understanding of many complex phenomena despite the fact that very few natural systems have actually been found to be low dimensional deterministic in the sense of the theory. In order to evaluate the long term usefulness of the nonlinear time series approach as inspired by chaos theory, it will be important that the corresponding methods become more widely accessible. This paper, while not a proper review on nonlinear time series analysis, tries to make a contribution to this process by describing the actual implementation of the algorithms, and their proper usage. Most of the methods require the choice of certain parameters for each specific time series application. We will try to give guidance in this respect. The scope and selection of topics in this article, as well as the implementational choices that have been made, correspond to the contents of the software package TISEAN which is publicly available from http://www.mpipks-dresden.mpg.de/~tisean . In fact, this paper can be seen as an extended manual for the TISEAN programs. It fills the gap between the technical documentation and the existing literature, providing the necessary entry points for a more thorough study of the theoretical background.

  1. Urban logistics pooling viabililty analysis via a multicriteria multiactor method

    E-print Network

    Paris-Sud XI, Université de

    Urban logistics pooling viabililty analysis via a multicriteria multiactor method Jesus Gonzalez transportation and logistics pooling are relatively new concepts in research, but are very popular in practice. In the last years, collaborative transportation seems a good city logistics alternative to classical urban

  2. 44 CFR 9.9 - Analysis and reevaluation of practicable alternatives.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...FLOODPLAIN MANAGEMENT AND PROTECTION OF WETLANDS § 9.9 Analysis and reevaluation...requirements to avoid floodplains and wetlands unless there is no practicable alternative...practicable alternative, or the floodplain or wetland is itself not a practicable...

  3. 44 CFR 9.9 - Analysis and reevaluation of practicable alternatives.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...FLOODPLAIN MANAGEMENT AND PROTECTION OF WETLANDS § 9.9 Analysis and reevaluation...requirements to avoid floodplains and wetlands unless there is no practicable alternative...practicable alternative, or the floodplain or wetland is itself not a practicable...

  4. 44 CFR 9.9 - Analysis and reevaluation of practicable alternatives.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ...FLOODPLAIN MANAGEMENT AND PROTECTION OF WETLANDS § 9.9 Analysis and reevaluation...requirements to avoid floodplains and wetlands unless there is no practicable alternative...practicable alternative, or the floodplain or wetland is itself not a practicable...

  5. 44 CFR 9.9 - Analysis and reevaluation of practicable alternatives.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ...FLOODPLAIN MANAGEMENT AND PROTECTION OF WETLANDS § 9.9 Analysis and reevaluation...requirements to avoid floodplains and wetlands unless there is no practicable alternative...practicable alternative, or the floodplain or wetland is itself not a practicable...

  6. 44 CFR 9.9 - Analysis and reevaluation of practicable alternatives.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ...FLOODPLAIN MANAGEMENT AND PROTECTION OF WETLANDS § 9.9 Analysis and reevaluation...requirements to avoid floodplains and wetlands unless there is no practicable alternative...practicable alternative, or the floodplain or wetland is itself not a practicable...

  7. Practical Aspects of the Equation-Error Method for Aircraft Parameter Estimation

    NASA Technical Reports Server (NTRS)

    Morelli, Eugene a.

    2006-01-01

    Various practical aspects of the equation-error approach to aircraft parameter estimation were examined. The analysis was based on simulated flight data from an F-16 nonlinear simulation, with realistic noise sequences added to the computed aircraft responses. This approach exposes issues related to the parameter estimation techniques and results, because the true parameter values are known for simulation data. The issues studied include differentiating noisy time series, maximum likelihood parameter estimation, biases in equation-error parameter estimates, accurate computation of estimated parameter error bounds, comparisons of equation-error parameter estimates with output-error parameter estimates, analyzing data from multiple maneuvers, data collinearity, and frequency-domain methods.

  8. A Mixed-Method Approach to Investigating the Adoption of Evidence-Based Pain Practices in Nursing Homes

    PubMed Central

    Ersek, Mary; Jablonski, Anita

    2014-01-01

    This mixed methods study examined perceived facilitators and obstacles to adopting evidence-based pain management protocols vis-a-vis documented practice changes that were measured using a chart audit tool. This analysis used data from a subgroup of four nursing homes that participated in a clinical trial. Focus group interviews with staff yielded qualitative data about perceived factors that affected their willingness and ability to use the protocols. Chart audits determined whether pain assessment and management practices changed over time in light of these reported facilitators and barriers. Reported facilitators included administrative support, staff consistency, and policy and procedure changes. Barriers were staff attitudes, regulatory issues, and provider mistrust of nurses’ judgment. Overall, staff reported improvements in pain practices. These reports were corroborated by modest but significant increases in adherence to recommended practices. Change in clinical practice is complex and requires attention to both structural and process aspects of care. PMID:24640959

  9. An analysis of revenues and expenses in a hospital-based ambulatory pediatric practice.

    PubMed

    Berkelhamer, J E; Rojek, K J

    1988-05-01

    We developed a method of analyzing revenues and expenses in a hospital-based ambulatory pediatric practice. Results of an analysis of the Children's Medical Group (CMG) at the University of Chicago Medical Center demonstrate how changes in collection rates, practice expenses, and hospital underwriting contribute to the financial outcome of the practice. In this analysis, certain programmatic goals of the CMG are achieved at a level of just under 12,000 patient visits per year. At this activity level, pediatric residency program needs are met and income to the CMG physicians is maximized. An ethical problem from the physician's perspective is created by seeking profit maximization. To accomplish this end, the CMG physicians would have to restrict their personal services to only the better-paying patients. This study serves to underscore the importance of hospital-based physicians and hospital administrators structuring fiscal incentives for physicians that mutually meet the institutional goals for the hospital and its physicians. PMID:3358399

  10. Application of agile method in the enterprise website backstage management system: Practices for extreme programming

    Microsoft Academic Search

    Linghui Liu; Yao Lu

    2012-01-01

    When the traditional method lacking in adapting requirement variety, some agile software development methods appears, which flexible development mechanism could control the risk that requirement variety brings. Taking Extreme Programming as an example, this paper introduces ideas, values and process practice rules of agile methods. Extreme Programming is a software development methodology that is agile and based on practice. This

  11. Neuroscience and the Feldenkrais Method: evidence in research and clinical practice

    E-print Network

    Hickman, Mark

    @ canterbury.ac.nz Some say evidence-based practice stifles the creative therapies and learning modalitiesNeuroscience and the Feldenkrais Method: evidence in research and clinical practice Associate. It draws on principles of exploratory practice rather than prescribed exercises and can work at different

  12. Using Qualitative Evaluation Methods to Identify Exemplary Practices in Early Childhood Education.

    ERIC Educational Resources Information Center

    DeStefano, Lizanne; And Others

    1992-01-01

    Describes an alternative method for identifying exemplary practices in early childhood education programs. In this approach, parents and practitioners identify exemplary practices after coming to know the practices and the context in which they operate through observation, interview, and document review. Discusses methodological issues raised by…

  13. Methods for cancer epigenome analysis.

    PubMed

    Nagarajan, Raman P; Fouse, Shaun D; Bell, Robert J A; Costello, Joseph F

    2013-01-01

    Accurate detection of epimutations in tumor cells is crucial for -understanding the molecular pathogenesis of cancer. Alterations in DNA methylation in cancer are functionally important and clinically relevant, but even this well-studied area is continually re-evaluated in light of unanticipated results, such as the strong association between aberrant DNA methylation in adult tumors and polycomb group profiles in embryonic stem cells, cancer-associated genetic mutations in epigenetic regulators such as DNMT3A and TET family genes, and the discovery of altered 5-hydroxymethylcytosine, a product of TET proteins acting on 5-methylcytosine, in human tumors with TET mutations. The abundance and distribution of covalent histone modifications in primary cancer tissues relative to normal cells is an important but largely uncharted area, although there is good evidence for a mechanistic role of cancer-specific alterations in histone modifications in tumor etiology, drug response, and tumor progression. Meanwhile, the discovery of new epigenetic marks continues, and there are many useful methods for epigenome analysis applicable to primary tumor samples, in addition to cancer cell lines. For DNA methylation and hydroxymethylation, next-generation sequencing allows increasingly inexpensive and quantitative whole-genome profiling. Similarly, the refinement and maturation of chromatin immunoprecipitation with next-generation sequencing (ChIP-seq) has made possible genome-wide mapping of histone modifications, open chromatin, and transcription factor binding sites. Computational tools have been developed apace with these epigenome methods to better enable accurate interpretation of the profiling data. PMID:22956508

  14. Practicing Intersectionality in Sociological Research: A Critical Analysis of Inclusions, Interactions, and Institutions in the Study

    E-print Network

    Sheridan, Jennifer

    Practicing Intersectionality in Sociological Research: A Critical Analysis of Inclusions studies, we draw attention to the comparative, contextual, and complex dimensions of sociological analysis bite. Moreover, whether such feminist appeals have practical consequences for sociology is hard

  15. Practice patterns in FNA technique: A survey analysis

    PubMed Central

    DiMaio, Christopher J; Buscaglia, Jonathan M; Gross, Seth A; Aslanian, Harry R; Goodman, Adam J; Ho, Sammy; Kim, Michelle K; Pais, Shireen; Schnoll-Sussman, Felice; Sethi, Amrita; Siddiqui, Uzma D; Robbins, David H; Adler, Douglas G; Nagula, Satish

    2014-01-01

    AIM: To ascertain fine needle aspiration (FNA) techniques by endosonographers with varying levels of experience and environments. METHODS: A survey study was performed on United States based endosonographers. The subjects completed an anonymous online electronic survey. The main outcome measurements were differences in needle choice, FNA technique, and clinical decision making among endosonographers and how this relates to years in practice, volume of EUS-FNA procedures, and practice environment. RESULTS: A total of 210 (30.8%) endosonographers completed the survey. Just over half (51.4%) identified themselves as academic/university-based practitioners. The vast majority of respondents (77.1%) identified themselves as high-volume endoscopic ultrasound (EUS) (> 150 EUS/year) and high-volume FNA (> 75 FNA/year) performers (73.3). If final cytology is non-diagnostic, high-volume EUS physicians were more likely than low volume physicians to repeat FNA with a core needle (60.5% vs 31.2%; P = 0.0004), and low volume physicians were more likely to refer patients for either surgical or percutaneous biopsy, (33.4% vs 4.9%, P < 0.0001). Academic physicians were more likely to repeat FNA with a core needle (66.7%) compared to community physicians (40.2%, P < 0.001). CONCLUSION: There is significant variation in EUS-FNA practices among United States endosonographers. Differences appear to be related to EUS volume and practice environment. PMID:25324922

  16. Honesty in critically reflective essays: an analysis of student practice.

    PubMed

    Maloney, Stephen; Tai, Joanna Hong-Meng; Lo, Kristin; Molloy, Elizabeth; Ilic, Dragan

    2013-10-01

    In health professional education, reflective practice is seen as a potential means for self-improvement from everyday clinical encounters. This study aims to examine the level of student honesty in critical reflection, and barriers and facilitators for students engaging in honest reflection. Third year physiotherapy students, completing summative reflective essays on clinical encounters using the modified Gibbs cycle, were invited to participate in an anonymous online survey. Student knowledge and beliefs about reflective practice, and disclosure of the truthfulness of their reflections, were assessed using a mixed method approach. A total of 34 students, from a maximum possible of 48 (71 %), participated in the study activities. A total of 68 % stated that they were at least 80 % truthful about their experiences. There was general student consensus that reflective practice was important for their growth as a clinician. Students questioned the belief that the reflection needed to be based on a factual experience. Reflective practice can be a valuable addition to the clinical education of health care professionals, although this value can be diminished through dishonest reflections if it is not carefully implemented. Student influences on honest reflection include; (1) the design of any assessment criteria, and (2) student knowledge and competency in applying critical reflection. PMID:22926807

  17. Designing for scientific data analysis: From practice to prototype

    Microsoft Academic Search

    Springmeyer

    1992-01-01

    Designers charged with creating tools for processes foreign to their own experience need a reliable source of application knowledge. This dissertation presents an empirical study of the scientific data analysis process in order to inform the design of tools for this important aspect of scientific computing. Interaction analysis and contextual inquiry methods were adapted to observe scientists analyzing their own

  18. Estimating free-living human energy expenditure: Practical aspects of the doubly labeled water method and its applications

    PubMed Central

    Kazuko, Ishikawa-Takata; Kim, Eunkyung; Kim, Jeonghyun; Yoon, Jinsook

    2014-01-01

    The accuracy and noninvasive nature of the doubly labeled water (DLW) method makes it ideal for the study of human energy metabolism in free-living conditions. However, the DLW method is not always practical in many developing and Asian countries because of the high costs of isotopes and equipment for isotope analysis as well as the expertise required for analysis. This review provides information about the theoretical background and practical aspects of the DLW method, including optimal dose, basic protocols of two- and multiple-point approaches, experimental procedures, and isotopic analysis. We also introduce applications of DLW data, such as determining the equations of estimated energy requirement and validation studies of energy intake. PMID:24944767

  19. Coal Field Fire Fighting - Practiced methods, strategies and tactics

    NASA Astrophysics Data System (ADS)

    Wündrich, T.; Korten, A. A.; Barth, U. H.

    2009-04-01

    Subsurface coal fires destroy millions of tons of coal each year, have an immense impact to the ecological surrounding and threaten further coal reservoirs. Due to enormous dimensions a coal seam fire can develop, high operational expenses are needed. As part of the Sino-German coal fire research initiative "Innovative technologies for exploration, extinction and monitoring of coal fires in Northern China" the research team of University of Wuppertal (BUW) focuses on fire extinction strategies and tactics as well as aspects of environmental and health safety. Besides the choice and the correct application of different extinction techniques further factors are essential for the successful extinction. Appropriate tactics, well trained and protected personnel and the choice of the best fitting extinguishing agents are necessary for the successful extinction of a coal seam fire. The chosen strategy for an extinction campaign is generally determined by urgency and importance. It may depend on national objectives and concepts of coal conservation, on environmental protection (e.g. commitment to green house gases (GHG) reductions), national funding and resources for fire fighting (e.g. personnel, infrastructure, vehicles, water pipelines); and computer-aided models and simulations of coal fire development from self ignition to extinction. In order to devise an optimal fire fighting strategy, "aims of protection" have to be defined in a first step. These may be: - directly affected coal seams; - neighboring seams and coalfields; - GHG emissions into the atmosphere; - Returns on investments (costs of fire fighting compared to value of saved coal). In a further step, it is imperative to decide whether the budget shall define the results, or the results define the budget; i.e. whether there are fixed objectives for the mission that will dictate the overall budget, or whether the limited resources available shall set the scope within which the best possible results shall be achieved. For an effective and efficient fire fighting optimal tactics are requiered and can be divided into four fundamental tactics to control fire hazards: - Defense (digging away the coal, so that the coal can not begin to burn; or forming a barrier, so that the fire can not reach the not burning coal), - Rescue the coal (coal mining of a not burning seam), - Attack (active and direct cooling of burning seam), - Retreat (only monitoring till self-extinction of a burning seam). The last one is used when a fire exceeds the organizational and/or technical scope of a mission. In other words, "to control a coal fire" does not automatically and in all situations mean "to extinguish a coal fire". Best-practice tactics or a combination of them can be selected for control of a particular coal fire. For the extinguishing works different extinguishing agents are available. They can be applied by different application techniques and varying distinctive operating expenses. One application method may be the drilling of boreholes from the surface or covering the surface with low permeability soils. The mainly used extinction agents for coal field fire are as followed: Water (with or without additives), Slurry, Foaming mud/slurry, Inert gases, Dry chemicals and materials and Cryogenic agents. Because of its tremendous dimension and its complexity the worldwide challenge of coal fires is absolutely unique - it can only be solved with functional application methods, best fitting strategies and tactics, organisation and research as well as the dedication of the involved fire fighters, who work under extreme individual risks on the burning coal fields.

  20. A practical introduction to multivariate meta-analysis.

    PubMed

    Mavridis, Dimitris; Salanti, Georgia

    2013-04-01

    Multivariate meta-analysis is becoming increasingly popular and official routines or self-programmed functions have been included in many statistical software. In this article, we review the statistical methods and the related software for multivariate meta-analysis. Emphasis is placed on Bayesian methods using Markov chain Monte Carlo, and codes in WinBUGS are provided. The various model-fitting options are illustrated in two examples and specific guidance is provided on how to run a multivariate meta-analysis using various software packages. PMID:22275379

  1. Polydispersity analysis of Taylor dispersion data: the cumulant method

    E-print Network

    Luca Cipelletti; Jean-Philippe Biron; Michel Martin; Hervé Cottet

    2014-08-26

    Taylor dispersion analysis is an increasingly popular characterization method that measures the diffusion coefficient, and hence the hydrodynamic radius, of (bio)polymers, nanoparticles or even small molecules. In this work, we describe an extension to current data analysis schemes that allows size polydispersity to be quantified for an arbitrary sample, thereby significantly enhancing the potentiality of Taylor dispersion analysis. The method is based on a cumulant development similar to that used for the analysis of dynamic light scattering data. Specific challenges posed by the cumulant analysis of Taylor dispersion data are discussed, and practical ways to address them are proposed. We successfully test this new method by analyzing both simulated and experimental data for solutions of moderately polydisperse polymers and polymer mixtures.

  2. Parallel Processable Cryptographic Methods with Unbounded Practical Security.

    ERIC Educational Resources Information Center

    Rothstein, Jerome

    Addressing the problem of protecting confidential information and data stored in computer databases from access by unauthorized parties, this paper details coding schemes which present such astronomical work factors to potential code breakers that security breaches are hopeless in any practical sense. Two procedures which can be used to encode for…

  3. A Practical Guide to Interpretation of Large Collections of Incident Narratives Using the QUORUM Method

    NASA Technical Reports Server (NTRS)

    McGreevy, Michael W.

    1997-01-01

    Analysis of incident reports plays an important role in aviation safety. Typically, a narrative description, written by a participant, is a central part of an incident report. Because there are so many reports, and the narratives contain so much detail, it can be difficult to efficiently and effectively recognize patterns among them. Recognizing and addressing recurring problems, however, is vital to continuing safety in commercial aviation operations. A practical way to interpret large collections of incident narratives is to apply the QUORUM method of text analysis, modeling, and relevance ranking. In this paper, QUORUM text analysis and modeling are surveyed, and QUORUM relevance ranking is described in detail with many examples. The examples are based on several large collections of reports from the Aviation Safety Reporting System (ASRS) database, and a collection of news stories describing the disaster of TWA Flight 800, the Boeing 747 which exploded in mid- air and crashed near Long Island, New York, on July 17, 1996. Reader familiarity with this disaster should make the relevance-ranking examples more understandable. The ASRS examples illustrate the practical application of QUORUM relevance ranking.

  4. Articulating current service development practices: a qualitative analysis of eleven mental health projects

    PubMed Central

    2014-01-01

    Background The utilisation of good design practices in the development of complex health services is essential to improving quality. Healthcare organisations, however, are often seriously out of step with modern design thinking and practice. As a starting point to encourage the uptake of good design practices, it is important to understand the context of their intended use. This study aims to do that by articulating current health service development practices. Methods Eleven service development projects carried out in a large mental health service were investigated through in-depth interviews with six operation managers. The critical decision method in conjunction with diagrammatic elicitation was used to capture descriptions of these projects. Stage-gate design models were then formed to visually articulate, classify and characterise different service development practices. Results Projects were grouped into three categories according to design process patterns: new service introduction and service integration; service improvement; service closure. Three common design stages: problem exploration, idea generation and solution evaluation - were then compared across the design process patterns. Consistent across projects were a top-down, policy-driven approach to exploration, underexploited idea generation and implementation-based evaluation. Conclusions This study provides insight into where and how good design practices can contribute to the improvement of current service development practices. Specifically, the following suggestions for future service development practices are made: genuine user needs analysis for exploration; divergent thinking and innovative culture for idea generation; and fail-safe evaluation prior to implementation. Better training for managers through partnership working with design experts and researchers could be beneficial. PMID:24438471

  5. Comparison of homotopy analysis method and homotopy perturbation method

    E-print Network

    Jeffrey, David

    Comparison of homotopy analysis method and homotopy perturbation method through an evolution by Liao in 1992 and the homotopy perturbation method (HPM) proposed by He in 1998 are compared through solutions (either analytical ones or numerical ones) can be expected. Perturbation method is one of the well

  6. Analysis of flight equipment purchasing practices of representative air carriers

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The process through which representative air carriers decide whether or not to purchase flight equipment was investigated as well as their practices and policies in retiring surplus aircraft. An analysis of the flight equipment investment decision process in ten airlines shows that for the airline industry as a whole, the flight equipment investment decision is in a state of transition from a wholly informal process in earliest years to a much more organized and structured process in the future. Individual air carriers are in different stages with respect to the formality and sophistication associated with the flight equipment investment decision.

  7. Investigating the Efficacy of Practical Skill Teaching: A Pilot-Study Comparing Three Educational Methods

    ERIC Educational Resources Information Center

    Maloney, Stephen; Storr, Michael; Paynter, Sophie; Morgan, Prue; Ilic, Dragan

    2013-01-01

    Effective education of practical skills can alter clinician behaviour, positively influence patient outcomes, and reduce the risk of patient harm. This study compares the efficacy of two innovative practical skill teaching methods, against a traditional teaching method. Year three pre-clinical physiotherapy students consented to participate in a…

  8. Assessing Student Perception of Practice Evaluation Knowledge in Introductory Research Methods

    ERIC Educational Resources Information Center

    Baker, Lisa R.; Pollio, David E.; Hudson, Ashley

    2011-01-01

    The authors explored the use of the Practice Evaluation Knowledge Scale (PEKS) to assess student perception of acquisition and retention of practice evaluation knowledge from an undergraduate research methods class. The authors sampled 2 semesters of undergraduate social work students enrolled in an introductory research methods course.…

  9. Return on Investment in Electronic Health Records in Primary Care Practices: A Mixed-Methods Study

    PubMed Central

    Sanche, Steven

    2014-01-01

    Background The use of electronic health records (EHR) in clinical settings is considered pivotal to a patient-centered health care delivery system. However, uncertainty in cost recovery from EHR investments remains a significant concern in primary care practices. Objective Guided by the question of “When implemented in primary care practices, what will be the return on investment (ROI) from an EHR implementation?”, the objectives of this study are two-fold: (1) to assess ROI from EHR in primary care practices and (2) to identify principal factors affecting the realization of positive ROI from EHR. We used a break-even point, that is, the time required to achieve cost recovery from an EHR investment, as an ROI indicator of an EHR investment. Methods Given the complexity exhibited by most EHR implementation projects, this study adopted a retrospective mixed-method research approach, particularly a multiphase study design approach. For this study, data were collected from community-based primary care clinics using EHR systems. Results We collected data from 17 primary care clinics using EHR systems. Our data show that the sampled primary care clinics recovered their EHR investments within an average period of 10 months (95% CI 6.2-17.4 months), seeing more patients with an average increase of 27% in the active-patients-to-clinician-FTE (full time equivalent) ratio and an average increase of 10% in the active-patients-to-clinical-support-staff-FTE ratio after an EHR implementation. Our analysis suggests, with a 95% confidence level, that the increase in the number of active patients (P=.006), the increase in the active-patients-to-clinician-FTE ratio (P<.001), and the increase in the clinic net revenue (P<.001) are positively associated with the EHR implementation, likely contributing substantially to an average break-even point of 10 months. Conclusions We found that primary care clinics can realize a positive ROI with EHR. Our analysis of the variances in the time required to achieve cost recovery from EHR investments suggests that a positive ROI does not appear automatically upon implementing an EHR and that a clinic’s ability to leverage EHR for process changes seems to play a role. Policies that provide support to help primary care practices successfully make EHR-enabled changes, such as support of clinic workflow optimization with an EHR system, could facilitate the realization of positive ROI from EHR in primary care practices. PMID:25600508

  10. 78 FR 11611 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-19

    ...Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls...Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls...Manufacturing Practice and Hazard Analysis and Risk-Based Preventive...

  11. 78 FR 69604 - Current Good Manufacturing Practice and Hazard Analysis and Risk-Based Preventive Controls for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-20

    ...Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls...Manufacturing Practice and Hazard Analysis and Risk- Based Preventive Controls...Manufacturing Practice and Hazard Analysis and Risk-Based Preventive...

  12. Good practices in LIBS analysis: Review and advices

    NASA Astrophysics Data System (ADS)

    El Haddad, J.; Canioni, L.; Bousquet, B.

    2014-11-01

    This paper presents a review on the analytical results obtained by laser-induced breakdown spectroscopy (LIBS). In the first part, results on identification and classification of samples are presented including the risk of misclassification, and in the second part, results on concentration measurement based on calibration are accompanied with significant figures of merit including the concept of accuracy. Both univariate and multivariate approaches are discussed with special emphasis on the methodology, the way of presenting the results and the assessment of the methods. Finally, good practices are proposed for both classification and concentration measurement.

  13. Error analysis in some recent versions of the Fry Method

    NASA Astrophysics Data System (ADS)

    Srivastava, D. C.; Kumar, R.

    2013-12-01

    Fry method is a graphical technique that directly displays the strain ellipse in form of central vacancy on a point distribution, the Fry plot. For accurate strain estimation from the Fry plot, the central vacancy must appear as a sharply focused perfect ellipse. Diffused nature of the central vacancy, common in practice, induces a subjectivity in direct strain estimation from the Fry plot. Several alternative methods, based on the point density contrast, the image analysis, the Delaunay triangulation, or the point distribution analysis exist for objective strain estimation from the Fry plots. Relative merits and limitations of these methods are, however, not yet well-explored and understood. This study compares the accuracy and efficacy of the six methods proposed for objective determination of strain from Fry plots. Our approach consists of; (i) graphical simulation of variously sorted object sets, (ii) distortion of different object sets by known strain in pure shear, simple shear and simultaneous pure-and-simple shear deformations and, (iii) error analysis and comparison of the six methods. Results from more than 1000 tests reveal that the Delaunay triangulation method, the point density contrast methods or the image analysis method are relatively more accurate and versatile. The amount and nature of distortion, or the degree of sorting have little effect on the accuracy of results in these methods. The point distribution analysis methods are successful provided the pre-deformed objects were well-sorted and defined by the specific types of point distribution. Both the Delaunay triangulation method and the image analysis method are more time efficient in comparison to the point distribution analysis methods. The time-efficiency of the density contrast methods is in between these two extremes.

  14. Prognostic Analysis System and Methods of Operation

    NASA Technical Reports Server (NTRS)

    MacKey, Ryan M. E. (Inventor); Sneddon, Robert (Inventor)

    2014-01-01

    A prognostic analysis system and methods of operating the system are provided. In particular, a prognostic analysis system for the analysis of physical system health applicable to mechanical, electrical, chemical and optical systems and methods of operating the system are described herein.

  15. A deliberate practice approach to teaching phylogenetic analysis.

    PubMed

    Hobbs, F Collin; Johnson, Daniel J; Kearns, Katherine D

    2013-01-01

    One goal of postsecondary education is to assist students in developing expert-level understanding. Previous attempts to encourage expert-level understanding of phylogenetic analysis in college science classrooms have largely focused on isolated, or "one-shot," in-class activities. Using a deliberate practice instructional approach, we designed a set of five assignments for a 300-level plant systematics course that incrementally introduces the concepts and skills used in phylogenetic analysis. In our assignments, students learned the process of constructing phylogenetic trees through a series of increasingly difficult tasks; thus, skill development served as a framework for building content knowledge. We present results from 5 yr of final exam scores, pre- and postconcept assessments, and student surveys to assess the impact of our new pedagogical materials on student performance related to constructing and interpreting phylogenetic trees. Students improved in their ability to interpret relationships within trees and improved in several aspects related to between-tree comparisons and tree construction skills. Student feedback indicated that most students believed our approach prepared them to engage in tree construction and gave them confidence in their abilities. Overall, our data confirm that instructional approaches implementing deliberate practice address student misconceptions, improve student experiences, and foster deeper understanding of difficult scientific concepts. PMID:24297294

  16. Usability inspection methods after 15 years of research and practice

    Microsoft Academic Search

    Tasha Hollingsed; David G. Novick

    2007-01-01

    Usability inspection methods, such as heuristic evaluation, the cognitive walkthrough, formal usability inspections, and the pluralistic usability walkthrough, were introduced fifteen years ago. Since then, these methods, analyses of their comparative effectiveness, and their use have evolved in different ways. In this paper, we track the fortunes of the methods and analyses, looking at which led to use and to

  17. Usability Inspection Methods after 15 Years of Research and Practice

    E-print Network

    Novick, David G.

    introduced fifteen years ago. Since then, these methods, analyses of their comparative effectiveness of empirical usability testing versus other, less costly, methods (see, e.g., [10], [19], [20], [24]). Full- blown usability testing was effective but expensive. Other methods, generally known under the category

  18. Scharz Preconditioners for Krylov Methods: Theory and Practice

    SciTech Connect

    Szyld, Daniel B.

    2013-05-10

    Several numerical methods were produced and analyzed. The main thrust of the work relates to inexact Krylov subspace methods for the solution of linear systems of equations arising from the discretization of partial di#11;erential equa- tions. These are iterative methods, i.e., where an approximation is obtained and at each step. Usually, a matrix-vector product is needed at each iteration. In the inexact methods, this product (or the application of a preconditioner) can be done inexactly. Schwarz methods, based on domain decompositions, are excellent preconditioners for thise systems. We contributed towards their under- standing from an algebraic point of view, developed new ones, and studied their performance in the inexact setting. We also worked on combinatorial problems to help de#12;ne the algebraic partition of the domains, with the needed overlap, as well as PDE-constraint optimization using the above-mentioned inexact Krylov subspace methods.

  19. Practical learning method for multi-scale entangled states

    NASA Astrophysics Data System (ADS)

    Landon-Cardinal, Olivier; Poulin, David

    2012-08-01

    We describe two related methods for reconstructing multi-scale entangled states from a small number of efficiently-implementable measurements and fast post-processing. Both methods only require single-particle measurements and the total number of measurements is polynomial in the number of particles. Data post-processing for state reconstruction uses standard tools, namely matrix diagonalization and conjugate gradient method, and scales polynomially with the number of particles. Both methods prevent the build-up of errors from both numerical and experimental imperfections. The first method is conceptually simpler but requires unitary control. The second method circumvents the need for unitary control but requires more measurements and produces an estimated state of lower fidelity.

  20. Practical Inexact Proximal Quasi-Newton Method with Global ...

    E-print Network

    2014-03-14

    Mar 14, 2014 ... Hessian approximations, and provide a global convergence rate analysis in the spirit of ...... which are preprocessed from breast cancer data and gene expression ... UCI Adult benchmark set a9a used for income classification, ...

  1. SNM measurement methods: the state of the practice

    Microsoft Academic Search

    1982-01-01

    In order to determine the specific applications and performance of special nuclear materials (SNM) measurement methods being used routinely in nuclear fuel production facilities, a survey of users was conducted at 22 commercial and DOE plants. The methods used and their performance when applied to the wide range of uranium and plutonium materials encountered during processing were evaluated. The most

  2. Impact AssessmentEuropean Experience of Qualitative Methods and Practices

    Microsoft Academic Search

    Erkki Ormala

    1994-01-01

    In Europe, impact assessment is an essential element of evaluation. Impact cannot usually be measured directly but has to be explored in view of a set of both quantitative and qualitative impact dimensions. The impact needs also to be placed in the relevant context. At present, a range of established methods is available for use by evaluators. The methods can

  3. Status of Activator Methods Chiropractic Technique, Theory, and Practice

    Microsoft Academic Search

    Arlan W. Fuhr; J. Michael Menke

    2005-01-01

    Objective: To provide an historical overview, description, synthesis, and critique of the Activator Adjusting Instrument (AAI) and Activator Methods Chiropractic Technique of clinical assessment. Methods: Online resources were searched including Index to Chiropractic Literature, EBSCO Online, MANTIS, CHIROLARS, CINAHL, eJournals, Ovid, MDConsult, Lane Catalog, SU Catalog, and Pubmed. Relevant peer-reviewed studies, commentaries, and reviews were selected. Studies fell into 2

  4. Professional Suitability for Social Work Practice: A Factor Analysis

    ERIC Educational Resources Information Center

    Tam, Dora M. Y.; Coleman, Heather; Boey, Kam-Wing

    2012-01-01

    Objective: The purpose of this study was to identify the underlying dimensions of professional suitability. Method: Data were collected from a province-wide mail-out questionnaire surveying 341 participants from a random sample of registered social workers. Results: The use of an exploratory factor analysis identified a 5-factor solution on…

  5. Practical quantitative lithic use-wear analysis using multiple classifiers

    Microsoft Academic Search

    Nathan E. Stevens; Douglas R. Harro; Alan Hicklin

    2010-01-01

    Although use-wear analysis of prehistoric stone tools using conventional microscopy has proven useful to archaeologists interested in tool function, critics have questioned the reliability and repeatability of the method. The research presented here shows it is possible to quantitatively discriminate between various contact materials (e.g., wood, antler) using laser scanning confocal microscopy in conjunction with conventional edge damage data. Experiments

  6. Dynamic mechanical analysis: A practical introduction to techniques and applications

    SciTech Connect

    Menard, K.

    1999-01-01

    This book introduces DMA, its history, and its current position as part of thermal analysis on polymers. It discusses major types of instrumentation, including oscillatory rotational, oscillatory axial, and torsional pendulum. It also describes analytical techniques in terms of utility, quality of data, methods of calibration, and suitability for different types of materials and assesses applications for thermoplastics, thermosetting systems, and thermosets.

  7. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B. (Idaho Falls, ID); Novascone, Stephen R. (Idaho Falls, ID); Wright, Jerry P. (Idaho Falls, ID)

    2012-05-29

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  8. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    DOEpatents

    West, Phillip B. (Idaho Falls, ID); Novascone, Stephen R. (Idaho Falls, ID); Wright, Jerry P. (Idaho Falls, ID)

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  9. Current status of methods for shielding analysis

    SciTech Connect

    Engle, W.W.

    1980-01-01

    Current methods used in shielding analysis and recent improvements in those methods are discussed. The status of methods development is discussed based on needs cited at the 1977 International Conference on Reactor Shielding. Additional areas where methods development is needed are discussed.

  10. Moodtrack : practical methods for assembling emotion-driven music

    E-print Network

    Vercoe, G. Scott

    2006-01-01

    This thesis presents new methods designed for the deconstruction and reassembly of musical works based on a target emotional contour. Film soundtracks provide an ideal testing ground for organizing music around strict ...

  11. Documenting Elementary Teachers' Sustainability of Instructional Practices: A Mixed Method Case Study

    NASA Astrophysics Data System (ADS)

    Cotner, Bridget A.

    School reform programs focus on making educational changes; however, research on interventions past the funded implementation phase to determine what was sustained is rarely done (Beery, Senter, Cheadle, Greenwald, Pearson, et al., 2005). This study adds to the research on sustainability by determining what instructional practices, if any, of the Teaching SMARTRTM professional development program that was implemented from 2005--2008 in elementary schools with teachers in grades third through eighth were continued, discontinued, or adapted five years post-implementation (in 2013). Specifically, this study sought to answer the following questions: What do teachers who participated in Teaching SMARTRTM and district administrators share about the sustainability of Teaching SMARTRTM practices in 2013? What teaching strategies do teachers who participated in the program (2005--2008) use in their science classrooms five years postimplementation (2013)? What perceptions about the roles of females in science, technology, engineering, and mathematics (STEM) do teachers who participated in the program (2005--2008) have five years later (2013)? And, What classroom management techniques do the teachers who participated in the program (2005--2008) use five years post implementation (2013)? A mixed method approach was used to answer these questions. Quantitative teacher survey data from 23 teachers who participated in 2008 and 2013 were analyzed in SAS v. 9.3. Descriptive statistics were reported and paired t-tests were conducted to determine mean differences by survey factors identified from an exploratory factor analysis, principal axis factoring, and parallel analysis conducted with teacher survey baseline data (2005). Individual teacher change scores (2008 and 2013) for identified factors were computed using the Reliable Change Index statistic. Qualitative data consisted of interviews with two district administrators and three teachers who responded to the survey in both years (2008 and 2013). Additionally, a classroom observation was conducted with one of the interviewed teachers in 2013. Qualitative analyses were conducted following the constant comparative method and were facilitated by ATLAS.ti v. 6.2, a qualitative analysis software program. Qualitative findings identified themes at the district level that influenced teachers' use of Teaching SMARTRTM strategies. All the themes were classified as obstacles to sustainability: economic downturn, turnover of teachers and lack of hiring, new reform policies, such as Race to the Top, Student Success Act, Common Core State Standards, and mandated blocks of time for specific content. Results from the survey data showed no statistically significant difference through time in perceived instructional practices except for a perceived decrease in the use of hands-on instructional activities from 2008 to 2013. Analyses conducted at the individual teacher level found change scores were statistically significant for a few teachers, but overall, teachers reported similarly on the teacher survey at both time points. This sustainability study revealed the lack of facilitating factors to support the continuation of reform practices; however, teachers identified strategies to continue to implement some of the reform practices through time in spite of a number of system-wide obstacles. This sustainability study adds to the literature by documenting obstacles to sustainability in this specific context, which overlap with what is known in the literature. Additionally, the strategies teachers identified to overcome some of the obstacles to implement reform practices and the recommendations by district level administrators add to the literature on how stakeholders may support sustainability of reform through time.

  12. Autoethnography as a Method for Reflexive Research and Practice in Vocational Psychology

    ERIC Educational Resources Information Center

    McIlveen, Peter

    2008-01-01

    This paper overviews the qualitative research method of autoethnography and its relevance to research in vocational psychology and practice in career development. Autoethnography is a reflexive means by which the researcher-practitioner consciously embeds himself or herself in theory and practice, and by way of intimate autobiographic account,…

  13. Measuring oral sensitivity in clinical practice: a quick and reliable behavioural method.

    PubMed

    Dovey, Terence M; Aldridge, Victoria K; Martin, Clarissa I

    2013-12-01

    This article aims to offer a behavioural assessment strategy for oral sensitivity that can be readily applied in the clinical setting. Four children, ranging in age and with a variety of developmental and medical problems, were used as test cases for a task analysis of tolerance to touch probes in and around the mouth. In all cases, the assessment was sensitive to weekly measures of an intervention for oral sensitivity over a 3-week period. Employing an inexpensive, direct, specific to the individual, replicable, reliable, and effective measure for a specific sensory problem would fit better with the edicts of evidence-based practice. The current method offered the initial evidence towards this goal. PMID:23515637

  14. Theory, Method and Practice of Neuroscientific Findings in Science Education

    ERIC Educational Resources Information Center

    Liu, Chia-Ju; Chiang, Wen-Wei

    2014-01-01

    This report provides an overview of neuroscience research that is applicable for science educators. It first offers a brief analysis of empirical studies in educational neuroscience literature, followed by six science concept learning constructs based on the whole brain theory: gaining an understanding of brain function; pattern recognition and…

  15. Grounded action research: a method for understanding IT in practice

    Microsoft Academic Search

    Richard Baskerville; Jan Pries-Heje

    1999-01-01

    This paper shows how the theory development portion of action research can be made more rigorous. The process of theory formulation is an essential part of action research, yet this process is not well understood. A case study demonstrates how units of analysis and techniques from grounded theory can be integrated into the action research cycle in order to add

  16. Short time-series microarray analysis: Methods and challenges

    PubMed Central

    Wang, Xuewei; Wu, Ming; Li, Zheng; Chan, Christina

    2008-01-01

    The detection and analysis of steady-state gene expression has become routine. Time-series microarrays are of growing interest to systems biologists for deciphering the dynamic nature and complex regulation of biosystems. Most temporal microarray data only contain a limited number of time points, giving rise to short-time-series data, which imposes challenges for traditional methods of extracting meaningful information. To obtain useful information from the wealth of short-time series data requires addressing the problems that arise due to limited sampling. Current efforts have shown promise in improving the analysis of short time-series microarray data, although challenges remain. This commentary addresses recent advances in methods for short-time series analysis including simplification-based approaches and the integration of multi-source information. Nevertheless, further studies and development of computational methods are needed to provide practical solutions to fully exploit the potential of this data. PMID:18605994

  17. Evaluating participatory decision processes: which methods inform reflective practice?

    PubMed

    Kaufman, Sanda; Ozawa, Connie P; Shmueli, Deborah F

    2014-02-01

    Evaluating participatory decision processes serves two key purposes: validating the usefulness of specific interventions for stakeholders, interveners and funders of conflict management processes, and improving practice. However, evaluation design remains challenging, partly because when attempting to serve both purposes we may end up serving neither well. In fact, the better we respond to one, the less we may satisfy the other. Evaluations tend to focus on endogenous factors (e.g., stakeholder selection, BATNAs, mutually beneficial tradeoffs, quality of the intervention, etc.), because we believe that the success of participatory decision processes hinges on them, and they also seem to lend themselves to caeteris paribus statistical comparisons across cases. We argue that context matters too and that contextual differences among specific cases are meaningful enough to undermine conclusions derived solely from comparisons of process-endogenous factors implicitly rooted in the caeteris paribus assumption. We illustrate this argument with an environmental mediation case. We compare data collected about it through surveys geared toward comparability across cases to information elicited through in-depth interviews geared toward case specifics. The surveys, designed by the U.S. Institute of Environmental Conflict Resolution, feed a database of environmental conflicts that can help make the (statistical) case for intervention in environmental conflict management. Our interviews elicit case details - including context - that enable interveners to link context specifics and intervention actions to outcomes. We argue that neither approach can "serve both masters." PMID:24121657

  18. Methods for the analysis of triacylglycerols

    Microsoft Academic Search

    V. Ruiz-Gutiérrez; L. J. R. Barron

    1995-01-01

    This article discusses the methods most commonly employed in the analysis of the triacylglycerols (TAGs) in natural fats and considers the main advantages and disadvantages of each and the techniques for optimising analytical conditions. Complete analysis of the composition of a natural fat calls for a method of extracting and purifying the triglyceride fraction, normally by preparatory thin-layer and column

  19. Text analysis methods, text analysis apparatuses, and articles of manufacture

    DOEpatents

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  20. Methods of DNA methylation analysis.

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The purpose of this review was to provide guidance for investigators who are new to the field of DNA methylation analysis. Epigenetics is the study of mitotically heritable alterations in gene expression potential that are not mediated by changes in DNA sequence. Recently, it has become clear that n...

  1. Probabilistic analysis of sequencing methods

    E-print Network

    Sontag, Eduardo

    to the point where it is possi- ble to sequence entire genomes. Indeed, the International Human Genome of the entire human genome in the Spring of 2001. The human genome is approximately 3 Ã? 109 bp long, so this was an ambitious project requiring massive amounts of lab work and data analysis. In this chapter we discuss some

  2. Practical Aspects of a Method of Elementary School Closing.

    ERIC Educational Resources Information Center

    Puleo, Vincent T.

    This method of determining which elementary school to close involves the identification of a number of factors to be considered and the weighting of those factors according to importance. It involves five steps: establishing a precise reason or purpose for school closing, selecting factors to quantify this purpose, deciding on a weighting scheme…

  3. A practical method for evaluating capability of scene matching algorithms

    Microsoft Academic Search

    Fangfang He; Jiyin Sun; Wenpu Guo

    2006-01-01

    Evaluating the capability of matching algorithms for actual engineering application of scene matching navigation system can promote the research in point. A doable capability evaluation method based on database has been proposed in this paper through computer-aided simulation studies. Firstly, a systemic project programming is designed with four parts: database administering, couple images creating, matching experiments simulating and experimental data

  4. Identification of potential strategies, methods, and tools for improving cost estimating practices for highway projects

    E-print Network

    Donnell, Kelly Elaine

    2005-08-29

    of strategies, methods, and tools for project cost estimation practices aimed at achieving greater consistency and accuracy between the project development phases. A literature review was conducted that assisted in identifying factors that lead to the cost...

  5. Probabilistic structural analysis by extremum methods

    NASA Technical Reports Server (NTRS)

    Nafday, Avinash M.

    1990-01-01

    The objective is to demonstrate discrete extremum methods of structural analysis as a tool for structural system reliability evaluation. Specifically, linear and multiobjective linear programming models for analysis of rigid plastic frames under proportional and multiparametric loadings, respectively, are considered. Kinematic and static approaches for analysis form a primal-dual pair in each of these models and have a polyhedral format. Duality relations link extreme points and hyperplanes of these polyhedra and lead naturally to dual methods for system reliability evaluation.

  6. Scenario driven requirements analysis method

    Microsoft Academic Search

    Wei Wang; Steve Hufnagel; Pei Hsia; Seung Min Yang

    1992-01-01

    Application specific scenarios are used to develop a system's requirements-specification document and then iterate through the software development lifecycle. A seamless object-oriented approach is presented, suitable for the development of large real-time systems. It starts with requirements-analysis and specifications definition. It then iterates through the design, implementation, and test phases. The items of interest in each phase of the lifecycle

  7. [Analysis of heart rate variability : Mathematical description and practical application].

    PubMed

    Sammito, S; Böckelmann, I

    2015-03-01

    The analysis of heart rate variability (HRV) has recently become established as a non-invasive measurement for estimation of demands on the cardiovascular system. The HRV reflects the interaction of the sympathetic and parasympathetic nervous systems and allows the influence of the autonomic nervous system on the regulation of the cardiovascular system to be mathematically described. This review explicates the analysis method of HRV for time, frequency and non-linear methods as well as the range of parameters and the demand on acquisition time. The necessity and possibilities of artefact correction and advice for the selection of a reasonable acquisition period are discussed and standard values for selected HRV parameters are presented. PMID:25298003

  8. [Use of the method of immunothermistography in obstetrical practice].

    PubMed

    Pali?, G K; Berezovskaia, S B

    1990-05-01

    This paper presents results of testing pregnant women for staphylococcal antigen sensitization using immunothermistography (ITG). The ITG is based on registration of environmental heat conduction change with a microthermistor resistor during antigen-antibody reaction. The study group comprised 75 pregnant women immunized by staphylococcal anaphylotoxin and nonimmunized pregnant and nonpregnant women. Blood alfa-antistaphylolysin levels showed a close direct correlation with ITG findings. Combined use of these methods identified a population of pregnant women requiring immunization. PMID:2204287

  9. Maximizing Return From Sound Analysis and Design Practices

    SciTech Connect

    Bramlette, J.D.

    2002-04-22

    With today's tightening budgets computer applications must provide ''true'' long-term benefit to the company. Businesses are spending large portions of their budgets ''Re-Engineering'' old systems to take advantage of ''new'' technology. But what they are really getting is simply a new interface implementing the same incomplete or poor defined requirements as before. ''True'' benefit can only be gained if sound analysis and design practices are used. WHAT data and processes are required of a system is not the same as HOW the system will be implemented within a company. It is the System Analyst's responsibility to understand the difference between these two concepts. The paper discusses some simple techniques to be used during the Analysis and Design phases of projects, as well as the information gathered and recorded in each phase and how it is transformed between these phases. The paper also covers production application generated using Oracle Designer. Applying these techniques to ''real world'' problems, the applications will meet the needs for today's business and adapt easily to ever-changing business environments.

  10. Maximizing Return From Sound Analysis and Design Practices

    SciTech Connect

    Bramlette, Judith Lynn

    2002-06-01

    With today's tightening budgets computer applications must provide "true" long-term benefit to the company. Businesses are spending large portions of their budgets "Re- Engineering" old systems to take advantage of "new" technology. But what they are really getting is simply a new interface implementing the same incomplete or poor defined requirements as before. "True" benefit can only be gained if sound analysis and design practices are used. WHAT data and processes are required of a system is not the same as HOW the system will be implemented within a company. It is the System Analyst's responsibility to understand the difference between these two concepts. The paper discusses some simple techniques to be used during the Analysis and Design phases of projects, as well as the information gathered and recorded in each phase and how it is transformed between these phases. The paper also covers production application generated using Oracle Designer. Applying these techniques to "real world" problems, the applications will meet the needs for today's business and adapt easily to ever-changing business environments.

  11. Triangle area water supply monitoring project, October 1988 through September 2001, North Carolina -- description of the water-quality network, sampling and analysis methods, and quality-assurance practices

    USGS Publications Warehouse

    Oblinger, Carolyn J.

    2004-01-01

    The Triangle Area Water Supply Monitoring Project was initiated in October 1988 to provide long-term water-quality data for six area water-supply reservoirs and their tributaries. In addition, the project provides data that can be used to determine the effectiveness of large-scale changes in water-resource management practices, document differences in water quality among water-supply types (large multiuse reservoir, small reservoir, run-of-river), and tributary-loading and in-lake data for water-quality modeling of Falls and Jordan Lakes. By September 2001, the project had progressed in four phases and included as many as 34 sites (in 1991). Most sites were sampled and analyzed by the U.S. Geological Survey. Some sites were already a part of the North Carolina Division of Water Quality statewide ambient water-quality monitoring network and were sampled by the Division of Water Quality. The network has provided data on streamflow, physical properties, and concentrations of nutrients, major ions, metals, trace elements, chlorophyll, total organic carbon, suspended sediment, and selected synthetic organic compounds. Project quality-assurance activities include written procedures for sample collection, record management and archive, collection of field quality-control samples (blank samples and replicate samples), and monitoring the quality of field supplies. In addition to project quality-assurance activities, the quality of laboratory analyses was assessed through laboratory quality-assurance practices and an independent laboratory quality-control assessment provided by the U.S. Geological Survey Branch of Quality Systems through the Blind Inorganic Sample Project and the Organic Blind Sample Project.

  12. Understanding Online Teacher Best Practices: A Thematic Analysis to Improve Learning

    ERIC Educational Resources Information Center

    Corry, Michael; Ianacone, Robert; Stella, Julie

    2014-01-01

    The purpose of this study was to examine brick-and-mortar and online teacher best practice themes using thematic analysis and a newly developed theory-based analytic process entitled Synthesized Thematic Analysis Criteria (STAC). The STAC was developed to facilitate the meaningful thematic analysis of research based best practices of K-12…

  13. A Practical Blended Analysis for Dynamic Features in JavaScript

    E-print Network

    Ryder, Barbara G.

    A Practical Blended Analysis for Dynamic Features in JavaScript Shiyi Wei Barbara G. Ryder Department of Computer Science Virginia Tech wei, ryder@cs.vt.edu ABSTRACT The JavaScript Blended Analysis Framework is designed to perform a general-purpose, practical combined static/dynamic analysis of JavaScript

  14. Microparticle analysis system and method

    NASA Technical Reports Server (NTRS)

    Morrison, Dennis R. (Inventor)

    2007-01-01

    A device for analyzing microparticles is provided which includes a chamber with an inlet and an outlet for respectively introducing and dispensing a flowing fluid comprising microparticles, a light source for providing light through the chamber and a photometer for measuring the intensity of light transmitted through individual microparticles. The device further includes an imaging system for acquiring images of the fluid. In some cases, the device may be configured to identify and determine a quantity of the microparticles within the fluid. Consequently, a method for identifying and tracking microparticles in motion is contemplated herein. The method involves flowing a fluid comprising microparticles in laminar motion through a chamber, transmitting light through the fluid, measuring the intensities of the light transmitted through the microparticles, imaging the fluid a plurality of times and comparing at least some of the intensities of light between different images of the fluid.

  15. Aircraft accidents : method of analysis

    NASA Technical Reports Server (NTRS)

    1931-01-01

    The revised report includes the chart for the analysis of aircraft accidents, combining consideration of the immediate causes, underlying causes, and results of accidents, as prepared by the special committee, with a number of the definitions clarified. A brief statement of the organization and work of the special committee and of the Committee on Aircraft Accidents; and statistical tables giving a comparison of the types of accidents and causes of accidents in the military services on the one hand and in civil aviation on the other, together with explanations of some of the important differences noted in these tables.

  16. Summarization Evaluation Methods: Experiments and Analysis

    E-print Network

    Elhadad, Michael

    Summarization Evaluation Methods: Experiments and Analysis Hongyan Jing Dept. of Computer Science@cs.bgu.ac.il) Abstract Two methods are used for evaluation of summarization systems: an evaluation of generated summaries such as informa­ tion retrieval. We carried out two large experiments to study the two evaluation methods. Our

  17. Validation of analytical methods in compliance with good manufacturing practice: a practical approach

    PubMed Central

    2013-01-01

    Background The quality and safety of cell therapy products must be maintained throughout their production and quality control cycle, ensuring their final use in the patient. We validated the Lymulus Amebocyte Lysate (LAL) test and immunophenotype according to International Conference on Harmonization Q2 Guidelines and the EU Pharmacopoeia, considering accuracy, precision, repeatability, linearity and range. Methods For the endotoxin test we used a kinetic chromogenic LAL test. As this is a limit test for the control of impurities, in compliance with International Conference on Harmonization Q2 Guidelines and the EU Pharmacopoeia, we evaluated the specificity and detection limit. For the immunophenotype test, an identity test, we evaluated specificity through the Fluorescence Minus One method and we repeated all experiments thrice to verify precision. The immunophenotype validation required a performance qualification of the flow cytometer using two types of standard beads which have to be used daily to check cytometer reproducibly set up. The results were compared together. Collected data were statistically analyzed calculating mean, standard deviation and coefficient of variation percentage (CV%). Results The LAL test is repeatable and specific. The spike recovery value of each sample was between 0.25 EU/ml and 1 EU/ml with a CV% < 10%. The correlation coefficient (? 0.980) and CV% (< 10%) of the standard curve tested in duplicate showed the test's linearity and a minimum detectable concentration value of 0.005 EU/ml. The immunophenotype method performed thrice on our cell therapy products is specific and repeatable as showed by CV% inter -experiment < 10%. Conclusions Our data demonstrated that validated analytical procedures are suitable as quality controls for the batch release of cell therapy products. Our paper could offer an important contribution for the scientific community in the field of CTPs, above all to small Cell Factories such as ours, where it is not always possible to have CFR21 compliant software. PMID:23981284

  18. A mixed methods study of food safety knowledge, practices and beliefs in Hispanic families with young children.

    PubMed

    Stenger, Kristen M; Ritter-Gooder, Paula K; Perry, Christina; Albrecht, Julie A

    2014-12-01

    Children are at a higher risk for foodborne illness. The objective of this study was to explore food safety knowledge, beliefs and practices among Hispanic families with young children (?10 years of age) living within a Midwestern state. A convergent mixed methods design collected qualitative and quantitative data in parallel. Food safety knowledge surveys were administered (n?=?90) prior to exploration of beliefs and practices among six focus groups (n?=?52) conducted by bilingual interpreters in community sites in five cities/towns. Descriptive statistics determined knowledge scores and thematic coding unveiled beliefs and practices. Data sets were merged to assess concordance. Participants were female (96%), 35.7 (±7.6) years of age, from Mexico (69%), with the majority having a low education level. Food safety knowledge was low (56%?±?11). Focus group themes were: Ethnic dishes popular, Relating food to illness, Fresh food in home country, Food safety practices, and Face to face learning. Mixed method analysis revealed high self confidence in preparing food safely with low safe food handling knowledge and the presence of some cultural beliefs. On-site Spanish classes and materials were preferred venues for food safety education. Bilingual food safety messaging targeting common ethnic foods and cultural beliefs and practices is indicated to lower the risk of foodborne illness in Hispanic families with young children. PMID:25178898

  19. Comparison of three evidence-based practice learning assessment methods in dental curricula.

    PubMed

    Al-Ansari, Asim A; El Tantawi, Maha M A

    2015-02-01

    Incorporating evidence-based practice (EBP) training in dental curricula is now an accreditation requirement for dental schools, but questions remain about the most effective ways to assess learning outcomes. The purpose of this study was to evaluate and compare three assessment methods for EBP training and to assess their relation to students' overall course grades. Participants in the study were dental students from two classes who received training in appraising randomized controlled trials (RCTs) and systematic reviews in 2013 at the University of Dammam, Saudi Arabia. Repeated measures analysis of variance was used to compare students' scores on appraisal assignments, scores on multiple-choice question (MCQ) exams in which EBP concepts were applied to clinical scenarios, and scores for self-reported efficacy in appraisal. Regression analysis was used to assess the relationship among the three assessment methods, gender, program level, and overall grade. The instructors had acceptable reliability in scoring the assignments (overall intraclass correlation coefficient=0.60). The MCQ exams had acceptable discrimination indices although their reliability was less satisfactory (Cronbach's alpha=0.46). Statistically significant differences were observed among the three methods with MCQ exams having the lowest overall scores. Variation in the overall course grades was explained by scores on the appraisal assignment and MCQ exams (partial eta-squared=0.52 and 0.24, respectively), whereas score on the self-efficacy questionnaire was not significantly associated with overall grade. The results suggest that self-reported efficacy is not a valid method to assess dental students' RCT appraisal skills, whereas instructor-graded appraisal assignments explained a greater portion of variation in grade and had inherent validity and acceptable consistency and MCQ exams had good construct validity but low internal consistency. PMID:25640619

  20. Methods for Analysis of Outdoor Performance Data (Presentation)

    SciTech Connect

    Jordan, D.

    2011-02-01

    The ability to accurately predict power delivery over the course of time is of vital importance to the growth of the photovoltaic (PV) industry. Two key cost drivers are the efficiency with which sunlight is converted to power and secondly how this relationship develops over time. The accurate knowledge of power decline over time, also known as degradation rates, is essential and important to all stakeholders--utility companies, integrators, investors, and scientists alike. Different methods to determine degradation rates and discrete versus continuous data are presented, and some general best practice methods are outlined. In addition, historical degradation rates and some preliminary analysis with respect to climate are given.

  1. Practical methods for dealing with 'not applicable' item responses in the AMC Linear Disability Score project

    PubMed Central

    Holman, Rebecca; Glas, Cees AW; Lindeboom, Robert; Zwinderman, Aeilko H; de Haan, Rob J

    2004-01-01

    Background Whenever questionnaires are used to collect data on constructs, such as functional status or health related quality of life, it is unlikely that all respondents will respond to all items. This paper examines ways of dealing with responses in a 'not applicable' category to items included in the AMC Linear Disability Score (ALDS) project item bank. Methods The data examined in this paper come from the responses of 392 respondents to 32 items and form part of the calibration sample for the ALDS item bank. The data are analysed using the one-parameter logistic item response theory model. The four practical strategies for dealing with this type of response are: cold deck imputation; hot deck imputation; treating the missing responses as if these items had never been offered to those individual patients; and using a model which takes account of the 'tendency to respond to items'. Results The item and respondent population parameter estimates were very similar for the strategies involving hot deck imputation; treating the missing responses as if these items had never been offered to those individual patients; and using a model which takes account of the 'tendency to respond to items'. The estimates obtained using the cold deck imputation method were substantially different. Conclusions The cold deck imputation method was not considered suitable for use in the ALDS item bank. The other three methods described can be usefully implemented in the ALDS item bank, depending on the purpose of the data analysis to be carried out. These three methods may be useful for other data sets examining similar constructs, when item response theory based methods are used. PMID:15200681

  2. Practical Methods for Locating Abandoned Wells in Populated Areas

    SciTech Connect

    Veloski, G.A.; Hammack, R.W.; Lynn, R.J.

    2007-09-01

    An estimated 12 million wells have been drilled during the 150 years of oil and gas production in the United States. Many old oil and gas fields are now populated areas where the presence of improperly plugged wells may constitute a hazard to residents. Natural gas emissions from wells have forced people from their houses and businesses and have caused explosions that injured or killed people and destroyed property. To mitigate this hazard, wells must be located and properly plugged, a task made more difficult by the presence of houses, businesses, and associated utilities. This paper describes well finding methods conducted by the National Energy Technology Laboratory (NETL) that were effective at two small towns in Wyoming and in a suburb of Pittsburgh, Pennsylvania.

  3. Method of analysis and quality-assurance practices by the U.S. Geological Survey Organic Geochemistry Research Group; determination of geosmin and methylisoborneol in water using solid-phase microextraction and gas chromatography/mass spectrometry

    USGS Publications Warehouse

    Zimmerman, L.R.; Ziegler, A.C.; Thurman, E.M.

    2002-01-01

    A method for the determination of two common odor-causing compounds in water, geosmin and 2-methylisoborneol, was modified and verified by the U.S. Geological Survey's Organic Geochemistry Research Group in Lawrence, Kansas. The optimized method involves the extraction of odor-causing compounds from filtered water samples using a divinylbenzene-carboxen-polydimethylsiloxane cross-link coated solid-phase microextraction (SPME) fiber. Detection of the compounds is accomplished using capillary-column gas chromatography/mass spectrometry (GC/MS). Precision and accuracy were demonstrated using reagent-water, surface-water, and ground-water samples. The mean accuracies as percentages of the true compound concentrations from water samples spiked at 10 and 35 nanograms per liter ranged from 60 to 123 percent for geosmin and from 90 to 96 percent for 2-methylisoborneol. Method detection limits were 1.9 nanograms per liter for geosmin and 2.0 nanograms per liter for 2-methylisoborneol in 45-milliliter samples. Typically, concentrations of 30 and 10 nanograms per liter of geosmin and 2-methylisoborneol, respectively, can be detected by the general public. The calibration range for the method is equivalent to concentrations from 5 to 100 nanograms per liter without dilution. The method is valuable for acquiring information about the production and fate of these odor-causing compounds in water.

  4. Methods for diagnosis of bile acid malabsorption in clinical practice.

    PubMed

    Vijayvargiya, Priya; Camilleri, Michael; Shin, Andrea; Saenger, Amy

    2013-10-01

    Altered concentrations of bile acid (BA) in the colon can cause diarrhea or constipation. More than 25% of patients with irritable bowel syndrome with diarrhea or chronic diarrhea in Western countries have BA malabsorption (BAM). As BAM is increasingly recognized, proper diagnostic methods are needed to help direct the most effective course of treatment for the chronic bowel dysfunction. We review the methodologies, advantages, and disadvantages of tools that directly measure BAM: the (14)C-glycocholate breath and stool test, the (75)selenium homotaurocholic acid test (SeHCAT), and measurements of 7 ?-hydroxy-4-cholesten-3-one (C4) and fecal BAs. The (14)C-glycocholate test is laborious and no longer widely used. The (75)SeHCAT has been validated but is not available in the United States. Measurement of serum C4 is a simple and accurate method that can be used for most patients but requires further clinical validation. Assays to quantify fecal BA (total and individual levels) are technically cumbersome and not widely available. Regrettably, none of these tests are routinely available in the United States; assessment of the therapeutic effects of a BA binder is used as a surrogate for diagnosis of BAM. Recent data indicate the advantages to studying fecal excretion of individual BAs and their role in BAM; these could support the use of the fecal BA assay, compared with other tests. Measurement of fecal BA levels could become a routine addition to the measurement of fecal fat in patients with unexplained diarrhea. Availability ultimately determines whether the C4, SeHCAT, or fecal BA test is used; more widespread availability of such tests would enhance clinical management of these patients. PMID:23644387

  5. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    NASA Technical Reports Server (NTRS)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  6. Vibration analysis methods for piping

    NASA Astrophysics Data System (ADS)

    Gibert, R. J.

    1981-09-01

    Attention is given to flow vibrations in pipe flow induced by singularity points in the piping system. The types of pressure fluctuations induced by flow singularities are examined, including the intense wideband fluctuations immediately downstream of the singularity and the acoustic fluctuations encountered in the remainder of the circuit, and a theory of noise generation by unsteady flow in internal acoustics is developed. The response of the piping systems to the pressure fluctuations thus generated is considered, and the calculation of the modal characteristics of piping containing a dense fluid in order to obtain the system transfer function is discussed. The TEDEL program, which calculates the vibratory response of a structure composed of straight and curved pipes with variable mechanical characteristics forming a three-dimensional network by a finite element method, is then presented, and calculations of fluid-structural coupling in tubular networks are illustrated.

  7. Statistical and methodological issues in the analysis of complex sample survey data: Practical guidance for trauma researchers

    Microsoft Academic Search

    Brady T. West

    2008-01-01

    Standard methods for the analysis of survey data assume that the data arise from a simple random sample of the target population. In practice, analysts of survey data sets collected from nationally representative probability samples often pay little attention to important properties of the survey data. Standard statistical software procedures do not allow analysts to take these properties of survey

  8. Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture

    DOEpatents

    Sanfilippo, Antonio P. (Richland, WA); Cowell, Andrew J. (Kennewick, WA); Gregory, Michelle L. (Richland, WA); Baddeley, Robert L. (Richland, WA); Paulson, Patrick R. (Pasco, WA); Tratz, Stephen C. (Richland, WA); Hohimer, Ryan E. (West Richland, WA)

    2012-03-20

    Hypothesis analysis methods, hypothesis analysis devices, and articles of manufacture are described according to some aspects. In one aspect, a hypothesis analysis method includes providing a hypothesis, providing an indicator which at least one of supports and refutes the hypothesis, using the indicator, associating evidence with the hypothesis, weighting the association of the evidence with the hypothesis, and using the weighting, providing information regarding the accuracy of the hypothesis.

  9. Low hardness organisms: Culture methods, sensitivities, and practical applications

    SciTech Connect

    DaCruz, A.; DaCruz, N.; Bird, M.

    1995-12-31

    EPA Regulations require biomonitoring of permitted effluent and stormwater runoff. Several permit locations were studied, in Virginia, that have supply water and or stormwater runoff which ranges in hardness from 5--30 mg/L. Ceriodaphnia dubia (dubia) and Pimephales promelas (fathead minnow) were tested in reconstituted water with hardnesses from 5--30 mg/L. Results indicated osmotic stresses present in the acute tests with the fathead minnow as well as chronic tests for the dubia and the fathead minnow. Culture methods were developed for both organism types in soft (30 mg) reconstituted freshwater. Reproductivity and development for each organisms type meets or exceeds EPA testing requirements for moderately hard organisms. Sensitivities were measured over an 18 month interval using cadmium chloride as a reference toxicant. Additionally, sensitivities were charted in contrast with those of organisms cultured in moderately hard water. The comparison proved that the sensitivities of both the dubia and the fathead minnow cultured in 30 mg water increased, but were within two standard deviations of the organism sensitivities of those cultured in moderately hard water. Latitude for use of organisms cultured in 30 mg was documented for waters ranging in hardness from 10--100 mg/L with no acclimation period required. The stability of the organism sensitivity was also validated. The application was most helpful in stormwater runoff and in effluents where the hardness was 30 mg/L or less.

  10. Qualitative Analysis of Common Definitions for Core Advanced Pharmacy Practice Experiences

    PubMed Central

    Danielson, Jennifer; Weber, Stanley S.

    2014-01-01

    Objective. To determine how colleges and schools of pharmacy interpreted the Accreditation Council for Pharmacy Education’s (ACPE’s) Standards 2007 definitions for core advanced pharmacy practice experiences (APPEs), and how they differentiated community and institutional practice activities for introductory pharmacy practice experiences (IPPEs) and APPEs. Methods. A cross-sectional, qualitative, thematic analysis was done of survey data obtained from experiential education directors in US colleges and schools of pharmacy. Open-ended responses to invited descriptions of the 4 core APPEs were analyzed using grounded theory to determine common themes. Type of college or school of pharmacy (private vs public) and size of program were compared. Results. Seventy-one schools (72%) with active APPE programs at the time of the survey responded. Lack of strong frequent themes describing specific activities for the acute care/general medicine core APPE indicated that most respondents agreed on the setting (hospital or inpatient) but the student experience remained highly variable. Themes were relatively consistent between public and private institutions, but there were differences across programs of varying size. Conclusion. Inconsistencies existed in how colleges and schools of pharmacy defined the core APPEs as required by ACPE. More specific descriptions of core APPEs would help to standardize the core practice experiences across institutions and provide an opportunity for quality benchmarking. PMID:24954931

  11. Measuring solar reflectance Part II: Review of practical methods

    SciTech Connect

    Levinson, Ronnen; Akbari, Hashem; Berdahl, Paul

    2010-05-14

    A companion article explored how solar reflectance varies with surface orientation and solar position, and found that clear sky air mass 1 global horizontal (AM1GH) solar reflectance is a preferred quantity for estimating solar heat gain. In this study we show that AM1GH solar reflectance R{sub g,0} can be accurately measured with a pyranometer, a solar spectrophotometer, or an updated edition of the Solar Spectrum Reflectometer (version 6). Of primary concern are errors that result from variations in the spectral and angular distributions of incident sunlight. Neglecting shadow, background and instrument errors, the conventional pyranometer technique can measure R{sub g,0} to within 0.01 for surface slopes up to 5:12 [23{sup o}], and to within 0.02 for surface slopes up to 12:12 [45{sup o}]. An alternative pyranometer method minimizes shadow errors and can be used to measure R{sub g,0} of a surface as small as 1 m in diameter. The accuracy with which it can measure R{sub g,0} is otherwise comparable to that of the conventional pyranometer technique. A solar spectrophotometer can be used to determine R*{sub g,0}, a solar reflectance computed by averaging solar spectral reflectance weighted with AM1GH solar spectral irradiance. Neglecting instrument errors, R*{sub g,0} matches R{sub g,0} to within 0.006. The air mass 1.5 solar reflectance measured with version 5 of the Solar Spectrum Reflectometer can differ from R*{sub g,0} by as much as 0.08, but the AM1GH output of version 6 of this instrument matches R*{sub g,0} to within about 0.01.

  12. TECHNICAL DESIGN NOTE: Practical application of an analytical method for calculating a coverage interval

    NASA Astrophysics Data System (ADS)

    Fotowicz, Pawe?

    2010-08-01

    This article presents a practical application of an analytical method for the calculation of the measurement uncertainty. The proposed method enables the determination of uncertainty in accordance with the new probabilistic definition of the coverage interval for a measurand. The proposed method ensures that the expanded uncertainty is calculated with the recommended number of significant digits at the recommended coverage probability. The method was used for the uncertainty evaluation of measurement of small outer diameters with a laser scanning instrument.

  13. Methods to enhance compost practices as an alternative to waste disposal

    SciTech Connect

    Stuckey, H.T.; Hudak, P.F.

    1998-12-31

    Creating practices that are ecologically friendly, economically profitable, and ethically sound is a concept that is slowly beginning to unfold in modern society. In developing such practices, the authors challenge long-lived human behavior patterns and environmental management practices. In this paper, they trace the history of human waste production, describe problems associated with such waste, and explore regional coping mechanisms. Composting projects in north central Texas demonstrate new methods for waste disposal. The authors studied projects conducted by municipalities, schools, agricultural organizations, and individual households. These efforts were examined within the context of regional and statewide solid waste plans. They conclude that: (1) regional composting in north central Texas will substantially reduce the waste stream entering landfills; (2) public education is paramount to establishing alternative waste disposal practices; and (3) new practices for compost will catalyze widespread and efficient production.

  14. Regulating forest practices in Texas: a problem analysis

    E-print Network

    Dreesen, Alan D

    1977-01-01

    information. The objectives of the study were: 1) to ascertain public sentiment regarding the need for public control of private forestry practices, 2) to determine opinions regarding the impacts of forest practices on resource values in Texas, 3... Forest Service was the agency that should be charged with its administration. The most formidable obstacle to achieving a successful Texas forest practice law was considered to be resistance by private landowners. Recommendations for additional...

  15. Manufacturing excellence through TPM implementation: a practical analysis

    Microsoft Academic Search

    Rajiv Kumar Sharma; Dinesh Kumar; Pradeep Kumar

    2006-01-01

    Purpose – To examine the need to develop, practice and implement such maintenance practices, which not only reduce sudden sporadic failures in semi-automated cells but also reduce both operation and maintenance (O&M) costs. Design\\/methodology\\/approach – A case-based approach in conjunction with standard tools, techniques and practices is used to discuss various issues related with TPM implementation in a semi-automated cell.

  16. A practical method of estimating standard error of age in the fission track dating method

    USGS Publications Warehouse

    Johnson, N.M.; McGee, V.E.; Naeser, C.W.

    1979-01-01

    A first-order approximation formula for the propagation of error in the fission track age equation is given by PA = C[P2s+P2i+P2??-2rPsPi] 1 2, where PA, Ps, Pi and P?? are the percentage error of age, of spontaneous track density, of induced track density, and of neutron dose, respectively, and C is a constant. The correlation, r, between spontaneous are induced track densities is a crucial element in the error analysis, acting generally to improve the standard error of age. In addition, the correlation parameter r is instrumental is specifying the level of neutron dose, a controlled variable, which will minimize the standard error of age. The results from the approximation equation agree closely with the results from an independent statistical model for the propagation of errors in the fission-track dating method. ?? 1979.

  17. Airbreathing hypersonic vehicle design and analysis methods

    NASA Technical Reports Server (NTRS)

    Lockwood, Mary Kae; Petley, Dennis H.; Hunt, James L.; Martin, John G.

    1996-01-01

    The design, analysis, and optimization of airbreathing hypersonic vehicles requires analyses involving many highly coupled disciplines at levels of accuracy exceeding those traditionally considered in a conceptual or preliminary-level design. Discipline analysis methods including propulsion, structures, thermal management, geometry, aerodynamics, performance, synthesis, sizing, closure, and cost are discussed. Also, the on-going integration of these methods into a working environment, known as HOLIST, is described.

  18. Methods for Evaluating Practice Change Toward a Patient-Centered Medical Home

    PubMed Central

    Jaén, Carlos Roberto; Crabtree, Benjamin F.; Palmer, Raymond F.; Ferrer, Robert L.; Nutting, Paul A.; Miller, William L.; Stewart, Elizabeth E.; Wood, Robert; Davila, Marivel; Stange, Kurt C.

    2010-01-01

    PURPOSE Understanding the transformation of primary care practices to patient-centered medical homes (PCMHs) requires making sense of the change process, multilevel outcomes, and context. We describe the methods used to evaluate the country’s first national demonstration project of the PCMH concept, with an emphasis on the quantitative measures and lessons for multimethod evaluation approaches. METHODS The National Demonstration Project (NDP) was a group-randomized clinical trial of facilitated and self-directed implementation strategies for the PCMH. An independent evaluation team developed an integrated package of quantitative and qualitative methods to evaluate the process and outcomes of the NDP for practices and patients. Data were collected by an ethnographic analyst and a research nurse who visited each practice, and from multiple data sources including a medical record audit, patient and staff surveys, direct observation, interviews, and text review. Analyses aimed to provide real-time feedback to the NDP implementation team and lessons that would be transferable to the larger practice, policy, education, and research communities. RESULTS Real-time analyses and feedback appeared to be helpful to the facilitators. Medical record audits provided data on process-of-care outcomes. Patient surveys contributed important information about patient-rated primary care attributes and patient-centered outcomes. Clinician and staff surveys provided important practice experience and organizational data. Ethnographic observations supplied insights about the process of practice development. Most practices were not able to provide detailed financial information. CONCLUSIONS A multimethod approach is challenging, but feasible and vital to understanding the process and outcome of a practice development process. Additional longitudinal follow-up of NDP practices and their patients is needed. PMID:20530398

  19. Methods for Evaluating Practice Change Toward a Patient-Centered Medical Home

    Microsoft Academic Search

    Carlos Roberto Jaén; Benjamin F. Crabtree; Raymond F. Palmer; Robert L. Ferrer; Paul A. Nutting; William L. Miller; Elizabeth E. Stewart; Robert Wood; Marivel Davila; Kurt C. Stange

    2010-01-01

    PURPOSE Understanding the transformation of primary care practices to patient- centered medical homes (PCMHs) requires making sense of the change process, multilevel outcomes, and context. We describe the methods used to evaluate the country's fi rst national demonstration project of the PCMH concept, with an emphasis on the quantitative measures and lessons for multimethod evaluation approaches. METHODS The National Demonstration

  20. Scholarship and practice: the contribution of ethnographic research methods to bridging the gap

    Microsoft Academic Search

    Lynda J. Harvey; Michael D. Myers

    1995-01-01

    Information systems research methods need to contribute to the scholarly requirements of the field of knowledge but also need to develop the potential to contribute to the practical requirements of practitioners? knowledge. This leads to possible conflicts in choosing research methods. Argues that the changing world of the IS practitioner is reflected in the changing world of the IS researcher

  1. Best Practices in Teaching Statistics and Research Methods in the Behavioral Sciences [with CD-ROM

    ERIC Educational Resources Information Center

    Dunn, Dana S., Ed.; Smith, Randolph A., Ed.; Beins, Barney, Ed.

    2007-01-01

    This book provides a showcase for "best practices" in teaching statistics and research methods in two- and four-year colleges and universities. A helpful resource for teaching introductory, intermediate, and advanced statistics and/or methods, the book features coverage of: (1) ways to integrate these courses; (2) how to promote ethical conduct;…

  2. CONCURRENCY:PRACTICE & EXPERIENCE, 4(4), 269291 (JUNE 1992) PARALLELIZING THE SPECTRAL TRANSFORM METHOD

    E-print Network

    CONCURRENCY:PRACTICE & EXPERIENCE, 4(4), 269­291 (JUNE 1992) PARALLELIZING THE SPECTRAL TRANSFORM METHOD PATRICK H. WORLEY \\Lambda AND JOHN B. DRAKE \\Lambda Abstract. The spectral transform method at the National Center for Atmospheric Research. This paper describes initial experiences in parallelizing

  3. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ...2011-04-01 2011-04-01 false Imported articles involving unfair methods of competition...Unfair Competition § 12.39 Imported articles involving unfair methods of competition...practices in the importation or sale of articles, the effect or tendency of...

  4. 19 CFR 12.39 - Imported articles involving unfair methods of competition or practices.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ...2010-04-01 2010-04-01 false Imported articles involving unfair methods of competition...Unfair Competition § 12.39 Imported articles involving unfair methods of competition...practices in the importation or sale of articles, the effect or tendency of...

  5. Evaluation of texture methods for image analysis

    Microsoft Academic Search

    Mona Sharma; Sameer Singh

    2001-01-01

    The evaluation of texture features is important for several image processing applications. Texture analysis forms the basis of object recognition and classification in several domains. There is a range of texture extraction methods and their performance evaluation is an important part of understanding the utility of feature extraction tools in image analysis. In this paper we evaluate five different feature

  6. Data Analysis Method for Evaluating Dialogic Learning.

    ERIC Educational Resources Information Center

    Sarja, Anneli; Janhonen, Sirpa

    2000-01-01

    Describes a method for analyzing learning that takes place through dialogue, which requires voice analysis and attention to the connection between speaker and speech and between speech parts and whole. Presents three stages of analysis: limiting the object of discussion, analyzing utterances, and identifying individual self-realization. (SK)

  7. Two MIS Analysis Methods: An Experimental Comparison.

    ERIC Educational Resources Information Center

    Wang, Shouhong

    1996-01-01

    In China, 24 undergraduate business students applied data flow diagrams (DFD) to a mini-case, and 20 used object-oriented analysis (OOA). DFD seemed easier to learn, but after training, those using the OOA method for systems analysis made fewer errors. (SK)

  8. Multicultural Issues in School Psychology Practice: A Critical Analysis

    ERIC Educational Resources Information Center

    Ortiz, Samuel O.

    2006-01-01

    Once thought of largely as a sideline issue, multiculturalism is fast becoming a major topic on the central stage of psychology and practice. That cultural factors permeate the whole of psychological foundations and influence the manner in which the very scope of practice is shaped is undeniable. The rapidly changing face of the U.S. population…

  9. Optical methods for the analysis of dermatopharmacokinetics

    Microsoft Academic Search

    Juergen Lademann; Hans-Juergen Weigmann; R. von Pelchrzim; Wolfram Sterry

    2002-01-01

    The method of tape stripping in combination with spectroscopic measurements is a simple and noninvasive method for the analysis of dermatopharmacokinetics of cosmetic products and topically applied drugs. The absorbance at 430 nm was used for the characterization of the amount of corneocytes on the tape strips. It was compared to the increase of weight of the tapes after removing

  10. Relating Actor Analysis Methods to Policy Problems

    Microsoft Academic Search

    T. E. Van der Lei

    2009-01-01

    For a policy analyst the policy problem is the starting point for the policy analysis process. During this process the policy analyst structures the policy problem and makes a choice for an appropriate set of methods or techniques to analyze the problem (Goeller 1984). The methods of the policy analyst are often referred to as the toolbox or toolkit of

  11. Multiple predictor smoothing methods for sensitivity analysis

    Microsoft Academic Search

    Curtis B. Storlie; Jon C. Helton

    2005-01-01

    The use of multiple predictor smoothing methods in sam- pling-based sensitivity analyses of complex models is in- vestigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise ap- plication of the following nonparametric regression tech- niques are described: (i) locally weighted regression (LOESS), (ii) additive models (GAMs), (iii) projection pursuit regression (PP_REG), and (iv) recursive partition- ing regression

  12. Multiple predictor smoothing methods for sensitivity analysis

    Microsoft Academic Search

    Jon Craig Helton; Curtis B. Storlie

    2006-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both

  13. Special education funding in Indiana's public schools: A financial analysis of current practices

    Microsoft Academic Search

    Robby B Goodman

    2009-01-01

    This quantitative and qualitative study examines the current funding practices in the state of Indiana in regards to special education programs. The analysis includes interviews with current practitioners (superintendents and cooperative directors) from the state of Indiana to gather their perceptions in terms of the current funding practices for special education. A regression analysis was performed to determine if a

  14. Common Goals for the Science and Practice of Behavior Analysis: A Response to Critchfield

    ERIC Educational Resources Information Center

    Schneider, Susan M.

    2012-01-01

    In his scholarly and thoughtful article, "Interesting Times: Practice, Science, and Professional Associations in Behavior Analysis," Critchfield (2011) discussed the science-practice frictions to be expected in any professional organization that attempts to combine these interests. He suggested that the Association for Behavior Analysis

  15. Human Reliability Analysis for Design: Using Reliability Methods for Human Factors Issues

    SciTech Connect

    Ronald Laurids Boring

    2010-11-01

    This paper reviews the application of human reliability analysis methods to human factors design issues. An application framework is sketched in which aspects of modeling typically found in human reliability analysis are used in a complementary fashion to the existing human factors phases of design and testing. The paper provides best achievable practices for design, testing, and modeling. Such best achievable practices may be used to evaluate and human system interface in the context of design safety certifications.

  16. Analysis of nonstandard and home-made explosives and post-blast residues in forensic practice

    NASA Astrophysics Data System (ADS)

    Kotrlý, Marek; Turková, Ivana

    2014-05-01

    Nonstandard and home-made explosives may constitute a considerable threat and as well as a potential material for terrorist activities. Mobile analytical devices, particularly Raman, or also FTIR spectrometers are used for the initial detection. Various sorts of phlegmatizers (moderants) to decrease sensitivity of explosives were tested, some kinds of low viscosity lubricants yielded very good results. If the character of the substance allows it, phlegmatized samples are taken in the amount of approx.0.3g for a laboratory analysis. Various separation methods and methods of concentrations of samples from post-blast scenes were tested. A wide range of methods is used for the laboratory analysis. XRD techniques capable of a direct phase identification of the crystalline substance, namely in mixtures, have highly proved themselves in practice for inorganic and organic phases. SEM-EDS/WDS methods are standardly employed for the inorganic phase. In analysing post-blast residues, there are very important techniques allowing analysis at the level of separate particles, not the overall composition in a mixed sample.

  17. Degradation of learned skills: Effectiveness of practice methods on visual approach and landing skill retention

    NASA Technical Reports Server (NTRS)

    Sitterley, T. E.; Zaitzeff, L. P.; Berge, W. A.

    1972-01-01

    Flight control and procedural task skill degradation, and the effectiveness of retraining methods were evaluated for a simulated space vehicle approach and landing under instrument and visual flight conditions. Fifteen experienced pilots were trained and then tested after 4 months either without the benefits of practice or with static rehearsal, dynamic rehearsal or with dynamic warmup practice. Performance on both the flight control and procedure tasks degraded significantly after 4 months. The rehearsal methods effectively countered procedure task skill degradation, while dynamic rehearsal or a combination of static rehearsal and dynamic warmup practice was required for the flight control tasks. The quality of the retraining methods appeared to be primarily dependent on the efficiency of visual cue reinforcement.

  18. Practical methods of computer-aided flat pattern development for sheet metal components

    Microsoft Academic Search

    K. S. R. K. Prasad; P. Selvaraj

    2004-01-01

    This paper provides step-by-step illustrations of the practical methods of computer-aided flat pattern development (CAFPAD) for bend formable sheet metal component (SMC) manufacture. Based on these methods, software systems are exclusively designed and developed for fully automated loft generation. The shape and complexity of the SMC geometry influence the choice among these methods comprising of contour augmentation, unfold, cross-section development

  19. Comparative analysis of the spatial analysis methods for hotspot identification.

    PubMed

    Yu, Hao; Liu, Pan; Chen, Jun; Wang, Hao

    2014-05-01

    Spatial analysis technique has been introduced as an innovative approach for hazardous road segments identification (HRSI). In this study, the performance of two spatial analysis methods and four conventional methods for HRSI was compared against three quantitative evaluation criteria. The spatial analysis methods considered in this study include the local spatial autocorrelation method and the kernel density estimation (KDE) method. It was found that the empirical Bayesian (EB) method and the KDE method outperformed other HRSI approaches. By transferring the kernel density function into a form that was analogous to the form of the EB function, we further proved that the KDE method can eventually be considered a simplified version of the EB method in which crashes reported at neighboring spatial units are used as the reference population for estimating the EB-adjusted crashes. Theoretically, the KDE method may outperform the EB method in HRSI when the neighboring spatial units provide more useful information on the expected crash frequency than a safety performance function does. PMID:24530515

  20. Method and apparatus for ceramic analysis

    DOEpatents

    Jankowiak, Ryszard J. (Ames, IA); Schilling, Chris (Ames, IA); Small, Gerald J. (Ames, IA); Tomasik, Piotr (Cracow, PL)

    2003-04-01

    The present invention relates to a method and apparatus for ceramic analysis, in particular, a method for analyzing density, density gradients and/or microcracks, including an apparatus with optical instrumentation for analysis of density, density gradients and/or microcracks in ceramics. The method provides analyzing density of a ceramic comprising exciting a component on a surface/subsurface of the ceramic by exposing the material to excitation energy. The method may further include the step of obtaining a measurement of an emitted energy from the component. The method may additionally include comparing the measurement of the emitted energy from the component with a predetermined reference measurement so as to obtain a density for said ceramic.

  1. An Overview of Longitudinal Data Analysis Methods for Neurological Research

    PubMed Central

    Locascio, Joseph J.; Atri, Alireza

    2011-01-01

    The purpose of this article is to provide a concise, broad and readily accessible overview of longitudinal data analysis methods, aimed to be a practical guide for clinical investigators in neurology. In general, we advise that older, traditional methods, including (1) simple regression of the dependent variable on a time measure, (2) analyzing a single summary subject level number that indexes changes for each subject and (3) a general linear model approach with a fixed-subject effect, should be reserved for quick, simple or preliminary analyses. We advocate the general use of mixed-random and fixed-effect regression models for analyses of most longitudinal clinical studies. Under restrictive situations or to provide validation, we recommend: (1) repeated-measure analysis of covariance (ANCOVA), (2) ANCOVA for two time points, (3) generalized estimating equations and (4) latent growth curve/structural equation models. PMID:22203825

  2. Application of Stacking Technique in ANA: Method and Practice with PKU Seismological Array

    NASA Astrophysics Data System (ADS)

    Liu, J.; Tang, Y.; Ning, J.; Chen, Y. J.

    2010-12-01

    Cross correlation of ambient noise records is now routinely used to get dispersion curve and then do seismic tomography; however little attention has been paid to array techniques. We will present a spacial-stacking method to get high resolution dispersion curves and show practices with the observation data of PKU seismological array. Experiential Green Functions are generally obtained by correlation between two stations, and then the dispersion curves are obtained from the analysis of FTAN. Popular method to get high resolution dispersion curves is using long time records. At the same time, if we want to get effectual signal, the distance between the two stations must be at least 3 times of the longest wavelength. So we need both long time records and appropriate spaced stations. Now we use a new method, special-stacking, which allows shorter observation period and utilizes observations of a group of closely distributed stations to get fine dispersion curves. We correlate observations of every station in the station group with those of a far station, and then stack them together. However we cannot just simply stack them unless the stations in the station group at a circle, of which the center is the far station owing to dispersion characteristics of the Rayleigh waves. Thus we do antidispersion on the observation data of every station in the array, then do stacking. We test the method using the theoretical seismic surface wave records which obtained by qseis06 compiled by Rongjiang Wang both with and without noise. For the cases of three imaginary stations (distance is 1 degree) have the same underground structure and without noise, result is that the center station had the same dispersion with and without spacial-stacking. Then we add noise to the theoretical records. The center station's dispersion curves obtained by our method are much closer to the dispersion curve without noise than contaminated ones. We can see that our method has improved the resolution of the dispersion curve. Then we use the real data from PKU array whose interval is about 10 km and the permanent stations of IRIS which is far (more than 200 km) from PKU array, to test the method. Firstly, we compare the stacked correlation results of the three consecutive stations with uncorrelated ones, finding the resolution of the dispersion curve of the former is better. Secondly, we compare the stacked results with the results of center station's traditional correlation in one year, and find the two fit very well.

  3. Analysis of sourcing & procurement practices : a cross industry framework

    E-print Network

    Koliousis, Ioannis G

    2006-01-01

    This thesis presents and analyzes the various practices in the functional area of Sourcing and Procurement. The 21 firms that are studied operate in one of the following industries: Aerospace, Apparel/ Footwear, Automotive, ...

  4. Method of analysis and quality-assurance practices for determination of pesticides in water by solid-phase extraction and capillary-column gas chromatography/mass spectrometry at the U.S. Geological Survey California District Organic Chemistry Laboratory, 1996-99

    USGS Publications Warehouse

    Crepeau, Kathryn L.; Baker, Lucian M.; Kuivila, Kathryn M.

    2000-01-01

    A method of analysis and quality-assurance practices were developed to study the fate and transport of pesticides in the San Francisco Bay-Estuary by the U.S. Geological Survey. Water samples were filtered to remove suspended-particulate matter and pumped through C-8 solid-phase extraction cartridges to extract the pesticides. The cartridges were dried with carbon dioxide and the pesticides were eluted with three cartridge volumes of hexane:diethyl ether (1:1) solution. The eluants were analyzed using capillary-column gas chromatography/mass spectrometry in full-scan mode. Method detection limits for pesticides ranged from 0.002 to 0.025 microgram per liter for 1-liter samples. Recoveries ranged from 44 to 140 percent for 25 pesticides in samples of organic-free reagent water and Sacramento-San Joaquin Delta and Suisun Bay water fortified at 0.05 and 0.50 microgram per liter. The estimated holding time for pesticides after extraction on C-8 solid-phase extraction cartridges ranged from 10 to 257 days.

  5. Drosophila hematopoiesis: Markers and methods for molecular genetic analysis.

    PubMed

    Evans, Cory J; Liu, Ting; Banerjee, Utpal

    2014-06-15

    Analyses of the Drosophila hematopoietic system are becoming more and more prevalent as developmental and functional parallels with vertebrate blood cells become more evident. Investigative work on the fly blood system has, out of necessity, led to the identification of new molecular markers for blood cell types and lineages and to the refinement of useful molecular genetic tools and analytical methods. This review briefly describes the Drosophila hematopoietic system at different developmental stages, summarizes the major useful cell markers and tools for each stage, and provides basic protocols for practical analysis of circulating blood cells and of the lymph gland, the larval hematopoietic organ. PMID:24613936

  6. [The method of time series modeling and its application in the spectral analysis of lubricating oil].

    PubMed

    Gan, M; Zuo, H; Yang, Z; Jiang, Y

    2000-02-01

    In this paper, we discuss the applications of time series modeling method in the analysis of lubricating oil of mechanical equipment. We obtained satisfactory results by applying AR model to perform time series modeling and forecasting analysis to the collected spectral analysis data of the air engine. So we have built a practical method for state monitoring and trouble forecasting of mechanical equipment. PMID:12953452

  7. A Comparative Analysis of Ethnomedicinal Practices for Treating Gastrointestinal Disorders Used by Communities Living in Three National Parks (Korea)

    PubMed Central

    Kim, Hyun; Song, Mi-Jang; Brian, Heldenbrand; Choi, Kyoungho

    2014-01-01

    The purpose of this study is to comparatively analyze the ethnomedicinal practices on gastrointestinal disorders within communities in Jirisan National Park, Gayasan National Park, and Hallasan National Park of Korea. Data was collected through participant observations and indepth interviews with semistructured questionnaires. Methods for comparative analysis were accomplished using the informant consensus factor, fidelity level, and internetwork analysis. A total of 490 ethnomedicinal practices recorded from the communities were classified into 110 families, 176 genera, and 220 species that included plants, animals, fungi, and alga. The informant consensus factor values in the disorder categories were enteritis, and gastralgia (1.0), followed by indigestion (0.94), constipation (0.93), and abdominal pain and gastroenteric trouble (0.92). In terms of fidelity levels, 71 plant species showed fidelity levels of 100%. The internetwork analysis between disorders and all medicinal species are grouped in the center by the four categories of indigestion, diarrhea, abdominal pain, and gastroenteric trouble, respectively. Regarding the research method of this study, the comparative analysis methods will contribute to the availability of orally transmitted ethnomedicinal knowledge. Among the methods of analysis, the use of internetwork analysis as a tool for analysis in this study provides imperative internetwork maps between gastrointestinal disorders and medicinal species. PMID:25202330

  8. Practical considerations for volumetric wear analysis of explanted hip arthroplasties

    PubMed Central

    Langton, D. J.; Sidaginamale, R. P.; Holland, J. P.; Deehan, D.; Joyce, T. J.; Nargol, A. V. F.; Meek, R. D.; Lord, J. K.

    2014-01-01

    Objectives Wear debris released from bearing surfaces has been shown to provoke negative immune responses in the recipient. Excessive wear has been linked to early failure of prostheses. Analysis using coordinate measuring machines (CMMs) can provide estimates of total volumetric material loss of explanted prostheses and can help to understand device failure. The accuracy of volumetric testing has been debated, with some investigators stating that only protocols involving hundreds of thousands of measurement points are sufficient. We looked to examine this assumption and to apply the findings to the clinical arena. Methods We examined the effects on the calculated material loss from a ceramic femoral head when different CMM scanning parameters were used. Calculated wear volumes were compared with gold standard gravimetric tests in a blinded study. Results Various scanning parameters including point pitch, maximum point to point distance, the number of scanning contours or the total number of points had no clinically relevant effect on volumetric wear calculations. Gravimetric testing showed that material loss can be calculated to provide clinically relevant degrees of accuracy. Conclusions Prosthetic surfaces can be analysed accurately and rapidly with currently available technologies. Given these results, we believe that routine analysis of explanted hip components would be a feasible and logical extension to National Joint Registries. Cite this article: Bone Joint Res 2014;3:60–8. PMID:24627327

  9. Practical post-calibration uncertainty analysis: Yucca Mountain, Nevada, USA

    NASA Astrophysics Data System (ADS)

    James, S. C.; Doherty, J.; Eddebbarh, A.

    2009-12-01

    The values of parameters in a groundwater flow model govern the precision of predictions of future system behavior. Predictive precision, thus, typically depends on an ability to infer values of system properties from historical measurements through calibration. When such data are scarce, or when their information content with respect to parameters that are most relevant to predictions of interest is weak, predictive uncertainty may be high, even if the model is “calibrated.” Recent advances help recognize this condition, quantitatively evaluate predictive uncertainty, and suggest a path toward improved predictive accuracy by identifying sources of predictive uncertainty and by determining what observations will most effectively reduce this uncertainty. We demonstrate linear and nonlinear predictive error/uncertainty analyses as applied to a groundwater flow model of Yucca Mountain, Nevada, the US’s proposed site for disposal of high-level radioactive waste. Both of these types uncertainty analysis are readily implemented as an adjunct to model calibration with medium to high parameterization density. Linear analysis yields contributions made by each parameter to a prediction’s uncertainty and the worth of different observations, both existing and yet-to-be-gathered, toward reducing this uncertainty. Nonlinear analysis provides more accurate characterization of the uncertainty of model predictions while yielding their (approximate) probability distribution functions. This paper applies the above methods to a prediction of specific discharge and confirms the uncertainty bounds on specific discharge supplied in the Yucca Mountain Project License Application. Furthermore, Monte Carlo simulations confirm that hydrogeologic units thought to be flow barriers have probability distributions skewed toward lower permeabilities.

  10. Hybrid least squares multivariate spectral analysis methods

    DOEpatents

    Haaland, David M.

    2004-03-23

    A set of hybrid least squares multivariate spectral analysis methods in which spectral shapes of components or effects not present in the original calibration step are added in a following prediction or calibration step to improve the accuracy of the estimation of the amount of the original components in the sampled mixture. The hybrid method herein means a combination of an initial calibration step with subsequent analysis by an inverse multivariate analysis method. A spectral shape herein means normally the spectral shape of a non-calibrated chemical component in the sample mixture but can also mean the spectral shapes of other sources of spectral variation, including temperature drift, shifts between spectrometers, spectrometer drift, etc. The shape can be continuous, discontinuous, or even discrete points illustrative of the particular effect.

  11. A numerical method for the stress analysis of stiffened-shell structures under nonuniform temperature distributions

    NASA Technical Reports Server (NTRS)

    Heldenfels, Richard R

    1951-01-01

    A numerical method is presented for the stress analysis of stiffened-shell structures of arbitrary cross section under nonuniform temperature distributions. The method is based on a previously published procedure that is extended to include temperature effects and multicell construction. The application of the method to practical problems is discussed and an illustrative analysis is presented of a two-cell box beam under the combined action of vertical loads and a nonuniform temperature distribution.

  12. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2010-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  13. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor)

    2009-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted simultaneously with native fluorescence spectroscopy to provide high levels of sensitivity and specificity in the same instrument.

  14. A qualitative analysis of case managers' use of harm reduction in practice.

    PubMed

    Tiderington, Emmy; Stanhope, Victoria; Henwood, Benjamin F

    2013-01-01

    The harm reduction approach has become a viable framework within the field of addictions, yet there is limited understanding about how this approach is implemented in practice. For people who are homeless and have co-occurring psychiatric and substance use disorders, the Housing First model has shown promising results in employing such an approach. This qualitative study utilizes ethnographic methods to explore case managers' use of harm reduction within Housing First with a specific focus on the consumer-provider relationship. Analysis of observational data and in-depth interviews with providers and consumers revealed how communication between the two regarding the consumer's substance use interacted with the consumer-provider relationship. From these findings emerged a heuristic model of harm reduction practice that highlighted the profound influence of relationship quality on the paths of communication regarding substance use. This study provides valuable insight into how harm reduction is implemented in clinical practice that ultimately has public health implications in terms of more effectively addressing high rates of addiction that contribute to homelessness and health disparities. PMID:22520277

  15. Practical guidelines to select and scale earthquake records for nonlinear response history analysis of structures

    USGS Publications Warehouse

    Kalkan, Erol; Chopra, Anil K.

    2010-01-01

    Earthquake engineering practice is increasingly using nonlinear response history analysis (RHA) to demonstrate performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. Presented herein is a modal-pushover-based scaling (MPS) method to scale ground motions for use in nonlinear RHA of buildings and bridges. In the MPS method, the ground motions are scaled to match (to a specified tolerance) a target value of the inelastic deformation of the first-'mode' inelastic single-degree-of-freedom (SDF) system whose properties are determined by first-'mode' pushover analysis. Appropriate for first-?mode? dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-'mode' SDF system in selecting a subset of the scaled ground motions. Based on results presented for two bridges, covering single- and multi-span 'ordinary standard' bridge types, and six buildings, covering low-, mid-, and tall building types in California, the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.

  16. Practical method for evaluating the visibility of moire patterns for CRT design

    NASA Astrophysics Data System (ADS)

    Shiramatsu, Naoki; Tanigawa, Masashi; Iwata, Shuji

    1995-04-01

    The high resolution CRT displays used for computer monitor and high performance TV often produce a pattern of bright and dark stripes on the screen called a moire pattern. The elimination of the moire is an important consideration in the CRT design. The objective of this study is to provide a practical method for estimating and evaluating a moire pattern considering the visibility by the human vision. On the basis of the mathematical model of a moire generation, precise value of the period and the intensity of a moire are calculated from the actual data of the electron beam profile and the transmittance distribution of apertures of the shadow mask. The visibility of the moire is evaluated by plotting the calculation results on the contrast-period plane, which consists of visible and invisible moire pattern regions based on experimental results of the psychological tests. Not only fundamental design parameters such as a shadow mask pitch and a scanning line pitch but also details of an electron beam profile such as a distortion or an asymmetry can be examined. In addition to the analysis, the image simulation of a moire using the image memory is also available.

  17. Impact of pedagogical method on Brazilian dental students' waste management practice.

    PubMed

    Victorelli, Gabriela; Flório, Flávia Martão; Ramacciato, Juliana Cama; Motta, Rogério Heládio Lopes; de Souza Fonseca Silva, Almenara

    2014-11-01

    The purpose of this study was to conduct a qualitative analysis of waste management practices among a group of Brazilian dental students (n=64) before and after implementing two different pedagogical methods: 1) the students attended a two-hour lecture based on World Health Organization standards; and 2) the students applied the lessons learned in an organized group setting aimed toward raising their awareness about socioenvironmental issues related to waste. All eligible students participated, and the students' learning was evaluated through their answers to a series of essay questions, which were quantitatively measured. Afterwards, the impact of the pedagogical approaches was compared by means of qualitative categorization of wastes generated in clinical activities. Waste categorization was performed for a period of eight consecutive days, both before and thirty days after the pedagogical strategies. In the written evaluation, 80 to 90 percent of the students' answers were correct. The qualitative assessment revealed a high frequency of incorrect waste disposal with a significant increase of incorrect disposal inside general and infectious waste containers (p<0.05). Although the students' theoretical learning improved, it was not enough to change behaviors established by cultural values or to encourage the students to adequately segregate and package waste material. PMID:25362694

  18. Moving environmental DNA methods from concept to practice for monitoring aquatic macroorganisms

    USGS Publications Warehouse

    Goldberg, Caren S.; Strickler, Katherine M.; Pilliod, David S.

    2015-01-01

    The discovery that macroorganisms can be detected from their environmental DNA (eDNA) in aquatic systems has immense potential for the conservation of biological diversity. This special issue contains 11 papers that review and advance the field of eDNA detection of vertebrates and other macroorganisms, including studies of eDNA production, transport, and degradation; sample collection and processing to maximize detection rates; and applications of eDNA for conservation using citizen scientists. This body of work is an important contribution to the ongoing efforts to take eDNA detection of macroorganisms from technical breakthrough to established, reliable method that can be used in survey, monitoring, and research applications worldwide. While the rapid advances in this field are remarkable, important challenges remain, including consensus on best practices for collection and analysis, understanding of eDNA diffusion and transport, and avoidance of inhibition in sample collection and processing. Nonetheless, as demonstrated in this special issue, eDNA techniques for research and monitoring are beginning to realize their potential for contributing to the conservation of biodiversity globally.

  19. Chromatographic methods for analysis of triazine herbicides.

    PubMed

    Abbas, Hana Hassan; Elbashir, Abdalla A; Aboul-Enein, Hassan Y

    2015-07-01

    Gas chromatography (GC) and high-performance liquid chromatography (HPLC) coupled to different detectors, and in combination with different sample extraction methods, are most widely used for analysis of triazine herbicides in different environmental samples. Nowadays, many variations and modifications of extraction and sample preparation methods such as solid-phase microextraction (SPME), hollow fiber-liquid phase microextraction (HF-LPME), stir bar sportive extraction (SBSE), headspace-solid phase microextraction (HS-SPME), dispersive liquid-liquid microextraction (DLLME), dispersive liquid-liquid microextraction based on solidification of floating organic droplet (DLLME-SFO), ultrasound-assisted emulsification microextraction (USAEME), and others have been introduced and developed to obtain sensitive and accurate methods for the analysis of these hazardous compounds. In this review, several analytical properties such as linearity, sensitivity, repeatability, and accuracy for each developed method are discussed, and excellent results were obtained for the most of developed methods combined with GC and HPLC techniques for the analysis of triazine herbicides. This review gives an overview of recent publications of the application of GC and HPLC for analysis of triazine herbicides residues in various samples. PMID:25849823

  20. TECHNICAL REVIEW A practical guide to methods of parentage analysis

    E-print Network

    Jones, Adam

    ' (Jeffreys et al. 1985). This multi-locus DNA fingerprinting approach was rapidly adopted by avian the spread of DNA fingerprinting applications outside of birds and mammals. Several years after the development of DNA fingerprinting, the discovery of microsatellite markers (Tautz 1989), also known as simple

  1. Practical Implementation of New Particle Tracking Method to the Real Field of Groundwater Flow and Transport

    PubMed Central

    Suk, Heejun

    2012-01-01

    Abstract In articles published in 2009 and 2010, Suk and Yeh reported the development of an accurate and efficient particle tracking algorithm for simulating a path line under complicated unsteady flow conditions, using a range of elements within finite elements in multidimensions. Here two examples, an aquifer storage and recovery (ASR) example and a landfill leachate migration example, are examined to enhance the practical implementation of the proposed particle tracking method, known as Suk's method, to a real field of groundwater flow and transport. Results obtained by Suk's method are compared with those obtained by Pollock's method. Suk's method produces superior tracking accuracy, which suggests that Suk's method can describe more accurately various advection-dominated transport problems in a real field than existing popular particle tracking methods, such as Pollock's method. To illustrate the wide and practical applicability of Suk's method to random-walk particle tracking (RWPT), the original RWPT has been modified to incorporate Suk's method. Performance of the modified RWPT using Suk's method is compared with the original RWPT scheme by examining the concentration distributions obtained by the modified RWPT and the original RWPT under complicated transient flow systems. PMID:22476629

  2. Multiple predictor smoothing methods for sensitivity analysis

    Microsoft Academic Search

    Curtis B. Storlie; J. C. Helton

    2005-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models (GAMs), (iii) projection pursuit regression (PP_REG), and (iv) recursive partitioning regression (RP_REG). The indicated procedures are

  3. Simplified method for nonlinear structural analysis

    NASA Technical Reports Server (NTRS)

    Kaufman, A.

    1983-01-01

    A simplified inelastic analysis computer program was developed for predicting the stress-strain history of a thermomechanically cycled structure from an elastic solution. The program uses an iterative and incremental procedure to estimate the plastic strains from the material stress-strain properties and a simulated plasticity hardening model. The simplified method was exercised on a number of problems involving uniaxial and multiaxial loading, isothermal and nonisothermal conditions, and different materials and plasticity models. Good agreement was found between these analytical results and nonlinear finite element solutions for these problems. The simplified analysis program used less than 1 percent of the CPU time required for a nonlinear finite element analysis.

  4. Design analysis, robust methods, and stress classification

    SciTech Connect

    Bees, W.J. (ed.)

    1993-01-01

    This special edition publication volume is comprised of papers presented at the 1993 ASME Pressure Vessels and Piping Conference, July 25--29, 1993 in Denver, Colorado. The papers were prepared for presentations in technical sessions developed under the auspices of the PVPD Committees on Computer Technology, Design and Analysis, Operations Applications and Components. The topics included are: Analysis of Pressure Vessels and Components; Expansion Joints; Robust Methods; Stress Classification; and Non-Linear Analysis. Individual papers have been processed separately for inclusion in the appropriate data bases.

  5. A Practical Test Method for Mode I Fracture Toughness of Adhesive Joints with Dissimilar Substrates

    SciTech Connect

    Boeman, R.G.; Erdman, D.L.; Klett, L.B.; Lomax, R.D.

    1999-09-27

    A practical test method for determining the mode I fracture toughness of adhesive joints with dissimilar substrates will be discussed. The test method is based on the familiar Double Cantilever Beam (DCB) specimen geometry, but overcomes limitations in existing techniques that preclude their use when testing joints with dissimilar substrates. The test method is applicable to adhesive joints where the two bonded substrates have different flexural rigidities due to geometric and/or material considerations. Two specific features discussed are the use of backing beams to prevent substrate damage and a compliance matching scheme to achieve symmetric loading conditions. The procedure is demonstrated on a modified DCB specimen comprised of SRIM composite and thin-section, e-coat steel substrates bonded with an epoxy adhesive. Results indicate that the test method provides a practical means of characterizing the mode I fracture toughness of joints with dissimilar substrates.

  6. Connecting Practice, Theory and Method: Supporting Professional Doctoral Students in Developing Conceptual Frameworks

    ERIC Educational Resources Information Center

    Kumar, Swapna; Antonenko, Pavlo

    2014-01-01

    From an instrumental view, conceptual frameworks that are carefully assembled from existing literature in Educational Technology and related disciplines can help students structure all aspects of inquiry. In this article we detail how the development of a conceptual framework that connects theory, practice and method is scaffolded and facilitated…

  7. A comparison of South Australia's driver licensing methods: competency-based training vs. practical examination

    E-print Network

    A comparison of South Australia's driver licensing methods: competency-based training vs. practical Australia, 5th Floor CDRC Building, The Queen Elizabeth Hospital, Woodville Road, Woodville SA 5011, Australia b Transport Systems Centre, School of Geoinformatics, Planning and Building, University of South

  8. Preaching History in a Social Studies Methods Course: A Portrait of Practice

    ERIC Educational Resources Information Center

    Slekar, Timothy D.

    2006-01-01

    This article presents a portrait of practice of one social studies methods professor engaged in teaching his course and analyzes the choices the professor makes during the semester. These choices are linked to his philosophy of social studies education with particular attention paid to his passionate belief in the American story as the core of…

  9. What Informs Practice and What Is Valued in Corporate Instructional Design? A Mixed Methods Study

    ERIC Educational Resources Information Center

    Thompson-Sellers, Ingrid N.

    2012-01-01

    This study used a two-phased explanatory mixed-methods design to explore in-depth what factors are perceived by Instructional Design and Technology (IDT) professionals as impacting instructional design practice, how these factors are valued in the field, and what differences in perspectives exist between IDT managers and non-managers. For phase 1…

  10. Simple, Practical Method for Determining Station Weights Using Thiessen Polygons and Isohyetal Maps

    E-print Network

    Fiedler, Fritz R.

    the classical Thiessen methodology; and 2 inverse distance squared weighting NWS 2002 . However, sta- tion data estimated with the inverse distance squared technique NWS 2002 . The goal is to make sure that the MAPSimple, Practical Method for Determining Station Weights Using Thiessen Polygons and Isohyetal Maps

  11. Using the Patient as Teacher: A Training Method for Family Practice Residents in Behavioral Science

    Microsoft Academic Search

    Janis L. Lewis; DeVon R. Stokes; Lawrence R. Fischetti; Aaron L. Rutledge

    1988-01-01

    Since the inception of family medicine as a specialty in allopathy and osteopathy in 1969 and 1973, respectively, there has been a need to develop integrative approaches of teaching behavioral science concepts without violating the scope of practice limitations between the fields. We describe a collaborative training method by which we attempt to achieve this balance. Residents referring patients for

  12. Econometric Methods for Causal Evaluation of Education Policies and Practices: A Non-Technical Guide

    ERIC Educational Resources Information Center

    Schlotter, Martin; Schwerdt, Guido; Woessmann, Ludger

    2011-01-01

    Education policy-makers and practitioners want to know which policies and practices can best achieve their goals. But research that can inform evidence-based policy often requires complex methods to distinguish causation from accidental association. Avoiding econometric jargon and technical detail, this paper explains the main idea and intuition…

  13. Self-Assessment Methods in Writing Instruction: A Conceptual Framework, Successful Practices and Essential Strategies

    ERIC Educational Resources Information Center

    Nielsen, Kristen

    2014-01-01

    Student writing achievement is essential to lifelong learner success, but supporting writing can be challenging for teachers. Several large-scale analyses of publications on writing have called for further study of instructional methods, as the current literature does not sufficiently address the need to support best teaching practices.…

  14. A Practical Copper Loss Measurement Method for the Planar Transformer in High-Frequency Switching Converters

    Microsoft Academic Search

    Yongtao Han; Wilson Eberle; Yan-Fei Liu

    2007-01-01

    In this paper, a new and practical measurement method is proposed to characterize the planar transformer copper loss operating in a high-frequency switching mode power supply (SMPS). The scheme is easy to set up, and it provides an equivalent winding alternating current resistance, which is the result of all the field effects on the transformer windings to achieve more accurate

  15. Passive Sampling Methods for Contaminated Sediments: Practical Guidance for Selection, Calibration, and Implementation

    EPA Science Inventory

    This article provides practical guidance on the use of passive sampling methods(PSMs) that target the freely dissolved concentration (Cfree) for improved exposure assessment of hydrophobic organic chemicals in sediments. Primary considerations for selecting a PSM for a specific a...

  16. Developing a clinical hypermedia corpus: experiences from the use of a practice-centered method.

    PubMed Central

    Timpka, T.; Nyce, J. M.; Sjöberg, C.; Hedblom, P.; Lindblom, P.

    1992-01-01

    This paper outlines a practice-centered method for creation of a hypermedia corpus. It also describes experiences with creating such a corpus of information to support interprofessional work at a Primary Healthcare Center. From these experiences, a number of basic issues regarding information systems development within medical informatics will be discussed. PMID:1482924

  17. Practical use of three-dimensional inverse method for compressor blade design

    Microsoft Academic Search

    S. Damle; T. Dang; J. Stringham; E. Razinsky

    1999-01-01

    The practical utility of a three-dimensional inverse viscous method is demonstrated by carrying out a design modification of a first-stage rotor in an industrial compressor. In this design modification study, the goal is to improve the efficiency of the original blade while retaining its overall aerodynamic, structural, and manufacturing characteristics. By employing a simple modification to the blade pressure loading

  18. National Survey of Psychologists' Test Feedback Training, Supervision, and Practice: A Mixed Methods Study

    Microsoft Academic Search

    Kyle T. Curry; William E. Hanson

    2010-01-01

    In this empirical, mixed methods study, we explored test feedback training, supervision, and practice among psychologists, focusing specifically on how feedback is provided to clients and whether feedback skills are taught in graduate programs. Based on a 48.5% return rate, this national survey of clinical, counseling, and school psychologists' suggests psychologists provide test feedback to clients but inconsistently. Most respondents,

  19. Amplitude and Phase Fluorescence-Spectroscopy Methods for Dissolved Oxygen Concentration Evaluation: Comparative Practical Results

    Microsoft Academic Search

    Gustavo J. Grillo; Miguel A. Pérez; Marta Valledor; Rubén Ramos

    2005-01-01

    This paper shows the practical results from a detailed comparative study of amplitude and phase fluorescence-spectroscopy methods for dissolved oxygen concentration evaluation. These results were obtained with an implemented optoelectronic measurement system that guarantees near-optimal operation conditions for both methods and a commercial fluorescence optical-fiber sensor, which is excited by a continuous-regulated sinusoidal-amplitude modulated light beam. The comparison was made

  20. Practical method for evaluating the sound field radiated from a waveguide.

    PubMed

    Feng, Xuelei; Shen, Yong; Chen, Simiao; Zhao, Ye

    2015-01-01

    This letter presents a simple and practical method for evaluating the sound field radiated from a waveguide. By using the proposed method, detailed information about the radiated sound field can be obtained by measuring the sound field in the mouth of the baffled waveguide. To examine this method's effectiveness, the radiated sound pressure distribution in space was first evaluated by using the proposed method, and then it was measured directly for comparison. Experiments using two different waveguides showed good agreement between the evaluated and the measured radiated sound pressure distributions. PMID:25618097

  1. A Mixed Methods Content Analysis of the Research Literature in Science Education

    ERIC Educational Resources Information Center

    Schram, Asta B.

    2014-01-01

    In recent years, more and more researchers in science education have been turning to the practice of combining qualitative and quantitative methods in the same study. This approach of using mixed methods creates possibilities to study the various issues that science educators encounter in more depth. In this content analysis, I evaluated 18…

  2. Author's personal copy Monte Carlo methods for design and analysis of radiation detectors

    E-print Network

    Shultis, J. Kenneth

    Author's personal copy Monte Carlo methods for design and analysis of radiation detectors William L Radiation detectors Inverse problems Detector design a b s t r a c t An overview of Monte Carlo as a practical method for designing and analyzing radiation detectors is provided. The emphasis is on detectors

  3. Particle size analysis of nanocrystals: improved analysis method.

    PubMed

    Keck, Cornelia M

    2010-05-01

    The influence of optical parameters, additional techniques (e.g. PIDS technology) and the importance of light microscopy were investigated by comparing laser diffraction data obtained via the conventional method and an optimized analysis method. Also the influence of a possible dissolution of nanocrystals during a measurement on the size result obtained was assessed in this study. The results reveal that dissolution occurs if unsaturated medium or microparticle saturated medium is used for the measurements. The dissolution is erratic and the results are not reproducible. Dissolution can be overcome by saturating the measuring medium prior to the measurement. If nanocrystals are analysed the dispersion medium should be saturated with the nanocrystals, because the solubility is higher than for coarse micro-sized drug material. The importance of using the optimized analysis method was proven by analysing 40 different nanosuspensions via the conventional versus the optimized sizing method. There was no large difference in the results obtained for the 40 nanosuspensions using the conventional method. This would have led to the conclusion, that all the 40 formulations investigated are physically stable. However, the analysis via the optimized method revealed that from 40 formulations investigated only four were physically stable. In conclusion an optimized analysis saves time and money and avoids misleading developments, because discrimination between "stable" and "unstable" can be done reliably at a very early stage of the development. PMID:19733647

  4. Methods for Chemical Analysis of Fresh Waters.

    ERIC Educational Resources Information Center

    Golterman, H. L.

    This manual, one of a series prepared for the guidance of research workers conducting studies as part of the International Biological Programme, contains recommended methods for the analysis of fresh water. The techniques are grouped in the following major sections: Sample Taking and Storage; Conductivity, pH, Oxidation-Reduction Potential,…

  5. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, L.M.; Ng, E.G.

    1998-09-29

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data are disclosed. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated. 8 figs.

  6. Integrated method for chaotic time series analysis

    DOEpatents

    Hively, Lee M. (Philadelphia, TN); Ng, Esmond G. (Concord, TN)

    1998-01-01

    Methods and apparatus for automatically detecting differences between similar but different states in a nonlinear process monitor nonlinear data. Steps include: acquiring the data; digitizing the data; obtaining nonlinear measures of the data via chaotic time series analysis; obtaining time serial trends in the nonlinear measures; and determining by comparison whether differences between similar but different states are indicated.

  7. Epsilon substitution method for elementary analysis

    Microsoft Academic Search

    Grigori Mints; Sergei Tupailo; Wilfried Buchholz

    1996-01-01

    We formulate epsilon substitution method for elementary analysisEA (second order arithmetic with comprehension for arithmetical formulas with predicate parameters). Two proofs of its termination are presented. One uses embedding into ramified system of level one and cutelimination for this system. The second proof uses non-effective continuity argument.

  8. Analysis methods for tocopherols and tocotrienols

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Tocopherols and tocotrienols, sometimes called tocochromanols or tocols, are also collectively termed Vitamin E. Vitamins A, D, E, and K, are referred to as fat soluble vitamins. Since the discovery of Vitamin E in 1922, many methods have been developed for the analysis of tocopherols and tocotrie...

  9. Automated Analysis of Java Methods for Confidentiality

    E-print Network

    Cerny, Pavol

    Automated Analysis of Java Methods for Confidentiality Pavol Cern´y and Rajeev Alur University midlets for mobile devices, where a central correctness requirement con- cerns confidentiality of data, and are not applicable to checking confidentiality properties that re- quire reasoning about equivalence among executions

  10. MARGIN AND SENSITIVITY METHODS SECURITY ANALYSIS

    E-print Network

    MARGIN AND SENSITIVITY METHODS FOR SECURITY ANALYSIS OF ELECTRIC POWER SYSTEMS by Scott Greene defined by limiting events and instabilities, and the sensitivity of those margins with respect to assump. The usefulness of computing the sensitivities of these margins with respect to interarea transfers, loading

  11. Comparison and Cost Analysis of Drinking Water Quality Monitoring Requirements versus Practice in Seven Developing Countries

    PubMed Central

    Crocker, Jonny; Bartram, Jamie

    2014-01-01

    Drinking water quality monitoring programs aim to support provision of safe drinking water by informing water quality management. Little evidence or guidance exists on best monitoring practices for low resource settings. Lack of financial, human, and technological resources reduce a country’s ability to monitor water supply. Monitoring activities were characterized in Cambodia, Colombia, India (three states), Jordan, Peru, South Africa, and Uganda according to water sector responsibilities, monitoring approaches, and marginal cost. The seven study countries were selected to represent a range of low resource settings. The focus was on monitoring of microbiological parameters, such as E. coli, coliforms, and H2S-producing microorganisms. Data collection involved qualitative and quantitative methods. Across seven study countries, few distinct approaches to monitoring were observed, and in all but one country all monitoring relied on fixed laboratories for sample analysis. Compliance with monitoring requirements was highest for operational monitoring of large water supplies in urban areas. Sample transport and labor for sample collection and analysis together constitute approximately 75% of marginal costs, which exclude capital costs. There is potential for substantive optimization of monitoring programs by considering field-based testing and by fundamentally reconsidering monitoring approaches for non-piped supplies. This is the first study to look quantitatively at water quality monitoring practices in multiple developing countries. PMID:25046632

  12. Cost management practices for supply chain management: an exploratory analysis

    Microsoft Academic Search

    Stephan M. Wagner

    2008-01-01

    Cost management within a supply chain management domain has lately received a great deal of interest from academics and practitioners; however, the literature is still dominated by conceptual and anecdotal work. The major issue is that it is difficult at best to draw conclusion with any level of confidence concerning the actual degree of usage of various cost management practices.

  13. An Analysis of Teacher Practices with Toddlers during Social Conflicts

    ERIC Educational Resources Information Center

    Gloeckler, Lissy R.; Cassell, Jennifer M.; Malkus, Amy J.

    2014-01-01

    Employing a quasi-experimental design, this pilot study on teacher practices with toddlers during social conflicts was conducted in the southeastern USA. Four child-care classrooms, teachers (n?=?8) and children (n?=?51) were assessed with the Classroom Assessment Scoring System -- Toddler [CLASS-Toddler; La Paro, K., Hamre, B. K., & Pianta,…

  14. An Analysis of Farm Injuries and Safety Practices in Mississippi

    Microsoft Academic Search

    Carey L. Ford; Terence L. Lynch

    2000-01-01

    In Mississippi, agriculture is the most dangerous industry employing over 30% of the state's workforce. Records from the Mississippi Cooperative Extension Service indicated that 18 tractor deaths occurred in 1997, a new all-time record. Also, there were two additional deaths involving other farm machinery. This study was designed to determine the magnitude of farm injuries, safety practices, and educational programs

  15. Professional Learning in Rural Practice: A Sociomaterial Analysis

    ERIC Educational Resources Information Center

    Slade, Bonnie

    2013-01-01

    Purpose: This paper aims to examine the professional learning of rural police officers. Design/methodology/approach: This qualitative case study involved interviews and focus groups with 34 police officers in Northern Scotland. The interviews and focus groups were transcribed and analysed, drawing on practice-based and sociomaterial learning…

  16. Honesty in Critically Reflective Essays: An Analysis of Student Practice

    ERIC Educational Resources Information Center

    Maloney, Stephen; Tai, Joanna Hong-Meng; Lo, Kristin; Molloy, Elizabeth; Ilic, Dragan

    2013-01-01

    In health professional education, reflective practice is seen as a potential means for self-improvement from everyday clinical encounters. This study aims to examine the level of student honesty in critical reflection, and barriers and facilitators for students engaging in honest reflection. Third year physiotherapy students, completing summative…

  17. A practical approach to fire hazard analysis for offshore structures.

    PubMed

    Krueger, Joel; Smith, Duncan

    2003-11-14

    Offshore quantitative risk assessments (QRA) have historically been complex and costly. For large offshore design projects, the level of detail required for a QRA is often not available until well into the detailed design phase of the project. In these cases, the QRA may be unable to provide timely hazard understanding. As a result, the risk reduction measures identified often come too late to allow for cost effective changes to be implemented. This forces project management to make a number of difficult or costly decisions. This paper demonstrates how a scenario-based approached to fire risk assessment can be effectively applied early in a project's development. The scenario or design basis fire approach calculates the consequence of a select number of credible fire scenarios, determines the potential impact on the platform process equipment, structural members, egress routes, safety systems, and determines the effectiveness of potential options for mitigation. The early provision of hazard data allows the project team to select an optimum design that is safe and will meet corporate or regulatory risk criteria later in the project cycle. The focus of this paper is on the application of the scenario-based approach to gas jet fires. This paper draws on recent experience in the Gulf of Mexico (GOM) and other areas to outline an approach to fire hazard analysis and fire hazard management for deep-water structures. The methods presented will include discussions from the recent June 2002 International Workshop for Fire Loading and Response. PMID:14602403

  18. Multiple predictor smoothing methods for sensitivity analysis.

    SciTech Connect

    Helton, Jon Craig; Storlie, Curtis B.

    2006-08-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (1) locally weighted regression (LOESS), (2) additive models, (3) projection pursuit regression, and (4) recursive partitioning regression. The indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present.

  19. Empirical Analysis of Green Supply Chain Management Practices in Indian Automobile Industry

    NASA Astrophysics Data System (ADS)

    Luthra, S.; Garg, D.; Haleem, A.

    2014-04-01

    Environmental sustainability and green environmental issues have an increasing popularity among researchers and supply chain practitioners. An attempt has been made to identify and empirically analyze green supply chain management (GSCM) practices in Indian automobile industry. Six main GSCM practices (having 37 sub practices) and four expected performance outcomes (having 16 performances) have been identified by implementing GSCM practices from literature review. Questionnaire based survey has been made to validate these practices and performance outcomes. 123 complete questionnaires were collected from Indian automobile organizations and used for empirical analysis of GSCM practices in Indian automobile industry. Descriptive statistics have been used to know current implementation status of GSCM practices in Indian automobile industry and multiple regression analysis has been carried out to know the impact on expected organizational performance outcomes by current GSCM practices adopted by Indian automobile industry. The results of study suggested that environmental, economic, social and operational performances improve with the implementation of GSCM practices. This paper may play an important role to understand various GSCM implementation issues and help practicing managers to improve their performances in the supply chain.

  20. Social Workers and Policy Practice: An Analysis of Job Descriptions in Israel

    Microsoft Academic Search

    Idit Weiss-Gal; Lia Levin

    2010-01-01

    Social workers' involvement in the policy-making process (policy practice) is an important aspect of social work. This article examines formal social work job descriptions in an effort to determine whether, and to what extent, social workers in Israel are required to engage in policy practice and which specific activities are required of them in this sense. A quantitative content analysis

  1. Bridging Work Practice and System Design: Integrating Systemic Analysis, Appreciative Intervention and Practitioner Participation

    Microsoft Academic Search

    Helena Karasti

    2001-01-01

    This article discusses the integration of work practice and system design. By scrutinising the unfolding discourse of workshop participants the co-construction of work practice issues as relevant design considerations is described. Through a mutual exploration of ethnography and participatory design the contributing constituents to the co-construction process are identified and put forward as elements in the integration of `systemic analysis'

  2. The Impact of Weather Extremes on Agricultural Production Methods: Does Drought Increase Adoption of Conservation Tillage Practices?

    Microsoft Academic Search

    Ya Ding; Karina Schoengold

    2007-01-01

    The adoption of conservation tillage practices such as ridge till, mulch till, or no-till has been shown to reduce soil erosion. An additional benefit of these conservation practices is that they also increase soil moisture. Therefore, these practices appear to be a method that agricultural producers can use to reduce their risk associated with abnormally dry or wet conditions (i.e.,

  3. Nonlinear analysis method can improve pipeline design

    SciTech Connect

    Aynbinder, A.; Taksa, B. [Gulf Interstate Engineering, Houston, TX (United States); Dalton, P.

    1996-03-25

    A nonlinear engineering method for analyzing pipe stress criteria has been developed and can be used in common spreadsheet software for pipeline design. Designs based on this method can enhance the operational reliability of pipeline systems because their designs can more accurately determine actual pipe stress and strain. Most pipeline design codes establish allowable equivalent-stress limits that are higher than the pipe steel`s proportional limit (the linear relationship between stress and strain). The limit is approximately 70% of the specified minimum yield strength (SMYS). Therefore, consideration of the nonlinear mechanical properties of the material is reasonable in pipeline stress analysis. The nonlinear, numerical engineering method proposed is based on small elastic-plastic deformation theories and design data for materials used to manufacture pipe according to industry specifications. The method allows for elastic plastic concepts to be easily incorporated in pipeline design. In some cases, this method allows the pipe`s design wall thickness to be reduced; in other cases, an increase in the temperature differential can be tolerated. This method can also be used for calculating the rigidity characteristics of the pipe. The results may be used in pipeline system analysis and such design programs as TRIFLEX or CAESAR.

  4. Language Ideology or Language Practice? An Analysis of Language Policy Documents at Swedish Universities

    ERIC Educational Resources Information Center

    Björkman, Beyza

    2014-01-01

    This article presents an analysis and interpretation of language policy documents from eight Swedish universities with regard to intertextuality, authorship and content analysis of the notions of language practices and English as a lingua franca (ELF). The analysis is then linked to Spolsky's framework of language policy, namely language…

  5. TxDOT Best Practices Model and Implementation Guide for Advance Planning Risk Analysis for

    E-print Network

    Texas at Austin, University of

    0-5478-P2 TxDOT Best Practices Model and Implementation Guide for Advance Planning Risk Analysis?....................................................................................................1 1.2 Advance Planning Risk Analysis .............................................................................................73 #12;#12;1 Chapter 1. What Is the APRA? The Advance Planning Risk Analysis (APRA

  6. Practical Blended Taint Analysis for JavaScript Shiyi Wei and Barbara G. Ryder

    E-print Network

    Ryder, Barbara G.

    Practical Blended Taint Analysis for JavaScript Shiyi Wei and Barbara G. Ryder Department of Computer Science Virginia Tech, USA {wei, ryder}@cs.vt.edu ABSTRACT JavaScript is widely used in Web analysis, an instantiation of our general-purpose analysis framework for JavaScript, to illus- trate how

  7. A Practical Blended Analysis for Dynamic Features in JavaScript

    E-print Network

    Ryder, Barbara G.

    A Practical Blended Analysis for Dynamic Features in JavaScript Shiyi Wei and Barbara G. Ryder in Web applications; however, its dynamism renders static analysis ineffective. Our JavaScript Blended Analysis Framework is designed to handle JavaScript dynamic features. It performs a flexible combined

  8. A PROBABILISTIC APPROACH FOR ANALYSIS OF UNCERTAINTY IN THE EVALUATION OF WATERSHED MANAGEMENT PRACTICES

    EPA Science Inventory

    A computational framework is presented for analyzing the uncertainty in model estimates of water quality benefits of best management practices (BMPs) in two small (2) watersheds in Indiana. The analysis specifically recognizes the significance of the difference b...

  9. Text analysis devices, articles of manufacture, and text analysis methods

    DOEpatents

    Turner, Alan E; Hetzler, Elizabeth G; Nakamura, Grant C

    2013-05-28

    Text analysis devices, articles of manufacture, and text analysis methods are described according to some aspects. In one aspect, a text analysis device includes processing circuitry configured to analyze initial text to generate a measurement basis usable in analysis of subsequent text, wherein the measurement basis comprises a plurality of measurement features from the initial text, a plurality of dimension anchors from the initial text and a plurality of associations of the measurement features with the dimension anchors, and wherein the processing circuitry is configured to access a viewpoint indicative of a perspective of interest of a user with respect to the analysis of the subsequent text, and wherein the processing circuitry is configured to use the viewpoint to generate the measurement basis.

  10. Results from three years of the world's largest interlaboratory comparison for total mercury and methylmercury: Method performance and best practices

    NASA Astrophysics Data System (ADS)

    Creswell, J. E.; Engel, V.; Carter, A.; Davies, C.

    2013-12-01

    Brooks Rand Instruments has conducted the world's largest interlaboratory comparison study for total mercury and methylmercury in natural waters annually for three years. Each year, roughly 50 laboratories registered to participate and the majority of participants submitted results. Each laboratory was assigned a performance score based on the distance between its results and the consensus mean, as well as the precision of its replicate analyses. Participants were also asked to provide detailed data on their analytical methodology and equipment. We used the methodology data and performance scores to assess the performance of the various methods reported and equipment used. Although the majority of methods in use show no systematic trend toward poor analytical performance, there are noteworthy exceptions. We present results from each of the three years of the interlaboratory comparison exercise, as well as aggregated method performance data. We compare the methods used in this study to methods from other published interlaboratory comparison studies and present a list of recommended best practices. Our goals in creating a list of best practices are to maximize participation, ensure inclusiveness, minimize non-response bias, guarantee high data quality, and promote transparency of analysis. We seek to create a standardized methodology for interlaboratory comparison exercises for total mercury and methylmercury analysis in water, which will lead to more directly comparable results between studies. We show that in most cases, the coefficient of variation between labs measuring replicates of the same sample is greater than 20% after the removal of outlying data points (e.g. Figure 1). It is difficult to make comparisons between studies and ecosystems with such a high variability between labs. We highlight the need for regular participation in interlaboratory comparison studies and continuous analytical method improvement in order to ensure accurate data. Figure 1. Results from one sample analyzed in the 2013 Interlaboratory Comparison Study.

  11. Infant-feeding practices among african american women: social-ecological analysis and implications for practice.

    PubMed

    Reeves, Elizabeth A; Woods-Giscombé, Cheryl L

    2015-05-01

    Despite extensive evidence supporting the health benefits of breastfeeding, significant disparities exist between rates of breastfeeding among African American women and women of other races. Increasing rates of breastfeeding among African American women can contribute to the improved health of the African American population by decreasing rates of infant mortality and disease and by enhancing cognitive development. Additionally, higher rates of breastfeeding among African American women could foster maternal-child bonding and could contribute to stronger families, healthier relationships, and emotionally healthier adults. The purpose of this article is twofold: (a) to use the social-ecological model to explore the personal, socioeconomic, psychosocial, and cultural factors that affect the infant feeding decision-making processes of African American women and (b) to discuss the implications of these findings for clinical practice and research to eliminate current disparities in rates of breastfeeding. PMID:24810518

  12. Interesting Times: Practice, Science, and Professional Associations in Behavior Analysis

    PubMed Central

    Critchfield, Thomas S

    2011-01-01

    Neither practitioners nor scientists appear to be fully satisfied with the world's largest behavior-analytic membership organization. Each community appears to believe that initiatives that serve the other will undermine the association's capacity to serve their own needs. Historical examples suggest that such discord is predicted when practitioners and scientists cohabit the same association. This is true because all professional associations exist to address guild interests, and practice and science are different professions with different guild interests. No association, therefore, can succeed in being all things to all people. The solution is to assure that practice and science communities are well served by separate professional associations. I comment briefly on how this outcome might be promoted. PMID:22532750

  13. Meaning and challenges in the practice of multiple therapeutic massage modalities: a combined methods study

    PubMed Central

    2011-01-01

    Background Therapeutic massage and bodywork (TMB) practitioners are predominantly trained in programs that are not uniformly standardized, and in variable combinations of therapies. To date no studies have explored this variability in training and how this affects clinical practice. Methods Combined methods, consisting of a quantitative, population-based survey and qualitative interviews with practitioners trained in multiple therapies, were used to explore the training and practice of TMB practitioners in Alberta, Canada. Results Of the 5242 distributed surveys, 791 were returned (15.1%). Practitioners were predominantly female (91.7%), worked in a range of environments, primarily private (44.4%) and home clinics (35.4%), and were not significantly different from other surveyed massage therapist populations. Seventy-seven distinct TMB therapies were identified. Most practitioners were trained in two or more therapies (94.4%), with a median of 8 and range of 40 therapies. Training programs varied widely in number and type of TMB components, training length, or both. Nineteen interviews were conducted. Participants described highly variable training backgrounds, resulting in practitioners learning unique combinations of therapy techniques. All practitioners reported providing individualized patient treatment based on a responsive feedback process throughout practice that they described as being critical to appropriately address the needs of patients. They also felt that research treatment protocols were different from clinical practice because researchers do not usually sufficiently acknowledge the individualized nature of TMB care provision. Conclusions The training received, the number of therapies trained in, and the practice descriptors of TMB practitioners are all highly variable. In addition, clinical experience and continuing education may further alter or enhance treatment techniques. Practitioners individualize each patient's treatment through a highly adaptive process. Therefore, treatment provision is likely unique to each practitioner. These results may be of interest to researchers considering similar practice issues in other professions. The use of a combined-methods design effectively captured this complexity of TMB practice. TMB research needs to consider research approaches that can capture or adapt to the individualized nature of practice. PMID:21929823

  14. Comment on Pearl: Practical implications of theoretical results for causal mediation analysis.

    PubMed

    Imai, Kosuke; Keele, Luke; Tingley, Dustin; Yamamoto, Teppei

    2014-12-01

    Mediation analysis has been extensively applied in psychological and other social science research. A number of methodologists have recently developed a formal theoretical framework for mediation analysis from a modern causal inference perspective. In Imai, Keele, and Tingley (2010), we have offered such an approach to causal mediation analysis that formalizes identification, estimation, and sensitivity analysis in a single framework. This approach has been used by a number of substantive researchers, and in subsequent work we have also further extended it to more complex settings and developed new research designs. In an insightful article, Pearl (2014) proposed an alternative approach that is based on a set of assumptions weaker than ours. In this comment, we demonstrate that the theoretical differences between our identification assumptions and his alternative conditions are likely to be of little practical relevance in the substantive research settings faced by most psychologists and other social scientists. We also show that our proposed estimation algorithms can be easily applied in the situations discussed in Pearl (2014). The methods discussed in this comment and many more are implemented via mediation, an open-source software (Tingley, Yamamoto, Hirose, Keele, & Imai, 2013). PMID:25486116

  15. Practical hyperdynamics method for systems with large changes in potential energy.

    PubMed

    Hirai, Hirotoshi

    2014-12-21

    A practical hyperdynamics method is proposed to accelerate systems with highly endothermic and exothermic reactions such as hydrocarbon pyrolysis and oxidation reactions. In this method, referred to as the "adaptive hyperdynamics (AHD) method," the bias potential parameters are adaptively updated according to the change in potential energy. The approach is intensively examined for JP-10 (exo-tetrahydrodicyclopentadiene) pyrolysis simulations using the ReaxFF reactive force field. Valid boost parameter ranges are clarified as a result. It is shown that AHD can be used to model pyrolysis at temperatures as low as 1000 K while achieving a boost factor of around 10(5). PMID:25527921

  16. Inductive analysis methods applied on questionnaires.

    PubMed

    Robertson, A; Sillén, R; Norén, J G

    1998-10-01

    The am of this study was to evaluate subjective aspects from questionnaires dealing with dental trauma by applying different computerized inductive techniques within the field of artificial intelligence to questionnaires consisting of descriptive variables and of questions reflecting functional, personal, and social effects of patients' oral situation following dental trauma. As the methodology used is new to many readers in odontologic sciences, a detailed description of both the processes and the terminology is given. Utilizing a neural network as a first step in an analysis of data showed if relations existed in the training set, but the network could not make the relations explicit, so other methods, inductive methods, had to be applied. Inductive methods have the potential constructing rules from a set of examples. The rules combined with domain knowledge can reveal relations between the variables. It can be concluded that the usage of methods based on artificial intelligence can greatly improve explanatory value and make knowledge in databases explicit. PMID:9860094

  17. New Regularization Method for EXAFS Analysis

    SciTech Connect

    Reich, Tatiana Ye.; Reich, Tobias [Institute of Nuclear Chemistry, Johannes Gutenberg-Universitaet Mainz, 55099 Mainz (Germany); Korshunov, Maxim E.; Antonova, Tatiana V.; Ageev, Alexander L. [Institute of Mathematics and Mechanics, Ural Branch of Russian Academy of Sciences, ul. S. Kovalevskaja 16, 620219 Ekaterinburg GSP-384 (Russian Federation); Moll, Henry [Institute of Radiochemistry, Forschungszentrum Rossendorf, P.O. Box 510119, 01314 Dresden (Germany)

    2007-02-02

    As an alternative to the analysis of EXAFS spectra by conventional shell fitting, the Tikhonov regularization method has been proposed. An improved algorithm that utilizes a priori information about the sample has been developed and applied to the analysis of U L3-edge spectra of soddyite, (UO2)2SiO4{center_dot}2H2O, and of U(VI) sorbed onto kaolinite. The partial radial distribution functions g1(UU), g2(USi), and g3(UO) of soddyite agree with crystallographic values and previous EXAFS results.

  18. Advanced Analysis Methods in High Energy Physics

    SciTech Connect

    Pushpalatha C. Bhat

    2001-10-03

    During the coming decade, high energy physics experiments at the Fermilab Tevatron and around the globe will use very sophisticated equipment to record unprecedented amounts of data in the hope of making major discoveries that may unravel some of Nature's deepest mysteries. The discovery of the Higgs boson and signals of new physics may be around the corner. The use of advanced analysis techniques will be crucial in achieving these goals. The author discusses some of the novel methods of analysis that could prove to be particularly valuable for finding evidence of any new physics, for improving precision measurements and for exploring parameter spaces of theoretical models.

  19. Reform-based science teaching: A mixed-methods approach to explaining variation in secondary science teacher practice

    NASA Astrophysics Data System (ADS)

    Jetty, Lauren E.

    The purpose of this two-phase, sequential explanatory mixed-methods study was to understand and explain the variation seen in secondary science teachers' enactment of reform-based instructional practices. Utilizing teacher socialization theory, this mixed-methods analysis was conducted to determine the relative influence of secondary science teachers' characteristics, backgrounds and experiences across their teacher development to explain the range of teaching practices exhibited by graduates from three reform-oriented teacher preparation programs. Data for this study were obtained from the Investigating the Meaningfulness of Preservice Programs Across the Continuum of Teaching (IMPPACT) Project, a multi-university, longitudinal study funded by NSF. In the first quantitative phase of the study, data for the sample (N=120) were collected from three surveys from the IMPPACT Project database. Hierarchical multiple regression analysis was used to examine the separate as well as the combined influence of factors such as teachers' personal and professional background characteristics, beliefs about reform-based science teaching, feelings of preparedness to teach science, school context, school culture and climate of professional learning, and influences of the policy environment on the teachers' use of reform-based instructional practices. Findings indicate three blocks of variables, professional background, beliefs/efficacy, and local school context added significant contribution to explaining nearly 38% of the variation in secondary science teachers' use of reform-based instructional practices. The five variables that significantly contributed to explaining variation in teachers' use of reform-based instructional practices in the full model were, university of teacher preparation, sense of preparation for teaching science, the quality of professional development, science content focused professional, and the perceived level of professional autonomy. Using the results from phase one, the second qualitative phase selected six case study teachers based on their levels of reform-based teaching practices to highlight teachers across the range of practices from low, average, to high levels of implementation. Using multiple interview sources, phase two helped to further explain the variation in levels of reform-based practices. Themes related to teachers' backgrounds, local contexts, and state policy environments were developed as they related to teachers' socialization experiences across these contexts. The results of the qualitative analysis identified the following factors differentiating teachers who enacted reform-based instructional practices from those who did not: 1) extensive science research experiences prior to their preservice teacher preparation; 2) the structure and quality of their field placements; 3) developing and valuing a research-based understanding of teaching and learning as a result of their preservice teacher preparation experiences; 4) the professional culture of their school context where there was support for a high degree of professional autonomy and receiving support from "educational companions" with a specific focus on teacher pedagogy to support student learning; and 5) a greater sense of agency to navigate their districts' interpretation and implementation of state polices. Implications for key stakeholders as well as directions for future research are discussed.

  20. Spectroscopic Chemical Analysis Methods and Apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F.; Reid, Ray D.

    2012-01-01

    This invention relates to non-contact spectroscopic methods and apparatus for performing chemical analysis and the ideal wavelengths and sources needed for this analysis. It employs deep ultraviolet (200- to 300-nm spectral range) electron-beam-pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor lightemitting devices, and hollow cathode metal ion lasers. Three achieved goals for this innovation are to reduce the size (under 20 L), reduce the weight [under 100 lb (.45 kg)], and reduce the power consumption (under 100 W). This method can be used in microscope or macroscope to provide measurement of Raman and/or native fluorescence emission spectra either by point-by-point measurement, or by global imaging of emissions within specific ultraviolet spectral bands. In other embodiments, the method can be used in analytical instruments such as capillary electrophoresis, capillary electro-chromatography, high-performance liquid chromatography, flow cytometry, and related instruments for detection and identification of unknown analytes using a combination of native fluorescence and/or Raman spectroscopic methods. This design provides an electron-beampumped semiconductor radiation-producing method, or source, that can emit at a wavelength (or wavelengths) below 300 nm, e.g. in the deep ultraviolet between about 200 and 300 nm, and more preferably less than 260 nm. In some variations, the method is to produce incoherent radiation, while in other implementations it produces laser radiation. In some variations, this object is achieved by using an AlGaN emission medium, while in other implementations a diamond emission medium may be used. This instrument irradiates a sample with deep UV radiation, and then uses an improved filter for separating wavelengths to be detected. This provides a multi-stage analysis of the sample. To avoid the difficulties related to producing deep UV semiconductor sources, a pumping approach has been developed that uses ballistic electron beam injection directly into the active region of a wide bandgap semiconductor material.

  1. Forum discussion on probabilistic structural analysis methods

    SciTech Connect

    Rodriguez, E.A.; Girrens, S.P.

    2000-10-01

    The use of Probabilistic Structural Analysis Methods (PSAM) has received much attention over the past several decades due in part to enhanced reliability theories, computational capabilities, and efficient algorithms. The need for this development was already present and waiting at the door step. Automotive design and manufacturing has been greatly enhanced because of PSAM and reliability methods, including reliability-based optimization. This demand was also present in the US Department of Energy (DOE) weapons laboratories in support of the overarching national security responsibility of maintaining the nations nuclear stockpile in a safe and reliable state.

  2. Applying decision analysis to pharmacy management and practice decisions.

    PubMed

    Barr, J T; Schumacher, G E

    1994-01-01

    Decision analysis, a structured approach to decision making, is presented and applied to a typical management situation. Decision analysis is an explicit, quantitative, and prescriptive approach to choosing among alternative outcomes. It engenders in the decision maker an analytical viewpoint, the need to structure the various courses of action and the resultant consequences of the actions, to assess the degree of uncertainty of the actions occurring, and to value the preferences for the alternative outcomes. Literature examples of using pharmacy-related decision analysis are provided, including its use in formulary additions, cost-effectiveness analysis, drug therapy evaluation, therapeutic drug monitoring, and health policy issues. PMID:10130685

  3. Digital dream analysis: a revised method.

    PubMed

    Bulkeley, Kelly

    2014-10-01

    This article demonstrates the use of a digital word search method designed to provide greater accuracy, objectivity, and speed in the study of dreams. A revised template of 40 word search categories, built into the website of the Sleep and Dream Database (SDDb), is applied to four "classic" sets of dreams: The male and female "Norm" dreams of Hall and Van de Castle (1966), the "Engine Man" dreams discussed by Hobson (1988), and the "Barb Sanders Baseline 250" dreams examined by Domhoff (2003). A word search analysis of these original dream reports shows that a digital approach can accurately identify many of the same distinctive patterns of content found by previous investigators using much more laborious and time-consuming methods. The results of this study emphasize the compatibility of word search technologies with traditional approaches to dream content analysis. PMID:25286125

  4. APPLYING NEW METHODS TO RESEARCH REACTOR ANALYSIS

    Microsoft Academic Search

    D. J. CHENG; L. HANSON; A. XU; J. CAREW

    2004-01-01

    Detailed reactor physics and safety analyses are being performed for the 20 MW DO-moderated research reactor at the National Institute of Standards and Technology (NIST). The analyses employ state-of-the-art calculational methods and will contribute to an update to the Final Safety Analysis Report (FSAR). Three-dimensional MCNP Monte Carlo neutron and photon transport calculations are performed to determine power and reactivity

  5. A Simplified Mesh Deformation Method Using Commercial Structural Analysis Software

    NASA Technical Reports Server (NTRS)

    Hsu, Su-Yuen; Chang, Chau-Lyan; Samareh, Jamshid

    2004-01-01

    Mesh deformation in response to redefined or moving aerodynamic surface geometries is a frequently encountered task in many applications. Most existing methods are either mathematically too complex or computationally too expensive for usage in practical design and optimization. We propose a simplified mesh deformation method based on linear elastic finite element analyses that can be easily implemented by using commercially available structural analysis software. Using a prescribed displacement at the mesh boundaries, a simple structural analysis is constructed based on a spatially varying Young s modulus to move the entire mesh in accordance with the surface geometry redefinitions. A variety of surface movements, such as translation, rotation, or incremental surface reshaping that often takes place in an optimization procedure, may be handled by the present method. We describe the numerical formulation and implementation using the NASTRAN software in this paper. The use of commercial software bypasses tedious reimplementation and takes advantage of the computational efficiency offered by the vendor. A two-dimensional airfoil mesh and a three-dimensional aircraft mesh were used as test cases to demonstrate the effectiveness of the proposed method. Euler and Navier-Stokes calculations were performed for the deformed two-dimensional meshes.

  6. Stirling Analysis Comparison of Commercial vs. High-Order Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako

    2007-01-01

    Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/ proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's Compact scheme and Dyson s Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model although sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.

  7. Stirling Analysis Comparison of Commercial Versus High-Order Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.; Demko, Rikako

    2005-01-01

    Recently, three-dimensional Stirling engine simulations have been accomplished utilizing commercial Computational Fluid Dynamics software. The validations reported can be somewhat inconclusive due to the lack of precise time accurate experimental results from engines, export control/proprietary concerns, and the lack of variation in the methods utilized. The last issue may be addressed by solving the same flow problem with alternate methods. In this work, a comprehensive examination of the methods utilized in the commercial codes is compared with more recently developed high-order methods. Specifically, Lele's compact scheme and Dyson's Ultra Hi-Fi method will be compared with the SIMPLE and PISO methods currently employed in CFD-ACE, FLUENT, CFX, and STAR-CD (all commercial codes which can in theory solve a three-dimensional Stirling model with sliding interfaces and their moving grids limit the effective time accuracy). We will initially look at one-dimensional flows since the current standard practice is to design and optimize Stirling engines with empirically corrected friction and heat transfer coefficients in an overall one-dimensional model. This comparison provides an idea of the range in which commercial CFD software for modeling Stirling engines may be expected to provide accurate results. In addition, this work provides a framework for improving current one-dimensional analysis codes.

  8. Finite Volume Methods: Foundation and Analysis

    NASA Technical Reports Server (NTRS)

    Barth, Timothy; Ohlberger, Mario

    2003-01-01

    Finite volume methods are a class of discretization schemes that have proven highly successful in approximating the solution of a wide variety of conservation law systems. They are extensively used in fluid mechanics, porous media flow, meteorology, electromagnetics, models of biological processes, semi-conductor device simulation and many other engineering areas governed by conservative systems that can be written in integral control volume form. This article reviews elements of the foundation and analysis of modern finite volume methods. The primary advantages of these methods are numerical robustness through the obtention of discrete maximum (minimum) principles, applicability on very general unstructured meshes, and the intrinsic local conservation properties of the resulting schemes. Throughout this article, specific attention is given to scalar nonlinear hyperbolic conservation laws and the development of high order accurate schemes for discretizing them. A key tool in the design and analysis of finite volume schemes suitable for non-oscillatory discontinuity capturing is discrete maximum principle analysis. A number of building blocks used in the development of numerical schemes possessing local discrete maximum principles are reviewed in one and several space dimensions, e.g. monotone fluxes, E-fluxes, TVD discretization, non-oscillatory reconstruction, slope limiters, positive coefficient schemes, etc. When available, theoretical results concerning a priori and a posteriori error estimates are given. Further advanced topics are then considered such as high order time integration, discretization of diffusion terms and the extension to systems of nonlinear conservation laws.

  9. Probabilistic methods in fire-risk analysis

    SciTech Connect

    Brandyberry, M.D.

    1989-01-01

    The first part of this work outlines a method for assessing the frequency of ignition of a consumer product in a building and shows how the method would be used in an example scenario utilizing upholstered furniture as the product and radiant auxiliary heating devices (electric heaters, wood stoves) as the ignition source. Deterministic thermal models of the heat-transport processes are coupled with parameter uncertainty analysis of the models and with a probabilistic analysis of the events involved in a typical scenario. This leads to a distribution for the frequency of ignition for the product. In second part, fire-risk analysis as currently used in nuclear plants is outlines along with a discussion of the relevant uncertainties. The use of the computer code COMPBRN is discussed for use in the fire-growth analysis along with the use of response-surface methodology to quantify uncertainties in the code's use. Generalized response surfaces are developed for temperature versus time for a cable tray, as well as a surface for the hot gas layer temperature and depth for a room of arbitrary geometry within a typical nuclear power plant compartment. These surfaces are then used to simulate the cable tray damage time in a compartment fire experiment.

  10. Spectroscopic chemical analysis methods and apparatus

    NASA Technical Reports Server (NTRS)

    Hug, William F. (Inventor); Reid, Ray D. (Inventor); Bhartia, Rohit (Inventor)

    2013-01-01

    Spectroscopic chemical analysis methods and apparatus are disclosed which employ deep ultraviolet (e.g. in the 200 nm to 300 nm spectral range) electron beam pumped wide bandgap semiconductor lasers, incoherent wide bandgap semiconductor light emitting devices, and hollow cathode metal ion lasers to perform non-contact, non-invasive detection of unknown chemical analytes. These deep ultraviolet sources enable dramatic size, weight and power consumption reductions of chemical analysis instruments. Chemical analysis instruments employed in some embodiments include capillary and gel plane electrophoresis, capillary electrochromatography, high performance liquid chromatography, flow cytometry, flow cells for liquids and aerosols, and surface detection instruments. In some embodiments, Raman spectroscopic detection methods and apparatus use ultra-narrow-band angle tuning filters, acousto-optic tuning filters, and temperature tuned filters to enable ultra-miniature analyzers for chemical identification. In some embodiments Raman analysis is conducted along with photoluminescence spectroscopy (i.e. fluorescence and/or phosphorescence spectroscopy) to provide high levels of sensitivity and specificity in the same instrument.

  11. EMQN Best Practice Guidelines for molecular and haematology methods for carrier identification and prenatal diagnosis of the haemoglobinopathies.

    PubMed

    Traeger-Synodinos, Joanne; Harteveld, Cornelis L; Old, John M; Petrou, Mary; Galanello, Renzo; Giordano, Piero; Angastioniotis, Michael; De la Salle, Barbara; Henderson, Shirley; May, Alison

    2015-04-01

    Haemoglobinopathies constitute the commonest recessive monogenic disorders worldwide, and the treatment of affected individuals presents a substantial global disease burden. Carrier identification and prenatal diagnosis represent valuable procedures that identify couples at risk for having affected children, so that they can be offered options to have healthy offspring. Molecular diagnosis facilitates prenatal diagnosis and definitive diagnosis of carriers and patients (especially 'atypical' cases who often have complex genotype interactions). However, the haemoglobin disorders are unique among all genetic diseases in that identification of carriers is preferable by haematological (biochemical) tests rather than DNA analysis. These Best Practice guidelines offer an overview of recommended strategies and methods for carrier identification and prenatal diagnosis of haemoglobinopathies, and emphasize the importance of appropriately applying and interpreting haematological tests in supporting the optimum application and evaluation of globin gene DNA analysis. PMID:25052315

  12. Optical methods for the analysis of dermatopharmacokinetics

    NASA Astrophysics Data System (ADS)

    Lademann, Juergen; Weigmann, Hans-Juergen; von Pelchrzim, R.; Sterry, Wolfram

    2002-07-01

    The method of tape stripping in combination with spectroscopic measurements is a simple and noninvasive method for the analysis of dermatopharmacokinetics of cosmetic products and topically applied drugs. The absorbance at 430 nm was used for the characterization of the amount of corneocytes on the tape strips. It was compared to the increase of weight of the tapes after removing them from the skin surface. The penetration profiles of two UV filter substances used in sunscreens were determined. The combined method of tape stripping and spectroscopic measurements can be also used for the investigation of the dermatopharmacokinetics of topically applied drugs passing through the skin. Differences in the penetration profiles of the steroid compound clobetasol, applied in the same concentration in different formulations on the skin are presented.

  13. Review of Computational Stirling Analysis Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  14. Backchain Iteration: Towards a Practical Inference Method That Is Simple Enough to Be Proved Terminating, Sound, and Complete

    Microsoft Academic Search

    Adrian Walker

    1993-01-01

    We focus on methods for interpreting stratified datalog programs with negation, and we describe progress towards a practical, yet simple method that treats such programs as executable specifications of deductive database applications.

  15. Adaptive Hierarchical Methods for Landscape Representation and Analysis

    E-print Network

    Sminchisescu, Cristian

    information systems, topographic analysis, climatic, hydrological or geomorphological simulationsAdaptive Hierarchical Methods for Landscape Representation and Analysis Thomas Gerstner University of algorithms for terrain analysis. We will present two out of a large variety of methods, both using regular

  16. Methods for Model Selection & Checking Sensitivity Analysis for Bayes Factors

    E-print Network

    Masci, Frank

    Methods for Model Selection & Checking Sensitivity Analysis for Bayes Factors for Model Selection & Checking Sensitivity Analysis for Bayes Factors A Radical Suggestion Methods for Model Selection & Checking Sensitivity Analysis for Bayes Factors A Radical Suggestion

  17. A practical method to avoid zero-point leak in molecular dynamics calculations: Application to the water dimer

    NASA Astrophysics Data System (ADS)

    Czakó, Gábor; Kaledin, Alexey L.; Bowman, Joel M.

    2010-04-01

    We report the implementation of a previously suggested method to constrain a molecular system to have mode-specific vibrational energy greater than or equal to the zero-point energy in quasiclassical trajectory calculations [J. M. Bowman et al., J. Chem. Phys. 91, 2859 (1989); W. H. Miller et al., J. Chem. Phys. 91, 2863 (1989)]. The implementation is made practical by using a technique described recently [G. Czakó and J. M. Bowman, J. Chem. Phys. 131, 244302 (2009)], where a normal-mode analysis is performed during the course of a trajectory and which gives only real-valued frequencies. The method is applied to the water dimer, where its effectiveness is shown by computing mode energies as a function of integration time. Radial distribution functions are also calculated using constrained quasiclassical and standard classical molecular dynamics at low temperature and at 300 K and compared to rigorous quantum path integral calculations.

  18. Influence of Analysis Methods on Interpretation of Hazard Maps

    PubMed Central

    Koehler, Kirsten A.

    2013-01-01

    Exposure or hazard mapping is becoming increasingly popular among industrial hygienists. Direct-reading instruments used for hazard mapping of data collection are steadily increasing in reliability and portability while decreasing in cost. Exposure measurements made with these instruments generally require no laboratory analysis although hazard mapping can be a time-consuming process. To inform decision making by industrial hygienists and management, it is crucial that the maps generated from mapping data are as accurate and representative as possible. Currently, it is unclear how many sampling locations are necessary to produce a representative hazard map. As such, researchers typically collect as many points as can be sampled in several hours and interpolation methods are used to produce higher resolution maps. We have reanalyzed hazard-mapping data sets from three industrial settings to determine which interpolation methods yield the most accurate results. The goal is to provide practicing industrial hygienists with some practical guidelines to generate accurate hazard maps with ‘off-the-shelf’ mapping software. Visually verifying the fit of the variogram model is crucial for accurate interpolation. Exponential and spherical variogram models performed better than Gaussian models. It was also necessary to diverge from some of the default interpolation parameters such as the number of bins used for the experimental variogram and whether or not to allow for a nugget effect to achieve reasonable accuracy of the interpolation for some data sets. PMID:23258453

  19. Safeguards systems analysis research and development and the practice of safeguards at DOE facilities

    SciTech Connect

    Zack, N.R.; Thomas, K.E.; Markin, J.T.; Tape, J.W.

    1991-01-01

    Los Alamos Safeguards Systems Group personnel interact with Department of Energy (DOE) nuclear materials processing facilities in a number of ways. Among them are training courses, formal technical assistance such as developing information management or data analysis software and informal ad hoc assistance especially in reviewing and commenting on existing facility safeguards technology and procedures. These activities are supported by the DOE Office of Safeguards and Security, DOE Operations Offices, and contractor organizations. Because of the relationships with the Operations Office and facility personnel, the Safeguards Systems Group research and development (R D) staff have developed an understanding of the needs of the entire complex. Improved safeguards are needed in areas such as materials control activities, accountability procedures and techniques, systems analysis and evaluation methods, and material handling procedures. This paper surveys the generic needs for efficient and cost effective enhancements in safeguards technologies and procedures at DOE facilities, identifies areas where existing safeguards R D products are being applied or could be applied, and sets a direction for future systems analysis R D to address practical facility safeguards needs.

  20. Are larger dental practices more efficient? An analysis of dental services production.

    PubMed

    Lipscomb, J; Douglass, C W

    1986-12-01

    Whether cost-efficiency in dental services production increases with firm size is investigated through application of an activity analysis production function methodology to data from a national survey of dental practices. Under this approach, service delivery in a dental practice is modeled as a linear programming problem that acknowledges distinct input-output relationships for each service. These service-specific relationships are then combined to yield projections of overall dental practice productivity, subject to technical and organizational constraints. The activity analysis reported here represents arguably the most detailed evaluation yet of the relationship between dental practice size and cost-efficiency, controlling for such confounding factors as fee and service-mix differences across firms. We conclude that cost-efficiency does increase with practice size, over the range from solo to four-dentist practices. Largely because of data limitations, we were unable to test satisfactorily for scale economies in practices with five or more dentists. Within their limits, our findings are generally consistent with results from the neoclassical production function literature. From the standpoint of consumer welfare, the critical question raised (but not resolved) here is whether these apparent production efficiencies of group practice are ultimately translated by the market into lower fees, shorter queues, or other nonprice benefits. PMID:3102404

  1. Are larger dental practices more efficient? An analysis of dental services production.

    PubMed Central

    Lipscomb, J; Douglass, C W

    1986-01-01

    Whether cost-efficiency in dental services production increases with firm size is investigated through application of an activity analysis production function methodology to data from a national survey of dental practices. Under this approach, service delivery in a dental practice is modeled as a linear programming problem that acknowledges distinct input-output relationships for each service. These service-specific relationships are then combined to yield projections of overall dental practice productivity, subject to technical and organizational constraints. The activity analysis reported here represents arguably the most detailed evaluation yet of the relationship between dental practice size and cost-efficiency, controlling for such confounding factors as fee and service-mix differences across firms. We conclude that cost-efficiency does increase with practice size, over the range from solo to four-dentist practices. Largely because of data limitations, we were unable to test satisfactorily for scale economies in practices with five or more dentists. Within their limits, our findings are generally consistent with results from the neoclassical production function literature. From the standpoint of consumer welfare, the critical question raised (but not resolved) here is whether these apparent production efficiencies of group practice are ultimately translated by the market into lower fees, shorter queues, or other nonprice benefits. PMID:3102404

  2. Alignment of patient and primary care practice member perspectives of chronic illness care: a cross-sectional analysis

    PubMed Central

    2014-01-01

    Background Little is known as to whether primary care teams’ perceptions of how well they have implemented the Chronic Care Model (CCM) corresponds with their patients’ own experience of chronic illness care. We examined the extent to which practice members’ perceptions of how well they organized to deliver care consistent with the CCM were associated with their patients’ perceptions of the chronic illness care they have received. Methods Analysis of baseline measures from a cluster randomized controlled trial testing a practice facilitation intervention to implement the CCM in small, community-based primary care practices. All practice “members” (i.e., physician providers, non-physician providers, and staff) completed the Assessment of Chronic Illness Care (ACIC) survey and adult patients with 1 or more chronic illnesses completed the Patient Assessment of Chronic Illness Care (PACIC) questionnaire. Results Two sets of hierarchical linear regression models accounting for nesting of practice members (N?=?283) and patients (N?=?1,769) within 39 practices assessed the association between practice member perspectives of CCM implementation (ACIC scores) and patients’ perspectives of CCM (PACIC). ACIC summary score was not significantly associated with PACIC summary score or most of PACIC subscale scores, but four of the ACIC subscales [Self-management Support (p?practice member perspectives when evaluating quality of chronic illness care. Trial registration NCT00482768 PMID:24678983

  3. LISA Data Analysis using MCMC methods

    E-print Network

    Neil J. Cornish; Jeff Crowder

    2005-06-10

    The Laser Interferometer Space Antenna (LISA) is expected to simultaneously detect many thousands of low frequency gravitational wave signals. This presents a data analysis challenge that is very different to the one encountered in ground based gravitational wave astronomy. LISA data analysis requires the identification of individual signals from a data stream containing an unknown number of overlapping signals. Because of the signal overlaps, a global fit to all the signals has to be performed in order to avoid biasing the solution. However, performing such a global fit requires the exploration of an enormous parameter space with a dimension upwards of 50,000. Markov Chain Monte Carlo (MCMC) methods offer a very promising solution to the LISA data analysis problem. MCMC algorithms are able to efficiently explore large parameter spaces, simultaneously providing parameter estimates, error analyses and even model selection. Here we present the first application of MCMC methods to simulated LISA data and demonstrate the great potential of the MCMC approach. Our implementation uses a generalized F-statistic to evaluate the likelihoods, and simulated annealing to speed convergence of the Markov chains. As a final step we super-cool the chains to extract maximum likelihood estimates, and estimates of the Bayes factors for competing models. We find that the MCMC approach is able to correctly identify the number of signals present, extract the source parameters, and return error estimates consistent with Fisher information matrix predictions.

  4. (252)Cf-source-driven neutron noise analysis method

    NASA Astrophysics Data System (ADS)

    Mihalczo, J. T.; King, W. T.; Blakeman, E. D.

    The Cf-252-source-driven neutron noise analysis method has been tested in a wide variety of experiments that have indicated the broad range of applicability of the method. The neutron multiplication factor k(sub eff) has been satisfactorily determined for a variety of materials including uranium metal, light water reactor fuel pins, fissile solutions, fuel plates in water, and interacting cylinders. For a uranyl nitrate solution tank which is typical of a fuel processing or reprocessing plant, the k(sub eff) values were satisfactorily determined for values between 0.92 and 0.5 using a simple point kinetics interpretation of the experimental data. The short measurement times, in several cases as low as 1 min, have shown that the development of this method can lead to a practical subcriticality monitor for many in-plant applications. The further development of the method will require experiments oriented toward particular applications including dynamic experiments and the development of theoretical methods to predict the experimental observables.

  5. A high-efficiency aerothermoelastic analysis method

    NASA Astrophysics Data System (ADS)

    Wan, ZhiQiang; Wang, YaoKun; Liu, YunZhen; Yang, Chao

    2014-06-01

    In this paper, a high-efficiency aerothermoelastic analysis method based on unified hypersonic lifting surface theory is established. The method adopts a two-way coupling form that couples the structure, aerodynamic force, and aerodynamic thermo and heat conduction. The aerodynamic force is first calculated based on unified hypersonic lifting surface theory, and then the Eckert reference temperature method is used to solve the temperature field, where the transient heat conduction is solved using Fourier's law, and the modal method is used for the aeroelastic correction. Finally, flutter is analyzed based on the p-k method. The aerothermoelastic behavior of a typical hypersonic low-aspect ratio wing is then analyzed, and the results indicate the following: (1) the combined effects of the aerodynamic load and thermal load both deform the wing, which would increase if the flexibility, size, and flight time of the hypersonic aircraft increase; (2) the effect of heat accumulation should be noted, and therefore, the trajectory parameters should be considered in the design of hypersonic flight vehicles to avoid hazardous conditions, such as flutter.

  6. Practical Static Analysis of JavaScript Applications in the Presence of Frameworks and Libraries

    E-print Network

    Livshits, Ben

    Practical Static Analysis of JavaScript Applications in the Presence of Frameworks and Libraries Microsoft Corporation, USA ABSTRACT JavaScript is a language that is widely-used for both web- based and standalone applications such as those in the Win- dows 8 operating system. Analysis of JavaScript has long

  7. Practical Static Analysis of JavaScript Applications in the Presence of Frameworks and Libraries

    E-print Network

    Livshits, Ben

    Practical Static Analysis of JavaScript Applications in the Presence of Frameworks and Libraries Corporation Microsoft Research Technical Report MSR-TR-2012-66 #12;Abstract JavaScript is a language system. Analysis of JavaScript has long been known to be challenging due to the language's dynamic nature

  8. A Practical Approach to Modeling Uncertainty in Intrusion Analysis Xinming Ou Raj Rajagopalan Sakthiyuvaraja Sakthivelmurugan

    E-print Network

    Ou, Xinming "Simon"

    the seemingly ad-hoc human reasoning of uncertain events, and can yield useful tools for automated intrusionA Practical Approach to Modeling Uncertainty in Intrusion Analysis Xinming Ou Raj Rajagopalan is an innate feature of intrusion analysis due to the limited views provided by system monitoring tools

  9. A Policy Analysis: State Teacher Evaluations Policies and Practices in Comparison to Evidenced Based Characteristics of High Performing Teachers

    ERIC Educational Resources Information Center

    Janson, Karl E.; Martin, Patrick N.; Sutton, Marica K.

    2011-01-01

    This policy analysis of teacher evaluations focuses specifically on teacher practices related to student performance. Due to the relationship between teacher practices and student performance, and the need for evaluation policy to include these practices, the policy analysis of teacher evaluation policies was conducted. The focus of the project…

  10. Dynamic mechanical analysis: A practical introduction to techniques and applications

    SciTech Connect

    Menard, K.

    1999-03-01

    This introduction provides the chemist, chemical engineer, or materials scientists with a starting point to understand the applications of dynamic mechanical analysis, its workings, and its advantages and limitations. This book serves as a systematic study of manufacturing polymeric materials and components as well as for developing new materials. Contents include: introduction to dynamic mechanical analysis; basic rheological concepts: stress, strain, and flow; rheology basic: creep-recovery and stress relaxation; dynamic testing; time-temperature scans part 1: transitions in polymers; time and temperature studies part 2: thermosets; frequency scans; DMA applications to real problems: guidelines; and appendix: sample experiments for the DMA.

  11. Deriving a practical analytical-probabilistic method to size flood routing reservoirs

    NASA Astrophysics Data System (ADS)

    Balistrocchi, Matteo; Grossi, Giovanna; Bacchi, Baldassare

    2013-12-01

    In the engineering practice routing reservoir sizing is commonly performed by using the design storm method, although its effectiveness has been debated for a long time. Conversely, continuous simulations and direct statistical analyses of recorded hydrographs are considered more reliable and comprehensive, but are indeed complex or seldom practicable. In this paper a handier tool is provided by the analytical-probabilistic approach to construct probability functions of peak discharges issuing from natural watersheds or routed through on-line and off-line reservoirs. A simplified routing scheme and a rainfall-runoff model based on a few essential hydrological parameters were implemented. To validate the proposed design methodology, on-line and off-line routing reservoirs were firstly sized by means of a conventional design storm method for a test watershed located in northern Italy. Their routing efficiencies were then estimated by both analytical-probabilistic models and benchmarking continuous simulations. Bearing in mind practical design purposes, adopted models evidenced a satisfactory consistency.

  12. Newborn Hearing Screening: An Analysis of Current Practices

    ERIC Educational Resources Information Center

    Houston, K. Todd; Bradham, Tamala S.; Munoz, Karen F.; Guignard, Gayla Hutsell

    2011-01-01

    State coordinators of early hearing detection and intervention (EHDI) programs completed a strengths, weaknesses, opportunities, and threats, or SWOT, analysis that consisted of 12 evaluative areas of EHDI programs. For the newborn hearing screening area, a total of 293 items were listed by 49 EHDI coordinators, and themes were identified within…

  13. An Analysis of Ethical Considerations in Programme Design Practice

    ERIC Educational Resources Information Center

    Govers, Elly

    2014-01-01

    Ethical considerations are inherent to programme design decision-making, but not normally explicit. Nonetheless, they influence whose interests are served in a programme and who benefits from it. This paper presents an analysis of ethical considerations made by programme design practitioners in the context of a polytechnic in Aotearoa/New Zealand.…

  14. Strategic planning for public health practice using macroenvironmental analysis.

    PubMed Central

    Ginter, P M; Duncan, W J; Capper, S A

    1991-01-01

    Macroenvironmental analysis is the initial stage in comprehensive strategic planning. The authors examine the benefits of this type of analysis when applied to public health organizations and present a series of questions that should be answered prior to committing resources to scanning, monitoring, forecasting, and assessing components of the macroenvironment. Using illustrations from the public and private sectors, each question is examined with reference to specific challenges facing public health. Benefits are derived both from the process and the outcome of macroenvironmental analysis. Not only are data acquired that assist public health professionals to make decisions, but the analytical process required assures a better understanding of potential external threats and opportunities as well as an organization's strengths and weaknesses. Although differences exist among private and public as well as profit and not-for-profit organizations, macroenvironmental analysis is seen as more essential to the public and not-for-profit sectors than the private and profit sectors. This conclusion results from the extreme dependency of those areas on external environmental forces that cannot be significantly influenced or controlled by public health decision makers. PMID:1902305

  15. Digital Data Collection and Analysis: Application for Clinical Practice

    ERIC Educational Resources Information Center

    Ingram, Kelly; Bunta, Ferenc; Ingram, David

    2004-01-01

    Technology for digital speech recording and speech analysis is now readily available for all clinicians who use a computer. This article discusses some advantages of moving from analog to digital recordings and outlines basic recording procedures. The purpose of this article is to familiarize speech-language pathologists with computerized audio…

  16. Thermal Analysis Methods for Aerobraking Heating

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.; Gasbarre, Joseph F.; Dec, John A.

    2005-01-01

    As NASA begins exploration of other planets, a method of non-propulsively slowing vehicles at the planet, aerobraking, may become a valuable technique for managing vehicle design mass and propellant. An example of this is Mars Reconnaissance Orbiter (MRO), which will launch in late 2005 and reach Mars in March of 2006. In order to save propellant, MRO will use aerobraking to modify the initial orbit at Mars. The spacecraft will dip into the atmosphere briefly on each orbit, and during the drag pass, the atmospheric drag on the spacecraft will slow it, thus lowering the orbit apoapsis. The largest area on the spacecraft, and that most affected by the heat generated during the aerobraking process, is the solar arrays. A thermal analysis of the solar arrays was conducted at NASA Langley, to simulate their performance throughout the entire roughly 6-month period of aerobraking. Several interesting methods were used to make this analysis more rapid and robust. Two separate models were built for this analysis, one in Thermal Desktop for radiation and orbital heating analysis, and one in MSC.Patran for thermal analysis. The results from the radiation model were mapped in an automated fashion to the Patran thermal model that was used to analyze the thermal behavior during the drag pass. A high degree of automation in file manipulation as well as other methods for reducing run time were employed, since toward the end of the aerobraking period the orbit period is short, and in order to support flight operations the runs must be computed rapidly. All heating within the Patran Thermal model was combined in one section of logic, such that data mapped from the radiation model and aeroheating model, as well as skin temperature effects on the aeroheating and surface radiation, could be incorporated easily. This approach calculates the aeroheating at any given node, based on its position and temperature as well as the density and velocity at that trajectory point. Run times on several different processors, computer hard drives, and operating systems (Windows versus Linux) were evaluated.

  17. Method and apparatus for simultaneous spectroelectrochemical analysis

    DOEpatents

    Chatterjee, Sayandev; Bryan, Samuel A; Schroll, Cynthia A; Heineman, William R

    2013-11-19

    An apparatus and method of simultaneous spectroelectrochemical analysis is disclosed. A transparent surface is provided. An analyte solution on the transparent surface is contacted with a working electrode and at least one other electrode. Light from a light source is focused on either a surface of the working electrode or the analyte solution. The light reflected from either the surface of the working electrode or the analyte solution is detected. The potential of the working electrode is adjusted, and spectroscopic changes of the analyte solution that occur with changes in thermodynamic potentials are monitored.

  18. Apparatus And Method For Fluid Analysis

    DOEpatents

    Wilson, Bary W. (Richland, WA); Peters, Timothy J. (Richland, WA); Shepard, Chester L. (West Richland, WA); Reeves, James H. (Richland, WA)

    2003-05-13

    The present invention is an apparatus and method for analyzing a fluid used in a machine or in an industrial process line. The apparatus has at least one meter placed proximate the machine or process line and in contact with the machine or process fluid for measuring at least one parameter related to the fluid. The at least one parameter is a standard laboratory analysis parameter. The at least one meter includes but is not limited to viscometer, element meter, optical meter, particulate meter, and combinations thereof.

  19. Apparatus and method for fluid analysis

    DOEpatents

    Wilson, Bary W.; Peters, Timothy J.; Shepard, Chester L.; Reeves, James H.

    2004-11-02

    The present invention is an apparatus and method for analyzing a fluid used in a machine or in an industrial process line. The apparatus has at least one meter placed proximate the machine or process line and in contact with the machine or process fluid for measuring at least one parameter related to the fluid. The at least one parameter is a standard laboratory analysis parameter. The at least one meter includes but is not limited to viscometer, element meter, optical meter, particulate meter, and combinations thereof.

  20. [Current methods in automated statistical analysis].

    PubMed

    Praganov, D; Kalpazanov, I; Simeonov, G

    1983-01-01

    The advantages and disadvantages of the so called minicomputers (e.g. type NOVA) and microcomputers (e.g. type NR 95) are compared in their use for statistical analysis. In spite of some advantages - autonomy, possibilities for immediate use, etc. of microcomputers, the minicomputers, type NOVA, were established to enable the elimination of their non-specific work for the non-mathematical specialists, their direction to the most proper and complexly applied statistical method in the organized statistical processing, existing at the Institute of Hygiene and Occupational Diseases, hence the expenditures would be lower and reability - higher. PMID:6672822

  1. Scanning methods applied to bitemark analysis

    NASA Astrophysics Data System (ADS)

    Bush, Peter J.; Bush, Mary A.

    2010-06-01

    The 2009 National Academy of Sciences report on forensics focused criticism on pattern evidence subdisciplines in which statements of unique identity are utilized. One principle of bitemark analysis is that the human dentition is unique to the extent that a perpetrator may be identified based on dental traits in a bitemark. Optical and electron scanning methods were used to measure dental minutia and to investigate replication of detail in human skin. Results indicated that being a visco-elastic substrate, skin effectively reduces the resolution of measurement of dental detail. Conclusions indicate caution in individualization statements.

  2. Selective spectroscopic methods for water analysis

    SciTech Connect

    Vaidya, B.

    1997-06-24

    This dissertation explores in large part the development of a few types of spectroscopic methods in the analysis of water. Methods for the determination of some of the most important properties of water like pH, metal ion content, and chemical oxygen demand are investigated in detail. This report contains a general introduction to the subject and the conclusions. Four chapters and an appendix have been processed separately. They are: chromogenic and fluorogenic crown ether compounds for the selective extraction and determination of Hg(II); selective determination of cadmium in water using a chromogenic crown ether in a mixed micellar solution; reduction of chloride interference in chemical oxygen demand determination without using mercury salts; structural orientation patterns for a series of anthraquinone sulfonates adsorbed at an aminophenol thiolate monolayer chemisorbed at gold; and the role of chemically modified surfaces in the construction of miniaturized analytical instrumentation.

  3. Planning for IS applications: a practical, information theoretical method and case study in mobile financial services

    Microsoft Academic Search

    Ken Peffers; Tuure Tuunanen

    2005-01-01

    Abstract We use information theory to justify use of a method,to help managers,better understand what new IT applications and features will be most valued by users and why and then apply this method,in a case study involving the development,of financial service applications for mobile devices. We review five methods for data gathering, analysis, modeling, and decision-making and compare them with

  4. A novel and practical cardiovascular magnetic resonance method to quantify mitral annular excursion and recoil applied to hypertrophic cardiomyopathy

    PubMed Central

    2014-01-01

    Background We have developed a novel and practical cardiovascular magnetic resonance (CMR) technique to evaluate left ventricular (LV) mitral annular motion by tracking the atrioventricular junction (AVJ). To test AVJ motion analysis as a metric for LV function, we compared AVJ motion variables between patients with hypertrophic cardiomyopathy (HCM), a group with recognized systolic and diastolic dysfunction, and healthy volunteers. Methods We retrospectively evaluated 24 HCM patients with normal ejection fractions (EF) and 14 healthy volunteers. Using the 4-chamber view cine images, we tracked the longitudinal motion of the lateral and septal AVJ at 25 time points during the cardiac cycle. Based on AVJ displacement versus time, we calculated maximum AVJ displacement (MD) and velocity in early diastole (MVED), velocity in diastasis (VDS) and the composite index VDS/MVED. Results Patients with HCM showed significantly slower median lateral and septal AVJ recoil velocities during early diastole, but faster velocities in diastasis. We observed a 16-fold difference in VDS/MVED at the lateral AVJ [median 0.141, interquartile range (IQR) 0.073, 0.166 versus 0.009 IQR -0.006, 0.037, P?analysis took approximately 10 minutes per subject. Conclusions Atrioventricular junction motion analysis provides a practical and novel CMR method to assess mitral annular motion. In this proof of concept study we found highly statistically significant differences in mitral annular excursion and recoil between HCM patients and healthy volunteers. PMID:24886666

  5. Practical guidance for statistical analysis of operational event data

    SciTech Connect

    Atwood, C.L.

    1995-10-01

    This report presents ways to avoid mistakes that are sometimes made in analysis of operational event data. It then gives guidance on what to do when a model is rejected, a list of standard types of models to consider, and principles for choosing one model over another. For estimating reliability, it gives advice on which failure modes to model, and moment formulas for combinations of failure modes. The issues are illustrated with many examples and case studies.

  6. Measuring Racial/Ethnic Disparities in Health Care: Methods and Practical Issues

    PubMed Central

    Cook, Benjamin Lê; McGuire, Thomas G; Zaslavsky,, Alan M

    2012-01-01

    Objective To review methods of measuring racial/ethnic health care disparities. Study Design Identification and tracking of racial/ethnic disparities in health care will be advanced by application of a consistent definition and reliable empirical methods. We have proposed a definition of racial/ethnic health care disparities based in the Institute of Medicine's (IOM) Unequal Treatment report, which defines disparities as all differences except those due to clinical need and preferences. After briefly summarizing the strengths and critiques of this definition, we review methods that have been used to implement it. We discuss practical issues that arise during implementation and expand these methods to identify sources of disparities. We also situate the focus on methods to measure racial/ethnic health care disparities (an endeavor predominant in the United States) within a larger international literature in health outcomes and health care inequality. Empirical Application We compare different methods of implementing the IOM definition on measurement of disparities in any use of mental health care and mental health care expenditures using the 2004–2008 Medical Expenditure Panel Survey. Conclusion Disparities analysts should be aware of multiple methods available to measure disparities and their differing assumptions. We prefer a method concordant with the IOM definition. PMID:22353147

  7. The influence of deliberate practice on musical achievement: a meta-analysis

    PubMed Central

    Platz, Friedrich; Kopiez, Reinhard; Lehmann, Andreas C.; Wolf, Anna

    2014-01-01

    Deliberate practice (DP) is a task-specific structured training activity that plays a key role in understanding skill acquisition and explaining individual differences in expert performance. Relevant activities that qualify as DP have to be identified in every domain. For example, for training in classical music, solitary practice is a typical training activity during skill acquisition. To date, no meta-analysis on the quantifiable effect size of deliberate practice on attained performance in music has been conducted. Yet the identification of a quantifiable effect size could be relevant for the current discussion on the role of various factors on individual difference in musical achievement. Furthermore, a research synthesis might enable new computational approaches to musical development. Here we present the first meta-analysis on the role of deliberate practice in the domain of musical performance. A final sample size of 13 studies (total N = 788) was carefully extracted to satisfy the following criteria: reported durations of task-specific accumulated practice as predictor variables and objectively assessed musical achievement as the target variable. We identified an aggregated effect size of rc = 0.61; 95% CI [0.54, 0.67] for the relationship between task-relevant practice (which by definition includes DP) and musical achievement. Our results corroborate the central role of long-term (deliberate) practice for explaining expert performance in music. PMID:25018742

  8. Evaluating Physician Impact Analysis: Methods, Results, and Uses in Ontario Hospitals.

    ERIC Educational Resources Information Center

    Charles, Cathy; Roberts, Jacqueline

    1994-01-01

    Physician impact analysis (PIA) is a planning tool intended to provide a way to evaluate the impact of a new or replacement physician's practice profile on the clinical program priorities, staffing resources, and costs of a hospital. Key methods for PIA and issues related to its use are considered. (SLD)

  9. The Real-Time Case Method: Description and Analysis of the First Implementation

    ERIC Educational Resources Information Center

    Kilbane, Clare; Theroux, James; Sulej, Julian; Bisson, Barry; Hay, David; Boyer, Dennis

    2004-01-01

    This article describes the first implementation of the "Real-time Case Method" (RTCM)--a new instructional practice that makes use of various technologies to create a new type of case study. Data obtained from five instructors at four business schools in the U.S. and Canada were analyzed using analytic induction. Analysis suggests RTCM was…

  10. Standard practices for dissolving glass containing radioactive and mixed waste for chemical and radiochemical analysis

    E-print Network

    American Society for Testing and Materials. Philadelphia

    2000-01-01

    1.1 These practices cover techniques suitable for dissolving glass samples that may contain nuclear wastes. These techniques used together or independently will produce solutions that can be analyzed by inductively coupled plasma atomic emission spectroscopy (ICP-AES), inductively coupled plasma mass spectrometry (ICP-MS), atomic absorption spectrometry (AAS), radiochemical methods and wet chemical techniques for major components, minor components and radionuclides. 1.2 One of the fusion practices and the microwave practice can be used in hot cells and shielded hoods after modification to meet local operational requirements. 1.3 The user of these practices must follow radiation protection guidelines in place for their specific laboratories. 1.4 Additional information relating to safety is included in the text. 1.5 The dissolution techniques described in these practices can be used for quality control of the feed materials and the product of plants vitrifying nuclear waste materials in glass. 1.6 These pr...

  11. A survey of castration methods and associated livestock management practices performed by bovine veterinarians in the United States

    PubMed Central

    2010-01-01

    Background Castration of male calves destined for beef production is a common management practice performed in the United States amounting to approximately 15 million procedures per year. Societal concern about the moral and ethical treatment of animals is increasing. Therefore, production agriculture is faced with the challenge of formulating animal welfare policies relating to routine management practices such as castration. To enable the livestock industry to effectively respond to these challenges there is a need for more data on management practices that are commonly used in cattle production systems. The objective of this survey was to describe castration methods, adverse events and husbandry procedures performed by U.S. veterinarians at the time of castration. Invitations to participate in the survey were sent to email addresses of 1,669 members of the American Association of Bovine Practitioners and 303 members of the Academy of Veterinary Consultants. Results After partially completed surveys and missing data were omitted, 189 responses were included in the analysis. Surgical castration with a scalpel followed by testicular removal by twisting (calves <90 kg) or an emasculator (calves >90 kg) was the most common method of castration used. The potential risk of injury to the operator, size of the calf, handling facilities and experience with the technique were the most important considerations used to determine the method of castration used. Swelling, stiffness and increased lying time were the most prevalent adverse events observed following castration. One in five practitioners report using an analgesic or local anesthetic at the time of castration. Approximately 90% of respondents indicated that they vaccinate and dehorn calves at the time of castration. Over half the respondents use disinfectants, prophylactic antimicrobials and tetanus toxoid to reduce complications following castration. Conclusions The results of this survey describe current methods of castration and associated management practices employed by bovine veterinarians in the U.S. Such data are needed to guide future animal well-being research, the outcomes of which can be used to develop industry-relevant welfare guidelines. PMID:20199669

  12. Comparative analysis of the methods for SADT determination.

    PubMed

    Kossoy, A A; Sheinman, I Ya

    2007-04-11

    The self-accelerating decomposition temperature (SADT) is an important parameter that characterizes thermal safety at transport of self-reactive substances. A great many articles were published focusing on various methodological aspects of SADT determination. Nevertheless there remain several serious problems that require further analysis and solution. Some of them are considered in the paper. Firstly four methods suggested by the United Nations "Recommendations on the Transport of Dangerous Goods" (TDG) are surveyed in order to reveal their features and limitations. The inconsistency between two definitions of SADT is discussed afterwards. One definition is the basis for the US SADT test and the heat accumulation storage test (Dewar test), another one is used when the Adiabatic storage test or the Isothermal storage test are applied. It is shown that this inconsistency may result in getting different and, in some cases, unsafe estimates of SADT. Then the applicability of the Dewar test for determination of SADT for solids is considered. It is shown that this test can be restrictedly applied for solids provided that the appropriate scale-up procedure is available. The advanced method based on the theory of regular cooling mode is proposed, which ensures more reliable results of the Dewar test application. The last part of the paper demonstrates how the kinetics-based simulation method helps in evaluation of SADT in those complex but practical cases (in particular, stack of packagings) when neither of the methods recommended by TDG can be used. PMID:16889892

  13. Why and How Do Nursing Homes Implement Culture Change Practices? Insights from Qualitative Interviews in a Mixed Methods Study

    PubMed Central

    Shield, Renée R.; Looze, Jessica; Tyler, Denise; Lepore, Michael; Miller, Susan C.

    2015-01-01

    Objective To understand the process of instituting culture change (CC) practices in nursing homes (NHs). Methods NH Directors of Nursing (DONs) and Administrators (NHAs) at 4,149 United States NHs were surveyed about CC practices. Follow-up interviews with 64 NHAs were conducted and analyzed by a multidisciplinary team which reconciled interpretations recorded in an audit trail. Results The themes include: 1) Reasons for implementing CC practices vary; 2) NH approaches to implementing CC practices are diverse; 3) NHs consider resident mix in deciding to implement practices; 4) NHAs note benefits and few implementation costs of implementing CC practices; 5) Implementation of changes is challenging and strategies for change are tailored to the challenges encountered; 6) Education and communication efforts are vital ways to institute change; and 7) NHA and other staff leadership is key to implementing changes. Discussion Diverse strategies and leadership skills appear to help NHs implement reform practices, including CC innovations. PMID:24652888

  14. Preventing childhood obesity during infancy in UK primary care: a mixed-methods study of HCPs' knowledge, beliefs and practice

    PubMed Central

    2011-01-01

    Background There is a strong rationale for intervening in early childhood to prevent obesity. Over a quarter of infants gain weight more rapidly than desirable during the first six months of life putting them at greater risk of obesity in childhood. However, little is known about UK healthcare professionals' (HCPs) approach to primary prevention. This study explored obesity-related knowledge of UK HCPs and the beliefs and current practice of general practitioners (GPs) and practice nurses in relation to identifying infants at risk of developing childhood obesity. Method Survey of UK HCPs (GPs, practice nurses, health visitors, nursery, community and children's nurses). HCPs (n = 116) rated their confidence in providing infant feeding advice and completed the Obesity Risk Knowledge Scale (ORK-10). Semi-structured interviews with a sub-set of 12 GPs and 6 practice nurses were audio recorded, taped and transcribed verbatim. Thematic analysis was applied using an interpretative, inductive approach. Results GPs were less confident about giving advice about infant feeding than health visitors (p = 0.001) and nursery nurses (p = 0.009) but more knowledgeable about the health risks of obesity (p < 0.001) than nurses (p = 0.009). HCPs who were consulted more often about feeding were less knowledgeable about the risks associated with obesity (r = -0.34, n = 114, p < 0.001). There was no relationship between HCPs' ratings of confidence in their advice and their knowledge of the obesity risk. Six main themes emerged from the interviews: 1) Attribution of childhood obesity to family environment, 2) Infant feeding advice as the health visitor's role, 3) Professional reliance on anecdotal or experiential knowledge about infant feeding, 4) Difficulties with recognition of, or lack of concern for, infants "at risk" of becoming obese, 5) Prioritising relationship with parent over best practice in infant feeding and 6) Lack of shared understanding for dealing with early years' obesity. Conclusions Intervention is needed to improve health visitors and nursery nurses' knowledge of obesity risk and GPs and practice nurses' capacity to identify and manage infants' at risk of developing childhood obesity. GPs value strategies that maintain relationships with vulnerable families and interventions to improve their advice-giving around infant feeding need to take account of this. Further research is needed to determine optimal ways of intervening with infants at risk of obesity in primary care. PMID:21699698

  15. A situated practice of ethics for participatory visual and digital methods in public health research and practice: a focus on digital storytelling.

    PubMed

    Gubrium, Aline C; Hill, Amy L; Flicker, Sarah

    2014-09-01

    This article explores ethical considerations related to participatory visual and digital methods for public health research and practice, through the lens of an approach known as "digital storytelling." We begin by briefly describing the digital storytelling process and its applications to public health research and practice. Next, we explore 6 common challenges: fuzzy boundaries, recruitment and consent to participate, power of shaping, representation and harm, confidentiality, and release of materials. We discuss their complexities and offer some considerations for ethical practice. We hope this article serves as a catalyst for expanded dialogue about the need for high standards of integrity and a situated practice of ethics wherein researchers and practitioners reflexively consider ethical decision-making as part of the ongoing work of public health. PMID:23948015

  16. Bias Correction Methods for Misclassified Covariates in the Cox Model: comparison offive correction methods by simulation and data analysis

    PubMed Central

    Bang, Heejung; Chiu, Ya-Lin; Kaufman, Jay S.; Patel, Mehul D.; Heiss, Gerardo; Rose, Kathryn M.

    2013-01-01

    Measurement error/misclassification is commonplace in research when variable(s) can notbe measured accurately. A number of statistical methods have been developed to tackle this problemin a variety of settings and contexts. However, relatively few methods are available to handlemisclassified categorical exposure variable(s) in the Cox proportional hazards regression model. Inthis paper, we aim to review and compare different methods to handle this problem - naïvemethods, regression calibration, pooled estimation, multiple imputation, corrected score estimation,and MC-SIMEX - by simulation. These methods are also applied to a life course study with recalleddata and historical records. In practice, the issue of measurement error/misclassification should beaccounted for in design and analysis, whenever possible. Also, in the analysis, it could be moreideal to implement more than one correction method for estimation and inference, with properunderstanding of underlying assumptions. PMID:24072991

  17. Degradation of learned skills. Effectiveness of practice methods on simulated space flight skill retention

    NASA Technical Reports Server (NTRS)

    Sitterley, T. E.; Berge, W. A.

    1972-01-01

    Manual flight control and emergency procedure task skill degradation was evaluated after time intervals of from 1 to 6 months. The tasks were associated with a simulated launch through the orbit insertion flight phase of a space vehicle. The results showed that acceptable flight control performance was retained for 2 months, rapidly deteriorating thereafter by a factor of 1.7 to 3.1 depending on the performance measure used. Procedural task performance showed unacceptable degradation after only 1 month, and exceeded an order of magnitude after 4 months. The effectiveness of static rehearsal (checklists and briefings) and dynamic warmup (simulator practice) retraining methods were compared for the two tasks. Static rehearsal effectively countered procedural skill degradation, while some combination of dynamic warmup appeared necessary for flight control skill retention. It was apparent that these differences between methods were not solely a function of task type or retraining method, but were a function of the performance measures used for each task.

  18. The uniform asymptotic swallowtail approximation - Practical methods for oscillating integrals with four coalescing saddle points

    NASA Technical Reports Server (NTRS)

    Connor, J. N. L.; Curtis, P. R.; Farrelly, D.

    1984-01-01

    Methods that can be used in the numerical implementation of the uniform swallowtail approximation are described. An explicit expression for that approximation is presented to the lowest order, showing that there are three problems which must be overcome in practice before the approximation can be applied to any given problem. It is shown that a recently developed quadrature method can be used for the accurate numerical evaluation of the swallowtail canonical integral and its partial derivatives. Isometric plots of these are presented to illustrate some of their properties. The problem of obtaining the arguments of the swallowtail integral from an analytical function of its argument is considered, describing two methods of solving this problem. The asymptotic evaluation of the butterfly canonical integral is addressed.

  19. A concise method for mine soils analysis

    SciTech Connect

    Winkler, S.; Wildeman, T.; Robinson, R.; Herron, J.

    1999-07-01

    A large number of abandoned hard rock mines exist in Colorado and other mountain west states, many on public property. Public pressure and resulting policy changes have become a driving force in the reclamation of these sites. Two of the key reclamation issues for these sites in the occurrence of acid forming materials (AFMs) in mine soils, and acid mine drainage (AMD) issuing from mine audits. An AMD treatment system design project for the Forest Queen mine in Colorado's San Juan mountains raised the need for a simple, useable method for analysis of mine land soils, both for suitability as a construction material, and to determine the AFM content and potential for acid release. The authors have developed a simple, stepwise, go - no go test for the analysis of mine soils. Samples were collected from a variety of sites in the Silverton, CO area, and subjected to three tiers of tests including: paste pH, Eh, and 10% HCl fizz test; then total digestion in HNO{sub 3}/HCl, neutralization potential, exposure to meteoric water, and toxicity content leaching procedure (TCLP). All elemental analyses were performed with an inductively-coupled plasma (ICP) spectrometer. Elimination of samples via the first two testing tiers left two remaining samples, which were subsequently subjected to column and sequential batch tests, with further elemental analysis by ICP. Based on these tests, one sample was chosen for suitability as a constructing material for the Forest Queen treatment system basins. Further simplification, and testing on two pairs of independent soil samples, has resulted in a final analytical method suitable for general use.

  20. Chapter 11. Community analysis-based methods

    SciTech Connect

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  1. Green analytical method development for statin analysis.

    PubMed

    Assassi, Amira Louiza; Roy, Claude-Eric; Perovitch, Philippe; Auzerie, Jack; Hamon, Tiphaine; Gaudin, Karen

    2015-02-01

    Green analytical chemistry method was developed for pravastatin, fluvastatin and atorvastatin analysis. HPLC/DAD method using ethanol-based mobile phase with octadecyl-grafted silica with various grafting and related-column parameters such as particle sizes, core-shell and monolith was studied. Retention, efficiency and detector linearity were optimized. Even for column with particle size under 2 ?m, the benefit of keeping efficiency within a large range of flow rate was not obtained with ethanol based mobile phase compared to acetonitrile one. Therefore the strategy to shorten analysis by increasing the flow rate induced decrease of efficiency with ethanol based mobile phase. An ODS-AQ YMC column, 50 mm × 4.6 mm, 3 ?m was selected which showed the best compromise between analysis time, statin separation, and efficiency. HPLC conditions were at 1 mL/min, ethanol/formic acid (pH 2.5, 25 mM) (50:50, v/v) and thermostated at 40°C. To reduce solvent consumption for sample preparation, 0.5mg/mL concentration of each statin was found the highest which respected detector linearity. These conditions were validated for each statin for content determination in high concentrated hydro-alcoholic solutions. Solubility higher than 100mg/mL was found for pravastatin and fluvastatin, whereas for atorvastatin calcium salt the maximum concentration was 2mg/mL for hydro-alcoholic binary mixtures between 35% and 55% of ethanol in water. Using atorvastatin instead of its calcium salt, solubility was improved. Highly concentrated solution of statins offered potential fluid for per Buccal Per-Mucous(®) administration with the advantages of rapid and easy passage of drugs. PMID:25582487

  2. Analysis of methods. [information systems evolution environment

    NASA Technical Reports Server (NTRS)

    Mayer, Richard J. (editor); Ackley, Keith A.; Wells, M. Sue; Mayer, Paula S. D.; Blinn, Thomas M.; Decker, Louis P.; Toland, Joel A.; Crump, J. Wesley; Menzel, Christopher P.; Bodenmiller, Charles A.

    1991-01-01

    Information is one of an organization's most important assets. For this reason the development and maintenance of an integrated information system environment is one of the most important functions within a large organization. The Integrated Information Systems Evolution Environment (IISEE) project has as one of its primary goals a computerized solution to the difficulties involved in the development of integrated information systems. To develop such an environment a thorough understanding of the enterprise's information needs and requirements is of paramount importance. This document is the current release of the research performed by the Integrated Development Support Environment (IDSE) Research Team in support of the IISEE project. Research indicates that an integral part of any information system environment would be multiple modeling methods to support the management of the organization's information. Automated tool support for these methods is necessary to facilitate their use in an integrated environment. An integrated environment makes it necessary to maintain an integrated database which contains the different kinds of models developed under the various methodologies. In addition, to speed the process of development of models, a procedure or technique is needed to allow automatic translation from one methodology's representation to another while maintaining the integrity of both. The purpose for the analysis of the modeling methods included in this document is to examine these methods with the goal being to include them in an integrated development support environment. To accomplish this and to develop a method for allowing intra-methodology and inter-methodology model element reuse, a thorough understanding of multiple modeling methodologies is necessary. Currently the IDSE Research Team is investigating the family of Integrated Computer Aided Manufacturing (ICAM) DEFinition (IDEF) languages IDEF(0), IDEF(1), and IDEF(1x), as well as ENALIM, Entity Relationship, Data Flow Diagrams, and Structure Charts, for inclusion in an integrated development support environment.

  3. How equity is addressed in clinical practice guidelines: a content analysis

    PubMed Central

    Shi, Chunhu; Tian, Jinhui; Wang, Quan; Petkovic, Jennifer; Ren, Dan; Yang, Kehu; Yang, Yang

    2014-01-01

    Objectives Considering equity into guidelines presents methodological challenges. This study aims to qualitatively synthesise the methods for incorporating equity in clinical practice guidelines (CPGs). Setting Content analysis of methodological publications. Eligibility criteria for selecting studies Methodological publications were included if they provided checklists/frameworks on when, how and to what extent equity should be incorporated in CPGs. Data sources We electronically searched MEDLINE, retrieved references, and browsed guideline development organisation websites from inception to January 2013. After study selection by two authors, general characteristics and checklists items/framework components from included studies were extracted. Based on the questions or items from checklists/frameworks (unit of analysis), content analysis was conducted to identify themes and questions/items were grouped into these themes. Primary outcomes The primary outcomes were methodological themes and processes on how to address equity issues in guideline development. Results 8 studies with 10 publications were included from 3405 citations. In total, a list of 87 questions/items was generated from 17 checklists/frameworks. After content analysis, questions were grouped into eight themes (‘scoping questions’, ‘searching relevant evidence’, ‘appraising evidence and recommendations’, ‘formulating recommendations’, ‘monitoring implementation’, ‘providing a flow chart to include equity in CPGs’, and ‘others: reporting of guidelines and comments from stakeholders’ for CPG developers and ‘assessing the quality of CPGs’ for CPG users). Four included studies covered more than five of these themes. We also summarised the process of guideline development based on the themes mentioned above. Conclusions For disadvantaged population-specific CPGs, eight important methodological issues identified in this review should be considered when including equity in CPGs under the guidance of a scientific guideline development manual. PMID:25479795

  4. Practical Application of Parallel Coordinates for Climate Model Analysis

    SciTech Connect

    Steed, Chad A [ORNL; Shipman, Galen M [ORNL; Thornton, Peter E [ORNL; Ricciuto, Daniel M [ORNL; Erickson III, David J [ORNL; Branstetter, Marcia L [ORNL

    2012-01-01

    The determination of relationships between climate variables and the identification of the most significant associations between them in various geographic regions is an important aspect of climate model evaluation. The EDEN visual analytics toolkit has been developed to aid such analysis by facilitating the assessment of multiple variables with respect to the amount of variability that can be attributed to specific other variables. EDEN harnesses the parallel coordinates visualization technique and is augmented with graphical indicators of key descriptive statistics. A case study is presented in which the focus on the Harvard Forest site (42.5378N Lat, 72.1715W Lon) and the Community Land Model Version 4 (CLM4) is evaluated. It is shown that model variables such as land water runoff are more sensitive to a particular set of environmental variables than a suite of other inputs in the 88 variable analysis conducted. The approach presented here allows climate-domain scientists to focus on the most important variables in the model evaluations.

  5. Primary prevention in general practice – views of German general practitioners: a mixed-methods study

    PubMed Central

    2014-01-01

    Background Policy efforts focus on a reorientation of health care systems towards primary prevention. To guide such efforts, we analyzed the role of primary prevention in general practice and general practitioners’ (GPs) attitudes toward primary prevention. Methods Mixed-method study including a cross-sectional survey of all community-based GPs and focus groups in a sample of GPs who collaborated with the Institute of General Practice in Berlin, Germany in 2011. Of 1168 GPs 474 returned the mail survey. Fifteen GPs participated in focus group discussions. Survey and interview guidelines were developed and tested to assess and discuss beliefs, attitudes, and practices regarding primary prevention. Results Most respondents considered primary prevention within their realm of responsibility (70%). Primary prevention, especially physical activity, healthy eating, and smoking cessation, was part of the GPs’ health care recommendations if they thought it was indicated. Still a quarter of survey respondents discussed reduction of alcohol consumption with their patients infrequently even when they thought it was indicated. Similarly 18% claimed that they discuss smoking cessation only sometimes. The focus groups revealed that GPs were concerned about the detrimental effects an uninvited health behavior suggestion could have on patients and were hesitant to take on the role of “health policing”. GPs saw primary prevention as the responsibility of multiple actors in a network of societal and municipal institutions. Conclusions The mixed-method study showed that primary prevention approaches such as lifestyle counseling is not well established in primary care. GPs used a selective approach to offer preventive advice based upon indication. GPs had a strong sense that a universal prevention approach carried the potential to destroy a good patient-physician relationship. Other approaches to public health may be warranted such as a multisectoral approach to population health. This type of restructuring of the health care sector may benefit patients who are unable to afford specific prevention programmes and who have competing demands that hinder their ability to focus on behavior change. PMID:24885100

  6. APPLYING NEW METHODS TO RESEARCH REACTOR ANALYSIS.

    SciTech Connect

    DIAMOND,D.J.CHENG,L.HANSON,A.XU,J.CAREW,J.F.

    2004-02-05

    Detailed reactor physics and safety analyses are being performed for the 20 MW D{sub 2}O-moderated research reactor at the National Institute of Standards and Technology (NIST). The analyses employ state-of-the-art calculational methods and will contribute to an update to the Final Safety Analysis Report (FSAR). Three-dimensional MCNP Monte Carlo neutron and photon transport calculations are performed to determine power and reactivity parameters, including feedback coefficients and control element worths. The core depletion and determination of the fuel compositions are performed with MONTEBURNS to model the reactor at the beginning, middle, and end-of-cycle. The time-dependent analysis of the primary loop is determined with a RELAP5 transient analysis model that includes the pump, heat exchanger, fuel element geometry, and flow channels. A statistical analysis used to assure protection from critical heat flux (CHF) is performed using a Monte Carlo simulation of the uncertainties contributing to the CHF calculation. The power distributions used to determine the local fuel conditions and margin to CHF are determined with MCNP. Evaluations have been performed for the following accidents: (1) the control rod withdrawal startup accident, (2) the maximum reactivity insertion accident, (3) loss-of-flow resulting from loss of electrical power, (4) loss-of-flow resulting from a primary pump seizure, (5) loss-of-flow resulting from inadvertent throttling of a flow control valve, (6) loss-of-flow resulting from failure of both shutdown cooling pumps and (7) misloading of a fuel element. These analyses are significantly more rigorous than those performed previously. They have provided insights into reactor behavior and additional assurance that previous analyses were conservative and the reactor was being operated safely.

  7. Best Practices for Finite Element Analysis of Spent Nuclear Fuel Transfer, Storage, and Transportation Systems

    SciTech Connect

    Bajwa, Christopher S.; Piotter, Jason; Cuta, Judith M.; Adkins, Harold E.; Klymyshyn, Nicholas A.; Fort, James A.; Suffield, Sarah R.

    2010-08-11

    Storage casks and transportation packages for spent nuclear fuel (SNF) are designed to confine SNF in sealed canisters or casks, provide structural integrity during accidents, and remove decay through a storage or transportation overpack. The transfer, storage, and transportation of SNF in dry storage casks and transport packages is regulated under 10 CFR Part 72 and 10 CFR Part 71, respectively. Finite Element Analysis (FEA) is used with increasing frequency in Safety Analysis Reports and other regulatory technical evaluations related to SNF casks and packages and their associated systems. Advances in computing power have made increasingly sophisticated FEA models more feasible, and as a result, the need for careful review of such models has also increased. This paper identifies best practice recommendations that stem from recent NRC review experience. The scope covers issues common to all commercially available FEA software, and the recommendations are applicable to any FEA software package. Three specific topics are addressed: general FEA practices, issues specific to thermal analyses, and issues specific to structural analyses. General FEA practices covers appropriate documentation of the model and results, which is important for an efficient review process. The thermal analysis best practices are related to cask analysis for steady state conditions and transient scenarios. The structural analysis best practices are related to the analysis of casks and associated payload during standard handling and drop scenarios. The best practices described in this paper are intended to identify FEA modeling issues and provide insights that can help minimize associated uncertainties and errors, in order to facilitate the NRC licensing review process.

  8. Spelling Practice Intervention: A Comparison of Tablet PC and Picture Cards as Spelling Practice Methods for Students with Developmental Disabilities

    ERIC Educational Resources Information Center

    Seok, Soonhwa; DaCosta, Boaventura; Yu, Byeong Min

    2015-01-01

    The present study compared a spelling practice intervention using a tablet personal computer (PC) and picture cards with three students diagnosed with developmental disabilities. An alternating-treatments design with a non-concurrent multiple-baseline across participants was used. The aims of the present study were: (a) to determine if…

  9. The systematical analysis of oriental pulse waveform: a practical approach.

    PubMed

    Lee, Junyoung

    2008-02-01

    With the view to set up the oriental pulse database as well as objective diagnosis standards, this study has designed and manufactured a digital pulse diagnosis system that uses a high-performance microprocessor on the basis of systematic pulse diagnosis methodology. The algorithm for extracting significant points has been proposed to precisely interpret the pulse signals that have various kinds of noises, and pulse measurement tests were conducted on many patients at a hospital. Much of the clinical data attained by the digital pulse diagnosis system has been compared with the clinical findings made by the doctors in charge of patients. As a result of this comparison and analysis, the study has found out that the two findings showed almost identical opinions. On this basis, an objective diagnostic parameter clinical diagnosis has been presented. PMID:18333400

  10. Applied analysis of recurrent events: a practical overview

    PubMed Central

    Twisk, J.; Smidt, N.; de Vente, W.

    2005-01-01

    Study objective: The purpose of this paper is to give an overview and comparison of different easily applicable statistical techniques to analyse recurrent event data. Setting: These techniques include naive techniques and longitudinal techniques such as Cox regression for recurrent events, generalised estimating equations (GEE), and random coefficient analysis. The different techniques are illustrated with a dataset from a randomised controlled trial regarding the treatment of lateral epicondylitis. Main results: The use of different statistical techniques leads to different results and different conclusions regarding the effectiveness of the different intervention strategies. Conclusions: If you are interested in a particular short term or long term result, simple naive techniques are appropriate. However, if the development of a particular outcome is of interest, statistical techniques that consider the recurrent events and additionally corrects for the dependency of the observations are necessary. PMID:16020650

  11. Homotopy analysis method for quadratic Riccati differential equation

    Microsoft Academic Search

    Yue Tan; Saeid Abbasbandy

    2008-01-01

    In this paper, the quadratic Riccati differential equation is solved by means of an analytic technique, namely the homotopy analysis method (HAM). Comparisons are made between Adomian’s decomposition method (ADM), homotopy perturbation method (HPM) and the exact solution and the homotopy analysis method. The results reveal that the proposed method is very effective and simple.

  12. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R. (Albuquerque, NM)

    2010-11-23

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  13. Methods for spectral image analysis by exploiting spatial simplicity

    DOEpatents

    Keenan, Michael R.

    2010-05-25

    Several full-spectrum imaging techniques have been introduced in recent years that promise to provide rapid and comprehensive chemical characterization of complex samples. One of the remaining obstacles to adopting these techniques for routine use is the difficulty of reducing the vast quantities of raw spectral data to meaningful chemical information. Multivariate factor analysis techniques, such as Principal Component Analysis and Alternating Least Squares-based Multivariate Curve Resolution, have proven effective for extracting the essential chemical information from high dimensional spectral image data sets into a limited number of components that describe the spectral characteristics and spatial distributions of the chemical species comprising the sample. There are many cases, however, in which those constraints are not effective and where alternative approaches may provide new analytical insights. For many cases of practical importance, imaged samples are "simple" in the sense that they consist of relatively discrete chemical phases. That is, at any given location, only one or a few of the chemical species comprising the entire sample have non-zero concentrations. The methods of spectral image analysis of the present invention exploit this simplicity in the spatial domain to make the resulting factor models more realistic. Therefore, more physically accurate and interpretable spectral and abundance components can be extracted from spectral images that have spatially simple structure.

  14. Differential method of analysis of luminescence spectra of semiconductors

    SciTech Connect

    Emel'yanov, A. M., E-mail: Emelyanov@mail.ioffe.ru [Russian Academy of Sciences, Ioffe Physical Technical Institute (Russian Federation)

    2010-09-15

    A method for analyzing the luminescence spectra of semiconductors is suggested. The method is based on differentiation of the spectra. The potentialities of the method are demonstrated for luminescence in the region of the fundamental absorption edge of Si and SiGe alloy single crystals. The method is superior in accuracy to previously known luminescence methods of determining the band gap of indirect-gap semiconductors and practically insensitive to different conditions of outputting radiation from the sample.

  15. International Commercial Remote Sensing Practices and Policies: A Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Stryker, Timothy

    In recent years, there has been much discussion about U.S. commercial remoteUnder the Act, the Secretary of Commerce sensing policies and how effectively theylicenses the operations of private U.S. address U.S. national security, foreignremote sensing satellite systems, in policy, commercial, and public interests.consultation with the Secretaries of Defense, This paper will provide an overview of U.S.State, and Interior. PDD-23 provided further commercial remote sensing laws,details concerning the operation of advanced regulations, and policies, and describe recentsystems, as well as criteria for the export of NOAA initiatives. It will also addressturnkey systems and/or components. In July related foreign practices, and the overall2000, pursuant to the authority delegated to legal context for trade and investment in thisit by the Secretary of Commerce, NOAA critical industry.iss ued new regulations for the industry. Licensing and Regulationsatellite systems. NOAA's program is The 1992 Land Remote Sensing Policy Act ("the Act"), and the 1994 policy on Foreign Access to Remote Sensing Space Capabilities (known as Presidential Decision Directive-23, or PDD-23) put into place an ambitious legal and policy framework for the U.S. Government's licensing of privately-owned, high-resolution satellite systems. Previously, capabilities afforded national security and observes the international obligations of the United States; maintain positive control of spacecraft operations; maintain a tasking record in conjunction with other record-keeping requirements; provide U.S. Government access to and use of data when required for national security or foreign policy purposes; provide for U.S. Government review of all significant foreign agreements; obtain U.S. Government approval for any encryption devices used; make available unenhanced data to a "sensed state" as soon as such data are available and on reasonable cost terms and conditions; make available unenhanced data as requested by the U.S. Government Archive; and, obtain a priori U.S. Government approval of all plans and procedures to deal with safe disposition of the satellite. Further information on NOAA's regulations and NOAA's licensing program is available at www.licensing.noaa.gov. Monitoring and Enforcement NOAA's enforcement mission is focused on the legislative mandate which states that the Secretary of Commerce has a continuing obligation to ensure that licensed imaging systems are operated lawfully to preserve the national security and foreign policies of the United States. NOAA has constructed an end-to-end monitoring and compliance program to review the activities of licensed companies. This program includes a pre- launch review, an operational baseline audit, and an annual comprehensive national security audit. If at any time there is suspicion or concern that a system is being operated unlawfully, a no-notice inspection may be initiated. setbacks, three U.S. companies are now operational, with more firms expected to become so in the future. While NOAA does not disclose specific systems capabilities for proprietary reasons, its current licensing resolution thresholds for general commercial availability are as follows: 0.5 meter Ground Sample Distance (GSD) for panchromatic systems, 2 meter GSD for multi-spectral systems, 3 meter Impulse Response (IPR) for Synthetic Aperture Radar systems, and 20 meter GSD for hyperspectral systems (with certain 8-meter hyperspectral derived products also licensed for commercial distribution). These thresholds are subject to change based upon foreign availability and other considerations. It should also be noted that license applications are reviewed and granted on a case-by-case basis, pursuant to each system's technology and concept of operations. In 2001, NOAA, along with the Department of Commerce's International Trade Administration, commissioned a study by the RAND Corporation to assess the risks faced by the U.S. commercial remote sensing satellite industry. In commissioning this study, NOAA's goal was to bette

  16. Using the method explained in module 1(slide 9), practice adding the following decimal numbers. Do not use a calculator.

    E-print Network

    1 of 4 Decimals Addition Using the method explained in module 1(slide 9), practice adding the following decimal numbers. Do not use a calculator. Worked examples: (a)14.5+3.9=(14+3 explained in module 1(slide 10), practice subtracting the following decimal numbers. Do not use a calculator

  17. Robust and non-parametric statistics in the evaluation of figures of merit of analytical methods. Practices for students.

    PubMed

    Cruz Ortiz, M; Herrero, Ana; Sanllorente, Silvia; Reguera, Celia

    2005-05-01

    A set of laboratory practices is proposed in which evaluation of the quality of the analytical measurements is incorporated explicitly by applying systematically suitable methodology for extracting the useful information contained in chemical data. Non-parametric and robust techniques useful for detecting outliers have been used to evaluate different figures of merit in the validation and optimization of analytical methods. In particular, they are used for determination of the capability of detection according to ISO 11843 and IUPAC and for determination of linear range, for assessment of the response surface fitted using an experimental design to optimize an instrumental technique, and for analysis of a proficiency test carried out by different groups of students. The tools used are robust regression, least median of squares (LMS) regression, and some robust estimators as median absolute deviation (m.a.d.) or Huber estimator, which are very useful as an alternatives to the usual centralization and dispersion estimators. PMID:15782337

  18. Practical Estimates of Field-Saturated Hydraulic Conductivity of Bedrock Outcrops using a Modified Bottomless Bucket Method

    NASA Astrophysics Data System (ADS)

    Mirus, B. B.; Perkins, K. S.

    2012-12-01

    The bottomless bucket (BB) approach (Nimmo et al., VZJ, 2009) is a cost-effective method for rapidly characterizing field-saturated hydraulic conductivity Kfs of soils and alluvial deposits. This practical approach is of particular value for quantifying infiltration rates in remote areas with limited accessibility. A similar approach for bedrock outcrops is also of great value for improving quantitative understanding of infiltration and recharge in rugged terrain. We develop a simple modification to the BB method for application to bedrock outcrops, which uses a non-toxic, quick-drying silicone gel to seal the BB to the bedrock. These modifications to the field method require only minor changes to the analytical solution for calculating Kfs on soils. We investigate the reproducibility of the method with laboratory experiments on a previously studied calcarenite rock and conduct a sensitivity analysis to quantify uncertainty in our predictions. We apply the BB method on both bedrock and soil for sites on Pahute Mesa, which is located in a remote area of the Nevada National Security Site. The bedrock BB tests may require monitoring over several hours to days, depending on infiltration rates, which necessitates a cover to prevent evaporative losses. Our field and laboratory results compare well to Kfs values inferred from independent reports, which suggests the modified BB method can provide useful estimates and facilitate simple hypothesis testing. The ease with which the bedrock BB method can be deployed should facilitate more rapid in-situ data collection than is possible with alternative methods for quantitative characterization of infiltration into bedrock. Typical deployment of bedrock bottomless buckets (BBB's) on an outcrop of volcanic tuff before the application of water.

  19. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton (Albuquerque, NM); Phillips, Cynthia A. (Albuquerque, NM)

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  20. A practical approach to object based requirements analysis

    NASA Technical Reports Server (NTRS)

    Drew, Daniel W.; Bishop, Michael

    1988-01-01

    Presented here is an approach developed at the Unisys Houston Operation Division, which supports the early identification of objects. This domain oriented analysis and development concept is based on entity relationship modeling and object data flow diagrams. These modeling techniques, based on the GOOD methodology developed at the Goddard Space Flight Center, support the translation of requirements into objects which represent the real-world problem domain. The goal is to establish a solid foundation of understanding before design begins, thereby giving greater assurance that the system will do what is desired by the customer. The transition from requirements to object oriented design is also promoted by having requirements described in terms of objects. Presented is a five step process by which objects are identified from the requirements to create a problem definition model. This process involves establishing a base line requirements list from which an object data flow diagram can be created. Entity-relationship modeling is used to facilitate the identification of objects from the requirements. An example is given of how semantic modeling may be used to improve the entity-relationship model and a brief discussion on how this approach might be used in a large scale development effort.

  1. Deliberate Practice and Performance in Music, Games, Sports, Education, and Professions: A Meta-Analysis.

    PubMed

    Macnamara, Brooke N; Hambrick, David Z; Oswald, Frederick L

    2014-07-01

    More than 20 years ago, researchers proposed that individual differences in performance in such domains as music, sports, and games largely reflect individual differences in amount of deliberate practice, which was defined as engagement in structured activities created specifically to improve performance in a domain. This view is a frequent topic of popular-science writing-but is it supported by empirical evidence? To answer this question, we conducted a meta-analysis covering all major domains in which deliberate practice has been investigated. We found that deliberate practice explained 26% of the variance in performance for games, 21% for music, 18% for sports, 4% for education, and less than 1% for professions. We conclude that deliberate practice is important, but not as important as has been argued. PMID:24986855

  2. An easy and practical method for detection and estimation of microcrystalline cellulose in pasteurized milk.

    PubMed

    Zhang, Chi; Yang, Jun; Qiao, Ling

    2011-01-01

    Microcrystalline cellulose (MCC) is suspected to be a new adulteration in pasteurized milk in China, yet an efficient method for MCC detection in dairy has not been established. This study presents a novel procedure to detect and estimate MCC in pasteurized milk using dialysis, cellulase hydrolysis, and a reducing sugar assay. The background value of reducing sugar was eliminated by dialysis, and cellulase activity toward MCC was stable in dialyzed milk. A criterion for MCC detection and an empirical formula for MCC estimation were summarized based on the reducing sugar variation after hydrolysis. The detection sensitivity was below 0.5 g/L. Reducing sugar distribution after cellulase-catalyzed hydrolysis was examined by HPLC, and revealed that most of the detected sugar was glucose. This paper describes a practical method for detection of MCC in pasteurized milk that might benefit dairy QC. PMID:21563704

  3. A method for obtaining practical flutter-suppression control laws using results of optimal control theory

    NASA Technical Reports Server (NTRS)

    Newson, J. R.

    1979-01-01

    The results of optimal control theory are used to synthesize a feedback filter. The feedback filter is used to force the output of the filtered frequency response to match that of a desired optimal frequency response over a finite frequency range. This matching is accomplished by employing a nonlinear programing algorithm to search for the coefficients of the feedback filter that minimize the error between the optimal frequency response and the filtered frequency response. The method is applied to the synthesis of an active flutter-suppression control law for an aeroelastic wind-tunnel model. It is shown that the resulting control law suppresses flutter over a wide range of subsonic Mach numbers. This is a promising method for synthesizing practical control laws using the results of optimal control theory.

  4. Visceral fat estimation method by bioelectrical impedance analysis and causal analysis

    NASA Astrophysics Data System (ADS)

    Nakajima, Hiroshi; Tasaki, Hiroshi; Tsuchiya, Naoki; Hamaguchi, Takehiro; Shiga, Toshikazu

    2011-06-01

    It has been clarified that abdominal visceral fat accumulation is closely associated to the lifestyle disease and metabolic syndrome. The gold standard in medical fields is visceral fat area measured by an X-ray computer tomography (CT) scan or magnetic resonance imaging. However, their measurements are high invasive and high cost; especially a CT scan causes X-ray exposure. They are the reasons why medical fields need an instrument for viscera fat measurement with low invasive, ease of use, and low cost. The article proposes a simple and practical method of visceral fat estimation by employing bioelectrical impedance analysis and causal analysis. In the method, abdominal shape and dual impedances of abdominal surface and body total are measured to estimate a visceral fat area based on the cause-effect structure. The structure is designed according to the nature of abdominal body composition to be fine-tuned by statistical analysis. The experiments were conducted to investigate the proposed model. 180 subjects were hired to be measured by both a CT scan and the proposed method. The acquired model explained the measurement principle well and the correlation coefficient is 0.88 with the CT scan measurements.

  5. Systematic Analysis Method for Color Transparency Experiments

    E-print Network

    Pankaj Jain; John P. Ralston

    1993-03-15

    We introduce a data analysis procedure for color transparency experiments which is considerably less model dependent than the transparency ratio method. The new method is based on fitting the shape of the A dependence of the nuclear cross section at fixed momentum transfer to determine the effective attenuation cross section for hadrons propagating through the nucleus. The procedure does not require assumptions about the hard scattering rate inside the nuclear medium. Instead, the hard scattering rate is deduced directly from the data. The only theoretical input necessary is in modelling the attenuation due to the nuclear medium, for which we use a simple exponential law. We apply this procedure to the Brookhaven experiment of Carroll et al and find that it clearly shows color transparency: the effective attenuation cross section in events with momentum transfer $Q^2$ is approximately $40\\ mb\\ (2.2\\ GeV^2/Q^2)$. The fit to the data also supports the idea that the hard scattering inside the nuclear medium is closer to perturbative QCD predictions than is the scattering of isolated protons in free space. We also discuss the application of our approach to electroproduction experiments.

  6. Methods for analysis of autophagy in plants.

    PubMed

    Bassham, Diane C

    2015-03-01

    The plant vacuole is a major site for the breakdown and recycling of cellular macromolecules. Cytoplasmic components destined for degradation are delivered to the vacuole in vesicles termed autophagosomes, and the breakdown products are transported back into the cytosol for reuse, with the overall process termed autophagy. In plants, autophagy is required for nutrient remobilization and recycling during senescence and nutrient deficiency, for clearance of protein aggregates and damaged organelles during environmental stress, for pathogen defense, and for general cellular maintenance under normal growth conditions. There is growing interest in autophagy in plants due to the wide range of processes in which it functions. While much of the work thus far has used the model plant Arabidopsis thaliana, autophagy is now under investigation in a number of other plants, particularly in economically important crop species. Here, I discuss methods for assessing autophagy activity in plant cells. Microscopic and biochemical assays are described, along with ways to distinguish the steady-state number of autophagosomes from flux through the autophagic pathway. Some deficiencies still exist in plant autophagy analysis, and there is a particular need for more accurate methods of quantifying autophagic flux in plants. PMID:25239736

  7. Method and apparatus for frequency spectrum analysis

    NASA Technical Reports Server (NTRS)

    Cole, Steven W. (inventor)

    1992-01-01

    A method for frequency spectrum analysis of an unknown signal in real-time is discussed. The method is based upon integration of 1-bit samples of signal voltage amplitude corresponding to sine or cosine phases of a controlled center frequency clock which is changed after each integration interval to sweep the frequency range of interest in steps. Integration of samples during each interval is carried out over a number of cycles of the center frequency clock spanning a number of cycles of an input signal to be analyzed. The invention may be used to detect the frequency of at least two signals simultaneously. By using a reference signal of known frequency and voltage amplitude (added to the two signals for parallel processing in the same way, but in a different channel with a sampling at the known frequency and phases of the reference signal), the absolute voltage amplitude of the other two signals may be determined by squaring the sine and cosine integrals of each channel and summing the squares to obtain relative power measurements in all three channels and, from the known voltage amplitude of the reference signal, obtaining an absolute voltage measurement for the other two signals by multiplying the known voltage of the reference signal with the ratio of the relative power of each of the other two signals to the relative power of the reference signal.

  8. A neural-network method for the analysis of multilayered shielded microwave circuits

    Microsoft Academic Search

    Juan Pascual García; Fernando Quesada Pereira; David Cañete Rebenaque; José Luis Gómez Tornero; Alejandro Alvarez Melcón

    2006-01-01

    In this paper, a neural-network-based method for the analysis of practical multilayered shielded microwave circuits is presented. Using this idea, a radial basis function neural network (RBFNN) is trained to approximate the space-domain multilayered media boxed Green's functions used in the integral-equation (IE) method. Once the RBFNN has been trained, the outputs of the neural network (NN) replace the exact

  9. Assessing Scientific Practices Using Machine-Learning Methods: How Closely Do They Match Clinical Interview Performance?

    NASA Astrophysics Data System (ADS)

    Beggrow, Elizabeth P.; Ha, Minsu; Nehm, Ross H.; Pearl, Dennis; Boone, William J.

    2013-07-01

    The landscape of science education is being transformed by the new Framework for Science Education (National Research Council, A framework for K-12 science education: practices, crosscutting concepts, and core ideas. The National Academies Press, Washington, DC, 2012), which emphasizes the centrality of scientific practices—such as explanation, argumentation, and communication—in science teaching, learning, and assessment. A major challenge facing the field of science education is developing assessment tools that are capable of validly and efficiently evaluating these practices. Our study examined the efficacy of a free, open-source machine-learning tool for evaluating the quality of students' written explanations of the causes of evolutionary change relative to three other approaches: (1) human-scored written explanations, (2) a multiple-choice test, and (3) clinical oral interviews. A large sample of undergraduates (n = 104) exposed to varying amounts of evolution content completed all three assessments: a clinical oral interview, a written open-response assessment, and a multiple-choice test. Rasch analysis was used to compute linear person measures and linear item measures on a single logit scale. We found that the multiple-choice test displayed poor person and item fit (mean square outfit >1.3), while both oral interview measures and computer-generated written response measures exhibited acceptable fit (average mean square outfit for interview: person 0.97, item 0.97; computer: person 1.03, item 1.06). Multiple-choice test measures were more weakly associated with interview measures (r = 0.35) than the computer-scored explanation measures (r = 0.63). Overall, Rasch analysis indicated that computer-scored written explanation measures (1) have the strongest correspondence to oral interview measures; (2) are capable of capturing students' normative scientific and naive ideas as accurately as human-scored explanations, and (3) more validly detect understanding than the multiple-choice assessment. These findings demonstrate the great potential of machine-learning tools for assessing key scientific practices highlighted in the new Framework for Science Education.

  10. Chromium speciation by different methods of practical use for routine in situ measurement

    NASA Astrophysics Data System (ADS)

    Barakat, S.; Giusti, L.

    2003-05-01

    Simple, sensitive, low-cost, and relatively rapid methods for the detection of Cr (111) and Cr (VI) species in natural waters are needed for monitoring and regulatory purposes. Conventional acidification and storage of filtered samples can be a major cause of chromium losses from the `dissolved' phase. In situ monitoring is thus of paramount importance. The practical usefulness of selected chromium speciation methods was assessed in the laboratory and in the field. Significant discrepancies were found in the Cr (VI) detection efficiency by a selective ion meter based on the diphenylcarbazide method when compared with conventional Zeeman graphite fumace AAS. The efficiency of the DGT (Diffusion gradients in thin films) method, based on the deployment in situ of gel/resin units capable of separating labile species of Cr (III) and Cr (VI), looks promising, but is limited by cost considerations and by potential complications in the presence of complexing substances. The method based on the Sephadex DEAE A-25 ion exchange resins is quite effective in the separation of Cr species, though it requires on-site facilities, is relatively time-consuming and is potentially affected by complexing substances.

  11. Communication: Quantum polarized fluctuating charge model: A practical method to include ligand polarizability in biomolecular simulations

    NASA Astrophysics Data System (ADS)

    Roy Kimura, S.; Rajamani, Ramkumar; Langley, David R.

    2011-12-01

    We present a simple and practical method to include ligand electronic polarization in molecular dynamics (MD) simulation of biomolecular systems. The method involves periodically spawning quantum mechanical (QM) electrostatic potential (ESP) calculations on an extra set of computer processors using molecular coordinate snapshots from a running parallel MD simulation. The QM ESPs are evaluated for the small-molecule ligand in the presence of the electric field induced by the protein, solvent, and ion charges within the MD snapshot. Partial charges on ligand atom centers are fit through the multi-conformer restrained electrostatic potential (RESP) fit method on several successive ESPs. The RESP method was selected since it produces charges consistent with the AMBER/GAFF force-field used in the simulations. The updated charges are introduced back into the running simulation when the next snapshot is saved. The result is a simulation whose ligand partial charges continuously respond in real-time to the short-term mean electrostatic field of the evolving environment without incurring additional wall-clock time. We show that (1) by incorporating the cost of polarization back into the potential energy of the MD simulation, the algorithm conserves energy when run in the microcanonical ensemble and (2) the mean solvation free energies for 15 neutral amino acid side chains calculated with the quantum polarized fluctuating charge method and thermodynamic integration agree better with experiment relative to the Amber fixed charge force-field.

  12. The experience of implementing evidence-based practice change: a qualitative analysis.

    PubMed

    Irwin, Margaret M; Bergman, Rosalie M; Richards, Rebecca

    2013-10-01

    The Oncology Nursing Society (ONS) and ONS Foundation worked together to develop the Institute for Evidence-Based Practice Change (IEBPC) program to facilitate the implementation of evidence-based practice (EBP) change in nursing. This analysis describes the experience of 19 teams of nurses from various healthcare settings who participated in the IEBPC program. Qualitative analysis of verbatim narratives of activities and observations during the process of implementing an EBP project was used to identify key themes in the experience. EBP implementation enabled participants to learn about their own practice and to experience empowerment through the evidence, and it ignited the spirit of inquiry, team work, and multidisciplinary collaboration. Experiences and lessons learned from nurses implementing EBP can be useful to others in planning EBP implementation. PMID:24080054

  13. Integrated force method versus displacement method for finite element analysis

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Berke, Laszlo; Gallagher, Richard H.

    1990-01-01

    A novel formulation termed the integrated force method (IFM) has been developed in recent years for analyzing structures. In this method all the internal forces are taken as independent variables, and the system equilibrium equations (EE's) are integrated with the global compatibility conditions (CC's) to form the governing set of equations. In IFM the CC's are obtained from the strain formulation of St. Venant, and no choices of redundant load systems have to be made, in constrast to the standard force method (SFM). This property of IFM allows the generation of the governing equation to be automated straightforwardly, as it is in the popular stiffness method (SM). In this report IFM and SM are compared relative to the structure of their respective equations, their conditioning, required solution methods, overall computational requirements, and convergence properties as these factors influence the accuracy of the results. Overall, this new version of the force method produces more accurate results than the stiffness method for comparable computational cost.

  14. A Quantitative Analysis and Natural History of B. F. Skinner's Coauthoring Practices

    PubMed Central

    McKerchar, Todd L; Morris, Edward K; Smith, Nathaniel G

    2011-01-01

    This paper describes and analyzes B. F. Skinner's coauthoring practices. After identifying his 35 coauthored publications and 27 coauthors, we analyze his coauthored works by their form (e.g., journal articles) and kind (e.g., empirical); identify the journals in which he published and their type (e.g., data-type); describe his overall and local rates of publishing with his coauthors (e.g., noting breaks in the latter); and compare his coauthoring practices with his single-authoring practices (e.g., form, kind, journal type) and with those in the scientometric literature (e.g., majority of coauthored publications are empirical). We address these findings in the context of describing the natural history of Skinner's coauthoring practices. Finally, we describe some limitations in our methods and offer suggestions for future research. PMID:22532732

  15. Practice Makes Perfect: Improving Students' Skills in Understanding and Avoiding Plagiarism with a Themed Methods Course

    ERIC Educational Resources Information Center

    Estow, Sarah; Lawrence, Eva K.; Adams, Kathrynn A.

    2011-01-01

    To address the issue of plagiarism, students in two undergraduate Research Methods and Analysis courses conducted, analyzed, and wrote up original research on the topic of plagiarism. Students in an otherwise identical course completed the same assignments but examined a different research topic. At the start and end of the semester, all students…

  16. Using the Knowledge to Action Framework in practice: a citation analysis and systematic review.

    PubMed

    Field, Becky; Booth, Andrew; Ilott, Irene; Gerrish, Kate

    2014-11-23

    BackgroundConceptual frameworks are recommended as a way of applying theory to enhance implementation efforts. The Knowledge to Action (KTA) Framework was developed in Canada by Graham and colleagues in the 2000s, following a review of 31 planned action theories. The framework has two components: Knowledge Creation and an Action Cycle, each of which comprises multiple phases. This review sought to answer two questions: `Is the KTA Framework used in practice? And if so, how?¿MethodsThis study is a citation analysis and systematic review. The index citation for the original paper was identified on three databases¿Web of Science, Scopus and Google Scholar¿with the facility for citation searching. Limitations of English language and year of publication 2006-June 2013 were set. A taxonomy categorising the continuum of usage was developed. Only studies applying the framework to implementation projects were included. Data were extracted and mapped against each phase of the framework for studies where it was integral to the implementation project.ResultsThe citation search yielded 1,787 records. A total of 1,057 titles and abstracts were screened. One hundred and forty-six studies described usage to varying degrees, ranging from referenced to integrated. In ten studies, the KTA Framework was integral to the design, delivery and evaluation of the implementation activities. All ten described using the Action Cycle and seven referred to Knowledge Creation. The KTA Framework was enacted in different health care and academic settings with projects targeted at patients, the public, and nursing and allied health professionals.ConclusionsThe KTA Framework is being used in practice with varying degrees of completeness. It is frequently cited, with usage ranging from simple attribution via a reference, through informing planning, to making an intellectual contribution. When the framework was integral to knowledge translation, it guided action in idiosyncratic ways and there was theory fidelity. Prevailing wisdom encourages the use of theories, models and conceptual frameworks, yet their application is less evident in practice. This may be an artefact of reporting, indicating that prospective, primary research is needed to explore the real value of the KTA Framework and similar tools. PMID:25417046

  17. A Qualitative Analysis Evaluating The Purposes And Practices Of Clinical Documentation

    PubMed Central

    Ho, Y.-X.; Gadd, C. S.; Kohorst, K.L.; Rosenbloom, S.T.

    2014-01-01

    Summary Objectives An important challenge for biomedical informatics researchers is determining the best approach for healthcare providers to use when generating clinical notes in settings where electronic health record (EHR) systems are used. The goal of this qualitative study was to explore healthcare providers’ and administrators’ perceptions about the purpose of clinical documentation and their own documentation practices. Methods We conducted seven focus groups with a total of 46 subjects composed of healthcare providers and administrators to collect knowledge, perceptions and beliefs about documentation from those who generate and review notes, respectively. Data were analyzed using inductive analysis to probe and classify impressions collected from focus group subjects. Results We observed that both healthcare providers and administrators believe that documentation serves five primary domains: clinical, administrative, legal, research, education. These purposes are tied closely to the nature of the clinical note as a document shared by multiple stakeholders, which can be a source of tension for all parties who must use the note. Most providers reported using a combination of methods to complete their notes in a timely fashion without compromising patient care. While all administrators reported relying on computer-based documentation tools to review notes, they expressed a desire for a more efficient method of extracting relevant data. Conclusions Although clinical documentation has utility, and is valued highly by its users, the development and successful adoption of a clinical documentation tool largely depends on its ability to be smoothly integrated into the provider’s busy workflow, while allowing the provider to generate a note that communicates effectively and efficiently with multiple stakeholders. PMID:24734130

  18. Reforming High School Science for Low-Performing Students Using Inquiry Methods and Communities of Practice

    NASA Astrophysics Data System (ADS)

    Bolden, Marsha Gail

    Some schools fall short of the high demand to increase science scores on state exams because low-performing students enter high school unprepared for high school science. Low-performing students are not successful in high school for many reasons. However, using inquiry methods have improved students' understanding of science concepts. The purpose of this qualitative research study was to investigate the teachers' lived experiences with using inquiry methods to motivate low-performing high school science students in an inquiry-based program called Xtreem Science. Fifteen teachers were selected from the Xtreem Science program, a program designed to assist teachers in motivating struggling science students. The research questions involved understanding (a) teachers' experiences in using inquiry methods, (b) challenges teachers face in using inquiry methods, and (c) how teachers describe student's response to inquiry methods. Strategy of data collection and analysis included capturing and understanding the teachers' feelings, perceptions, and attitudes in their lived experience of teaching using inquiry method and their experience in motivating struggling students. Analysis of interview responses revealed teachers had some good experiences with inquiry and expressed that inquiry impacted their teaching style and approach to topics, and students felt that using inquiry methods impacted student learning for the better. Inquiry gave low-performing students opportunities to catch up and learn information that moved them to the next level of science courses. Implications for positive social change include providing teachers and school district leaders with information to help improve performance of the low performing science students.

  19. A Content Analysis of "Learning Disabilities Research & Practice:" 1991-2007

    ERIC Educational Resources Information Center

    Vostal, Brooks R.; Hughes, Charles A.; Ruhl, Kathy L.; Benedek-Wood, Elizabeth; Dexter, Douglas D.

    2008-01-01

    The purpose of this review was to analyze the content of "Learning Disabilities Research & Practice" to identify prevalent topics and types and proportions of articles published from 1991 through 2007. Also analyzed was the nature of the research reported including designs, participants, interveners, and settings. Analysis indicated that the three…

  20. Abstract: Using System Dynamics Analysis for Evaluating the Sustainability of ?Complete Streets? Practices

    EPA Science Inventory

    Abstract: Using System Dynamics Analysis for Evaluating the Sustainability of ?Complete Streets? Practices Primary Author: Nicholas R. Flanders 109 T.W. Alexander Drive Mail Code: E343-02 Research Triangle Park, NC 27709 919-541-3660 Flanders.nick@Epa.gov Topic categ...

  1. 3D Scanning Technology as a Standard Archaeological Tool for Pottery Analysis: Practice and Theory

    E-print Network

    3D Scanning Technology as a Standard Archaeological Tool for Pottery Analysis: Practice and Theory project, where 3D scanning technology, and newly developed software to optimally identify the rotation,20,28,32,34]. While the 3D scanning technology has made impressive strides in the last decade, its applications

  2. 3D scanning technology as a standard archaeological tool for pottery analysis: practice and theory

    E-print Network

    Smilansky, Uzy

    3D scanning technology as a standard archaeological tool for pottery analysis: practice and theoryD scanning technology, and newly developed software to optimally identify the rotation axis of wheel sites and periods were scanned, their axis of symmetry computed, and their mean profiles drawn

  3. Identifying Evidence-Based Practices in Special Education through High Quality Meta-Analysis

    ERIC Educational Resources Information Center

    Friedt, Brian

    2012-01-01

    The purpose of this study was to determine if meta-analysis can be used to enhance efforts to identify evidence-based practices (EBPs). In this study, the quality of included studies acted as the moderating variable. I used the quality indicators for experimental and quasi-experimental research developed by Gersten, Fuchs, Coyne, Greenwood, and…

  4. inria-00000421,version1-11Oct2005 Practical Semantic Analysis of Web Sites and Documents

    E-print Network

    Paris-Sud XI, Université de

    inria-00000421,version1-11Oct2005 Practical Semantic Analysis of Web Sites and Documents Thierry make a parallel between programs and Web sites. We present some examples of semantic constraints semantics, logic programming, web sites, in- formation system, knowledge management, content management

  5. The State, Legal Rigor and the Poor: The Daily Practice of Welfare Control Social Analysis, forthcoming.

    E-print Network

    Paris-Sud XI, Université de

    1 The State, Legal Rigor and the Poor: The Daily Practice of Welfare Control Social Analysis +33 (0)3 68 85 61 70 vincent.dubois@misha.fr Abstract: The state comes into being through its acts. This paper focuses on state acts par excellence, by which the state controls its population. It is based

  6. An NCME Instructional Module on Developing and Administering Practice Analysis Questionnaires

    ERIC Educational Resources Information Center

    Raymond, Mark R.

    2005-01-01

    The purpose of a credentialing examination is to assure the public that individuals who work in an occupation or profession have met certain standards. To be consistent with this purpose, credentialing examinations must be job related, and this requirement is typically met by developing test plans based on an empirical job or practice analysis.…

  7. Hotel companies' environmental policies and practices: a content analysis of their web pages

    Microsoft Academic Search

    2012-01-01

    Purpose – The purpose of this paper is to analyze the environmental management policies and practices of the top 50 hotel companies as disclosed on their corporate web sites. Design\\/methodology\\/approach – This study employed content analysis to review the web sites of the top 50 hotel companies as defined herein. Findings – Only 46 per cent of the selected hotel

  8. VMCAnalytic: Developing a Collaborative Video Analysis Tool for Education Faculty and Practicing Educators

    Microsoft Academic Search

    Grace Agnew; Chad M. Mills; Carolyn A. Maher

    2010-01-01

    This paper describes the genesis, design and prototype development of the VMCAnalytic, a repository-based video annotation and analysis tool for education. The VMCAnalytic is a flexible, extensible analytic tool that is unique in its integration into an open source repository architecture to transform a resource discovery environment into an interactive collaborative where practicing teachers and faculty researchers can analyze and

  9. Using Performance Analysis for Training in an Organization Implementing ISO-9000 Manufacturing Practices: A Case Study.

    ERIC Educational Resources Information Center

    Kunneman, Dale E.; Sleezer, Catherine M.

    2000-01-01

    This case study examines the application of the Performance Analysis for Training (PAT) Model in an organization that was implementing ISO-9000 (International Standards Organization) processes for manufacturing practices. Discusses the interaction of organization characteristics, decision maker characteristics, and analyst characteristics to…

  10. An Evaluation of EPIC's Analysis of School Practice & Knowledge System. The Effective Practice Incentive Community (EPIC). Research Brief

    ERIC Educational Resources Information Center

    Sloan, Kay; Pereira-Leon, Maura; Honeyford, Michelle

    2012-01-01

    Established in 2006 by New Leaders for New Schools[TM], the Effective Practice Incentive Community (EPIC) initiative rewards high-need urban schools showing significant gains in student achievement. In exchange, schools agree to share the practices helping to drive those gains, which they do through an in-depth study of practice, aided by the EPIC…

  11. Importance of meta-analysis and practical obstacles in oncological and epidemiological studies: statistics very close but also far!

    PubMed

    Tanriverdi, Ozgur; Yeniceri, Nese

    2015-01-01

    Studies of epidemiological and prognostic factors are very important for oncology practice. There is a rapidly increasing amount of research and resultant knowledge in the scientific literature. This means that health professionals have major challenges in accessing relevant information and they increasingly require best available evidence to make their clinical decisions. Meta-analyses of prognostic and other epidemiological factors are very practical statistical approaches to define clinically important parameters. However, they also feature many obstacles in terms of data collection, standardization of results from multiple centers, bias, and commentary for intepretation. In this paper, the obstacles of meta-analysis are briefly reviewed, and potential problems with this statistical method are discussed. PMID:25735371

  12. Three-dimensional Stress Analysis Using the Boundary Element Method

    NASA Technical Reports Server (NTRS)

    Wilson, R. B.; Banerjee, P. K.

    1984-01-01

    The boundary element method is to be extended (as part of the NASA Inelastic Analysis Methods program) to the three-dimensional stress analysis of gas turbine engine hot section components. The analytical basis of the method (as developed in elasticity) is outlined, its numerical implementation is summarized, and the approaches to be followed in extending the method to include inelastic material response indicated.

  13. Instruments and Methods Singular spectrum analysis and envelope detection

    E-print Network

    Moore, John

    method of improving signal-to-noise ratio in radargrams. The method uses singular spectrum analysis (SSA; it operates in the time domain, and as such is suited to time series containing quasi-periodic signals, ratherInstruments and Methods Singular spectrum analysis and envelope detection: methods of enhancing

  14. Analysis of an inquiry-oriented inservice program in affecting science teaching practices

    NASA Astrophysics Data System (ADS)

    Santamaria Makang, Doris

    This study was an examination of how science teachers' teaching abilities---content and pedagogical knowledge and skills---were affected by an inquiry-oriented science education professional development program. The study researched the characteristics of an inservice program, Microcosmos, designed to equip teachers with new perspectives on how to stimulate students' learning and to promote a self-reflective approach for the implementation of instructional practices leading to improving teachers' and students' roles in the science classroom. The Microcosmos Inservice Program, which focused on the use of microorganisms as a vehicle to teach science for middle and high school grades, was funded by the National Science Foundation and developed by the Microcosmos Project based at the School of Education, Boston University. The teacher-training program had as its main objective to show teachers and other educators how the smallest life forms---the microbes---can be a usable and dynamic way to stimulate science interest in students of all ages. It combines and integrates a number of training components that appear to be consistent with the recommendations listed in the major reform initiatives. The goal of the study was to explore weather the program provoked any change(s) in the pedagogical practices of teachers over time, and if these changes fostered inquiry-based practices in the classroom. The exploratory analysis used a qualitative methodology that followed a longitudinal design for the collection of the data gathered from a sample of 31 participants. The data was collected in two phases. Phase One - The Case History group, involved 5 science teachers over a period of seven years. Phase Two - The Expanded Teacher sample, involved 26 teachers---22 new teachers plus four teachers from Phase One---contacted at two different points on time during the study. Multiple data sources allowed for the collection of a varied and rigorous set of data for each individual in the sample. The primary data source was semi-structured interviews. Secondary data sources included pre- and post- on-site visits, classroom observations, teacher's self-report protocols and questionnaires, and documents and examples of teacher-work developed during the inservice training. The data was examined for evidence of change on: teachers' self-reported content-specific gains, teachers'self-reported and observed changes in their teaching methods and approach to curriculum, and the teachers' self-reported and observed changes in classroom practices as a result of the content and the pedagogy acting together and supplementing each other. A major finding of the study confirmed the benefits of inservice activities with an integral focus of science content and pedagogy on enhancing teachers' approach to instruction. The findings give renewed emphasis to the importance that inquiry-based practices for working with teachers, combined with a specific subject-matter focus, have in designing effective professional development. This combined approach, in some instances, contributed to important gains in the pedagogical content knowledge that teachers needed in order to effectively implement the Microcosmos learning experiences.

  15. Methods for analysis of fluoroquinolones in biological fluids

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Methods for analysis of 10 selected fluoroquinolone antibiotics in biological fluids are reviewed. Approaches for sample preparation, detection methods, limits of detection and quantitation and recovery information are provided for both single analyte and multi-analyte fluoroquinolone methods....

  16. A Spatial Load Forecasting Method Based on the Theory of Clustering Analysis

    NASA Astrophysics Data System (ADS)

    Bai, Xiao; Peng-wei, Guo; Gang, Mu; Gan-gui, Yan; Ping, Li; Hong-wei, Cheng; Jie-fu, Li; Yang, Bai

    The spatial load forecasting method based on time series analysis usually divides predictive zone into cellules, and then forecast the load of each cell based on time series analysis, to predict the whole spatial load. Now there are many prediction models based on time series analysis, but it is hard to realize the minimum of the prediction error for each cell, namely not guarantee the accuracy of results of spatial load forecasting. Thus, the spatial load forecasting method based on the theory of clustering analysis is proposed there. Firstly this method analyzes the results and the relative prediction error of each cell with different load forecasting model. Second, according to the best forecast models, cluster all cellules, which cellules in the same cluster use the same prediction model to forecast the load of cellules of the target year. The results of a practical example show that our method is correct and effective.

  17. A Novel Method for Dissolved Phosphorus Analysis

    NASA Astrophysics Data System (ADS)

    Berry, J. M.; Spiese, C. E.

    2012-12-01

    High phosphorus loading is a major problem in the Great Lakes watershed. Phosphate enters waterways via both point and non-point sources (e.g., runoff, tile drainage, etc.), promoting eutrophication, and ultimately leading to algal blooms, hypoxia and loss of aquatic life. Quantification of phosphorus loading is typically done using the molybdenum blue method, which is known to have significant drawbacks. The molybdenum blue method requires strict control on time, involves toxic reagents that have limited shelf-life, and is generally unable to accurately measure sub-micromolar concentrations. This study aims to develop a novel reagent that will overcome many of these problems. Ethanolic europium(III) chloride and 8-hydroxyquinoline-5-sulfonic acid (hqs) were combined to form the bis-hqs complex (Eu-hqs). Eu-hqs was synthesized as the dipotassium salt via a simple one-pot procedure. This complex was found to be highly fluorescent (?ex = 360 nm, ?em = 510 nm) and exhibited a linear response upon addition of monohydrogen phosphate. The linear response ranged from 0.5 - 25 ?M HPO42- (15.5 - 775 ?g P L-1). It was also determined that Eu-hqs formed a 1:1 complex with phosphate. Maximum fluorescence was found at a pH of 8.50, and few interferences from other ions were found. Shelf-life of the reagent was at least one month, twice as long as most of the molybdenum blue reagent formulations. In the future, field tests will be undertaken in local rivers, lakes, and wetlands to determine the applicability of the complex to real-world analysis.

  18. Quality changes of anchovy (Stolephorus heterolobus) under refrigerated storage of different practical industrial methods in Thailand.

    PubMed

    Chotimarkorn, Chatchawan

    2014-02-01

    Quality changes of anchovy (Stolephorus heterolobus) muscle during 7 days of refrigerated storage with ice and without ice were studied using several indicators: changes of ATP degradation products, K-value, TVB-N, TMA-N, Lactic acid, biogenic amines, sensory and microbiological analysis. During 7-day of refrigerated storage with ice and without ice, K-value, TVB-N, TMA-N and D, L-lactic acids contents increased with longer storage time (p???0.05). Major biogenic amines found in anchovy muscle during refrigerated storage were cadaverine, agmatine and tyramine, followed by putrescine and histamine. Skin and external odour by sensory evaluation, progressive decreases were observed as refrigeration time progressed. Storage of anchovy with ice resulted in a better maintenance of sensory quality, better control microbial activity, and the slowing down of biochemical degradation mechanisms. This result introduces the use of refrigerated storage with ice as a practical preliminary chilling for anchovy during industrial processing. PMID:24493885

  19. Evaluation of targeting methods for implementation of best management practices in the Saginaw River Watershed.

    PubMed

    Giri, Subhasis; Nejadhashemi, A Pouyan; Woznicki, Sean A

    2012-07-30

    Increasing concerns regarding water quality in the Great Lakes region are mainly due to changes in urban and agricultural landscapes. Both point and non-point sources contribute pollution to Great Lakes surface waters. Best management practices (BMPs) are a common tool used to reduce both point and non-point source pollution and improve water quality. Meanwhile, identification of critical source areas of pollution and placement of BMPs plays an important role in pollution reduction. The goal of this study is to evaluate the performance of different targeting methods in 1) identifying priority areas (high, medium, and low) based on various factors such as pollutant concentration, load, and yield, 2) comparing pollutant (sediment, total nitrogen-TN, and total phosphorus-TP) reduction in priority areas defined by all targeting methods, 3) determine the BMP relative sensitivity index among all targeting methods. Ten BMPs were implemented in the Saginaw River Watershed using the Soil and Water Assessment Tool (SWAT) model following identification of priority areas. Each targeting method selected distinct high priority areas based on the methodology of implementation. The concentration based targeting method was most effective at reduction of TN and TP, likely because it selected the greatest area of high priority for BMP implementation. The subbasin load targeting method was most effective at reducing sediment because it tended to select large, highly agricultural subbasins for BMP implementation. When implementing BMPs, native grass and terraces were generally the most effective, while conservation tillage and residue management had limited effectiveness. The BMP relative sensitivity index revealed that most combinations of targeting methods and priority areas resulted in a proportional decrease in pollutant load from the subbasin level and watershed outlet. However, the concentration and yield methods were more effective at subbasin reduction, while the stream load method was more effective at reducing pollutants at the watershed outlet. The results of this study indicate that emphasis should be placed on selection of the proper targeting method and BMP to meet the needs and goals of a BMP implementation project because different targeting methods produce varying results. PMID:22459068

  20. A new interpretation and practical aspects of the direct-methods modulus sum function. VIII.

    PubMed

    Rius, Jordi; Torrelles, Xavier; Miravitlles, Carles; Amigó, J M; Reventós, M M

    2002-01-01

    Since the first publication of the direct-methods modulus sum function [Rius (1993). Acta Cryst. A49, 406-409], the application of this function to a variety of situations has been shown in a series of seven subsequent papers. In this way, much experience about this function and its practical use has been gained. It is thought by the authors that it is now the right moment to publish a more complete study of this function which also considers most of this practical knowledge. The first part of the study relates, thanks to a new interpretation, this function to other existing phase-refinement functions, while the second shows, with the help of test calculations on a selection of crystal structures, the behaviour of the function for two different control parameters. In this study, the principal interest is focused on the function itself and not on the optimization procedure which is based on a conventional sequential tangent formula refinement. The results obtained are quite satisfactory and seem to indicate that, when combined with more sophisticated optimization algorithms, the application field of this function could be extended to larger structures than those used for the test calculations. PMID:11752759

  1. Optimum compression to ventilation ratios in CPR under realistic, practical conditions: a physiological and mathematical analysis

    Microsoft Academic Search

    Charles F. Babbs; Karl B. Kern

    2002-01-01

    Objective: To develop and evaluate a practical formula for the optimum ratio of compressions to ventilations in cardiopulmonary resuscitation (CPR). The optimum value of a variable is that for which a desired result is maximized. Here the desired result is assumed to be either oxygen delivery to peripheral tissues or a combination of oxygen delivery and waste product removal. Method:

  2. An Analysis of Agricultural Mechanics Safety Practices in Agricultural Science Laboratories.

    ERIC Educational Resources Information Center

    Swan, Michael K.

    North Dakota secondary agricultural mechanics instructors were surveyed regarding instructional methods and materials, safety practices, and equipment used in the agricultural mechanics laboratory. Usable responses were received from 69 of 89 instructors via self-administered mailed questionnaires. Findings were consistent with results of similar…

  3. Interpreting the Meaning of Grades: A Descriptive Analysis of Middle School Teachers' Assessment and Grading Practices

    ERIC Educational Resources Information Center

    Grimes, Tameshia Vaden

    2010-01-01

    This descriptive, non-experimental, quantitative study was designed to answer the broad question, "What do grades mean?" Core academic subject middle school teachers from one large, suburban school district in Virginia were administered an electronic survey that asked them to report on aspects of their grading practices and assessment methods for…

  4. Assessing performance of conservation-based Best Management Practices: Coarse vs. fine-scale analysis

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Background/Questions/Methods Animal agriculture in the Spring Creek watershed of central Pennsylvania contributes sediment to the stream and ultimately to the Chesapeake Bay. Best Management Practices (BMPs) such as streambank buffers are intended to intercept sediment moving from heavy-use areas to...

  5. Principal component analysis: a method for determining the essential dynamics of proteins.

    PubMed

    David, Charles C; Jacobs, Donald J

    2014-01-01

    It has become commonplace to employ principal component analysis to reveal the most important motions in proteins. This method is more commonly known by its acronym, PCA. While most popular molecular dynamics packages inevitably provide PCA tools to analyze protein trajectories, researchers often make inferences of their results without having insight into how to make interpretations, and they are often unaware of limitations and generalizations of such analysis. Here we review best practices for applying standard PCA, describe useful variants, discuss why one may wish to make comparison studies, and describe a set of metrics that make comparisons possible. In practice, one will be forced to make inferences about the essential dynamics of a protein without having the desired amount of samples. Therefore, considerable time is spent on describing how to judge the significance of results, highlighting pitfalls. The topic of PCA is reviewed from the perspective of many practical considerations, and useful recipes are provided. PMID:24061923

  6. Practical methods of tracking of nonstationary time series applied to real-world data

    NASA Astrophysics Data System (ADS)

    Nabney, Ian T.; McLachlan, Alan; Lowe, David

    1996-03-01

    In this paper, we discuss some practical implications for implementing adaptable network algorithms applied to non-stationary time series problems. Two real world data sets, containing electricity load demands and foreign exchange market prices, are used to test several different methods, ranging from linear models with fixed parameters, to non-linear models which adapt both parameters and model order on-line. Training with the extended Kalman filter, we demonstrate that the dynamic model-order increment procedure of the resource allocating RBF network (RAN) is highly sensitive to the parameters of the novelty criterion. We investigate the use of system noise for increasing the plasticity of the Kalman filter training algorithm, and discuss the consequences for on-line model order selection. The results of our experiments show that there are advantages to be gained in tracking real world non-stationary data through the use of more complex adaptive models.

  7. The National Criminal Justice Treatment Practices survey: Multilevel survey methods and procedures?

    PubMed Central

    Taxman, Faye S.; Young, Douglas W.; Wiersema, Brian; Rhodes, Anne; Mitchell, Suzanne

    2007-01-01

    The National Criminal Justice Treatment Practices (NCJTP) survey provides a comprehensive inquiry into the nature of programs and services provided to adult and juvenile offenders involved in the justice system in the United States. The multilevel survey design covers topics such as the mission and goals of correctional and treatment programs; organizational climate and culture for providing services; organizational capacity and needs; opinions of administrators and staff regarding rehabilitation, punishment, and services provided to offenders; treatment policies and procedures; and working relationships between correctional and other agencies. The methodology generates national estimates of the availability of programs and services for offenders. This article details the methodology and sampling frame for the NCJTP survey, response rates, and survey procedures. Prevalence estimates of juvenile and adult offenders under correctional control are provided with externally validated comparisons to illustrate the veracity of the methodology. Limitations of the survey methods are also discussed. PMID:17383548

  8. Multi-Spacecraft Turbulence Analysis Methods

    NASA Astrophysics Data System (ADS)

    Horbury, Tim S.; Osman, Kareem T.

    Turbulence is ubiquitous in space plasmas, from the solar wind to supernova remnants, and on scales from the electron gyroradius to interstellar separations. Turbulence is responsible for transporting energy across space and between scales and plays a key role in plasma heating, particle acceleration and thermalisation downstream of shocks. Just as with other plasma processes such as shocks or reconnection, turbulence results in complex, structured and time-varying behaviour which is hard to measure with a single spacecraft. However, turbulence is a particularly hard phenomenon to study because it is usually broadband in nature: it covers many scales simultaneously. One must therefore use techniques to extract information on multiple scales in order to quantify plasma turbulence and its effects. The Cluster orbit takes the spacecraft through turbulent regions with a range of characteristics: the solar wind, magnetosheath, cusp and magnetosphere. In each, the nature of the turbulence (strongly driven or fully evolved; dominated by kinetic effects or largely on fluid scales), as well as characteristics of the medium (thermalised or not; high or low plasma sub- or super-Alfvenic) mean that particular techniques are better suited to the analysis of Cluster data in different locations. In this chapter, we consider a range of methods and how they are best applied to these different regions. Perhaps the most studied turbulent space plasma environment is the solar wind, see Bruno and Carbone [2005]; Goldstein et al. [2005] for recent reviews. This is the case for a number of reasons: it is scientifically important for cosmic ray and solar energetic particle scattering and propagation, for example. However, perhaps the most significant motivations for studying solar wind turbulence are pragmatic: large volumes of high quality measurements are available; the stability of the solar wind on the scales of hours makes it possible to identify statistically stationary intervals to analyse; and, most important of all, the solar wind speed, V SW , is much higher than the local MHD wave speeds. This means that a spacecraft time series is essentially a "snapshot" spatial sample of the plasma along the flow direction, so we can consider measurements at a set of times ti to be at a set of locations in the plasma given by xi = VSW. This approximation,known as Taylor's hypothesis, greatly simplifies the analysis of the data. In contrast, in the magnetosheath the flow speed is lower than the wave speed and therefore temporal changes at the spacecraft are due to a complex combination of the plasma moving over the spacecraft and the turbulent fluctuations propagating in the plasma frame. This is also the case for ion and electron kinetic scale turbulence in the solar wind and dramatically complicates the analysis of the data. As a result, the application of multi-spacecraft techniques such as k filtering to Cluster data (see Chapter 5, which make it possible to disentangle the effects of flow and wave propagation, have probably resulted in the greatest increase in our understanding of magnetosheath turbulence rather than in the solar wind. We can therefore summarise the key advantages for plasma turbulence analysis of multi-spacecraft data sets such as those from Cluster, compared to single spacecraft data. Multiple sampling points allow us to measure how the turbulence varies in many directions, and on a range of scales, simultaneously, enabling the study of anisotropy in ways that have not previously been possible. They also allow us to distinguish between the motion of fluctuations in the plasma and motion of the plasma itself, enabling the study of turbulence in highly disturbed environments such as the magnetosheath. A number of authors have studied turbulence with Cluster data, using different techniques, the choice of which is motivated by the characteristics of the plasma environment in which they are interested. The complexity of both the Cluster data and the problem of turbulence meant that progress early in the mission was rat

  9. Method Development for Analysis of Aspirin Tablets.

    ERIC Educational Resources Information Center

    Street, Kenneth W., Jr.

    1988-01-01

    Develops a lab experiment for introductory instrumental analysis that requires interference studies and optimizing of conditions. Notes the analysis of the aspirin is by visible spectrophotometric assay. Gives experimental details and discussion. (MVL)

  10. Physical methods for intracellular delivery: practical aspects from laboratory use to industrial-scale processing.

    PubMed

    Meacham, J Mark; Durvasula, Kiranmai; Degertekin, F Levent; Fedorov, Andrei G

    2014-02-01

    Effective intracellular delivery is a significant impediment to research and therapeutic applications at all processing scales. Physical delivery methods have long demonstrated the ability to deliver cargo molecules directly to the cytoplasm or nucleus, and the mechanisms underlying the most common approaches (microinjection, electroporation, and sonoporation) have been extensively investigated. In this review, we discuss established approaches, as well as emerging techniques (magnetofection, optoinjection, and combined modalities). In addition to operating principles and implementation strategies, we address applicability and limitations of various in vitro, ex vivo, and in vivo platforms. Importantly, we perform critical assessments regarding (1) treatment efficacy with diverse cell types and delivered cargo molecules, (2) suitability to different processing scales (from single cell to large populations), (3) suitability for automation/integration with existing workflows, and (4) multiplexing potential and flexibility/adaptability to enable rapid changeover between treatments of varied cell types. Existing techniques typically fall short in one or more of these criteria; however, introduction of micro-/nanotechnology concepts, as well as synergistic coupling of complementary method(s), can improve performance and applicability of a particular approach, overcoming barriers to practical implementation. For this reason, we emphasize these strategies in examining recent advances in development of delivery systems. PMID:23813915

  11. Bayesian methods for data analysis in software engineering

    Microsoft Academic Search

    Mohan Sridharan; Akbar Siami Namin

    2010-01-01

    Software engineering researchers analyze programs by applying a range of test cases, measuring relevant statistics and reasoning about the observed phenomena. Though the traditional statistical methods provide a rigorous analysis of the data obtained during program analysis, they lack the flexibility to build a unique representation for each program. Bayesian methods for data analysis, on the other hand, allow for

  12. MONTE CARLO ANALYSIS: ESTIMATING GPP WITH THE CANOPY CONDUCTANCE METHOD

    E-print Network

    DeLucia, Evan H.

    MONTE CARLO ANALYSIS: ESTIMATING GPP WITH THE CANOPY CONDUCTANCE METHOD 1. Overview A novel method performed a Monte Carlo Analysis to investigate the power of our statistical approach: i.e. what and Assumptions The Monte Carlo Analysis was performed as follows: · Natural variation. The only study to date

  13. COMPARISON OF METHODS FOR THE ANALYSIS OF PANEL STUDIES

    EPA Science Inventory

    Three different methods of analysis of panels were compared using asthma panel data from a 1970-1971 study done by EPA in Riverhead, New York. The methods were (1) regression analysis using raw attack rates; (2) regression analysis using the ratio of observed attacks to expected ...

  14. Simple and practical method for correcting the inhomogeneous sensitivity of a receiving coil in magnetic particle imaging

    NASA Astrophysics Data System (ADS)

    Murase, Kenya; Banura, Natsuo; Mimura, Atsushi; Nishimoto, Kohei

    2015-03-01

    Magnetic particle imaging is a novel method of imaging the spatial distribution of magnetic nanoparticles. When considering the practical application of magnetic particle imaging, it is important to correct the inhomogeneous sensitivity of the receiving coil together with the feedthrough interference. In this study, we developed a simple and practical method for these corrections in which projection data are multiplied by correction factors obtained by fitting projection data acquired in a blank scan to a sixth-degree polynomial. Phantom experiments suggest that our method can be simply and easily implemented to realize the above corrections.

  15. Applying the 5-Step Method to Children and Affected Family Members: Opportunities and Challenges within Policy and Practice

    ERIC Educational Resources Information Center

    Harwin, Judith

    2010-01-01

    The main aim of this article is to consider how the 5-Step Method could be developed to meet the needs of affected family members (AFMs) with children under the age of 18. This would be an entirely new development. This article examines opportunities and challenges within practice and policy and makes suggestions on how the Method could be taken…

  16. Development and analysis of atomistic-to-continuum coupling methods

    E-print Network

    Weinberger, Hans

    Development and analysis of atomistic-to-continuum coupling methods Mitchell Luskin University analysis has clarified the relation between the various methods and the sources of error. The development. Materials scientists have proposed many methods to compute solutions to these mul- tiscale problems

  17. BIBLIOGRAPHY ON HAZARDOUS MATERIALS ANALYSIS METHODS

    EPA Science Inventory

    A comprehensive annotated bibliography of analytical methods for 67 of the chemicals on the Environmental Protection Agency's Hazardous Substances List is presented. Literature references were selected and abstracts of analytical methods were compiled to facilitate rapid and accu...

  18. Combining the soilwater balance and water-level fluctuation methods to estimate natural groundwater recharge: Practical aspects

    USGS Publications Warehouse

    Sophocleous, M.A.

    1991-01-01

    A relatively simple and practical approach for calculating groundwater recharge in semiarid plain environments with a relatively shallow water table, such as the Kansas Prairies, is outlined. Major uncertainties in the Darcian, water balance, and groundwater fluctuation analysis approaches are outlined, and a combination methodology for reducing some of the uncertainties is proposed. By combining a storm-based soilwater balance (lasting several days) with the resulting water table rise, effective storativity values of the region near the water table are obtained. This combination method is termed the 'hybrid water-fluctuation method'. Using a simple average of several such estimates results in a site-calibrated effective storativity value that can be used to translate each major water-table rise tied to a specific storm period into a corresponding amount of groundwater recharge. Examples of soilwater balance and water-level fluctuation analyses based on field-measured data from Kansas show that the proposed methodology gives better and more reliable results than either of the two well-established approaches used singly. ?? 1991.

  19. Learning in the Permaculture Community of Practice in England: An Analysis of the Relationship between Core Practices and Boundary Processes

    ERIC Educational Resources Information Center

    Ingram, Julie; Maye, Damian; Kirwan, James; Curry, Nigel; Kubinakova, Katarina

    2014-01-01

    Purpose: This article utilizes the Communities of Practice (CoP) framework to examine learning processes among a group of permaculture practitioners in England, specifically examining the balance between core practices and boundary processes. Design/methodology/approach: The empirical basis of the article derives from three participatory workshops…

  20. METHODS OF FREQUENCY ANALYSIS OF A COMPLEX MAMMALIAN VOCALISATION

    Microsoft Academic Search

    SAFI K. DARDEN; SIMON B. PEDERSEN; TORBEN DABELSTEEN

    2003-01-01

    The prevalence of complex acoustic structures in mammalian vocalisations can make it difficult to quantify frequency characteristics. We describe two methods developed for the frequency analysis of a complex swift fox Vulpes velox vocalisation, the barking sequence: (1) autocorrelation function analysis and (2) instantaneous frequency analysis. The autocorrelation function analysis results in an energy density spectrum of the signal's averaged

  1. Nurses’ self-efficacy and practices relating to weight management of adult patients: a path analysis

    PubMed Central

    2013-01-01

    Background Health professionals play a key role in the prevention and treatment of excess weight and obesity, but many have expressed a lack of confidence in their ability to manage obese patients with their delivery of weight-management care remaining limited. The specific mechanism underlying inadequate practices in professional weight management remains unclear. The primary purpose of this study was to examine a self-efficacy theory-based model in understanding Registered Nurses’ (RNs) professional performance relating to weight management. Methods A self-report questionnaire was developed based upon the hypothesized model and administered to a convenience sample of 588 RNs. Data were collected regarding socio-demographic variables, psychosocial variables (attitudes towards obese people, professional role identity, teamwork beliefs, perceived skills, perceived barriers and self-efficacy) and professional weight management practices. Structural equation modeling was conducted to identify correlations between the above variables and to test the goodness of fit of the proposed model. Results The survey response rate was 71.4% (n?=?420). The respondents reported a moderate level of weight management practices. Self-efficacy directly and positively predicted the weight management practices of the RNs (??=?0.36, p?practices. The final model constructed in this study demonstrated a good fit to the data [?2 (14) =13.90, p?=?0.46; GFI?=?0.99; AGFI?=?0.98; NNFI?=?1.00; CFI?=?1.00; RMSEA?=?0.00; AIC?=?57.90], accounting for 38.4% and 43.2% of the variance in weight management practices and self-efficacy, respectively. Conclusions Self-efficacy theory appears to be useful in understanding the weight management practices of RNs. Interventions targeting the enhancement of self-efficacy may be effective in promoting RNs’ professional performance in managing overweight and obese patients. PMID:24304903

  2. The method of attributes for data flow analysis

    Microsoft Academic Search

    Wayne A. Babich; Mehdi Jazayeri

    1978-01-01

    A new technique for global data flow analysis, called the method of attributes, is introduced. The technique is iterative and operates on a parse tree representation of the program. Application to dead variable and available expression analysis is shown.

  3. DISCRETE FUNCTIONAL ANALYSIS TOOLS FOR DISCONTINUOUS GALERKIN METHODS WITH APPLICATION

    E-print Network

    Paris-Sud XI, Université de

    DISCRETE FUNCTIONAL ANALYSIS TOOLS FOR DISCONTINUOUS GALERKIN METHODS WITH APPLICATION functional analysis tools are used to prove the conver- gence of Discontinuous Galerkin approximations and a conservative one based on a nonstandard modification of the pressure. 1. Introduction Discontinuous Galerkin

  4. Vitamin D Status of Clinical Practice Populations at Higher Latitudes: Analysis and Applications

    PubMed Central

    Genuis, Stephen J.; Schwalfenberg, Gerry K.; Hiltz, Michelle N.; Vaselenak, Sharon A.

    2009-01-01

    Background: Inadequate levels of vitamin D (VTD) throughout the life cycle from the fetal stage to adulthood have been correlated with elevated risk for assorted health afflictions. The purpose of this study was to ascertain VTD status and associated determinants in three clinical practice populations living in Edmonton, Alberta, Canada - a locale with latitude of 53°30’N, where sun exposure from October through March is often inadequate to generate sufficient vitamin D. Methods: To determine VTD status, 1,433 patients from three independent medical offices in Edmonton had levels drawn for 25(OH)D as part of their medical assessment between Jun 2001 and Mar 2007. The relationship between demographic data and lifestyle parameters with VTD status was explored. 25(OH)D levels were categorized as follows: (1) Deficient: <40 nmol/L; (2) Insufficient (moderate to mild): 40 to <80 nmol/L; and (3) Adequate: 80–250 nmol/L. Any cases <25 nmol/L were subcategorized as severely deficient for purposes of further analysis. Results: 240 (16.75% of the total sample) of 1,433 patients were found to be VTD ‘deficient’ of which 48 (3.35% of the overall sample) had levels consistent with severe deficiency. 738 (51.5% of the overall sample) had ‘insufficiency’ (moderate to mild) while only 31.75% had ‘adequate’ 25(OH)D levels. The overall mean for 25(OH) D was 68.3 with SD=28.95. VTD status was significantly linked with demographic and lifestyle parameters including skin tone, fish consumption, milk intake, sun exposure, tanning bed use and nutritional supplementation. Conclusion: A high prevalence of hypovitaminosis-D was found in three clinical practice populations living in Edmonton. In view of the potential health sequelae associated with widespread VTD inadequacy, strategies to facilitate translation of emerging epidemiological information into clinical intervention need to be considered in order to address this public health issue. A suggested VTD supplemental intake level is presented for consideration. PMID:19440275

  5. Methods for Mediation Analysis with Missing Data

    ERIC Educational Resources Information Center

    Zhang, Zhiyong; Wang, Lijuan

    2013-01-01

    Despite wide applications of both mediation models and missing data techniques, formal discussion of mediation analysis with missing data is still rare. We introduce and compare four approaches to dealing with missing data in mediation analysis including list wise deletion, pairwise deletion, multiple imputation (MI), and a two-stage maximum…

  6. Multiscale Methods for Nuclear Reactor Analysis

    NASA Astrophysics Data System (ADS)

    Collins, Benjamin S.

    The ability to accurately predict local pin powers in nuclear reactors is necessary to understand the mechanisms that cause fuel pin failure during steady state and transient operation. In the research presented here, methods are developed to improve the local solution using high order methods with boundary conditions from a low order global solution. Several different core configurations were tested to determine the improvement in the local pin powers compared to the standard techniques, that use diffusion theory and pin power reconstruction (PPR). Two different multiscale methods were developed and analyzed; the post-refinement multiscale method and the embedded multiscale method. The post-refinement multiscale methods use the global solution to determine boundary conditions for the local solution. The local solution is solved using either a fixed boundary source or an albedo boundary condition; this solution is "post-refinement" and thus has no impact on the global solution. The embedded multiscale method allows the local solver to change the global solution to provide an improved global and local solution. The post-refinement multiscale method is assessed using three core designs. When the local solution has more energy groups, the fixed source method has some difficulties near the interface: however the albedo method works well for all cases. In order to remedy the issue with boundary condition errors for the fixed source method, a buffer region is used to act as a filter, which decreases the sensitivity of the solution to the boundary condition. Both the albedo and fixed source methods benefit from the use of a buffer region. Unlike the post-refinement method, the embedded multiscale method alters the global solution. The ability to change the global solution allows for refinement in areas where the errors in the few group nodal diffusion are typically large. The embedded method is shown to improve the global solution when it is applied to a MOX/LEU assembly interface, the fuel/reflector interface, and assemblies where control rods are inserted. The embedded method also allows for multiple solution levels to be applied in a single calculation. The addition of intermediate levels to the solution improves the accuracy of the method. Both multiscale methods considered here have benefits and drawbacks, but both can provide improvements over the current PPR methodology.

  7. Dot-blot-SNP analysis for practical plant breeding and cultivar identification in rice

    Microsoft Academic Search

    K. Shirasawa; S. Shiokai; M. Yamaguchi; S. Kishitani; T. Nishio

    2006-01-01

    We report dot-blot hybridization with allele-specific oligonucleotides for single nucleotide polymorphisms (SNPs) analysis to be applicable for practical plant breeding and cultivar identification. Competitive hybridization of a digoxigenin-labeled oligonucleotide having the sequence of a mutant allele (or a wild-type allele) together with an unlabeled oligonucleotide having the sequence of a wild-type allele (or a mutant allele) was highly effective to

  8. Mobile IPv6 Deployments: Graph-based Analysis and practical Guidelines

    E-print Network

    Magnien, Clémence

    Mobile IPv6 Deployments: Graph-based Analysis and practical Guidelines Guillaume Valadon a,b, Cl LIP6, F-75005, Paris, France cTOYOTA InfoTechnology Center, U.S.A., Inc. Abstract The Mobile IPv6 to provide permanent IP addresses to end-users of WiMAX and 3GPP2. Mobile IPv6 relies on a specific router

  9. A method of nonlinear analysis in the frequency domain.

    PubMed Central

    Victor, J; Shapley, R

    1980-01-01

    A method is developed for the analysis of nonlinear biological systems based on an input temporal signal that consists of a sum of a large number of sinusoids. Nonlinear properties of the system are manifest by responses at harmonics and intermodulation frequencies of the input frequencies. The frequency kernels derived from these nonlinear responses are similar to the Fourier transforms of the Wiener kernels. Guidelines for the choice of useful input frequency sets, and examples satisfying these guidelines, are given. A practical algorithm for varying the relative phases of the input sinusoids to separate high-order interactions is presented. The utility of this technique is demonstrated with data obtained from a cat retinal ganglion cell of the Y type. For a high spatial frequency grafting, the entire response is contained in the even-order nonlinear components. Even at low contrast, fourth-order components are detectable. This suggests the presence of an essential nonlinearity in the functional pathway of the Y cell, with its singularity at zero contrast. PMID:7295867

  10. Obesity in social media: a mixed methods analysis.

    PubMed

    Chou, Wen-Ying Sylvia; Prestin, Abby; Kunath, Stephen

    2014-09-01

    The escalating obesity rate in the USA has made obesity prevention a top public health priority. Recent interventions have tapped into the social media (SM) landscape. To leverage SM in obesity prevention, we must understand user-generated discourse surrounding the topic. This study was conducted to describe SM interactions about weight through a mixed methods analysis. Data were collected across 60 days through SM monitoring services, yielding 2.2 million posts. Data were cleaned and coded through Natural Language Processing (NLP) techniques, yielding popular themes and the most retweeted content. Qualitative analyses of selected posts add insight into the nature of the public dialogue and motivations for participation. Twitter represented the most common channel. Twitter and Facebook were dominated by derogatory and misogynist sentiment, pointing to weight stigmatization, whereas blogs and forums contained more nuanced comments. Other themes included humor, education, and positive sentiment countering weight-based stereotypes. This study documented weight-related attitudes and perceptions. This knowledge will inform public health/obesity prevention practice. PMID:25264470

  11. Women's Access and Provider Practices for the Case Management of Malaria during Pregnancy: A Systematic Review and Meta-Analysis

    PubMed Central

    Hill, Jenny; D'Mello-Guyett, Lauren; Hoyt, Jenna; van Eijk, Anna M.; ter Kuile, Feiko O.; Webster, Jayne

    2014-01-01

    Background WHO recommends prompt diagnosis and quinine plus clindamycin for treatment of uncomplicated malaria in the first trimester and artemisinin-based combination therapies in subsequent trimesters. We undertook a systematic review of women's access to and healthcare provider adherence to WHO case management policy for malaria in pregnant women. Methods and Findings We searched the Malaria in Pregnancy Library, the Global Health Database, and the International Network for the Rational Use of Drugs Bibliography from 1 January 2006 to 3 April 2014, without language restriction. Data were appraised for quality and content. Frequencies of women's and healthcare providers' practices were explored using narrative synthesis and random effect meta-analysis. Barriers to women's access and providers' adherence to policy were explored by content analysis using NVivo. Determinants of women's access and providers' case management practices were extracted and compared across studies. We did not perform a meta-ethnography. Thirty-seven studies were included, conducted in Africa (30), Asia (4), Yemen (1), and Brazil (2). One- to three-quarters of women reported malaria episodes during pregnancy, of whom treatment was sought by >85%. Barriers to access among women included poor knowledge of drug safety, prohibitive costs, and self-treatment practices, used by 5%–40% of women. Determinants of women's treatment-seeking behaviour were education and previous experience of miscarriage and antenatal care. Healthcare provider reliance on clinical diagnosis and poor adherence to treatment policy, especially in first versus other trimesters (28%, 95% CI 14%–47%, versus 72%, 95% CI 39%–91%, p?=?0.02), was consistently reported. Prescribing practices were driven by concerns over side effects and drug safety, patient preference, drug availability, and cost. Determinants of provider practices were access to training and facility type (public versus private). Findings were limited by the availability, quality, scope, and methodological inconsistencies of the included studies. Conclusions A systematic assessment of the extent of substandard case management practices of malaria in pregnancy is required, as well as quality improvement interventions that reach all providers administering antimalarial drugs in the community. Pregnant women need access to information on which anti-malarial drugs are safe to use at different stages of pregnancy. Please see later in the article for the Editors' Summary PMID:25093720

  12. 7 CFR 58.245 - Method of sample analysis.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ...the USDA, Agricultural Marketing Service, Dairy Programs, or Official Methods of Analysis of the Association of Analytical Chemists or Standard Methods for the Examination of Dairy Products. [67 FR 48976, July 29,...

  13. Adaptive Nodal Transport Methods for Reactor Transient Analysis

    SciTech Connect

    Thomas Downar; E. Lewis

    2005-08-31

    Develop methods for adaptively treating the angular, spatial, and time dependence of the neutron flux in reactor transient analysis. These methods were demonstrated in the DOE transport nodal code VARIANT and the US NRC spatial kinetics code, PARCS.

  14. Common cause analysis : a review and extension of existing methods

    E-print Network

    Heising, Carolyn D.

    1982-01-01

    The quantitative common cause analysis code, MOBB, is extended to include uncertainties arising from modelling uncertainties and data uncertainties. Two methods, Monte Carlo simulation and the Method-of-Moments are used ...

  15. Shear Lag in Box Beams Methods of Analysis and Experimental Investigations

    NASA Technical Reports Server (NTRS)

    Kuhn, Paul; Chiarito, Patrick T

    1942-01-01

    The bending stresses in the covers of box beams or wide-flange beams differ appreciably from the stresses predicted by the ordinary bending theory on account of shear deformation of the flanges. The problem of predicting these differences has become known as the shear-lag problem. The first part of this paper deals with methods of shear-lag analysis suitable for practical use. The second part of the paper describes strain-gage tests made by the NACA to verify the theory. Three tests published by other investigators are also analyzed by the proposed method. The third part of the paper gives numerical examples illustrating the methods of analysis. An appendix gives comparisons with other methods, particularly with the method of Ebner and Koller.

  16. Practical way to assess metabolic syndrome using a continuous score obtained from principal components analysis

    PubMed Central

    Hillier, Teresa A.; Rousseau, A.; Lange, Céline; Lépinay, P.; Cailleau, Martine; Novak, M.; Calliez, Etienne; Ducimetière, Pierre; Balkau, Beverley

    2006-01-01

    Aims/hypothesis We devised a practical continuous score to assess the metabolic syndrome, and assessed whether this syndrome score predicts incident diabetes and cardiovascular disease. Methods Among 5,024 participants of the D.E.S.I.R. cohort, we defined a metabolic syndrome score by the first principal component (PC1), using only the correlations between continuous metabolic syndrome measures (glucose, waist circumference, triglycerides, and systolic blood pressure). This metabolic syndrome score was highly correlated with a similar score also including insulin and HDL-cholesterol (Spearman r=0.94). Over 9 years of follow-up, incident diabetes and cardiovascular disease (CVD) were predicted by logistic regression using the simpler metabolic syndrome score. Results The means of the metabolic syndrome measures differed between men and women. Nevertheless, as the degree of variance explained and the PC1 coefficients were remarkably similar, we used a common metabolic syndrome score. The metabolic syndrome score explained 50% of the variance of the metabolic syndrome measures, and waist circumference had the highest correlation (0.59) with this score. Each standard deviation increase in the metabolic syndrome score was associated with a markedly increased age-adjusted risk of developing diabetes (odds ratio, men: 3.4 [95% CI 2.6–4.4], odds ratio, women 5.1 [3.6–7.2]) and with increased incident CVD of 1.7 (1.4–2.1) in men and 1.7 (1.0–2.7) in women. Conclusions/interpretation Our results, which should be confirmed in other populations, suggest that it is possible to evaluate the risk of the metabolic syndrome in a pragmatic fashion with a continuous score, obtained from a principal components analysis of the basic, continuous syndrome measures. PMID:16752171

  17. Sharps-handling practices among junior surgical residents: a video analysis

    PubMed Central

    Tso, David; Langer, Monica; Blair, Geoff K.; Butterworth, Sonia

    2012-01-01

    Background Although “universal precautions” are standard for sharps handling, there has been poor compliance among surgeons. We used video analysis to assess sharps handling practices among junior surgical residents. Methods Postgraduate year (PGY)-2 general surgery and PGY-1 plastic surgery residents were videotaped performing pediatric inguinal hernia repairs. For each procedure, the resident was the principal operator, with the attending surgeon assisting. Retrospective assessment of safe and unsafe sharps handling was determined based on published guidelines. We assessed safety performance in personal sharps tasks, passage of sharps and verbal notification regarding sharps. Data was analyzed using descriptive statistics. Results Data were collected from 18 residents’ videos (4 plastic surgery, 14 general surgery). Residents safely performed sharps tasks, passed and verbally notified about sharps an average of 69.2%, 93.2% and 9.9% of the time, respectively. Suture needle manipulation was handled safely 56.2% of the time (mean 4.4 safe v. 4.3 unsafe actions). Surgical residents demonstrated a safe suture tying technique in 91.8% of cases, proper tissue retraction in 85.2% and safe handling of injection needles in 72.2% of cases. When assessing the safety performance of the surgical team, attending surgeons acting as surgical assistants safely passed sharps 80.0% of the time, while scrub nurses demonstrated safe passing at all times. Attending surgeons used verbal notification when passing sharps 22.7% of the time, while scrub nurses verbally notified the team 4.3% of the time. Conclusion Junior surgical residents consistently passed sharps safely. Personal sharps tasks were less likely to be performed safely, and only a minority of residents verbally notified the team about sharps placement. PMID:22854145

  18. Improved permeability prediction using multivariate analysis methods

    E-print Network

    Xie, Jiang

    2009-05-15

    Predicting rock permeability from well logs in uncored wells is an important task in reservoir characterization. Due to the high costs of coring and laboratory analysis, typically cores are acquired in only a few wells. Since most wells are logged...

  19. A dendrite method for cluster analysis

    Microsoft Academic Search

    T. Cali?ski; J. Harabasz

    1974-01-01

    A method for identifying clusters of points in a multidimensional Euclidean space is described and its application to taxonomy considered. It reconciles, in a sense, two different approaches to the investigation of the spatial relationships between the points, viz., the agglomerative and the divisive methods. A graph, the shortest dendrite of Florek etal. (1951a), is constructed on a nearest neighbour

  20. Nonlinear analysis method can improve pipeline design

    Microsoft Academic Search

    A. Aynbinder; B. Taksa; P. Dalton

    1996-01-01

    A nonlinear engineering method for analyzing pipe stress criteria has been developed and can be used in common spreadsheet software for pipeline design. Designs based on this method can enhance the operational reliability of pipeline systems because their designs can more accurately determine actual pipe stress and strain. Most pipeline design codes establish allowable equivalent-stress limits that are higher than

  1. Linear trend analysis: a comparison of methods

    Microsoft Academic Search

    Ann Hess; Hari Iyer; William Malm

    2001-01-01

    In this paper, we present an overview of statistical approaches available for detecting and estimating linear trends in environmental data. We evaluate seven methods of trend detection and make recommendations based on a simulation study. We also illustrate the methods using real data.

  2. Summarization Evaluation Methods: Experiments and Analysis

    Microsoft Academic Search

    Hongyan Jing

    1998-01-01

    Two methods are used for evaluation of summarizationsystems: an evaluation of generated summaries againstan "ideal" summary and evaluation of how well summarieshelp a person perform in a task such as informationretrieval. We carried out two large experiments tostudy the two evaluation methods. Our results showthat different parameters of an experiment can dramaticallyaffect how well a system scores. For example,summary length

  3. METHODS FOR SAMPLING AND ANALYSIS OF BREATH

    EPA Science Inventory

    The research program surveyed and evaluated the methods and procedures used to identify and quantitate chemical constituents in human breath. Methods have been evaluated to determine their ease and rapidity, as well as cost, accuracy, and precision. During the evaluation, a secon...

  4. PIC (PRODUCTS OF INCOMPLETE COMBUSTION) ANALYSIS METHODS

    EPA Science Inventory

    The report gives results of method evaluations for products of incomplete combustion (PICs): 36 proposed PICs were evaluated by previously developed gas chromatography/flame ionization detection (GC/FID) and gas chromatography/mass spectroscopy (GC/MS) methods. It also gives resu...

  5. Implementation of infection control best practice in intensive care units throughout Europe: a mixed-method evaluation study

    PubMed Central

    2013-01-01

    Background The implementation of evidence-based infection control practices is essential, yet challenging for healthcare institutions worldwide. Although acknowledged that implementation success varies with contextual factors, little is known regarding the most critical specific conditions within the complex cultural milieu of varying economic, political, and healthcare systems. Given the increasing reliance on unified global schemes to improve patient safety and healthcare effectiveness, research on this topic is needed and timely. The ‘InDepth’ work package of the European FP7 Prevention of Hospital Infections by Intervention and Training (PROHIBIT) consortium aims to assess barriers and facilitators to the successful implementation of catheter-related bloodstream infection (CRBSI) prevention in intensive care units (ICU) across several European countries. Methods We use a qualitative case study approach in the ICUs of six purposefully selected acute care hospitals among the 15 participants in the PROHIBIT CRBSI intervention study. For sensitizing schemes we apply the theory of diffusion of innovation, published implementation frameworks, sensemaking, and new institutionalism. We conduct interviews with hospital health providers/agents at different organizational levels and ethnographic observations, and conduct rich artifact collection, and photography during two rounds of on-site visits, once before and once one year into the intervention. Data analysis is based on grounded theory. Given the challenge of different languages and cultures, we enlist the help of local interpreters, allot two days for site visits, and perform triangulation across multiple data sources. Qualitative measures of implementation success will consider the longitudinal interaction between the initiative and the institutional context. Quantitative outcomes on catheter-related bloodstream infections and performance indicators from another work package of the consortium will produce a final mixed-methods report. Conclusion A mixed-methods study of this scale with longitudinal follow-up is unique in the field of infection control. It highlights the ‘Why’ and ‘How’ of best practice implementation, revealing key factors that determine success of a uniform intervention in the context of several varying cultural, economic, political, and medical systems across Europe. These new insights will guide future implementation of more tailored and hence more successful infection control programs. Trial registration Trial number: PROHIBIT-241928 (FP7 reference number) PMID:23421909

  6. Implementing a Virtual Community of Practice for Family Physician Training: A Mixed-Methods Case Study

    PubMed Central

    Jones, Sandra C; Caton, Tim; Iverson, Don; Bennett, Sue; Robinson, Laura

    2014-01-01

    Background GP training in Australia can be professionally isolating, with trainees spread across large geographic areas, leading to problems with rural workforce retention. Virtual communities of practice (VCoPs) may provide a way of improving knowledge sharing and thus reducing professional isolation. Objective The goal of our study was to review the usefulness of a 7-step framework for implementing a VCoP for general practitioner (GP) training and then evaluated the usefulness of the resulting VCoP in facilitating knowledge sharing and reducing professional isolation. Methods The case was set in an Australian general practice training region involving 55 first-term trainees (GPT1s), from January to July 2012. ConnectGPR was a secure, online community site that included standard community options such as discussion forums, blogs, newsletter broadcasts, webchats, and photo sharing. A mixed-methods case study methodology was used. Results are presented and interpreted for each step of the VCoP 7-step framework and then in terms of the outcomes of knowledge sharing and overcoming isolation. Results Step 1, Facilitation: Regular, personal facilitation by a group of GP trainers with a co-ordinating facilitator was an important factor in the success of ConnectGPR. Step 2, Champion and Support: Leadership and stakeholder engagement were vital. Further benefits are possible if the site is recognized as contributing to training time. Step 3, Clear Goals: Clear goals of facilitating knowledge sharing and improving connectedness helped to keep the site discussions focused. Step 4, A Broad Church: The ConnectGPR community was too narrow, focusing only on first-term trainees (GPT1s). Ideally there should be more involvement of senior trainees, trainers, and specialists. Step 5, A Supportive Environment: Facilitators maintained community standards and encouraged participation. Step 6, Measurement Benchmarking and Feedback: Site activity was primarily driven by centrally generated newsletter feedback. Viewing comments by other participants helped users benchmark their own knowledge, particularly around applying guidelines. Step 7, Technology and Community: All the community tools were useful, but chat was limited and users suggested webinars in future. A larger user base and more training may also be helpful. Time is a common barrier. Trust can be built online, which may have benefit for trainees that cannot attend face-to-face workshops. Knowledge sharing and isolation outcomes: 28/34 (82%) of the eligible GPT1s enrolled on ConnectGPR. Trainees shared knowledge through online chat, forums, and shared photos. In terms of knowledge needs, GPT1s rated their need for cardiovascular knowledge more highly than supervisors. Isolation was a common theme among interview respondents, and ConnectGPR users felt more supported in their general practice (13/14, 92.9%). Conclusions The 7-step framework for implementation of an online community was useful. Overcoming isolation and improving connectedness through an online knowledge sharing community shows promise in GP training. Time and technology are barriers that may be overcome by training, technology, and valuable content. In a VCoP, trust can be built online. This has implications for course delivery, particularly in regional areas. VCoPs may also have a specific role assisting overseas trained doctors to interpret their medical knowledge in a new context. PMID:24622292

  7. Extraction of brewer's yeasts using different methods of cell disruption for practical biodiesel production.

    PubMed

    Rezanka, Tomáš; Matoulková, Dagmar; Kolouchová, Irena; Masák, Jan; Viden, Ivan; Sigler, Karel

    2014-11-14

    The methods of preparation of fatty acids from brewer's yeast and its use in production of biofuels and in different branches of industry are described. Isolation of fatty acids from cell lipids includes cell disintegration (e.g., with liquid nitrogen, KOH, NaOH, petroleum ether, nitrogenous basic compounds, etc.) and subsequent processing of extracted lipids, including analysis of fatty acid and computing of biodiesel properties such as viscosity, density, cloud point, and cetane number. Methyl esters obtained from brewer's waste yeast are well suited for the production of biodiesel. All 49 samples (7 breweries and 7 methods) meet the requirements for biodiesel quality in both the composition of fatty acids and the properties of the biofuel required by the US and EU standards. PMID:25394535

  8. Strategies and Practices in Off-Label Marketing of Pharmaceuticals: A Retrospective Analysis of Whistleblower Complaints

    PubMed Central

    Kesselheim, Aaron S.; Mello, Michelle M.; Studdert, David M.

    2011-01-01

    Background Despite regulatory restrictions, off-label marketing of pharmaceutical products has been common in the US. However, the scope of off-label marketing remains poorly characterized. We developed a typology for the strategies and practices that constitute off-label marketing. Methods and Findings We obtained unsealed whistleblower complaints against pharmaceutical companies filed in US federal fraud cases that contained allegations of off-label marketing (January 1996–October 2010) and conducted structured reviews of them. We coded and analyzed the strategic goals of each off-label marketing scheme and the practices used to achieve those goals, as reported by the whistleblowers. We identified 41 complaints arising from 18 unique cases for our analytic sample (leading to US$7.9 billion in recoveries). The off-label marketing schemes described in the complaints had three non–mutually exclusive goals: expansions to unapproved diseases (35/41, 85%), unapproved disease subtypes (22/41, 54%), and unapproved drug doses (14/41, 34%). Manufacturers were alleged to have pursued these goals using four non–mutually exclusive types of marketing practices: prescriber-related (41/41, 100%), business-related (37/41, 90%), payer-related (23/41, 56%), and consumer-related (18/41, 44%). Prescriber-related practices, the centerpiece of company strategies, included self-serving presentations of the literature (31/41, 76%), free samples (8/41, 20%), direct financial incentives to physicians (35/41, 85%), and teaching (22/41, 54%) and research activities (8/41, 20%). Conclusions Off-label marketing practices appear to extend to many areas of the health care system. Unfortunately, the most common alleged off-label marketing practices also appear to be the most difficult to control through external regulatory approaches. Please see later in the article for the Editors' Summary PMID:21483716

  9. Limits on the resolution of correlation PIV iterative methods. Practical implementation and design of weighting functions

    NASA Astrophysics Data System (ADS)

    Nogueira, J.; Lecuona, A.; Rodríguez, P. A.; Alfaro, J. A.; Acosta, A.

    2005-08-01

    Elsewhere in this volume (Nogueira et al. (2005) Exp Fluids, in press), the conceptual background that explains the possibility of resolving wavelengths smaller than the size of the interrogation window, with no basic restrictions but sampling, has been explained. Here, a practical implementation of the concepts is performed. To achieve this resolution in iterative PIV processing, an appropriate weighting function can be used, as commented in that reference. Here, the constraints for the design of such weighting functions are presented and analysed. This opens a line of work on possible weighting functions to develop, since the weightings used in these iterative methods, like local field correction particle image velocimetry (LFC-PIV) (Nogueira et al. (1999) Exp Fluids 27(2):107 116), have not been optimised yet. As an example, different weighting functions are commented and tested both on synthetic and real images. The results on these new weightings indicate that the current ones can be improved and the optimisation criteria are open for further advancement.

  10. Passive sampling methods for contaminated sediments: Practical guidance for selection, calibration, and implementation

    PubMed Central

    Ghosh, Upal; Driscoll, Susan Kane; Burgess, Robert M; Jonker, Michiel To; Reible, Danny; Gobas, Frank; Choi, Yongju; Apitz, Sabine E; Maruya, Keith A; Gala, William R; Mortimer, Munro; Beegan, Chris

    2014-01-01

    This article provides practical guidance on the use of passive sampling methods (PSMs) that target the freely dissolved concentration (Cfree) for improved exposure assessment of hydrophobic organic chemicals in sediments. Primary considerations for selecting a PSM for a specific application include clear delineation of measurement goals for Cfree, whether laboratory-based “ex situ” and/or field-based “in situ” application is desired, and ultimately which PSM is best-suited to fulfill the measurement objectives. Guidelines for proper calibration and validation of PSMs, including use of provisional values for polymer–water partition coefficients, determination of equilibrium status, and confirmation of nondepletive measurement conditions are defined. A hypothetical example is described to illustrate how the measurement of Cfree afforded by PSMs reduces uncertainty in assessing narcotic toxicity for sediments contaminated with polycyclic aromatic hydrocarbons. The article concludes with a discussion of future research that will improve the quality and robustness of Cfree measurements using PSMs, providing a sound scientific basis to support risk assessment and contaminated sediment management decisions. Integr Environ Assess Manag 2014;10:210–223. © 2014 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals, Inc. on behalf of SETAC. PMID:24288273

  11. Method for chromium analysis and speciation

    DOEpatents

    Aiken, Abigail M.; Peyton, Brent M.; Apel, William A.; Petersen, James N.

    2004-11-02

    A method of detecting a metal in a sample comprising a plurality of metal is disclosed. The method comprises providing the sample comprising a metal to be detected. The sample is added to a reagent solution comprising an enzyme and a substrate, where the enzyme is inhibited by the metal to be detected. An array of chelating agents is used to eliminate the inhibitory effects of additional metals in the sample. An enzymatic activity in the sample is determined and compared to an enzymatic activity in a control solution to detect the metal to be detected. A method of determining a concentration of the metal in the sample is also disclosed. A method of detecting a valence state of a metal is also disclosed.

  12. Imputation of Truncated p-Values For Meta-Analysis Methods and Its Genomic Application1

    PubMed Central

    Tang, Shaowu; Ding, Ying; Sibille, Etienne; Mogil, Jeffrey; Lariviere, William R.; Tseng, George C.

    2014-01-01

    Microarray analysis to monitor expression activities in thousands of genes simultaneously has become routine in biomedical research during the past decade. a tremendous amount of expression profiles are generated and stored in the public domain and information integration by meta-analysis to detect differentially expressed (DE) genes has become popular to obtain increased statistical power and validated findings. Methods that aggregate transformed p-value evidence have been widely used in genomic settings, among which Fisher's and Stouffer's methods are the most popular ones. In practice, raw data and p-values of DE evidence are often not available in genomic studies that are to be combined. Instead, only the detected DE gene lists under a certain p-value threshold (e.g., DE genes with p-value < 0.001) are reported in journal publications. The truncated p-value information makes the aforementioned meta-analysis methods inapplicable and researchers are forced to apply a less efficient vote counting method or naïvely drop the studies with incomplete information. The purpose of this paper is to develop effective meta-analysis methods for such situations with partially censored p-values. We developed and compared three imputation methods—mean imputation, single random imputation and multiple imputation—for a general class of evidence aggregation methods of which Fisher's and Stouffer's methods are special examples. The null distribution of each method was analytically derived and subsequent inference and genomic analysis frameworks were established. Simulations were performed to investigate the type Ierror, power and the control of false discovery rate (FDR) for (correlated) gene expression data. The proposed methods were applied to several genomic applications in colorectal cancer, pain and liquid association analysis of major depressive disorder (MDD). The results showed that imputation methods outperformed existing naïve approaches. Mean imputation and multiple imputation methods performed the best and are recommended for future applications. PMID:25541588

  13. Practical considerations for conducting ecotoxicity test methods with manufactured nanomaterials: what have we learnt so far?

    PubMed

    Handy, Richard D; van den Brink, Nico; Chappell, Mark; Mühling, Martin; Behra, Renata; Dušinská, Maria; Simpson, Peter; Ahtiainen, Jukka; Jha, Awadhesh N; Seiter, Jennifer; Bednar, Anthony; Kennedy, Alan; Fernandes, Teresa F; Riediker, Michael

    2012-05-01

    This review paper reports the consensus of a technical workshop hosted by the European network, NanoImpactNet (NIN). The workshop aimed to review the collective experience of working at the bench with manufactured nanomaterials (MNMs), and to recommend modifications to existing experimental methods and OECD protocols. Current procedures for cleaning glassware are appropriate for most MNMs, although interference with electrodes may occur. Maintaining exposure is more difficult with MNMs compared to conventional chemicals. A metal salt control is recommended for experiments with metallic MNMs that may release free metal ions. Dispersing agents should be avoided, but if they must be used, then natural or synthetic dispersing agents are possible, and dispersion controls essential. Time constraints and technology gaps indicate that full characterisation of test media during ecotoxicity tests is currently not practical. Details of electron microscopy, dark-field microscopy, a range of spectroscopic methods (EDX, XRD, XANES, EXAFS), light scattering techniques (DLS, SLS) and chromatography are discussed. The development of user-friendly software to predict particle behaviour in test media according to DLVO theory is in progress, and simple optical methods are available to estimate the settling behaviour of suspensions during experiments. However, for soil matrices such simple approaches may not be applicable. Alternatively, a Critical Body Residue approach may be taken in which body concentrations in organisms are related to effects, and toxicity thresholds derived. For microbial assays, the cell wall is a formidable barrier to MNMs and end points that rely on the test substance penetrating the cell may be insensitive. Instead assays based on the cell envelope should be developed for MNMs. In algal growth tests, the abiotic factors that promote particle aggregation in the media (e.g. ionic strength) are also important in providing nutrients, and manipulation of the media to control the dispersion may also inhibit growth. Controls to quantify shading effects, and precise details of lighting regimes, shaking or mixing should be reported in algal tests. Photosynthesis may be more sensitive than traditional growth end points for algae and plants. Tests with invertebrates should consider non-chemical toxicity from particle adherence to the organisms. The use of semi-static exposure methods with fish can reduce the logistical issues of waste water disposal and facilitate aspects of animal husbandry relevant to MMNs. There are concerns that the existing bioaccumulation tests are conceptually flawed for MNMs and that new test(s) are required. In vitro testing strategies, as exemplified by genotoxicity assays, can be modified for MNMs, but the risk of false negatives in some assays is highlighted. In conclusion, most protocols will require some modifications and recommendations are made to aid the researcher at the bench. PMID:22422174

  14. CHAPTER 7. BERYLLIUM ANALYSIS BY NON-PLASMA BASED METHODS

    SciTech Connect

    Ekechukwu, A

    2009-04-20

    The most common method of analysis for beryllium is inductively coupled plasma atomic emission spectrometry (ICP-AES). This method, along with inductively coupled plasma mass spectrometry (ICP-MS), is discussed in Chapter 6. However, other methods exist and have been used for different applications. These methods include spectroscopic, chromatographic, colorimetric, and electrochemical. This chapter provides an overview of beryllium analysis methods other than plasma spectrometry (inductively coupled plasma atomic emission spectrometry or mass spectrometry). The basic methods, detection limits and interferences are described. Specific applications from the literature are also presented.

  15. Supercritical fluid methods for the analysis of complex fuel and environmental samples

    SciTech Connect

    Smith, R.D.; Yonker, C.R.; Kalinoski, H.T.; Chess, E.K.; Udseth, H.R.; Frye, S.L.; Wright, B.W.

    1985-11-01

    The application of supercritical fluid methods can greatly improve the analysis of complex mixtures spanning wide chemical classes. In separations from complex matrices supercritical fluids can be used to improve both extraction efficiency and speed. The variable solvating powers and gas-like diffusion coefficients and viscosities can provide efficient new methods for sample enrichment, cleanup and fractionation. Supercritical fluid chromatography (SFC) is applicable to both thermally labile and less volatile materials. The use of small diameter (25- to 50-..mu..m) fused silica capillary columns provides chromatographic efficiencies for SFC that are nearly comparable to conventional capillary gas chromatography and much greater than practical by liquid chromatography (LC). The development of practical capillary SFC-MS instrumentation, allowing both electron impact and chemical ionization mass spectrometry (MS), is expected to provide a powerful alternative to LC-MS for complex mixture analysis. Recent results will be presented on the development of methods for the analysis of fuels, labile pollutants, hazardous solid waste materials, and related complex environmental mixtures. The application of various supercritical fluids for sample extraction, SFC and SFC-MS, and the recent development of a system for automated supercritical fluid extraction-chromatography will also be described. Finally, the potential role of supercritical fluid methods in routine chemical analysis will be discussed. 16 refs., 11 figs.

  16. Dissolved oxygen: method comparison with potentiometric stripping analysis

    SciTech Connect

    Fayyad, M.; Tutunji, M.; Ramakrishna, R.S.; Taha, Z.

    1987-04-01

    Three methods for determination of dissolved oxygen in samples of natural water are compared; potentiometric stripping analysis, PSA compares well with oxygen selective electrodes. Although potentiometric stripping analysis and oxygen selective electrode methods are found to be simple, rapid and of higher reproducibility than the usual Winkler procedure, the use of oxygen selective electrodes has many disadvantages.

  17. Calibration of dynamic analysis methods from field test data

    Microsoft Academic Search

    A. Anandarajah; J. Zhang; C. Ealy

    2005-01-01

    In view of the heterogeneity of natural soil deposits and approximations made in analysis methods, in situ methods of determining soil parameters are highly desirable. The problem of interest here is the nonlinear dynamic behavior of pile foundations. It is shown in this paper that soil parameters needed for simplified dynamic analysis of a single pile may be back-calculated from

  18. Petri nets analysis using incidence matrix method inside ATOM3

    E-print Network

    Bellogin, Alejandro

    . (...) As a graphical tool, Petri nets can be used as a visual-communication aid similar to flow charts, block diagramsPetri nets analysis using incidence matrix method inside ATOM3 Alejandro Bellog´in Kouki-modelling and model-transforming tool called ATOM3 . Analysis methods for Petri nets may be classified

  19. PRACTICAL EXPERIENCE IN ANALYSIS OF ORGANIC COMPOUNDS IN AMBIENTAIR USING CANISTERS AND SORBENTS

    EPA Science Inventory

    Generation of accurate ambient air VOC pollutant mcasurement dataas a base for regulatory decisions is critical. umerous methodsand procedures for sampling and analysis are available from avariety of sources. ir methods available through theEnvironmental Protection Agency are con...

  20. Methods for Evidence-Based Practice: Quantitative Synthesis of Single-Subject Designs

    ERIC Educational Resources Information Center

    Shadish, William R.; Rindskopf, David M.

    2007-01-01

    Good quantitative evidence does not require large, aggregate group designs. The authors describe ground-breaking work in managing the conceptual and practical demands in developing meta-analytic strategies for single subject designs in an effort to add to evidence-based practice. (Contains 2 figures.)